Skip all navigation and go to page content

OERC Blog

A Weblog from the NN/LM Outreach Evaluation Resource Center

Qualitative Evaluation Week

The American Evaluation Association (AEA) just concluded a week-long blog theme about qualitative evaluation, which we’ve summarized below for your reference and to consider as part of your own assessment efforts:

  1. The Role of Context – the authors of this entry previously shared five high quality elements of qualitative evaluation,  and this entry referenced them while emphasizing the need for evaluators to understand what role setting, relationships, and other context factors play in data as well.
  2. Purposeful Sampling – a great explanation on why to avoid convenience sampling (interviewing people because they happen to be around) and using caution with your qualitative evaluation terminology to consider not using the word ‘sampling’ due to people’s association of it with random probability.
  3. Interviewing People who are Challenging – establishing rapport leads to good qualitative data, but what does an interviewer do if there seems to be conflict with the interviewee? Details about how to manage your own feelings and approach with a curious mindset are very helpful!
  4. Asking Stupid Questions – this example from a bilingual HIV/AIDS training is especially insightful about the importance of clarifying sexual terms, putting aside concerns the evaluator may have about looking ‘stupid,’ and outcomes that led to deeper engagement and discussion from the group.
  5. Practical Qualitative Analysis – many helpful tips and lessons shared, including the reminder of being sure to group our participants’ responses that answer the same question together even if these replies come from different parts of the survey or interview.
  6. Providing Descriptions – sometimes there are concerns expressed that evaluation is ‘only looking at the negative,’ and by including full details about your qualitative inquiry collection and analysis as an additional resource or appendix you can help explain the steps of the process that otherwise may not be evident.

Need more information about qualitative and other types of evaluation? The Outreach Evaluation Resource Center (OERC) has resources available including our Tools and Resources for Evaluation guide and our freely available Planning and Evaluating Health Information Outreach booklet series.

Assessing a Military Community

Returning soldier kisses daughter after return from Iraq.
Photo by The U.S. Army

http://publiclibrariesonline.org/2015/01/library-services-for-the-new-normal-of-miltary-families/

This week the OERC would like to highlight a community assessment that led to a public library’s outreach to the military community of Cumberland County, North Carolina. This project is described in “Library Services for the ‘New Normal’ of Military Families” by Jennifer Taft and Cynthia Olney, which appears in the November/December issue of Public Libraries. This article demonstrates how assessments can be used to involve community members in the process, leading to their commitment to the success of the outreach project.

Cumberland County is the home of Fort Bragg, one of the nation’s largest military bases. The military community there includes service members, their families, and the people and organizations that work to support the installation. The Cumberland County Public Library and Information Center was looking for ways to better serve this population. Getting an LSTA grant allowed the library to conduct a thorough assessment of the military community and the organizations that serve them.

The assessment process involved the community every step of the way:

  1. Library staff participated in the Fayetteville Community Blueprint Network, made up of organizations that provide support for service members, veterans and their families.
  2. The Library hosted a community forum on post-traumatic stress, which was met with enthusiasm.
  3. The assessment team used key informant interviews and focus groups to collect data.
  4. They also met with additional groups of military parents.
  5. Results were validated by presenting them to Living in the New Normal Committee, a group comprised of representatives from community organizations that work with military families.
  6. The Library created marketing and program strategies based on these results.
  7. These strategies were validated by presenting them to an advisory group of representative from organizations in Cumberland County or Fort Bragg.

The community assessment described in the article helped the library staff to understand the military community in greater depth. For example, the project team originally believed that post-traumatic stress would be a major topic of interest. They learned, instead, that the military deployment cycle is the single biggest disruptor to family. This cycle, from preparing for the service member’s departure, to adjustment to a single-parent situation, to then reintegrating the returning service member back into the family, is increasingly stressful the more often it occurs. This finding had significant impact on the library’s planning going forward. Other key findings are covered in the article, along with the library’s plan to respond to the community’s needs.

One of the clearest manifestations of the success of the assessment was the willingness of the community to support the library programs. For example, Community Blueprint Network organizational members participated in a library birthday celebration for the Army. Local military museums agreed to lend museum pieces to the library for display. Most significantly, the library’s requests to participate in on-post activities, which had not been approved before, were now met with enthusiasm by post personnel who had participated in the assessment process.

The OERC Welcomes Karen Vargas

Karen Vargas

The OERC welcomes evaluation specialist Karen Vargas, MSLS, who joined the staff on February 2.  Karen will create and present training sessions on various evaluation topics, contribute regularly to the OERC blog and LibGuide, and provide one-to-one evaluation assistance to the National Network of Libraries of Medicine.

Although Karen is new to the OERC staff, she is well known within NN/LM circles, particularly in the South Central Region. She joined the SCR Regional Medical Library in 2003 as the consumer health outreach coordinator, and later became the region’s outreach and evaluation coordinator.  Karen worked on a number of evaluation projects during her time at the SCR. Prior to joining the National Network, she worked at the Houston Public Library.

Karen will be telecommuting from Houston, Texas. We look forward to her contributions to the OERC.  She agreed to participate in an interview with Cindy Olney so we could introduce her to our blog readers.

What made you want to become part of NN/LM?

I liked all the opportunities to provide training offered through NN/LM. You get to see how this job impacts people’s lives.

 What was your favorite evaluation project?

I really enjoyed doing the evaluation of the resource library outreach subcontract program at NN/LM SCR. We involved the resource library directors and outreach contacts in identifying the outcomes that they were interested in. We started with friendly general questions that developed into a process that they were willing to use. We managed to develop specific methods that led to good data. I liked helping people identify what was important to them and figuring out how to track their progress. It makes it more likely they will actually participate in the evaluation. Some of the libraries are still using the forms that we developed.

What do you find challenging about doing that project?

Multi-site evaluation was a challenge.  We were working with libraries in five states.  It was important that everyone follow directions, but we weren’t always successful in getting that point across. Some people using our forms were not the ones we originally trained and the ones we trained didn’t train their co-workers.  If I were to do it over again, we would have written out the directions more explicitly and had special training sessions for everyone who was collecting data for the project. We could have recorded the training, too.

What made you want to work with the OERC?

We’re helping organizations see the value of their programs. At SCR, we helped network members working on projects to focus on what is working and not waste time on parts that aren’t working. We were helping them feel good about what they were doing.  It’s important, for advocacy, to find out what stakeholders want. But it’s also good to figure out what you want. That’s what makes evaluation exciting. People figuring out what makes them satisfied, what they would see as success, and then figuring out how to measure it.

What I’ve like best about working with NN/LM the last 11 years, is that, when we took people through the planning steps of writing an award proposal, they often did the project even if we didn’t fund it. They get so involved in planning that they find some way to do the project. I liked seeing them get excited.

What type of evaluation skills are you most interested in developing?

I would like to learn more about interviewing.

What else would you like readers to know about you?

 I have a little girl named Sophia. She’s two years old and plays harmonica, ukulele, recorder, and can even get a sound on a trombone!

 

Another Coffee Break: Word and Excel Templates

AEA Coffee

Here at the Outreach Evaluation Resource Center (OERC) we began 2015 blogging about the CDC Coffee Breaks. For February we’re offering a refill by featuring some notes from a recent American Evaluation Association (AEA) coffee break webcast. Unlike the CDC, the 20 minute AEA coffee break webcasts are not freely available to the public but are an included benefit of AEA membership. The webcast briefly covered best practices in data visualization using two commonly available resources (Microsoft Word and Excel) and how to automate use of them by creating templates for report format consistency and easier workflow.

Some great resources to learn more how to do this and bookmark for future reference include

Specific for Word

Specific for Excel

 

52 Weeks of Better Evaluation

BetterEvaluation.org is an international collaboration that encourages sharing of evaluation methods, approaches and processes for improvement. BetterEvaluation offers yearly blog themes for their staff and guest writers to focus on, and have wrapped up the highlights of their ’52 Weeks of BetterEvaluation’ 2014 theme in a post at http://betterevaluation.org/node/4682 For 2015 they are featuring ’12 Months of BetterEvaluation’ with multiple posts during a month, starting with impact evaluation in January.

A ‘top 5′ selection from the ‘52 Weeks of BetterEvaluation‘ post that is likely to be of interest to National Network of Libraries of Medicine (NN/LM) members includes

  1. Top ten developments in qualitative evaluation over the past decade (link to part 1, part 2)
  2. Fitting reporting methods to evaluation findings and audiences (link)
  3. Infographics, including step by step instructions in piktochart (link)
  4. Innovation in evaluation (link)
  5. Presenting data effectively (link)

Freebie Friday: Measuring Success Toolkit

Measurement and Evaluation Staircase

Monitoring and evaluation (M&E) is a form of assessment used to help improve the performance and achievement of program results and often used by both non-government organizations (NGOs) and government agencies. The staircase diagram above describes six questions that M&E can help answer through program planning, monitoring, and evaluation. More information clarifying the difference between monitoring and evaluation as well as guidance for each of the six questions is available at this link.

While not specific to health information outreach programs, the Measuring Success Toolkit at  https://www.urbanreproductivehealth.org/toolkits/measuring-success from the Urban Reproductive Health Initiative is about health program planning, monitoring and evaluation. The toolkit provides helpful resources from the initiative’s multi-country perspective of working with the urban poor and the significant health disparities they face that may be helpful to consult with your health information outreach partners to underserved communities.  It includes subject-specific M&E resources such as maternal & child health and HIV/AIDs, and the resources within the toolkit are selected by M&E experts and reviewed quarterly following established criteria to identify important resources from diverse perspectives that include accurate, up to date information.

Focused Outreach Vermont (from NN/LM NER)

If you are planning or currently conducting an outreach project, you might want to take a look at the Focused Outreach Vermont article in the National Network of Libraries of Medicine, New England Region (NN/LM NER) newsletter (posted January 13, 2015). NN/LM NER’s Focused Outreach Project uses carefully planned outreach activities and strong community-based collaboration to connect underserved communities with NLM resources and services. The Ner’eastah article, which is an abstract of a full report, highlights outreach results through a succinct description of evaluation findings.

I particularly applaud NN/LM NER’s reporting method. They provide a quick overview, featuring the results of their efforts, with easy access to full details for those who want it. The full report describes the project’s community assessment process and findings. You also get a more thorough description of documented outcomes, laid out in a highly readable format. A nice added feature is the infographic in the beginning of the report.

This is a great example of how to use evaluation to publicize and advocate for successful programs!

nnlm ner

We would like to report more projects that demonstrate effective use of evaluation methods. If you have an example to share, send it to Cindy Olney at olneyc@uw.edu.

Freebie Friday: Mobile Data Solutions Course

Mobile Course Screenshot

Are you curious about the use of smart phones, tablets, or other mobile data resources to collect data for your assessment project, but are seeking more information on how to determine if this is the right approach for your project or program and how to process the data you collect using this method?

Check out http://techchange.org/media/mobile-data-solutions/, which was created as part of the Mobile Solutions Technical Assistance and Research (mSTAR) project, with expertise provided by U.S. Agency for International Development’s (USAID) Digital Development Lab and designed by TechChange.

The primary goal of this freely available and accessible online course (free registration is required to access it) is to learn more about mobile tools, processes, and strategies for data collection in order to use mobile devices (referred to as mobile data solutions) to their full potential in doing so. The course will take about 2 hours to complete and can be done at your own pace over time. Your progress in the course is saved so you’ll be taken to the point where you stopped to continue learning the next time you access it.

The learning objectives of the course are

  • Describe examples of mobile data solutions from collection through visualization
  • Articulate the benefit of using these solutions
  • Analyze the challenges and limitations associated with mobile data solutions
  • Assess whether or not particular mobile data solutions are appropriate for a project, program or problem
  • Outline how to design a project or activity to include mobile data solutions
  • Explain the steps involved in implementing mobile data solutions
  • Summarize how to analyze, visualize, and share mobile data

 

Evaluation “Coffee Breaks” from the CDC

Want to build your repertoire of evaluation skills?  Check out this library of evaluation-related podcasts and webinars from the CDC’s Division of Heart Disease and Stroke Prevention.  These are archived documents from 20-minute presentations about evaluation. The usual basic topics are represented, such as “Making Logic Models Work for You”  and “How Do I Develop a Survey?” But a number of the presentations cover topics that are not standard fare. Here are just a few titles that caught my eye:

Most presentations consist of PDFs of PowerPoint slides and talking points, but there are a few podcasts as well.  All presentations seem to be bird’s-eye overviews, but the final slides offer transcripts of Q&A discussion and a list of resources for more in-depth exploration of the topic.  It’s a great way to check out a new evaluation interest!

coffee

Say No to Spaghetti: Effective Data Graphs

Line graph with 5 multicolor lines jumbled together

The illustration above is from Stephanie Evergreen‘s excellent blog post that caution is needed with the use of line graphs in a chart to show change over time for multiple organizations so you don’t end up with a brightly colored bowl of spaghetti. The solution to passing on this pasta effect? Creating small multiple graphs, such as by each region in this example, which are done one at a time using the same scale as the original graph then stitched together with alignment tools including a ruler and Align > Align Top commands in your graphing software. Be sure to see the end result and step by step guidance on how to create these at http://stephanieevergreen.com/declutter-dataviz-with-small-multiples/

Showing change over time as a line graph instead of a bar graph is one of the quantitative data focus areas in our Outreach Evaluation Resource Center (OERC) webinar Data Burger: A ‘Good’ Questionnaire Response Rate Plus Basic Quantitative Analysis. You can listen to a recording of the Data Burger presentation for the Mid Atlantic Region at https://webmeeting.nih.gov/p2mn6k7tkv6/, and please contact us if you’d like to hear more about this or one of our other webinars.

Last updated on Saturday, 23 November, 2013

Funded by the National Library of Medicine under contract # HHS-N-276-2011-00008-C.