Skip all navigation and go to page content

OERC Blog

A Weblog from the NN/LM Outreach Evaluation Resource Center

The OERC Welcomes Karen Vargas

Karen Vargas

The OERC welcomes evaluation specialist Karen Vargas, MSLS, who joined the staff on February 2.  Karen will create and present training sessions on various evaluation topics, contribute regularly to the OERC blog and LibGuide, and provide one-to-one evaluation assistance to the National Network of Libraries of Medicine.

Although Karen is new to the OERC staff, she is well known within NN/LM circles, particularly in the South Central Region. She joined the SCR Regional Medical Library in 2003 as the consumer health outreach coordinator, and later became the region’s outreach and evaluation coordinator.  Karen worked on a number of evaluation projects during her time at the SCR. Prior to joining the National Network, she worked at the Houston Public Library.

Karen will be telecommuting from Houston, Texas. We look forward to her contributions to the OERC.  She agreed to participate in an interview with Cindy Olney so we could introduce her to our blog readers.

What made you want to become part of NN/LM?

I liked all the opportunities to provide training offered through NN/LM. You get to see how this job impacts people’s lives.

 What was your favorite evaluation project?

I really enjoyed doing the evaluation of the resource library outreach subcontract program at NN/LM SCR. We involved the resource library directors and outreach contacts in identifying the outcomes that they were interested in. We started with friendly general questions that developed into a process that they were willing to use. We managed to develop specific methods that led to good data. I liked helping people identify what was important to them and figuring out how to track their progress. It makes it more likely they will actually participate in the evaluation. Some of the libraries are still using the forms that we developed.

What do you find challenging about doing that project?

Multi-site evaluation was a challenge.  We were working with libraries in five states.  It was important that everyone follow directions, but we weren’t always successful in getting that point across. Some people using our forms were not the ones we originally trained and the ones we trained didn’t train their co-workers.  If I were to do it over again, we would have written out the directions more explicitly and had special training sessions for everyone who was collecting data for the project. We could have recorded the training, too.

What made you want to work with the OERC?

We’re helping organizations see the value of their programs. At SCR, we helped network members working on projects to focus on what is working and not waste time on parts that aren’t working. We were helping them feel good about what they were doing.  It’s important, for advocacy, to find out what stakeholders want. But it’s also good to figure out what you want. That’s what makes evaluation exciting. People figuring out what makes them satisfied, what they would see as success, and then figuring out how to measure it.

What I’ve like best about working with NN/LM the last 11 years, is that, when we took people through the planning steps of writing an award proposal, they often did the project even if we didn’t fund it. They get so involved in planning that they find some way to do the project. I liked seeing them get excited.

What type of evaluation skills are you most interested in developing?

I would like to learn more about interviewing.

What else would you like readers to know about you?

 I have a little girl named Sophia. She’s two years old and plays harmonica, ukulele, recorder, and can even get a sound on a trombone!

 

Another Coffee Break: Word and Excel Templates

AEA Coffee

Here at the Outreach Evaluation Resource Center (OERC) we began 2015 blogging about the CDC Coffee Breaks. For February we’re offering a refill by featuring some notes from a recent American Evaluation Association (AEA) coffee break webcast. Unlike the CDC, the 20 minute AEA coffee break webcasts are not freely available to the public but are an included benefit of AEA membership. The webcast briefly covered best practices in data visualization using two commonly available resources (Microsoft Word and Excel) and how to automate use of them by creating templates for report format consistency and easier workflow.

Some great resources to learn more how to do this and bookmark for future reference include

Specific for Word

Specific for Excel

 

52 Weeks of Better Evaluation

BetterEvaluation.org is an international collaboration that encourages sharing of evaluation methods, approaches and processes for improvement. BetterEvaluation offers yearly blog themes for their staff and guest writers to focus on, and have wrapped up the highlights of their ’52 Weeks of BetterEvaluation’ 2014 theme in a post at http://betterevaluation.org/node/4682 For 2015 they are featuring ’12 Months of BetterEvaluation’ with multiple posts during a month, starting with impact evaluation in January.

A ‘top 5′ selection from the ‘52 Weeks of BetterEvaluation‘ post that is likely to be of interest to National Network of Libraries of Medicine (NN/LM) members includes

  1. Top ten developments in qualitative evaluation over the past decade (link to part 1, part 2)
  2. Fitting reporting methods to evaluation findings and audiences (link)
  3. Infographics, including step by step instructions in piktochart (link)
  4. Innovation in evaluation (link)
  5. Presenting data effectively (link)

Freebie Friday: Measuring Success Toolkit

Measurement and Evaluation Staircase

Monitoring and evaluation (M&E) is a form of assessment used to help improve the performance and achievement of program results and often used by both non-government organizations (NGOs) and government agencies. The staircase diagram above describes six questions that M&E can help answer through program planning, monitoring, and evaluation. More information clarifying the difference between monitoring and evaluation as well as guidance for each of the six questions is available at this link.

While not specific to health information outreach programs, the Measuring Success Toolkit at  https://www.urbanreproductivehealth.org/toolkits/measuring-success from the Urban Reproductive Health Initiative is about health program planning, monitoring and evaluation. The toolkit provides helpful resources from the initiative’s multi-country perspective of working with the urban poor and the significant health disparities they face that may be helpful to consult with your health information outreach partners to underserved communities.  It includes subject-specific M&E resources such as maternal & child health and HIV/AIDs, and the resources within the toolkit are selected by M&E experts and reviewed quarterly following established criteria to identify important resources from diverse perspectives that include accurate, up to date information.

Focused Outreach Vermont (from NN/LM NER)

If you are planning or currently conducting an outreach project, you might want to take a look at the Focused Outreach Vermont article in the National Network of Libraries of Medicine, New England Region (NN/LM NER) newsletter (posted January 13, 2015). NN/LM NER’s Focused Outreach Project uses carefully planned outreach activities and strong community-based collaboration to connect underserved communities with NLM resources and services. The Ner’eastah article, which is an abstract of a full report, highlights outreach results through a succinct description of evaluation findings.

I particularly applaud NN/LM NER’s reporting method. They provide a quick overview, featuring the results of their efforts, with easy access to full details for those who want it. The full report describes the project’s community assessment process and findings. You also get a more thorough description of documented outcomes, laid out in a highly readable format. A nice added feature is the infographic in the beginning of the report.

This is a great example of how to use evaluation to publicize and advocate for successful programs!

nnlm ner

We would like to report more projects that demonstrate effective use of evaluation methods. If you have an example to share, send it to Cindy Olney at olneyc@uw.edu.

Freebie Friday: Mobile Data Solutions Course

Mobile Course Screenshot

Are you curious about the use of smart phones, tablets, or other mobile data resources to collect data for your assessment project, but are seeking more information on how to determine if this is the right approach for your project or program and how to process the data you collect using this method?

Check out http://techchange.org/media/mobile-data-solutions/, which was created as part of the Mobile Solutions Technical Assistance and Research (mSTAR) project, with expertise provided by U.S. Agency for International Development’s (USAID) Digital Development Lab and designed by TechChange.

The primary goal of this freely available and accessible online course (free registration is required to access it) is to learn more about mobile tools, processes, and strategies for data collection in order to use mobile devices (referred to as mobile data solutions) to their full potential in doing so. The course will take about 2 hours to complete and can be done at your own pace over time. Your progress in the course is saved so you’ll be taken to the point where you stopped to continue learning the next time you access it.

The learning objectives of the course are

  • Describe examples of mobile data solutions from collection through visualization
  • Articulate the benefit of using these solutions
  • Analyze the challenges and limitations associated with mobile data solutions
  • Assess whether or not particular mobile data solutions are appropriate for a project, program or problem
  • Outline how to design a project or activity to include mobile data solutions
  • Explain the steps involved in implementing mobile data solutions
  • Summarize how to analyze, visualize, and share mobile data

 

Evaluation “Coffee Breaks” from the CDC

Want to build your repertoire of evaluation skills?  Check out this library of evaluation-related podcasts and webinars from the CDC’s Division of Heart Disease and Stroke Prevention.  These are archived documents from 20-minute presentations about evaluation. The usual basic topics are represented, such as “Making Logic Models Work for You”  and “How Do I Develop a Survey?” But a number of the presentations cover topics that are not standard fare. Here are just a few titles that caught my eye:

Most presentations consist of PDFs of PowerPoint slides and talking points, but there are a few podcasts as well.  All presentations seem to be bird’s-eye overviews, but the final slides offer transcripts of Q&A discussion and a list of resources for more in-depth exploration of the topic.  It’s a great way to check out a new evaluation interest!

coffee

Say No to Spaghetti: Effective Data Graphs

Line graph with 5 multicolor lines jumbled together

The illustration above is from Stephanie Evergreen‘s excellent blog post that caution is needed with the use of line graphs in a chart to show change over time for multiple organizations so you don’t end up with a brightly colored bowl of spaghetti. The solution to passing on this pasta effect? Creating small multiple graphs, such as by each region in this example, which are done one at a time using the same scale as the original graph then stitched together with alignment tools including a ruler and Align > Align Top commands in your graphing software. Be sure to see the end result and step by step guidance on how to create these at http://stephanieevergreen.com/declutter-dataviz-with-small-multiples/

Showing change over time as a line graph instead of a bar graph is one of the quantitative data focus areas in our Outreach Evaluation Resource Center (OERC) webinar Data Burger: A ‘Good’ Questionnaire Response Rate Plus Basic Quantitative Analysis. You can listen to a recording of the Data Burger presentation for the Mid Atlantic Region at https://webmeeting.nih.gov/p2mn6k7tkv6/, and please contact us if you’d like to hear more about this or one of our other webinars.

More Qualitative Data Visualization Ideas

In September, we blogged about a way to create qualitative data visualizations by chunking a long narrative into paragraphs with descriptive illustrations.

Ann Emery has shown six additional ways to create qualitative data visualization: 1) Strategic world cloud use (one word or before/after comparisons), 2) Quantitative + Qualitative combined (a graph of percentages and a quote from an open-ended text comment) 3) Photos alongside participant responses (only appropriate for non-anonymized data) 4) Icon images beside text narratives 5) Diagrams explaining processes or concepts (the illustration of a health worker’s protective gear from Ebola in the Washington Post is a great example) and 6) Graphic timelines. See these examples and overviews on how to make your own at  http://annkemery.com/qual-dataviz/

Do you need more information about reporting and visualizing your data? We at the Outreach Evaluation Resource Center (OERC) have more resources available for you from the Reporting and Visualizing tab of our Tools and Resources for Evaluation Guide at http://guides.nnlm.gov/oerc/tools and welcome your suggestions for additional resources to include and your comments.

Practical and Ethical Guidelines for Conducting Photovoice Studies

If you think you might want to do a photovoice evaluation study, then you definitely should consult Practical Guidance and Ethical Considerations for Studies Using Photo-Elicitation Interviews by Bugos et al.  The authors reviewed articles describing research projects that employed photovoice and photo-elicitation.  Then, they skillfully synthesized the information into practical and ethical guidelines for doing this type of work.

Photo-elicitation refers specifically to the interviewing methods used to get participants to talk about their photographs and videos. The key contribution of this article is its focus on how to interviewing. Effective interviewing technique is essential because the photographs are meaningless unless you understand the participants’ stories behind them. The practical guidelines help you elicit usable, trustworthy story data after the photographs have been taken.

While interviewing is the main focus of the article, you will find some advice on the photo collection phase as well. This article includes guidance on how to train your participants to protect their own safety and the dignity of their subjects when taking photographs. All of the research projects reviewed for this article received institutional review board approval. If you follow their guidelines, you can have confidence that you are protecting the safety, privacy and confidentiality of all involved.

Here is the full citation for this very pragmatic article:

Bugos E, Frasso R, FitzGerald E, True G, Adachi-Mejia AM, Cannuscio C. Practical Guidance and Ethical Considerations for Studies Using Photo- Elicitation Interviews. Prev Chronic Dis 2014;11:140216. DOI: http://dx.doi.org/10.5888/pcd11.140216

 

Tablet PC

Last updated on Saturday, 23 November, 2013

Funded by the National Library of Medicine under contract # HHS-N-276-2011-00008-C.