Here at the Outreach Evaluation Resource Center (OERC) we began 2015 blogging about the CDC Coffee Breaks. For February we’re offering a refill by featuring some notes from a recent American Evaluation Association (AEA) coffee break webcast. Unlike the CDC, the 20 minute AEA coffee break webcasts are not freely available to the public but are an included benefit of AEA membership. The webcast briefly covered best practices in data visualization using two commonly available resources (Microsoft Word and Excel) and how to automate use of them by creating templates for report format consistency and easier workflow.
Some great resources to learn more how to do this and bookmark for future reference include
Specific for Word
Specific for Excel
BetterEvaluation.org is an international collaboration that encourages sharing of evaluation methods, approaches and processes for improvement. BetterEvaluation offers yearly blog themes for their staff and guest writers to focus on, and have wrapped up the highlights of their ’52 Weeks of BetterEvaluation’ 2014 theme in a post at http://betterevaluation.org/node/4682 For 2015 they are featuring ’12 Months of BetterEvaluation’ with multiple posts during a month, starting with impact evaluation in January.
A ‘top 5′ selection from the ‘52 Weeks of BetterEvaluation‘ post that is likely to be of interest to National Network of Libraries of Medicine (NN/LM) members includes
- Top ten developments in qualitative evaluation over the past decade (link to part 1, part 2)
- Fitting reporting methods to evaluation findings and audiences (link)
- Infographics, including step by step instructions in piktochart (link)
- Innovation in evaluation (link)
- Presenting data effectively (link)
Monitoring and evaluation (M&E) is a form of assessment used to help improve the performance and achievement of program results and often used by both non-government organizations (NGOs) and government agencies. The staircase diagram above describes six questions that M&E can help answer through program planning, monitoring, and evaluation. More information clarifying the difference between monitoring and evaluation as well as guidance for each of the six questions is available at this link.
While not specific to health information outreach programs, the Measuring Success Toolkit at https://www.urbanreproductivehealth.org/toolkits/measuring-success from the Urban Reproductive Health Initiative is about health program planning, monitoring and evaluation. The toolkit provides helpful resources from the initiative’s multi-country perspective of working with the urban poor and the significant health disparities they face that may be helpful to consult with your health information outreach partners to underserved communities. It includes subject-specific M&E resources such as maternal & child health and HIV/AIDs, and the resources within the toolkit are selected by M&E experts and reviewed quarterly following established criteria to identify important resources from diverse perspectives that include accurate, up to date information.
If you are planning or currently conducting an outreach project, you might want to take a look at the Focused Outreach Vermont article in the National Network of Libraries of Medicine, New England Region (NN/LM NER) newsletter (posted January 13, 2015). NN/LM NER’s Focused Outreach Project uses carefully planned outreach activities and strong community-based collaboration to connect underserved communities with NLM resources and services. The Ner’eastah article, which is an abstract of a full report, highlights outreach results through a succinct description of evaluation findings.
I particularly applaud NN/LM NER’s reporting method. They provide a quick overview, featuring the results of their efforts, with easy access to full details for those who want it. The full report describes the project’s community assessment process and findings. You also get a more thorough description of documented outcomes, laid out in a highly readable format. A nice added feature is the infographic in the beginning of the report.
This is a great example of how to use evaluation to publicize and advocate for successful programs!
We would like to report more projects that demonstrate effective use of evaluation methods. If you have an example to share, send it to Cindy Olney at email@example.com.
Are you curious about the use of smart phones, tablets, or other mobile data resources to collect data for your assessment project, but are seeking more information on how to determine if this is the right approach for your project or program and how to process the data you collect using this method?
Check out http://techchange.org/media/mobile-data-solutions/, which was created as part of the Mobile Solutions Technical Assistance and Research (mSTAR) project, with expertise provided by U.S. Agency for International Development’s (USAID) Digital Development Lab and designed by TechChange.
The primary goal of this freely available and accessible online course (free registration is required to access it) is to learn more about mobile tools, processes, and strategies for data collection in order to use mobile devices (referred to as mobile data solutions) to their full potential in doing so. The course will take about 2 hours to complete and can be done at your own pace over time. Your progress in the course is saved so you’ll be taken to the point where you stopped to continue learning the next time you access it.
The learning objectives of the course are
- Describe examples of mobile data solutions from collection through visualization
- Articulate the benefit of using these solutions
- Analyze the challenges and limitations associated with mobile data solutions
- Assess whether or not particular mobile data solutions are appropriate for a project, program or problem
- Outline how to design a project or activity to include mobile data solutions
- Explain the steps involved in implementing mobile data solutions
- Summarize how to analyze, visualize, and share mobile data
Want to build your repertoire of evaluation skills? Check out this library of evaluation-related podcasts and webinars from the CDC’s Division of Heart Disease and Stroke Prevention. These are archived documents from 20-minute presentations about evaluation. The usual basic topics are represented, such as “Making Logic Models Work for You” and “How Do I Develop a Survey?” But a number of the presentations cover topics that are not standard fare. Here are just a few titles that caught my eye:
Most presentations consist of PDFs of PowerPoint slides and talking points, but there are a few podcasts as well. All presentations seem to be bird’s-eye overviews, but the final slides offer transcripts of Q&A discussion and a list of resources for more in-depth exploration of the topic. It’s a great way to check out a new evaluation interest!
The illustration above is from Stephanie Evergreen‘s excellent blog post that caution is needed with the use of line graphs in a chart to show change over time for multiple organizations so you don’t end up with a brightly colored bowl of spaghetti. The solution to passing on this pasta effect? Creating small multiple graphs, such as by each region in this example, which are done one at a time using the same scale as the original graph then stitched together with alignment tools including a ruler and Align > Align Top commands in your graphing software. Be sure to see the end result and step by step guidance on how to create these at http://stephanieevergreen.com/declutter-dataviz-with-small-multiples/
Showing change over time as a line graph instead of a bar graph is one of the quantitative data focus areas in our Outreach Evaluation Resource Center (OERC) webinar Data Burger: A ‘Good’ Questionnaire Response Rate Plus Basic Quantitative Analysis. You can listen to a recording of the Data Burger presentation for the Mid Atlantic Region at https://webmeeting.nih.gov/p2mn6k7tkv6/, and please contact us if you’d like to hear more about this or one of our other webinars.
In September, we blogged about a way to create qualitative data visualizations by chunking a long narrative into paragraphs with descriptive illustrations.
Ann Emery has shown six additional ways to create qualitative data visualization: 1) Strategic world cloud use (one word or before/after comparisons), 2) Quantitative + Qualitative combined (a graph of percentages and a quote from an open-ended text comment) 3) Photos alongside participant responses (only appropriate for non-anonymized data) 4) Icon images beside text narratives 5) Diagrams explaining processes or concepts (the illustration of a health worker’s protective gear from Ebola in the Washington Post is a great example) and 6) Graphic timelines. See these examples and overviews on how to make your own at http://annkemery.com/qual-dataviz/
Do you need more information about reporting and visualizing your data? We at the Outreach Evaluation Resource Center (OERC) have more resources available for you from the Reporting and Visualizing tab of our Tools and Resources for Evaluation Guide at http://guides.nnlm.gov/oerc/tools and welcome your suggestions for additional resources to include and your comments.
If you think you might want to do a photovoice evaluation study, then you definitely should consult Practical Guidance and Ethical Considerations for Studies Using Photo-Elicitation Interviews by Bugos et al. The authors reviewed articles describing research projects that employed photovoice and photo-elicitation. Then, they skillfully synthesized the information into practical and ethical guidelines for doing this type of work.
Photo-elicitation refers specifically to the interviewing methods used to get participants to talk about their photographs and videos. The key contribution of this article is its focus on how to interviewing. Effective interviewing technique is essential because the photographs are meaningless unless you understand the participants’ stories behind them. The practical guidelines help you elicit usable, trustworthy story data after the photographs have been taken.
While interviewing is the main focus of the article, you will find some advice on the photo collection phase as well. This article includes guidance on how to train your participants to protect their own safety and the dignity of their subjects when taking photographs. All of the research projects reviewed for this article received institutional review board approval. If you follow their guidelines, you can have confidence that you are protecting the safety, privacy and confidentiality of all involved.
Here is the full citation for this very pragmatic article:
Bugos E, Frasso R, FitzGerald E, True G, Adachi-Mejia AM, Cannuscio C. Practical Guidance and Ethical Considerations for Studies Using Photo- Elicitation Interviews. Prev Chronic Dis 2014;11:140216. DOI: http://dx.doi.org/10.5888/pcd11.140216
Rural and medically underserved areas often have challenges including both increased health disparities and population health issues combined with limited resources and healthcare providers to help meet these challenges. The use of appropriate program evaluation measures can help to assess what actually works for rural health settings since many evidence-based strategies are based on urban and non-rural populations.
The Rural Assistance Center (raconline.org) has recently issued a freely available online guide at http://www.raconline.org/topics/rural-health-research-assessment-evaluation The guide is intended to help an organization
- Identifies the similarities and differences among rural health research, assessment, and evaluation
- Discusses common methods, such as surveys and focus groups
- Provides contacts within the field of rural health research
- Addresses the importance of community-based participatory research to rural communities
- Looks at the community health needs assessment (CHNA) requirements for non-profit hospitals and public health
- Examines the importance of building the evidence-base so interventions conducted in rural areas have the maximum possible impact
Thanks to National Network of Libraries of Medicine (NN/LM) Network member (what does that mean?) Gail Kouame from HEALWA for sharing this great resource with us at the Outreach Evaluation Resource Center (OERC)! Do you have an evaluation-related resource to share? We would be happy to consider featuring it in our blog or possible inclusion in our Tools and Resources guide at guides.nnlm.gov/oerc/tools.