Archive for the ‘Practical Evaluation’ Category
Do you want to know more about great assessment resources, tools, and lessons learned from others who share your interest in evaluation?
Do you not want to add another professional journal to the existing TBR (to be read) stack in your office?
Check out the American Evaluation Association (AEA) 365 blog at http://aea365.org where anyone (not only AEA members) can subscribe via email or really simple syndication (RSS) feed. The established blog guidelines place a cap on contributions with a maximum of 450 words per entry. You will know at a glance what the subject is (Hot Tips, Cool Tricks, Rad Resources, or Lessons Learned) from the headers used within the entries, and all assumptions of prior knowledge and experience with evaluation and organizations are avoided with clarification of all acronyms and no jargon allowed.
A handy tip – Scroll down the right sidebar of the website to locate subjects arranged by the AEA Topical Interest Groups (TIGs). Some of these that are likely to be of interest to National Network of Libraries of Medicine (NN/LM) members are Data Visualization and Reporting, Disabilities and Other Vulnerable Populations, Health Evaluation, Integrating Technology into Evaluation, and Nonprofits and Foundations Evaluation to name only a few.
A brief review of a recent entry of interest to NN/LM members – Conducting a Health Needs Assessment of People With Disabilities – shared lessons learned from the needs assessment work done in Massachusetts, and shared the rad resource of Disability and Health Data System (DHDS) with state-level disability health data available from the Centers for Disease Prevention and Control (CDC).
The Nielsen Norman Group (NNG) conducts research and publishes information about user experience with interfaces. NNG was an early critic of the troubled “healthcare.gov” web site: “Healthcare.gov’s Account Setup: 10 Broken Usability Guidelines.” recent post (“Talking with participants during a usability test”) provided tips for facilitating usability tests that could be very useful whenever you’re facilitating a discussion or conducting an observation. When in doubt about whether to speak to a participant, count to 10 and decide whether to say something. Consider using “Echo” or “Boomerang” or “Columbo” approaches:
- Echo–repeat the last words or phrase, using an interrogatory tone.
- Boomerang–formulate a nonthreatening question that “pushes” a user’s comment back and causes them to think of a response for you, such as “What would you do if you were on your own?”
- Columbo–be smart but don’t act that way, as in the “Columbo” TV series from the 1960’s and 1970’s starring Peter Falk.
The full article “Talking with participants during a usability test” provides audio examples of these techniques that you can listen to. You can find a large amount of additional information about usability testing on the Nielsen Norman Group’s web site, such as “How to Conduct Usability Studies” and “Usability 101: Introduction to Usability.”
So how are those New Year’s resolutions going?
Many of us like to start the year resolving to clean up some part of our lives. Our diet. Our spending habits. The five years of magazine subscriptions sitting by our recliner.
Here’s another suggestion: Resolve to clean up “chart junk” in the charts you add to PowerPoint presentations or written reports.
Now I can pack information into a bar chart with the best of them. But it is no longer in vogue to clutter charts with data labels, gridlines, and detailed legends. This is not just a fashion statement, either. Design experts point out that charts should make their point without the inclusion of a bunch of distracting details. If the main point of your chart is not visually obvious, you either have not designed it correctly or you are presenting a finding that is not particularly significant.
So the next time you create a chart, consider these suggestions:
- Use your title to communicate the main point of the chart. Take a tip from newspaper headlines and make your title a complete sentence.
- Don’t use three-dimensional displays. It interferes with people’s comprehension of charts.
- Ditch the gridlines or make them faint so they don’t clutter the view.
- Use contrast to make your point. Add a bright color to the bar or line that carries the main point and use gray or another faint color for the comparison bars or lines.
- Be careful in picking colors. Use contrasting colors that are distinguishable to people with colorblindness. If your report is going to be printed, be sure the contrast still shows up when presented in black-and-white.
- Consider not using data labels, or just label the bar or line associated with your main point.
- Remove legends and apply legend labels inside the bars or at the end of lines.
For more comprehensive information on eliminating chart junk, check out this article:
Evergreen S, Mezner C. Design principles for data visualization in evaluation. In Azzam T, Evergreen S. (eds). Data visualization, part 2. New Directions in Evaluation. Winter 2013, 5-20.
Last week I attended an excellent webinar session presented by Kylie Hutchinson of Community Solutions Planning & Evaluation about the vast and often jargony world of evaluation terminology. As part of Hutchison’s research she consulted three online evaluation glossaries* and counted thirty six different definitions of evaluation methods within them. What accounts for so much variation? Common reasons include the perspectives and language used by different sectors and funders such as education, government, and non-profit organizations.
A helpful tip when working with organizations on evaluation projects is to ask to see copies of documents such as annual reports, mission and vision statements, strategic planning, and promotional materials to learn more about what language they use to communicate about themselves. This will assist you in knowing if modifications in assessment terminology language are needed, and can help guide you with discussions on clarifying the organization’s purpose of the evaluation.
Hutchinson identified several common themes within the plethora of evaluation methods and created color-coded clusters of them within her Evaluation Terminology Map, which uses the bubbl.us online mind mapping program. She also created a freely available Evaluation Glossary app for use on both iPhone and Android mobile devices and has a web-based version under development. For additional resources to better understand health information outreach evaluation, be sure to visit our tools website at http://guides.nnlm.gov/oerc/tools.
* Two of the three online evaluation glossaries referenced are still available online
The 2nd Edition of the Planning and Evaluating Health Information Outreach Projects series of 3 booklets is now available online:
Getting Started with Community-Based Outreach (Booklet 1)
What’s new? More emphasis and background on the value of health information outreach, including its relationship to the Healthy People 2020 Health Communication and Health Information Technology topic areas
Planning Outcomes-Based Outreach Projects (Booklet 2)
What’s new? Focus on uses of the logic model planning tool beyond project planning, such as providing approaches to writing proposals and reports.
Collecting and Analyzing Evaluation Data (Booklet 3)
What’s new? Step-by-step guide to collecting, analyzing, and assessing the validity (or trustworthiness) of quantitative and qualitative data, using questionnaires and interviews as examples.
These are all available free to NN/LM regional offices and network members. To request printed copies, send an email to email@example.com.
Non-508 compliant pdf versions of all three booklets are available here: http://nnlm.gov/evaluation/guides.html#A2 .
The Planning and Evaluating Health Information Outreach series, by Cynthia Olney and Susan Barnes, supplements and summarizes material in Cathy Burroughs’ groundbreaking work from 2000, Measuring the Difference: Guide to Planning and Evaluating Health Information Outreach. Printed copies of Burroughs’ book are also available free—just send an email request to firstname.lastname@example.org.
“There’s probably a better way of doing this.” How many times have you muttered this statement while using Excel to analyze a database download or a spreadsheet of class evaluation data?
Or maybe you would like to try your hand at some of the hot new trends in data visualization, such as data dashboards or infographics, but find that your lack of familiarity with Excel holds you back.
Whether you are a novice or experienced Excel user, you should check out Emery Evaluation’s “Excel for Evaluation” web page (http://emeryevaluation.com/excel/) with its series of videos demonstrating efficient ways to use Excel for data analysis and reporting. These videos, created by Ann Emery, are 1-4 minutes long and demonstrate a single Excel function, such as the formula to recode data or a technique for merging data from two separate files. The videos do use the 2010 version of Excel, so if you are working with an earlier version, some of the videos may not directly apply. Her videos are organized around the steps of good data analysis: importing your data, organizing and cleaning the data, recoding, looking for patterns, calculating statistics, and creating charts.
I’ve been using Excel since 1988, and I STILL always feel as though I’m taking the long way around to completing an analysis. These videos confirmed that I was, indeed, right. There are better ways to use Excel, and Emery’s videos show how.
In survey design workshops, we are often asked if rating scales designed to measure respondents’ opinions and attitudes should have an odd number of points, including a neutral mid-point (i.e., “Neither agree or disagree”); or if it’s better to have an even number of points, without a mid-point. Our answer, which probably frustrates our participants to no end, is “it depends.” You have to clearly think through what a “neutral” answer means and choose accordingly. Here is a link to a clever blog entry by Jane Davidson that makes this point very well and gives you ways to think about how many points your rating scales should have:
Boxers or briefs? Why having a favorite response scale makes no sense
(There are some insightful readers’ comments to this blog post that you might find interesting as well.)
At the OERC, we recommend using evaluation questions as a foundation for evaluation projects. The questions are useful in developing data collection methods, analyzing data, and organizing evaluation reports. If you are planning a needs assessment, you can take advantage of a tip sheet that provides needs assessment questions for you: The 6Ds of Needs Assessments. This one-page document will help you identify the information needed to advocate for your project or design your program. The 6D’s of Needs Assessment was created by Kylie Hutchinson, principle evaluator for Community Solutions Planning and Evaluation.
On July 27, 2012 Stephanie Evergreen, eLearning Initiatives Director for the American Evaluation Association, gave a half-hour webinar about the Ignite approach to giving presentations. This approach involves a 5 minute presentation based on 20 slides that are each shown for 15 seconds. (Yes, this is similar to Pecha Kucha.) The American Evaluation Association, which is conducting a “Potent Presentations” initiative to help its members improve their reporting skills, has made the recording and slides for this great presentation available in its free AEA Public Library:
In her short, practical webinar, Stephanie demonstrated the Ignite approach with a great presentation about “Chart Junk Extraction”—valuable tips for creating streamlined, readable charts with maximized visual impact. Spend an enjoyable and enlightening few minutes viewing the fast-paced and interesting “Light Your Ignite Training Webinar”—you can even learn how to set your PowerPoint timer to move forward automatically every 15 seconds so that you can practice your Igniting!
“Utilizing grounded theory to explore the information-seeking behavior of senior nursing students.” Duncan V; Holtslander, L. J Med Lib Assoc 100(1) January 2012:20-27.
In this very practical article, the authors describe the steps they took to analyze qualitative data from written records that nursing students kept about their experiences with finding information in the CINAHL database. They point out that, although the ideal way to gather data about student information seeking behavior would be via direct observation, that approach is not always practical. Also, self-reporting via surveys and interviews may create bias because members of sample populations might “censor themselves instead of admitting an information need.” For this study, students were asked to document their search process using an electronic template that included “prompts such as resource consulted, reason for choice, terms searched, outcome, comments, and sources consulted (people).”
After reviewing these searching journals, the authors followed up with interviews.
The “Data analysis and interpretation” section of this article provides a clear, concise description of the grounded theory approach to analyzing qualitative data using initial, focused, and theoretical coding using the nVivo 8 software. [Note, as of this writing, the latest version is nVivo 10]
- Initial codes: “participants’ words were highlighted to create initial codes that reflected as closely as possible the participants’ own words.”
- Focused codes: “more directed, selective, and conceptual than word-by-word, line-by-line, and incident-by-incident coding.”
- Theoretical codes: focused codes were compared and contrasted “in order to develop the emerging theory of the information-seeking process.”
The authors reviewed the coding in follow-up interviews with participants to check the credibility of their findings: “The central theme that united all categories and explained most of the variation among the data was ‘discovering vocabulary.’” They recommend “teaching strategies to identify possible words and phrases to use” when searching for information.
You can do this even if you don’t have access to nVivo 8 software. Here’s an illustration: “Summarize and Analyze Your Data” from the OERC’s Collecting and Analyzing Evaluation Data booklet.