Skip all navigation and go to page content
NN/LM Home About OERC | Contact Us | Feedback |OERC Sitemap | Help | Bookmark and Share

Data Viz: free training and other fun stuff

Coming soon to a computer near you!  Chris Lysy of FreshSpectrum  is offering a free seven-part data visualization workshop.  Chris has provided data viz training for the American Evaluation Association. (His followers also love his cartoon-illustrated evaluation blog. ) He calls himself the Rachel Ray of data visualization, which makes his course description a nice feature for the OERC’s Thanksgiving blog post.

The workshop date is still TBA, but  you can join his mailing list now to get full details when they are released.

Fresh spectrum logo

Also, Thanksgiving activities often include movie-viewing. So here are some fun data visualizations of famous movie quotes by Flowingdata to help you through the last afternoon before  the holiday weekend.

Poster of chart depictions of famous movie quotes

Freebie Friday: Data Visualization Options Flowchart

What would you like to show? Comparison, composition, relationship, distribution

Looking for an ‘at a glance’ single page to determine which type of data visualization chart is helpful in order to clearly communicate your results?

This PDF flowchart at  http://betterevaluation.org/plan/describe/visualise_data is a very handy reference! The flowchart guides you towards considering the appropriate data visualization chart options after your initial response to the question of “What would you like to show?” answers of comparison, distribution, composition, or relationship. There are brief descriptions of the charts at the Better Evaluation data visualization page that you can click through to get additional information such as a deviation bar graph that includes synonyms, a base definition, examples of how the chart is used, advice about their use, and links to resources for creating them.

Liberate conversations through Liberating Structures

Nothing beats qualitative (non-numerical) data collection methods for getting a high volume of rich, interesting information from project participants and stakeholders. The downside is that these methods are resource intensive, so you usually are limited to involving a relatively small number of participants in conversation.

But what if you want to collect a lot of qualitative responses from a lot of people?

If you do, check out the Liberating Structures website. It provides step-by-step instructions for activities to engage large groups in conversations for planning and evaluation.  The website offers a menu of 33 activities with extensive planning details, plus ideas for combining activities into an almost unlimited number of group discussion formats.

I participated in a Liberating Structures activity in Denver last month when I attended the Quint*Essential Conference, hosted by five Medical Library Association chapters. Staff from National Network of Libraries of Medicine (NN/LM) regional offices invited all conference attendees to generate and evaluate ideas for future network initiatives. It was a high-energy activity that engaged more than 100 people in providing bold ideas for future activities.

The beauty of Liberating Structures activities is that the guidelines include how to document conversations so meeting facilitators will end their exercises with actual data. In some cases, the data can be quickly analyzed. NN/LM facilitators were able to compile and report results from the Quint discussion in the exhibit hall later that day.

I want to thank Claire Hamasu, the Associate Director of the NN/LM MidContinental Region, for pointing me to the Liberating Structures web site and including me in the Quint Conference activity. I personally look forward to trying more of these activities and hope other readers are inspired to do so as well.

 

Liberating Structures logo

Literature Search Strategy Week at AEA

We at the Outreach Evaluation Resource Center (OERC) have previously covered the American Evaluation Association’s (AEA) tip-a-day blog at http://aea365.org/blog as a helpful resource. This week posts about literature search strategies were shared on the AEA blog by Network member librarians from the Lamar Soutter Library at the University of Massachusetts Medical School. Have you been involved in a similar collaboration? Please let us know, we’d love to feature your work in a future OERC blog post!

Literature Search Strategy Week

  1. Best Databases – learn the most effective starting points for biomedical, interdisciplinary, specialized, and a handy Top Ten list of literature databases.
  2. Constructing a Literature Search – learn the value of a vocabulary roadmap, and the difference between keyword and controlled vocabulary searching.
  3. Grey Literature – strategies for understanding these non-traditional but highly valuable information resources and starting points on where to find them.
  4. Using MyNCBI – learn how to sign up for your free account, save your PubMed search strategies, receive email updates, customize your display and more.
  5. Citation Management – featuring both freely available and other options you may have access to through your academic organizations.

The OER’s Appreciative Inquiry Project: Seeking Strength-Based Change

AI commons logo

 

For the past couple of months, the OERC has engaged in an Appreciative Inquiry (AI) interview project to get feedback and advice from users on to our services. Appreciative Inquiry  was developed in the 1980s by David Cooperrider and Suresh Srivastva as an approach to bring “collaborative and strength-based change” to organizations. The methods are designed to collect information emphasizing positive aspects of an organization and vision for a better future. Probably the best known AI tool is the interview, which covers three basic areas:

  • A peak experience of the interviewee.
  • Why the interviewee found that experience so valuable.
  • What the interviewee wished could happen to bring about more exceptional experiences.

(You can find the OERC’s adaption of these basic questions here.)

When people are first introduced to AI evaluation processes, they skeptically ask if this approach doesn’t lead to positively biased data.  I would say no, because we are asking for descriptive rather than evaluative comments. I call the interviews “constructive conversations without criticism.” You come away from the experience thinking “what could be?” rather than “what’s wrong?” The feedback was painless to me, because our users made recommendations in wishful, rather than judgmental, terms.

I also think AI is a superior way to get frank advice from users if they generally like your organization. When asked for feedback, particularly in interpersonal situations, interviewees may not want to offend the organization’s staff or, worse, cause negative repercussions. When you ask people to talk about dreams and wishes, their imaginations are engaged and fear of being critical falls away. They are free to give you great ideas for moving forward.

If your organization is about to embark on strategic planning of any kind, I highly recommend the AI approach. You can get more information about AI methods at the Appreciative Inquiry Commons or the Center for Appreciative Inquiry websites. For an excellent book on applying AI to evaluation practice, check out Reframing Evaluation through Appreciative Inquiry by Preskill and Catsambas (Sage, 2006).

Note: The OERC  will post results of its AI project in a future blog post, when we have completed our analysis.

 

Freebie Friday : Padlet

Padlet with origami crane

Recently the AEA365 Evaluation Tip a Day resource the Outreach Evaluation Resource Center (OERC) previously blogged about featured a review and several hot tips for the use of Padlet, a freely available web-based bulletin board system. Their hot tips included use of Padlet as an anonymous brainstorming activity in response to a question or idea, and as a backchannel for students or conference attendees to share resources and raise questions for future discussion.

I took a closer look at Padlet’s bulletin board configuration settings and found them intuitive and easy to use with various backgrounds and freeform, tabular or grid note arrangement display on the bulletin board. Free Padlet accounts can be created by either signing up for one or by linking to an existing Google or Facebook account.  Privacy is a key concern that Padlet delivers many options for that are clearly explained including Private (only you and others you invite to participate via email), requiring the use of a password to access the Padlet, and Public to view, write or moderate. A new update feature includes a variety of ways to share Padlet data, ranging from clicking the icon for 6 different social media channels to downloading data as a PDF or Excel/CSV file for analysis.

Please check out a Padlet about the OERC Evaluation Series and leave your input! Posts will be moderated on the Padlet before they display publicly.

Freebie Friday: CDC Program Evaluation Resources

CDC Program Evaluation Guide Cover

Are you new to evaluation or need assistance with planning and implementing program evaluation resources from a public or community health perspective for your projects?

The Centers for Disease Control and Prevention (CDC) has a freely available ‘how to’ resource for you with an Introduction to Program Evaluation for Public Health Programs: A Self Study Guide. Examples of public and community health programs that can be considered for program evaluation include direct service interventions, community-based mobilization efforts, research initiatives into issues such as health disparities, advocacy work, and training programs. The guide is available online or as a PDF download that consists of a six step process (from Engaging Stakeholders to Ensure Use of Evaluation Findings), a helpful Glossary of program evaluation terminology, and Resources for additional publications, toolkits and more to support public and community health program evaluation work.

A related CDC guide (A Framework for Program Evaluation) is one of several resources we at the Outreach Evaluation Resource Center (OERC) feature in the Evaluation Planning section of our Tools and Resources for Evaluation Page at http://guides.nnlm.gov/oerc/tools

 

Elegantly Simple Evaluation: Talking to Health Care Providers about Patient Health Literacy

logo for the VIVA project

By Yawar Ali and Cindy Olney

As the child of a physician living in South Texas, I’ve witnessed a deficiency of health literacy in patients. I volunteered in my dad’s clinic over spring break. I also participated on a medical relief trip with my father to a nonprofit charitable hospital in Pakistan. At both places, I witnessed difficulty in patient health literacy. – Yawar Ali

 In June 2014, Yawar Ali, a rising junior from the South Texas High School for Health Professions, taught physicians and physician assistants in his father’s medical clinics about patient health literacy. He also introduced them to MedlinePlus as an important tool for their patients. Yawar evaluated his project and discovered valuable insight that helped him improve the impact of his project.

Yawar conducted this health information outreach project as an internship offered through the  ¡VIVA! (Vital Information for a Virtual Age) project.  ¡VIVA! is a high school-based initiative in which students are trained to promote MedlinePlus to their classmates, teachers, families, and community members.  It is a student organization led by librarians of the South Texas Independent School District, located in the Lower Rio Grande Valley. The National Library of Medicine (NLM) funds the project.

He developed his presentation using health literacy materials available through the Medical Library Association and presented to three doctors and three PAs.  He taught them seven steps for addressing low patient health literacy and introduced them to MedlinePlus.

Yawar incorporated elegantly simple evaluation techniques into his project. Right after the presentation, he asked participants to complete a short evaluation form, asking them how likely they were to use the steps and promote MedlinePlus to patients.  They all responded positively, indicating good intentions.

Two weeks after the training, Yawar visited all of the health care providers to conduct brief semi-structured interviews. He asked if they had tried the steps and collected their feedback on the techniques. He also checked to see if they had promoted MedlinePlus to their patients. With some persistence, he was able to conduct a complete interview with each participant.

The feedback he received is of interest to anyone hoping to initiate health information outreach in partnership with primary care clinics, particularly in medically underserved areas:

  • The majority of Yawar’s participants tried teach-back, open-ended questions, and other techniques with their patients; but they were conflicted because such techniques added time to patient appointments. This interfered with their ability to stick to their busy schedules.
  • The health care providers were impressed with MedlinePlus, but they had convenient access to print materials from a database (Healthwise) that was integrated with the clinic’s Electronic Health Records (EHR) system. Furthermore, it was easier to document that they were adhering to the meaningful use requirements of the Medicare and Medicaid EHR Incentive Programs when they got patient information from Healthwise.
  • While the Healthwise database was more convenient for the providers, they recognized that the print information they were providing was limited. They believed their patients could get more comprehensive information from MedlinePlus, but the clinicians did not have a convenient way to promote the resource.

Their feedback prompted a speedy response. The project team secured MedlinePlus brochures from NLM that Yawar delivered to the clinics. The fix was relatively simple, but critical. The team may have never known about this necessary adjustment without Yawar’s elegantly simple evaluation.

Credit:  Yawar and Cindy would like to thank ¡VIVA! project team members Lucy Hansen, Sara Reibman, and Ann Vickman, for their help on this project.

The ¡VIVA! project has been funded in whole or in part with federal funds from the National Library of Medicine, National Institute of Health, under Contract No. HHSN-276-2011-0007-C with the Houston Academy of Medicine-Texas Medical Center Library.

 

 

 

Freebie Friday: Audit an Evaluation MOOC

Have you always been curious about Massive Open Online Courses (MOOCs) and haven’t yet checked them out, or have started a MOOC and not completed it because you found the format confusing or didn’t have the time to complete all the assignments in it?

Evaluating Social Programs is currently being offered by the Massachusetts Institute of Technology (MIT) within one of the easier-to-navigate open access MOOC formats, EdX. The course is now closed for registration for credit, but can still be accessed for free to audit with complete access to the instructional materials, activities, tests and discussions within the MOOC. It is estimated that full participation in each week of the course will take approximately 4 hours, but of course you are welcome to skip around and access the information that is of interest to you. I am currently auditing the MOOC while taking note of new resources and information that is likely to be of interest to you from a health program evaluation perspective this month, and will write a summary blog post in November.

The class description is:

This four and a half-week course on evaluating social programs will provide a thorough understanding of randomized evaluations and pragmatic step-by-step training for conducting one’s own evaluation. Through a combination of lectures and case studies from real randomized evaluations, the course will focus on the benefits and methods of randomization, choosing an appropriate sample size, and common threats and pitfalls to the validity of the experiment. While the course is centered around the why, how and when of Randomized Evaluations, it will also impart insights on the importance of a needs assessment, measuring outcomes effectively, quality control, and monitoring methods that are useful for all kinds of evaluations.

If you are interested in participating in Evaluating Social Programs and have not previously taken an EdX MOOC, I highly recommend checking out the 15 minute self-navigated DemoX which provides a very user-friendly overview of how to make the best use of the features within EdX.

 

Qualitative Data Visualization

Flowchart of text, text with illustrations, then illustrations leading to additional text

Have you thought that only quantitative information can be used for data visualizations, and qualitative data wasn’t an option without first coding or otherwise turning this valuable content into quantitative formats?

I learned about an innovative and compelling approach to creating qualitative data visualizations with illustrations from Fresh Spectrum . They begin the process (as shown in the illustration above) of taking a long narrative such as a focus group transcription, and chunking it into a few paragraphs per concept with a unique illustration for each one. In this case custom illustrations of people were used, but you could use your organization’s existing images or Creative Commons-licensed images for illustrating concepts. The next step for the visualization uses the images with brief captions as an online data dashboard, where visitors can click on the captioned image of interest to them to then access the more detailed narrative. The author describes how to do this within a WordPress portfolio blog template, or a simpler strategy of creating HTML anchor links to each individual section within a longer text. You can see how this works by clicking on an anchor link from the original post (http://freshspectrum.com/blogging-advice/#davidson for example) that leads to the longer narrative at http://freshspectrum.com/blogging-advice/ (a great source of advice for blogging by the way!)

Need more information about reporting and visualizing your data? We at the Outreach Evaluation Resource Center (OERC) have more resources available for you from the Reporting and Visualizing tab of our Tools and Resources for Evaluation Guide at http://guides.nnlm.gov/oerc/tools and welcome your suggestions and comments about the guide.