Skip all navigation and go to page content


NN/LM Outreach Evaluation Resource Center

Archive for the ‘News’ Category

Easy Email Newsletters for Keeping Stakeholders Informed

Friday, June 19th, 2015

Do you find it difficult to ensure that you are keeping your stakeholders up to date throughout your program? In her AEA Summer Institute class entitled “An Executive Summary is Not Enough: Effective Evaluation Reporting Techniques,” Kylie Hutchinson from Community Solutions suggests a very interesting product for staying in touch with stakeholders. The product is Constant Contact, a tool that allows you to put the latest stats, activities, or planning updates into a newsletter format that goes into the body of an email, allowing stakeholders to easily stay updated on the progress of your program. In addition, you, the evaluator, get feedback on who opens the email and what they click on.

Here is a sample email created for free in just a few moments, showing NN/LM stakeholders what activities took place in May in an imaginary community college outreach project.

Constant Contact

Components of Process Evaluation

Friday, June 12th, 2015

At the American Evaluation Association Summer Institute, Laura Linnan, Director of the Carolina Collaborative for Research on Work & Health at UNC Gillings School of Public Health, did a workshop entitled Process Evaluation: What You Need to Know and How to Get Started. According to the CDC, process evaluation is the systematic collection of information on a program’s inputs, activities, and outputs, as well as the program’s context and other key characteristics.

Logic Model Image from CDC

Process evaluation looks at the specific activities that take place during an outreach project to ensure that planned interventions are carried out equally at all sites and with all participants, to explain why successes happen or do not happen, and to understand the relationships between the project components. Process evaluation can be extremely important in making adjustments to ensure the project’s success, and determining how or whether to do a project again.

In the workshop I attended, Linnan walked through the details covered in Chapter 1 of the book Process Evaluation for Public Health Interventions and Research by Laura Linnan and Allan Steckler. This chapter presents an overview of process evaluation methods. In it, they define a set of terms that describe the components of process evaluation (Table 1.1). These components are valuable to understand, because evaluators can look in detail at each component to determine which ones should be evaluated.

  1. Context
  2. Reach
  3. Dose delivered
  4. Dose received
  5. Fidelity
  6. Implementation
  7. Recruitment

In addition, the authors describe a step-by-step process for designing and implementing process evaluation in a flow chart shown in Figure 1.1, including: creating an inventory of process objectives; reaching consensus on process evaluation questions to be answered; creating measurement tools to assess process objectives; analyzing data; and creating user-friendly reports. And as a final note, Linnan and Steckler recommend that stakeholders be involved in every aspect of this process.

Lesson Learned: Outputs are Cool!

Friday, June 5th, 2015

AEA Summer Institute Logo

Cindy Olney and I just returned from the American Evaluation Association Summer Institute in Atlanta, GA. The blog posts for the next couple of months will be filled with lessons learned from the Institute. I am going to start with Outputs, because they were the greatest surprise to me.

In his “Introduction to Program Evaluation,” Thomas Chapel, Chief Evaluation Officer for the Centers for Disease Control and Prevention, said that he thought outputs were just as important as outcomes. This was quite shocking to me, since it always seemed like outputs were just the way of counting what had been done, and not nearly as interesting as finding out if the desired outcome had happened.

Outputs are the tangible products of the activities that take place in a project. For example, let’s say the project’s goal is to reduce the number of children with Elevated Blood Lead Levels (EBLL) by screening children to identify the ones with EBLL and then referring them to health professionals for medical management. In this brief project description, the activities would be to:

1) Screen children to identify the ones with EBLL
2) Refer them to health professionals for medical management

If outputs are the tangible products of the activities, they are sometimes thought to be something countable, like “the number of children screened for EBLL” and “the number of referrals.” This is how the project manager can ensure that the activities took place that were planned.

However, if you think about the way an activity can take place, you can see that some methods of completing the activities might lead to a successful outcomes, and some might not. A better way of thinking of the outputs might be “what would an output look like that would lead to the outcome that we are looking for?” To use “referrals” as an example, let’s say that during the program 100% of the children identified with EBLL were referred to health professionals, but only 30% of them actually followed up and went to a health professional. If the only information you gathered was the number of referrals, you cannot tell why the success rate was so low. Some of the things that could go wrong in a referral is that people are referred to physicians who are not taking more patients, or to physicians who don’t speak the same language as the parents of the child. So you might want to define the referral output as including those factors. The new output measure could be “the number of referrals to ‘qualified’ physicians,” in which ‘qualified’ is defined by the attributes you need to see in the physicians, such as physicians who are taking new patients, or physicians who speak the same language as the family.

The lesson for me is that outputs are as important as outcomes because by thinking carefully about outputs at the beginning of the planning process, you can ensure that the project has the greatest chance of successful outcomes, and by using outputs during process evaluation, you can make any needed corrections in the process as it is happening to ensure the greatest success of the project.

Improve Your Presentations

Friday, May 22nd, 2015
Death by Presentation by Frits Ahlefeldt-Laurvig (CC BY-ND 2.0)  No changes.

Death by Presentation by Frits Ahlefeldt-Laurvig (CC BY-ND 2.0) No changes.

In a recent blog post, evaluator Stephanie Evergreen suggested that people no longer ask for power point slides at the end of a presentation (Stop Asking if the Slides are Available).  Her point is that the slides should support the speaker and be fairly useless on their own.  If the audience needs a reminder of what was said, the speaker should provide handouts, with main points and resources listed, as well as links to engaging dashboards and infographics.

In her blog post, Stephanie Evergreen has 10 points for improving your presentations.  You don’t like it when people read their slides? Her first point is to remove text from slides so the focus of the audience goes back to what the speaker is saying. A complementary point she makes is that the graphics on the slide be emotional to help the audience remember what the speaker is saying.

What makes Evergreen’s 10 points unique in the world of presentation advice is that many of them are about charts and graphs.  For example, her point, “Choose the right chart so that your results tell the best story,”  ties what some might see as dry charts into the story that your presentation is telling. Another one, “Keep it easy to interpret your graphs with close data labels and a descriptive subtitle,” is a suggestion that re-occurs in her blog and book, Presenting Data Effectively: Communicating Your Findings for Maximum Impact.  For a detailed checklist of how to make a better graph, take a look at her Data Visualization Checklist.

The OERC Expanding Its Limits at MLA ‘15

Wednesday, May 13th, 2015

The OERC will do its part in keeping Austin weird this week at the 2015 Medical Library Association conference.  If you happen to be around, stop by our presentations on Sunday (5/17) afternoon and say hi.

First up, Cindy Olney and other members of the !VIVA! Peer Tutor Project Team the will be exhibiting their poster:

Collaboration without Limits: A High School Student Intern and Health Care Providers Band Together for Patient Health Literacy (#49)
Austin Convention Center, Level One, Exhibit Hall 4
2:00 – 2:55 pm

South Texas ISD librarians Sara Reibman (who designed the poster), Lucy Hansen, and Ann Vickman also will be at the poster session. They have been training high school students to use and promote MedlinePlus and other health information resources since the !VIVA! Peer Tutor Project started in 2002. This poster describes a health information outreach project conducted by STISD high school student Yawar Ali and featured in a previous OERC blog post. We are hoping that Yawar will be joining us via Skype.

Karen Vargas will be presenting the following paper later on Sunday afternoon:

Boot up Your Lifelong Learning: Community Colleges and the National Network of Libraries of Medicine (NN/LM) Outreach Initiative
Austin Convention Center, Level Four, Ballroom E.
5:05-5:20 pm

Her co-presenters are Lisa Huang from Collin College (McKinney, TX) and Michelle Malizia, University of Houston. The panel will describe how one community college pushed the limits and collaborated with its regional medical library to offer educational opportunities in support of the Five-Year NN/LM Community College Outreach Initiative.

For our colleagues who are also attending MLA ‘15, we hope to see you there!

Billboard with postcard of Austin

Low Cost Mapping Tools on NLM’s Community Health Maps Blog

Friday, May 8th, 2015

Map of Childhood Lead Poisoining Risk in Philadelphia, PA from CDC Map Gallery

Have you ever wanted to be able to use mapping for your outreach needs, but thought that making maps would be too expensive, time-consuming, or just too difficult?   The National Library of Medicine has a blog called Community Health Maps: Information on Low Cost Mapping Tools for Community-based Organizations, with the goal of facilitating the use of geographic information system (GIS) mapping by providing information about low cost mapping tools, software reviews, best practices, and the experiences of those who have successfully implemented a mapping workflow as part of their work.  The blog is moderated by Kurt Menke, a certified GIS professional.

Here are some examples of the kinds of things you can find on the Community Health Maps blog:

  • A short guide for using iForm for field data collection. iForm is an app that can be used on iPads, iPhones and Android devices, and has a free version.  Using this app, you could go to different locations, gather data (for example, demographic information about attendance at your program), and view it in tabular or map format.
  • A description of a project using youth in the Philippines to collect data on the needs of their communities.  Technology + Youth = Change showed how a dozen donated phones helped 30 young adults survey and map information on access to water, electricity, jobs, and more.
  • A review of a pilot project done by the Seattle Indian Health Board’s Urban Indian Health Institute on noise pollution and health in the urban environment. One of the goals of the pilot project was to determine whether this kind of data collection and analysis would be feasible with other urban Indian health organizations, so they selected participants who had limited experience with data collection and GIS. The feedback suggested that the GIS software tools were very user-friendly and effective.

Photo credit: Childhood Lead Poisoning Risk Analysis, Philadelphia, Pennsylvania, from the CDC Map Gallery

Guerrilla Assessment Methods

Friday, May 1st, 2015

Recently, the Association of Research Libraries email discussion list had an enthusiastic discussion about guerrilla assessment techniques. These are low-cost, unconventional data collection methods that gather timely responses from library users. I thought I would share some of the favored methods from this discussion.

Graffiti walls seemed to be the most popular guerrilla method discussed in this group. Users were invited to write responses to one question on white boards or flip charts; or they were asked to write comments on sticky notes and post them to bulletin boards. Questions might be related, for example, to library space use or new furniture choices, or users might write suggestions for new resources. Pictured below is a colorful example of a graffiti wall from Clemson University’s Cooper Library posted by Peggy Tyler. Flip charts also were featured in this space use assessment conducted at University of Pittsburgh University Library System (see the FlipChart Analysis and the Flipchart Survey—Our Response presentations).

Short questionnaires to collect on-the-spot responses from users were also mentioned frequently. Some libraries placed laptops in conspicuous parts of the library to capture responses. Others took advantage of tablets, such as this project conducted at Georgia State University. Sometimes the low-tech approach worked best, featuring paper-and-pencil questionnaires or note cards for written comments.

Photographs also were used creatively to capture helpful assessment information. University of Pittsburgh University Library System staff used photographs to examine use of study space. With so many library users carrying mobile phones with cameras, there is a lot of potential for inviting users to incorporate photographs into their responses to assessment questions. In the ARL-assess discussion, Holt Zaugg at Brigham Young’s Harold B. Lee Library described a study in which student volunteers took pictures of places on campus that they thought fit a certain characteristic (e.g. too noisy, busy place to study).  The staff did follow-up interviews with the student volunteers for added insight about their photographs.

Guerrilla methods may look easy, but they do require careful planning and thought. You’ll need well-crafted, focused questions. You also will need an effective promotional strategy to attract user participation. And you’ll want a well-executed schedule for collecting and inputting data so that key information is not lost. Yet these guerrilla methods are worth the challenge, because they engage both participants and staff in the assessment process.  These methods are a refreshing alternative to conventional methods.

Graffitti Wall at Cooper Library at Clemson
Graffitti Wall at Cooper Library at Clemson



Infographics Basics: A Picture is Worth 1000 Data Points

Friday, April 24th, 2015

Open Access Week at University of Cape Town infographicYou’ve been collecting great data for your library, and now you have to figure out how to use it to convince someone of something, for example how great your library is. Part of the trick is turning that data into a presentation that your stakeholders understand – especially if you are not there to explain it.  Infographics are images that make data easy to understand in a way that gets your message across.

It turns out it doesn’t have to be difficult or expensive to create your own infographics.  Last week I went to a hands-on workshop at the Texas Library Association called “Infographics: One Picture is  Worth 1,000 Data Points,” taught by Leslie Barrett, Education Specialist from the Education Service Center Region 13 in Austin, TX. Using this website as her interactive “handout”, Leslie walked us through the process of creating an infographic (and as a byproduct of this great class, she also demonstrated a number of free instructional resources, such as Weebly, Padlet, and Thinglink).

Starting at the top of the page, click on anything with a hyperlink.  You will find a video as well as other “infographics of infographics”  which demonstrate how and why infographics can be used.  There are also a variety of examples to evaluate as part of the learning process.

Finally, there is information on the design process and resources that make infographics fairly easy to create.  These resources, such as Piktochart and Easelly, have free subscriptions for simple graphics and experimenting.

Leslie Barrett allowed us to share this website with you, so feel free to get started making your own infographics!

Image credit: Open Access Week at University of Cape Town by Shihaam Donnelly / CC BY SA 3.0

Keep It Simple with Micro-Surveys

Friday, April 17th, 2015

A hot trend in marketing research is the micro-survey. Also known as the bite-sized survey, these questionnaires are short (about three questions) with the goal of collecting focused feedback to guide specific action.

The micro-survey is a technique for overcoming what is arguably the biggest hurdle in survey assessment: Getting people to respond to your questionnaire. It is a technique that is particularly useful for populations where mobile technology use is on the rise, and where there is competition for everyone’s attention in any given moment.  If we expect our respondents to answer our questionnaires, we can’t saddle them with long, matrix-like questions or require them to flip through numerous web pages. We need to simplify, or we will lose respondents before they ever get to the submit button.

The trick to micro-surveys is to keep them short, but administer multiple questionnaires over time. You can break down a traditional membership or customer questionnaire into several micro-surveys and distribute them periodically. “Survey Doctor” Matthew Champagne gives evidence to the effectiveness of this technique in his blog post about bite-sized surveys. He provides an example of a project that boasted an 86% response rate.

Of course, the length of your survey is not the only factor contributing to response rate. You should strive to follow the Dillman method, which provides time-tested guidelines for administering surveys. (Here is one researcher’s description of how to use the Dillman method.) Also, take a look at Champagne’s Nine Principles of Embedded Assessment. His website has articles and YouTube videos on how to implement these principles.

If you want to try doing a micro-survey, check out the effective practices described in this blog article from the marketing research company Instantly.


Grid of sticky notes with notess: Simplify, Keep it Simple, Less is More, Spend Less, Unclutter, Become Minimalist

An Easier Way to Plan: Tearless Logic Models

Friday, April 10th, 2015

Sample Logic Model

Are you apprehensive when someone says it’s time to do “outcome-based planning using a logic model?” The Wichita State University Community Psychology Practice and Research Collaborative, and the Community Psychology Doctoral Program in Wichita, KS, have come up with an easy way to do logic models. This is described in an article, “Tearless Logic Model,” in the Global Journal of Community Psychology Practice.

Their goal was to create a facilitated, non-intimidating logic model process that would be more likely to be used in planning.  This approach is designed to give community-based groups, faith-based organizations, smaller nonprofits and people with little experience in strategic planning greater impact when planning community projects.

Tearless Logic Model planning requires only flip charts, magic markers, blue painters’ tape and a safe space to work with a group.  Jargon is eliminated, and is replaced with simple terms that anyone can understand.  For example, instead of asking “what are the anticipated impacts,” a facilitator would ask, “if you really got it right, what would it look like in 10 or 20 years?”

The step by step process that anyone can use is found in their article as well as links to a Prezi visual guide and the full PDF that includes a helpful template.

Ashlee D. Lien, Justin P. Greenleaf, Michael K. Lemke, Sharon M. Hakim, Nathan P. Swink, Rosemary Wright, Greg Meissen. Tearless Logic Model. Global Journal of Community Psychology Practice [Internet].  2011 Dec [cited 2015 Apr 10];2(2). Available from


Last updated on Saturday, 23 November, 2013

Funded by the National Library of Medicine under contract # HHS-N-276-2011-00008-C.