Skip all navigation and go to page content


NN/LM Outreach Evaluation Resource Center

Archive for the ‘Practical Evaluation’ Category

Creative Annual Reports

Friday, October 9th, 2015

Ah, the annual report – at its best we expect to see a glossy booklet with pie charts, short paragraphs and some quotes. At its worst it can be pages of dry text. Our main hope with annual reports is that our stakeholders and others will read them and be impressed with the successes of our organizations.

Last month I ran across the annual report from the Nowra Public Library in New South Wales, Australia, which was so compelling and understandable that over 100,000 people have viewed it on YouTube:

Photo of librarian links to Nowra library video

Since most organizations don’t have the resources to do their own music video (e.g. singers, writers, silly costumes), I thought I would look at a few other examples to consider when it’s time to do your annual report.

One of my all-time favorites is the annual report from the Schusterman Library of The University of Oklahoma-Tulsa. Their annual report is an infographic that shows the data that is collected, but also describes the data in such a way that 1) you have a better feel for what is going on in the library; and also 2) you might think “I didn’t know they would help me with that!”  For example: “7,274 Reference questions answered in person, by phone, by email, and instant message or text on everything from ADHD and child welfare to decision trees, LEED homes, and census reporting.” It is available on their website, and the librarians at the Schusterman Library say they frequently find students looking at it.

The Michigan State University College of Education won a gold ADDY and a Best in Show Award for their 2012 Annual Report (an ADDY is the advertising industry’s largest competition).  Their report featured a tri-fold, die-cut skyline that presented the college’s missions and strengths with an emphasis on “college as community.” The annual report also included a video and a website that gives detailed narratives that show institutional successes in terms of personal stories.

Of course, not all institutions want an unusual annual report.  But it is important to consider the target audience.  Annual reports reach the upper administration, potential funders, and patrons of the library. The success of this years annual report might shape the library users view of the library for years to come.

The OERC is on the Road Again

Friday, September 25th, 2015

The OERC is on the road again.  Today, Cindy and Beth Layton, Associate Director of the NN/LM Greater Midwest Region, are team-teaching Measuring What Matters to Stakeholders at the Michigan Health Sciences Library Association’s annual conference in Flint, MI.

Logo for Michigan Health Sciences Library Association

This workshop covers strategies for using evaluation to enhance and communicate a library’s value to organizational decision-makers and stakeholders who influence decision makers. The workshop combines updated information with material from the NN/LM MidContinental Region’s Measuring Your Impact and the OERC’s Valuing Your Library workshops that have been taught by a number of regional medical library staff members over the past decade.

On Saturday, Karen is presenting a brand-new workshop for the Texas Library Association’s District 8 Conference called Adding Meaning to Planning: A Step-by-Step Method for Involving Your Community in Meaningful Library Planning.

TLA District 8 Logo

The workshop is a method of involving community members in creating pain-free logic models to ensure that the long term vision is always in sight when planning.  Karen wrote a blog entry about the creating “tearless” logic models here.  This is Karen’s first experience creating and delivering a workshop that is purely about library evaluation.

The NN/LM travel season is about to go into full swing.  We know we aren’t the only ones out and about with presentations, trainings, and exhibits.  So safe travels. And we will see you in a week with another OERC blog post.

Which Online Survey Tool Should I Use? A Review of Reviews

Friday, September 4th, 2015

Quality survey close up with a thumbtack pointing on the word excellentRecently we faced the realization that we would have to reevaluate the online survey tool that we have been using. We thought that we would share some of the things that we learn along the way.

First of all, finding a place that evaluates survey products (like Survey Monkey or Survey Gizmo), is not as easy as going to Consumer Reports or Amazon (or CNET, Epinions, or Buzzillions).  A number of places can be found on the internet that provide reviews of surveys, but their quality is highly varied.   So for this week our project has been to compare review websites to see what we can learn from and about them.

Here are the best ones I could find that compare online survey tools:’s Ultimate Guide to Forms and Surveys, Chapter 7 “The 20 Best Online Survey Builder Tools”

This resource compares 20 different online survey tools. There is a chart with a brief statement of what each survey tool is best for, what you get for free, and the lowest plan cost. Additionally, there is a paragraph description of each tool and what it does best.  Note: this is part of an eBook published in 2015 which includes chapters like “The Best Online Form Builders for Every Task.”’s “18 Awesome Survey & Poll Apps”

This review was posted on May 27, 2015 which reassures us that the information is most likely up to date.  While there are very brief descriptions, it is good for a quick comparison of the survey products. Each review includes whether or not there is a free account, if the surveys can be customized, and whether or not there are ready-made templates.’s “Top Survey Software Products”

Check boxes showing the features of the different productsThis resource appears to be almost too good to be true. Alas, no date shown means that the specificity in the comparisons might not be accurate.  Nevertheless, this website lists over 200 survey software products, has separate profile pages on each product (with varying amounts of detail), and lists features that each product offers.  You can even narrow down the surveys you are looking for by filtering by feature.  Hopefully the features in Capterra’s database are kept updated for each product.  One thing to point out is that at least two fairly well-known survey products (that I know of) are not in their list.’s “Top 31 Free Survey Apps”

Another review site with no date listed. This one compares 31 apps by popularity, presumably in the year the article was written. One thing that is unique about this review site is that the in-depth review includes the history and popularity of the app, the differences of each app to other apps, and who they would recommend the app to.  Many of the reviews include videos showing how to use the app.  Pretty cool.’s 2015 Best Survey Software Reviews and Comparisons

This website has the feel of Consumer Reports. It has a long article explaining why you would use survey software, how and what the reviewers tested, and the kinds of things that are important when selecting survey software. Also like Consumer Reports, it has ratings of each product (including the experiences of the business, the respondents, and the quality of the support), and individual reviews of each product showing pros and cons. Because the date is included in the review name, the information is fairly current.

This is a starting point. There are individual reviews of online survey products on a variety of websites and blogs, which are not included here.  Stay tuned for more information on online survey tools as we move forward.


Improving Your Data Storytelling in 30 Days

Friday, August 21st, 2015

Here are some more great techniques to help with telling a story to report your evaluation data so it will get the attention it deserves.

Friends at campfire telling storiesJuice Analytics has this truly wonderful collection of resources in a guide called “30 Days to Data Storytelling.” With assignments of less than 30 minutes a day, this guide links to data visualization and storytelling resources from sources as varied as Pixar, Harvard Business Review, Ira Glass, the New York Times, and Bono (yes, that Bono).

The document is a checklist of daily activities lasting no longer than 30 minutes per day. Each activity is either an article to read, a video to watch, or a small project to do.

The resources answer valuable questions like:

  • What do you do when you’re stuck?
  • How do I decide between visual narrative techniques?
  • Where can I find some examples of using data visualization to tell a story?

A Rainbow Connection? The Evaluation Rainbow Framework

Friday, August 7th, 2015

Once you start looking online for help with your evaluation project, you will find a veritable storm of evaluation resources out there. So many that it can be confusing how choose the best ones for your needs.  But don’t worry, once you’ve looked at this online tool you will find the rainbow of hope that follows the storm (okay that was pretty cheesy – stay with me, it gets better).

A group of evaluators from all over the world created a website called for the purpose of organizing and sharing useful online evaluation resources. The framework they created to organize resources is called the Rainbow Framework because it divides the world of evaluation into seven different “clusters” which are delineated by rainbow colors.  Each cluster is then broken down into a number of tasks, and each task broken down into options and resources.

Here is an example of the Rainbow Framework in action.  By clicking on the yellow “Describe” category, the image opens a window on the right that lists seven tasks: 1) Sample; 2) Use measures, indicators, or metrics; 3) Collect and/or retrieve data; 4) Manage data; 5) Combine qualitative and quantitative data; 6) Analyze data; and 7) Visualize data.

When you click on a specific task, a page listing a variety of Options and Resources will open, like this:Rainbow Framework Options


BetterEvaluation made eight 20 minute “coffee break” webinars in conjunction with AEA that you can watch for free on the BetterEvaluation website. Each webinar describes a cluster, and there is one overview webinar.  The webinars are two years old, so the actual image of the rainbow looks a little different from the webinar video, but the content is still relevant.  Here is a link to the webinar series:

The Rainbow Framework does more than just organize resources. Here are some reasons you might want to use this Framework.

1) Help designing and planning an evaluation

2) Check the quality of an ongoing evaluation

3) Commission an evaluation – will help formulate what’s important to include when you commission an evaluator and then when you assess the quality of the proposals

4) Embed stakeholder participation thoughtfully throughout the evaluation

5) Develop your evaluation capacity – lifelong learning – to fill in gaps of knowledge.

So, somewhere over the rainbow, your evaluation skies may be blue…

Getting Started in Evaluation – Evaluation Guides from the OERC

Thursday, July 23rd, 2015

New to the world of evaluation? What is your boss talking about when she says she wants you to measure outcomes, not outputs?  What is an indicator? How many responses should you get from your surveys?

Sometimes people think evaluation is just the form that you fill out at the end of a class or event. But in fact evaluation can start at the beginning of the project when you do a community assessment and evaluation includes building support for your project from your stakeholders. And it continues through making an evaluation plan as part of your project, gathering data, analyzing data, and reporting the data back to the stakeholders in a way that it is useful.  Here is a model that the CDC uses to describe the evaluation framework:

CDC Framework for Program Evaluation

The Outreach Evaluation Resource Center (OERC) has a series of three booklets entitled Planning and Evaluating Health Information Outreach Projects that guide people through the evaluation process, from needs assessment to analyzing data.  While focusing on “health information outreach” this series of books can be used to learn how to do evaluation for any type of project.

Booklet 1: Getting Started with Community-Based Outreach

  • Getting organized: literature review; assembling team of advisors; taking an inventory; developing evaluation questions
  • Gathering information: primary data; secondary data, and publicly accessible databases
  • Assembling, Interpreting and Acting: summarizing data and keeping stakeholders involved

Booklet 2: Planning Outcomes-Based Outreach Projects

  • Planning your program with a logic model to connect activities to outcomes
  • Planning your process assessment
  • Developing an outcomes assessment plan, using indicators, objectives and an action plan

Booklet 3: Collecting and Analyzing Evaluation Data

  • Designing data collection methods; collecting data; summarizing and analyzing data for:
    • Quantitative methods
    • Qualitative methods

The books can be read in HTML, downloaded as a PDF or physical booklets can be ordered for free from the OERC by sending an email request to:

Learn more about the CDC’s Evaluation Framework:



DIY Tool for Program Success Stories (Program Success Stories Part 2)

Thursday, July 2nd, 2015

Last week, I wrote about program success stories. As a follow-up, I want to introduce you to a story builder tool available at the CDC Injury Prevention and Control web site. The story builder takes you through three steps to produce an attractive, well-written program success story. Each step offers downloadable Microsoft Word documents to walk you through the process.

Step 1: The worksheets are designed to gather and organize project information for your story. I think it would be interesting to use this step as a participatory activity. You could pull together your project team or a group of stakeholders to talk through questions in this worksheet. The discussion would help group members articulate the program’s value from their perspective.

Step 2: This step provides a story builder template to write your story, section by section. Each section has a field to develop a paragraph of your story, with some tips for writing in a compelling, user-friendly way. Each completed field prepares you for the final step.

Step 3: Here, you can download a layout template, where you transfer the paragraphs from your story builder template into the layout. Because this is a Word document, you can change background design, font, or even the size and placement of pictures and call-out quote boxes.

If you are thinking of trying your hand at program success stories, this story building web page provides some useful DYI tools to help you get started.

"What is Your Story" typed on paper in an old typewriter

Components of Process Evaluation

Friday, June 12th, 2015

At the American Evaluation Association Summer Institute, Laura Linnan, Director of the Carolina Collaborative for Research on Work & Health at UNC Gillings School of Public Health, did a workshop entitled Process Evaluation: What You Need to Know and How to Get Started. According to the CDC, process evaluation is the systematic collection of information on a program’s inputs, activities, and outputs, as well as the program’s context and other key characteristics.

Logic Model Image from CDC

Process evaluation looks at the specific activities that take place during an outreach project to ensure that planned interventions are carried out equally at all sites and with all participants, to explain why successes happen or do not happen, and to understand the relationships between the project components. Process evaluation can be extremely important in making adjustments to ensure the project’s success, and determining how or whether to do a project again.

In the workshop I attended, Linnan walked through the details covered in Chapter 1 of the book Process Evaluation for Public Health Interventions and Research by Laura Linnan and Allan Steckler. This chapter presents an overview of process evaluation methods. In it, they define a set of terms that describe the components of process evaluation (Table 1.1). These components are valuable to understand, because evaluators can look in detail at each component to determine which ones should be evaluated.

  1. Context
  2. Reach
  3. Dose delivered
  4. Dose received
  5. Fidelity
  6. Implementation
  7. Recruitment

In addition, the authors describe a step-by-step process for designing and implementing process evaluation in a flow chart shown in Figure 1.1, including: creating an inventory of process objectives; reaching consensus on process evaluation questions to be answered; creating measurement tools to assess process objectives; analyzing data; and creating user-friendly reports. And as a final note, Linnan and Steckler recommend that stakeholders be involved in every aspect of this process.

Lesson Learned: Outputs are Cool!

Friday, June 5th, 2015

AEA Summer Institute Logo

Cindy Olney and I just returned from the American Evaluation Association Summer Institute in Atlanta, GA. The blog posts for the next couple of months will be filled with lessons learned from the Institute. I am going to start with Outputs, because they were the greatest surprise to me.

In his “Introduction to Program Evaluation,” Thomas Chapel, Chief Evaluation Officer for the Centers for Disease Control and Prevention, said that he thought outputs were just as important as outcomes. This was quite shocking to me, since it always seemed like outputs were just the way of counting what had been done, and not nearly as interesting as finding out if the desired outcome had happened.

Outputs are the tangible products of the activities that take place in a project. For example, let’s say the project’s goal is to reduce the number of children with Elevated Blood Lead Levels (EBLL) by screening children to identify the ones with EBLL and then referring them to health professionals for medical management. In this brief project description, the activities would be to:

1) Screen children to identify the ones with EBLL
2) Refer them to health professionals for medical management

If outputs are the tangible products of the activities, they are sometimes thought to be something countable, like “the number of children screened for EBLL” and “the number of referrals.” This is how the project manager can ensure that the activities took place that were planned.

However, if you think about the way an activity can take place, you can see that some methods of completing the activities might lead to a successful outcomes, and some might not. A better way of thinking of the outputs might be “what would an output look like that would lead to the outcome that we are looking for?” To use “referrals” as an example, let’s say that during the program 100% of the children identified with EBLL were referred to health professionals, but only 30% of them actually followed up and went to a health professional. If the only information you gathered was the number of referrals, you cannot tell why the success rate was so low. Some of the things that could go wrong in a referral is that people are referred to physicians who are not taking more patients, or to physicians who don’t speak the same language as the parents of the child. So you might want to define the referral output as including those factors. The new output measure could be “the number of referrals to ‘qualified’ physicians,” in which ‘qualified’ is defined by the attributes you need to see in the physicians, such as physicians who are taking new patients, or physicians who speak the same language as the family.

The lesson for me is that outputs are as important as outcomes because by thinking carefully about outputs at the beginning of the planning process, you can ensure that the project has the greatest chance of successful outcomes, and by using outputs during process evaluation, you can make any needed corrections in the process as it is happening to ensure the greatest success of the project.

Use Evaluation Samples as Shortcuts

Friday, May 29th, 2015

VIVA project logo

We are often asked for samples of questionnaires and evaluations for information outreach projects. This peer tutor project from the Texas Rio Grande Valley has posted a number of sample evaluation documents that can be modified for other information outreach projects

The ¡VIVA! (Vital Information for a Virtual Age) Project is a high school-based health information outreach initiative in which high school students (peer tutors) from the South Texas Independent School District are trained to use and promote MedlinePlus, a consumer-health database of the NIH National Library of Medicine. Since 2001, these teen peer tutors have taught others about MedlinePlus through class demonstrations, student orientations, school open houses, and community events.

Evaluation has been a strong component of this program since its inception. As part of an online Implementation Guide , the ¡VIVA! Peer Tutor Project team posted many sample evaluation forms and documents. These can be modified to fit your own outreach evaluation needs. Here are some examples:

Logic Model and Evaluation Plan: Here is a sample of how to tie a logic model to the project’s evaluation plan based on outcomes.

Interview guides: It can take a long time to form the perfect questions for interviewing individuals or focus groups. See if these questions will work, and if not, see if you can adjust them.

Training Assessments: Here are some basic questionnaires designed to find out what students have learned and how they would rate their training session.

Feel free to use these shortcuts to make evaluation fit more easily into your workflow! If you have any questions about the program or the evaluation forms, feel free to contact Cynthia Olney.

Last updated on Saturday, 23 November, 2013

Funded by the National Library of Medicine under contract # HHS-N-276-2011-00008-C.