Skip all navigation and go to page content


NN/LM Outreach Evaluation Resource Center

Creative Annual Reports

Ah, the annual report – at its best we expect to see a glossy booklet with pie charts, short paragraphs and some quotes. At its worst it can be pages of dry text. Our main hope with annual reports is that our stakeholders and others will read them and be impressed with the successes of our organizations.

Last month I ran across the annual report from the Nowra Public Library in New South Wales, Australia, which was so compelling and understandable that over 100,000 people have viewed it on YouTube:

Photo of librarian links to Nowra library video

Since most organizations don’t have the resources to do their own music video (e.g. singers, writers, silly costumes), I thought I would look at a few other examples to consider when it’s time to do your annual report.

One of my all-time favorites is the annual report from the Schusterman Library of The University of Oklahoma-Tulsa. Their annual report is an infographic that shows the data that is collected, but also describes the data in such a way that 1) you have a better feel for what is going on in the library; and also 2) you might think “I didn’t know they would help me with that!”  For example: “7,274 Reference questions answered in person, by phone, by email, and instant message or text on everything from ADHD and child welfare to decision trees, LEED homes, and census reporting.” It is available on their website, and the librarians at the Schusterman Library say they frequently find students looking at it.

The Michigan State University College of Education won a gold ADDY and a Best in Show Award for their 2012 Annual Report (an ADDY is the advertising industry’s largest competition).  Their report featured a tri-fold, die-cut skyline that presented the college’s missions and strengths with an emphasis on “college as community.” The annual report also included a video and a website that gives detailed narratives that show institutional successes in terms of personal stories.

Of course, not all institutions want an unusual annual report.  But it is important to consider the target audience.  Annual reports reach the upper administration, potential funders, and patrons of the library. The success of this years annual report might shape the library users view of the library for years to come.

DYI: Two Great Photovoice Guides

Hand holding tablet, with photo of field of green grass on background with sunrise

Photovoice is an evaluation method for the times.  This method engages program stakeholders (learners; service recipients; community members) in taking photographs and using them as springboards to express their experiences and points of view.  With the prevalence of cameras in mobile devices, along with social media forums, most of us already are engaging in the foundational practices underlying photovoice: taking photos, posting them, and sharing our experiences.  Add in some facilitators who provide systematic method design, project management and ethical oversight; and you have the potential to gather program insights that would go untouched through traditional methods.

Today’s post introduces you to two practical resources written by action researchers describing their lessons learned about conducting photovoice projects. The documents also show you or link you to photos and commentary from contributing participants.


From the Prairie Women’s Health Centre of Excellence

One comprehensive guide comes from the Prairie Women’s Health Centre of Excellence  (PWHCE), located in Canada.  The center engages in collaborative, community-based research on social and other determinants of the health of women and girls. The center’s mission is to provide expert advice on social policies related to women’s health. The authors (Beverly Palibroda, Brigette Krieg, Lisa Murdock and Joanne Havelock) published A Practical Guide To Photovoice: Sharing Pictures, Telling Stories and Changing Communities, a nuts-and-bolts photovoice manual. It provides detailed advice, with periodic sidebars summarizing process. An appendix includes a helpful checklist. You will find sample photovoice entries throughout the document.

The manual was written in 2009.  Since that time, the PWHCE has introduced digital story-telling into its portfolio of participatory methods.  Check out the stories here.


Another guide was produced based on a photovoice project for, an educational website providing authoritative information about brain injury symptoms, diagnosis, and treatment. The project featured the stories of eight members with traumatic brain injury.  The gallery of essays is available here.   Facilitators Laura Lorenz and Barbara Webster developed a succinct facilitator guide based on this project.

If you want to learn how to do a photovoice project, these documents are a great place to start. You also can find other resources in OERC’s blog entries posted in 2012 and  2014.

The OERC is on the Road Again

The OERC is on the road again.  Today, Cindy and Beth Layton, Associate Director of the NN/LM Greater Midwest Region, are team-teaching Measuring What Matters to Stakeholders at the Michigan Health Sciences Library Association’s annual conference in Flint, MI.

Logo for Michigan Health Sciences Library Association

This workshop covers strategies for using evaluation to enhance and communicate a library’s value to organizational decision-makers and stakeholders who influence decision makers. The workshop combines updated information with material from the NN/LM MidContinental Region’s Measuring Your Impact and the OERC’s Valuing Your Library workshops that have been taught by a number of regional medical library staff members over the past decade.

On Saturday, Karen is presenting a brand-new workshop for the Texas Library Association’s District 8 Conference called Adding Meaning to Planning: A Step-by-Step Method for Involving Your Community in Meaningful Library Planning.

TLA District 8 Logo

The workshop is a method of involving community members in creating pain-free logic models to ensure that the long term vision is always in sight when planning.  Karen wrote a blog entry about the creating “tearless” logic models here.  This is Karen’s first experience creating and delivering a workshop that is purely about library evaluation.

The NN/LM travel season is about to go into full swing.  We know we aren’t the only ones out and about with presentations, trainings, and exhibits.  So safe travels. And we will see you in a week with another OERC blog post.

Elevator Conversations Pt. 2: The OERC Pitch

Theory and practice words written on the chalkboard Last week, I reviewed Tim David’s article “Your Elevator Pitch Needs an Elevator Pitch.” This week, Karen Vargas (my co-blogger) and I decided to challenge ourselves and write an elevator pitch for the Outreach Evaluation Resource Center (aka OERC). So this week’s post is our example of how to implement David’s approach.

The Set Up

For those of you who don’t already know about us, the OERC offers evaluation training and consultation to members of the National Network of Libraries of Medicine (NN/LM). Libraries and organizations from across the US join the NN/LM to help promote health information access and use. They specifically promote resources of the National Library of Medicine, which funds the program. The OERC’s charge includes helping NN/LM members use evaluation to identify their best outreach strategies and share them with colleagues, as well as to promote the results of their work.

The NN/LM is managed by the National Network Office at the National Library of Medicine.  The position of NNO Head is currently vacant.  In anticipation of the day when our new leader is hired, we decided to think about how to pitch the OERC.

The Pitch

Let’s imagine that I’m on the elevator with the new head of NNO.  This is purely hypothetical, of course. Reality may (probably will) vary.  When we incorporated one of David’s elements, we tagged it in parentheses.

OERC: “You know how the RMLs and organizations in the NN/LM do great and noble work funded by NLM, but we aren’t always sure how to get the message across about what we do?” (problem question)

NNO Head: “Well, yes.”

OERC: Everyone in the network is hungry to know what strategies work well and which ones are a waste of time. We want to be able to share lessons learned about how to do outreach. Ideally, we want to provide solid evidence that allows us to talk credibly about the results of our good work.” (noddable)

NNO Head: “I can agree that’s important.”

OERC:Well, the OERC helps NN/LM members use evaluation to learn what strategies work well. We also teach them how to take advantage of their evaluation data to get the message out about positive results and lessons learned.” (curiosity statement)

NNO Head: Really?  How do you do that?

OERC: “We combine training with one-to-one coaching. For example, an NN/LM outreach coordinator, Michelle Eberle, led one of the New England Region’s Communities of Interests, which is one of NER’s innovative approaches to promoting use of NLM consumer health resources.  Michelle has taken a lot of our training sessions over the years, so she developed an evaluation questionnaire for the COI, then asked me to review it. In the end, she got some good evidence of success and was able to publish an article about her project in MLA News. So that project was shared with a lot of health sciences librarians both in and outside of her region. That’s just one example.  In the past year alone, two of us taught evaluation webinars to about 580 participants and provided consultations on 31 projects.” (example)

The Close

Note: Tim David is a corporate communication consultant, so his elevator pitch was designed to produce a meeting with a potential client.  Our goal is similar.  We would hope for a meeting with the new Head of NNO to present more details about how we support the NN/LM. It will allow him or her to better understand our role (and our value) to the network. If our elevator pitch worked, we think the conversation would end something like this:

NNO Head: “It sounds as though you have other good stories to share.”

OERC:When you have some time, we would love to schedule a meeting to share more about some of our other evaluation projects with the NN/LM libraries and organizations. We would be happy to put together a presentation for you.”


David T. Your elevator pitch needs an elevator pitch. Harvard Business Review. 30 Dec 2014.  Retrieved 13 Sept 2015.

Eberle, Michelle L.; Malachowski, Margot; Richetelle, Alberta; Lahoz, Monina; McIntosh, Linda; Moore, Bradley; Searl, Jen; and Kronick, Judy, “Clear: Conversations: A Collaborative Regional Project to Help Patients Improve their Health Visits” (2014). National Network of Libraries of Medicine New England Region (NN/LM NER) Repository. Paper 25. (Michelle’s article about this project was published in the MLA News, August 2014, titled: Clear: Conversations: A Project to Help Patients Improve Their Health Visits.)


Give Your Elevator Pitch a Lift

It is the elevator button of up sign.

Forget about elevator speeches.  Think elevator conversations.

Elevator pitches are one of a number of strategies you can use to stealthily promote your organization’s successful programs and services. We cover elevator pitches in an OERC workshop about how to use evaluation to better advocate for your organization. I always thought of elevator pitches as little promotional speeches of elevator-ride length (i.e. 20-seconds) that you can slip into small talk when you run in to “someone influential.”  You add nuggets of evaluation findings to these mini-speeches to demonstrate program value.

I now see that I was missing a key element in the elevator pitch exchange: the other person.

I can thank this insight to Tim David and his article Your Elevator Pitch Needs an Elevator Pitch, which appeared in the Harvard Business Review (10 Dec 2014).  David emphasizes the importance of engaging your fellow elevator traveler, rather than talking “at” him or her.

As such, you have to prepare a conversation, not a speech.

What I appreciate in particular is how he seamlessly slips in evidence to support his low-key pitch. See, for instance, how he surreptitiously inserts a statistic that he must have obtained from a follow-up evaluation with one of his client organizations.  Specifically, the organization reported that productivity and morale increased 38% after his training. David seamlessly folds that little fact into the conversation and it underscores the value his service provided to the organization.

That’s how to tie evaluation to advocacy, folks!

Here are the other tips I took away from the article:

  • Answer polite but perfunctory questions (such as “what does your office do?”) with a surprising answer. This is harder than it looks, so I’m going to have to practice this tip. (“Hi Mom, did you know….?”)
  • Use questions to draw your elevator companion into the conversation. David suggests that you talk no more than 20% of the time. Yield the remainder of the time to the other traveler, but use questions to keep the conversation rolling.
  • Don’t worry too much about that 20-second time frame traditionally recommended for elevator pitches. If you successfully engage your fellow rider, he or she will hold the elevator door open to continue the chat.

We have posted a number of articles about weaving evaluation results into stories (see June 29, July 2, and August 21 of this year. The elevator pitch format is a good addition to your story-telling tool kit. But it is the extra-credit challenge. It will take some practice to be able to present an elevator pitch casually and conversationally. If you’re up for that challenge, then check out Tim David’s article for some excellent guidelines.


Which Online Survey Tool Should I Use? A Review of Reviews

Quality survey close up with a thumbtack pointing on the word excellentRecently we faced the realization that we would have to reevaluate the online survey tool that we have been using. We thought that we would share some of the things that we learn along the way.

First of all, finding a place that evaluates survey products (like Survey Monkey or Survey Gizmo), is not as easy as going to Consumer Reports or Amazon (or CNET, Epinions, or Buzzillions).  A number of places can be found on the internet that provide reviews of surveys, but their quality is highly varied.   So for this week our project has been to compare review websites to see what we can learn from and about them.

Here are the best ones I could find that compare online survey tools:’s Ultimate Guide to Forms and Surveys, Chapter 7 “The 20 Best Online Survey Builder Tools”

This resource compares 20 different online survey tools. There is a chart with a brief statement of what each survey tool is best for, what you get for free, and the lowest plan cost. Additionally, there is a paragraph description of each tool and what it does best.  Note: this is part of an eBook published in 2015 which includes chapters like “The Best Online Form Builders for Every Task.”’s “18 Awesome Survey & Poll Apps”

This review was posted on May 27, 2015 which reassures us that the information is most likely up to date.  While there are very brief descriptions, it is good for a quick comparison of the survey products. Each review includes whether or not there is a free account, if the surveys can be customized, and whether or not there are ready-made templates.’s “Top Survey Software Products”

Check boxes showing the features of the different productsThis resource appears to be almost too good to be true. Alas, no date shown means that the specificity in the comparisons might not be accurate.  Nevertheless, this website lists over 200 survey software products, has separate profile pages on each product (with varying amounts of detail), and lists features that each product offers.  You can even narrow down the surveys you are looking for by filtering by feature.  Hopefully the features in Capterra’s database are kept updated for each product.  One thing to point out is that at least two fairly well-known survey products (that I know of) are not in their list.’s “Top 31 Free Survey Apps”

Another review site with no date listed. This one compares 31 apps by popularity, presumably in the year the article was written. One thing that is unique about this review site is that the in-depth review includes the history and popularity of the app, the differences of each app to other apps, and who they would recommend the app to.  Many of the reviews include videos showing how to use the app.  Pretty cool.’s 2015 Best Survey Software Reviews and Comparisons

This website has the feel of Consumer Reports. It has a long article explaining why you would use survey software, how and what the reviewers tested, and the kinds of things that are important when selecting survey software. Also like Consumer Reports, it has ratings of each product (including the experiences of the business, the respondents, and the quality of the support), and individual reviews of each product showing pros and cons. Because the date is included in the review name, the information is fairly current.

This is a starting point. There are individual reviews of online survey products on a variety of websites and blogs, which are not included here.  Stay tuned for more information on online survey tools as we move forward.


Simply Elegant Evaluation: Appreciative Inquiry at NN/LM MAR

KF brochure

Kate Flewelling is the Outreach Coordinator for the National Network of Libraries of Medicine’s Middle Atlantic Region (NN/LM MAR), which is located at the University of Pittsburgh Health Sciences Library.  For those unfamiliar with the NN/LM, libraries and organizations join the network to help promote health information access and use.  The program is funded by the NIH National Library of Medicine, and NN/LM MAR is one of eight regional medical libraries that coordinate a region of the network. These eight health sciences libraries partner with member organizations and sometimes fund their health information outreach activities.

After attending an NN/LM Outreach Evaluation Resource Center training session, Kate was inspired to conduct an Appreciative Inquiry (AI) evaluation project to explore how NN/LM MAR could attract health professionals into the NN/LM and provide them with useful services.  In March 2015, she conducted eight 30-minute interviews with health professionals. She recruited interviewees from among health professionals who served on a special NN/LM MAR advisory committee or represented health organizations that received funding from NN/LM MAR. She chose interviewees from three states where NN/LM MAR works (Pennsylvania, New York, and Delaware), with representation from organizations in both rural and urban areas. Her interview guide was modeled after one presented in an OERC blog post.

Kate agreed to talk with the OERC to share her experience using Appreciative Inquiry.

OERC: What motivated you to do an Appreciative Inquiry project?

Kate:I wanted to see why health professionals got involved with us and have been so committed to us. We wanted to come up with selling points to tell other potential health professional members why they should join our network.”

Kate also chose an AI approach because she wanted stories, not numbers.  She was working with a small but diverse group of representatives, so interviews seemed to be a better approach than surveys for getting unique perspectives. She also believed an AI assessment was simple enough to be completed in about a week. In fact, she completed all eight interviews in eight days.


OERC: Did you believe you got responses that had an overly positive bias?

Kate: “I only asked people who loved us, but they also know us. So they have an idea of what we’re doing and a much broader understanding of what we do with outreach because they hear about the whole outreach program. But I got really good feedback. Not criticism, but stuff we could do to improve our services.”

In AI, it is not unusual to interview an organization’s champions, because they often can provide the most informed advice about improving a program. Kate understood that her interviewees had favorable opinions about NN/LM MAR, but she said her interviews still identified holes in their outreach efforts to health professionals. They provided good advice on how to attract other health professionals to the network.

OERC: What did you want to learn from the study?  

Kate: “The experience was great! It gave me good ideas.  I realized we weren’t using them [health professional colleagues] as much as we could. They told me ‘I will pass on whatever you need me to pass on.’  It gave me great ideas for how to use them, use their connections and develop target outreach materials and messages for special audiences.’”   

She realized that NN/LM MAR could send regular postings to the health representatives, just as they do to health sciences librarians.  The postings just needed to contain more context so that they were targeting a public health or clinical audience.

Kate: “The project also made me realize how far [NN/LM MAR’s] reach has gone in the past four years…It felt like, during our first and second year, throwing spaghetti on the wall to see if it was working with health professionals. But we were trying to make the connections. Now we know, for our most engaged people, what they value about their relationship with us.”

Most of the staff joined the NN/LM MAR in 2011, when University of Pittsburgh Health Sciences Library was awarded the contract to coordinate the Middle Atlantic Region. So the AI project was a member check with their public health and clinical health partners, to see how well the relatively new program was meeting their needs.  Before the AI project, Kate said she knew what NN/LM MAR staff was getting from their relationships with health professionals.  Afterwards, she understood what the health professionals were getting from NN/LM MAR.

OERC: How did you use the information you gathered?

Kate: “Just starting to talk to people at exhibits, I have a sense of what’s going to grab them.”

Kate developed a brochure targeted to health professionals with a summary of NN/LM selling points on the front (gleaned from her AI interviews) and resources of interest on the back. She plans to share the brochure with the other eight NN/LM regional medical libraries. She also believes the NN/LM MAR staff will tap into this information in the future when they plan programs for health professionals.


Improving Your Data Storytelling in 30 Days

Here are some more great techniques to help with telling a story to report your evaluation data so it will get the attention it deserves.

Friends at campfire telling storiesJuice Analytics has this truly wonderful collection of resources in a guide called “30 Days to Data Storytelling.” With assignments of less than 30 minutes a day, this guide links to data visualization and storytelling resources from sources as varied as Pixar, Harvard Business Review, Ira Glass, the New York Times, and Bono (yes, that Bono).

The document is a checklist of daily activities lasting no longer than 30 minutes per day. Each activity is either an article to read, a video to watch, or a small project to do.

The resources answer valuable questions like:

  • What do you do when you’re stuck?
  • How do I decide between visual narrative techniques?
  • Where can I find some examples of using data visualization to tell a story?

Soup Up Your Annual Reports with Calculator Soup

Summer is annual report time for our organization. Sometimes when I’m putting together my bulleted list of accomplishments for those reports, I feel as though our major wins get lost in the narrative. So I recently turned to an online calculator to help me create better metrics to talk about our center’s annual wins.

One of our objectives for the year was to increase participation in our evaluation training program. We developed new webinars based on our users’ feedback and also increased promotion of our training opportunities. The efforts paid off: training session attendance increased from 291 participants the previous year to 651 this year. Now that is a notable increase, but the numbers sort of disappear into the paragraph, don’t they? So I decided to add a metric to draw attention to this finding: Our participation rate increased 124% over last year’s attendance. Isn’t “percent increase” a simpler and more eye-catching way to express the same accomplishment?

Doing this extra analysis seems simple, but it takes time and gives me angst because it usually requires manual calculation. First I have to look up the formula somewhere. Then I have to calculate the statistic. Then I calculate it again, because I don’t trust myself. Then I calculate it again out of pure obsessiveness.

That’s why I love online calculators. Once I find one I like and test it for accuracy, I bookmark it for future use. From then on, I let the calculator do the computation because it is infinitely more reliable than I am when it comes to running numbers.

One of my favorite sites for online calculators is Calculator Soup, because it has so many of them. You may not ever use 90% of its calculators, but who knows when you might need to compute someone’s age from a birth date or convert days to hours. The calculators also show you the exact steps in their calculations. This allows you to check their work. You also can find formulas that you then can apply in an Excel spreadsheet.

One word of advice: test a calculator for accuracy before adopting it. I always test a new calculator to be sure the designers knew what they were doing. For Calculator Soup, I can vouch for the percent change and the mean/median/mode calculator. If I use any others at that site, I’ll test them as well. I’ll create an easy problem that I can solve manually and make sure my result matches the calculator’s.

If you want to see what Calculator Soup has to offer, check out their calculator index here.

Random array of simple arithmetic formulas

A Rainbow Connection? The Evaluation Rainbow Framework

Once you start looking online for help with your evaluation project, you will find a veritable storm of evaluation resources out there. So many that it can be confusing how choose the best ones for your needs.  But don’t worry, once you’ve looked at this online tool you will find the rainbow of hope that follows the storm (okay that was pretty cheesy – stay with me, it gets better).

A group of evaluators from all over the world created a website called for the purpose of organizing and sharing useful online evaluation resources. The framework they created to organize resources is called the Rainbow Framework because it divides the world of evaluation into seven different “clusters” which are delineated by rainbow colors.  Each cluster is then broken down into a number of tasks, and each task broken down into options and resources.

Here is an example of the Rainbow Framework in action.  By clicking on the yellow “Describe” category, the image opens a window on the right that lists seven tasks: 1) Sample; 2) Use measures, indicators, or metrics; 3) Collect and/or retrieve data; 4) Manage data; 5) Combine qualitative and quantitative data; 6) Analyze data; and 7) Visualize data.

When you click on a specific task, a page listing a variety of Options and Resources will open, like this:Rainbow Framework Options


BetterEvaluation made eight 20 minute “coffee break” webinars in conjunction with AEA that you can watch for free on the BetterEvaluation website. Each webinar describes a cluster, and there is one overview webinar.  The webinars are two years old, so the actual image of the rainbow looks a little different from the webinar video, but the content is still relevant.  Here is a link to the webinar series:

The Rainbow Framework does more than just organize resources. Here are some reasons you might want to use this Framework.

1) Help designing and planning an evaluation

2) Check the quality of an ongoing evaluation

3) Commission an evaluation – will help formulate what’s important to include when you commission an evaluator and then when you assess the quality of the proposals

4) Embed stakeholder participation thoughtfully throughout the evaluation

5) Develop your evaluation capacity – lifelong learning – to fill in gaps of knowledge.

So, somewhere over the rainbow, your evaluation skies may be blue…

Last updated on Saturday, 23 November, 2013

Funded by the National Library of Medicine under contract # HHS-N-276-2011-00008-C.