Skip all navigation and go to page content


NN/LM Outreach Evaluation Resource Center

Archive for the ‘News’ Category

Elevator Conversations Pt. 2: The OERC Pitch

Friday, September 18th, 2015

Theory and practice words written on the chalkboard Last week, I reviewed Tim David’s article “Your Elevator Pitch Needs an Elevator Pitch.” This week, Karen Vargas (my co-blogger) and I decided to challenge ourselves and write an elevator pitch for the Outreach Evaluation Resource Center (aka OERC). So this week’s post is our example of how to implement David’s approach.

The Set Up

For those of you who don’t already know about us, the OERC offers evaluation training and consultation to members of the National Network of Libraries of Medicine (NN/LM). Libraries and organizations from across the US join the NN/LM to help promote health information access and use. They specifically promote resources of the National Library of Medicine, which funds the program. The OERC’s charge includes helping NN/LM members use evaluation to identify their best outreach strategies and share them with colleagues, as well as to promote the results of their work.

The NN/LM is managed by the National Network Office at the National Library of Medicine.  The position of NNO Head is currently vacant.  In anticipation of the day when our new leader is hired, we decided to think about how to pitch the OERC.

The Pitch

Let’s imagine that I’m on the elevator with the new head of NNO.  This is purely hypothetical, of course. Reality may (probably will) vary.  When we incorporated one of David’s elements, we tagged it in parentheses.

OERC: “You know how the RMLs and organizations in the NN/LM do great and noble work funded by NLM, but we aren’t always sure how to get the message across about what we do?” (problem question)

NNO Head: “Well, yes.”

OERC: Everyone in the network is hungry to know what strategies work well and which ones are a waste of time. We want to be able to share lessons learned about how to do outreach. Ideally, we want to provide solid evidence that allows us to talk credibly about the results of our good work.” (noddable)

NNO Head: “I can agree that’s important.”

OERC:Well, the OERC helps NN/LM members use evaluation to learn what strategies work well. We also teach them how to take advantage of their evaluation data to get the message out about positive results and lessons learned.” (curiosity statement)

NNO Head: Really?  How do you do that?

OERC: “We combine training with one-to-one coaching. For example, an NN/LM outreach coordinator, Michelle Eberle, led one of the New England Region’s Communities of Interests, which is one of NER’s innovative approaches to promoting use of NLM consumer health resources.  Michelle has taken a lot of our training sessions over the years, so she developed an evaluation questionnaire for the COI, then asked me to review it. In the end, she got some good evidence of success and was able to publish an article about her project in MLA News. So that project was shared with a lot of health sciences librarians both in and outside of her region. That’s just one example.  In the past year alone, two of us taught evaluation webinars to about 580 participants and provided consultations on 31 projects.” (example)

The Close

Note: Tim David is a corporate communication consultant, so his elevator pitch was designed to produce a meeting with a potential client.  Our goal is similar.  We would hope for a meeting with the new Head of NNO to present more details about how we support the NN/LM. It will allow him or her to better understand our role (and our value) to the network. If our elevator pitch worked, we think the conversation would end something like this:

NNO Head: “It sounds as though you have other good stories to share.”

OERC:When you have some time, we would love to schedule a meeting to share more about some of our other evaluation projects with the NN/LM libraries and organizations. We would be happy to put together a presentation for you.”


David T. Your elevator pitch needs an elevator pitch. Harvard Business Review. 30 Dec 2014.  Retrieved 13 Sept 2015.

Eberle, Michelle L.; Malachowski, Margot; Richetelle, Alberta; Lahoz, Monina; McIntosh, Linda; Moore, Bradley; Searl, Jen; and Kronick, Judy, “Clear: Conversations: A Collaborative Regional Project to Help Patients Improve their Health Visits” (2014). National Network of Libraries of Medicine New England Region (NN/LM NER) Repository. Paper 25. (Michelle’s article about this project was published in the MLA News, August 2014, titled: Clear: Conversations: A Project to Help Patients Improve Their Health Visits.)


Give Your Elevator Pitch a Lift

Friday, September 11th, 2015

It is the elevator button of up sign.

Forget about elevator speeches.  Think elevator conversations.

Elevator pitches are one of a number of strategies you can use to stealthily promote your organization’s successful programs and services. We cover elevator pitches in an OERC workshop about how to use evaluation to better advocate for your organization. I always thought of elevator pitches as little promotional speeches of elevator-ride length (i.e. 20-seconds) that you can slip into small talk when you run in to “someone influential.”  You add nuggets of evaluation findings to these mini-speeches to demonstrate program value.

I now see that I was missing a key element in the elevator pitch exchange: the other person.

I can thank this insight to Tim David and his article Your Elevator Pitch Needs an Elevator Pitch, which appeared in the Harvard Business Review (10 Dec 2014).  David emphasizes the importance of engaging your fellow elevator traveler, rather than talking “at” him or her.

As such, you have to prepare a conversation, not a speech.

What I appreciate in particular is how he seamlessly slips in evidence to support his low-key pitch. See, for instance, how he surreptitiously inserts a statistic that he must have obtained from a follow-up evaluation with one of his client organizations.  Specifically, the organization reported that productivity and morale increased 38% after his training. David seamlessly folds that little fact into the conversation and it underscores the value his service provided to the organization.

That’s how to tie evaluation to advocacy, folks!

Here are the other tips I took away from the article:

  • Answer polite but perfunctory questions (such as “what does your office do?”) with a surprising answer. This is harder than it looks, so I’m going to have to practice this tip. (“Hi Mom, did you know….?”)
  • Use questions to draw your elevator companion into the conversation. David suggests that you talk no more than 20% of the time. Yield the remainder of the time to the other traveler, but use questions to keep the conversation rolling.
  • Don’t worry too much about that 20-second time frame traditionally recommended for elevator pitches. If you successfully engage your fellow rider, he or she will hold the elevator door open to continue the chat.

We have posted a number of articles about weaving evaluation results into stories (see June 29, July 2, and August 21 of this year. The elevator pitch format is a good addition to your story-telling tool kit. But it is the extra-credit challenge. It will take some practice to be able to present an elevator pitch casually and conversationally. If you’re up for that challenge, then check out Tim David’s article for some excellent guidelines.


Which Online Survey Tool Should I Use? A Review of Reviews

Friday, September 4th, 2015

Quality survey close up with a thumbtack pointing on the word excellentRecently we faced the realization that we would have to reevaluate the online survey tool that we have been using. We thought that we would share some of the things that we learn along the way.

First of all, finding a place that evaluates survey products (like Survey Monkey or Survey Gizmo), is not as easy as going to Consumer Reports or Amazon (or CNET, Epinions, or Buzzillions).  A number of places can be found on the internet that provide reviews of surveys, but their quality is highly varied.   So for this week our project has been to compare review websites to see what we can learn from and about them.

Here are the best ones I could find that compare online survey tools:’s Ultimate Guide to Forms and Surveys, Chapter 7 “The 20 Best Online Survey Builder Tools”

This resource compares 20 different online survey tools. There is a chart with a brief statement of what each survey tool is best for, what you get for free, and the lowest plan cost. Additionally, there is a paragraph description of each tool and what it does best.  Note: this is part of an eBook published in 2015 which includes chapters like “The Best Online Form Builders for Every Task.”’s “18 Awesome Survey & Poll Apps”

This review was posted on May 27, 2015 which reassures us that the information is most likely up to date.  While there are very brief descriptions, it is good for a quick comparison of the survey products. Each review includes whether or not there is a free account, if the surveys can be customized, and whether or not there are ready-made templates.’s “Top Survey Software Products”

Check boxes showing the features of the different productsThis resource appears to be almost too good to be true. Alas, no date shown means that the specificity in the comparisons might not be accurate.  Nevertheless, this website lists over 200 survey software products, has separate profile pages on each product (with varying amounts of detail), and lists features that each product offers.  You can even narrow down the surveys you are looking for by filtering by feature.  Hopefully the features in Capterra’s database are kept updated for each product.  One thing to point out is that at least two fairly well-known survey products (that I know of) are not in their list.’s “Top 31 Free Survey Apps”

Another review site with no date listed. This one compares 31 apps by popularity, presumably in the year the article was written. One thing that is unique about this review site is that the in-depth review includes the history and popularity of the app, the differences of each app to other apps, and who they would recommend the app to.  Many of the reviews include videos showing how to use the app.  Pretty cool.’s 2015 Best Survey Software Reviews and Comparisons

This website has the feel of Consumer Reports. It has a long article explaining why you would use survey software, how and what the reviewers tested, and the kinds of things that are important when selecting survey software. Also like Consumer Reports, it has ratings of each product (including the experiences of the business, the respondents, and the quality of the support), and individual reviews of each product showing pros and cons. Because the date is included in the review name, the information is fairly current.

This is a starting point. There are individual reviews of online survey products on a variety of websites and blogs, which are not included here.  Stay tuned for more information on online survey tools as we move forward.


Improving Your Data Storytelling in 30 Days

Friday, August 21st, 2015

Here are some more great techniques to help with telling a story to report your evaluation data so it will get the attention it deserves.

Friends at campfire telling storiesJuice Analytics has this truly wonderful collection of resources in a guide called “30 Days to Data Storytelling.” With assignments of less than 30 minutes a day, this guide links to data visualization and storytelling resources from sources as varied as Pixar, Harvard Business Review, Ira Glass, the New York Times, and Bono (yes, that Bono).

The document is a checklist of daily activities lasting no longer than 30 minutes per day. Each activity is either an article to read, a video to watch, or a small project to do.

The resources answer valuable questions like:

  • What do you do when you’re stuck?
  • How do I decide between visual narrative techniques?
  • Where can I find some examples of using data visualization to tell a story?

Soup Up Your Annual Reports with Calculator Soup

Friday, August 14th, 2015

Summer is annual report time for our organization. Sometimes when I’m putting together my bulleted list of accomplishments for those reports, I feel as though our major wins get lost in the narrative. So I recently turned to an online calculator to help me create better metrics to talk about our center’s annual wins.

One of our objectives for the year was to increase participation in our evaluation training program. We developed new webinars based on our users’ feedback and also increased promotion of our training opportunities. The efforts paid off: training session attendance increased from 291 participants the previous year to 651 this year. Now that is a notable increase, but the numbers sort of disappear into the paragraph, don’t they? So I decided to add a metric to draw attention to this finding: Our participation rate increased 124% over last year’s attendance. Isn’t “percent increase” a simpler and more eye-catching way to express the same accomplishment?

Doing this extra analysis seems simple, but it takes time and gives me angst because it usually requires manual calculation. First I have to look up the formula somewhere. Then I have to calculate the statistic. Then I calculate it again, because I don’t trust myself. Then I calculate it again out of pure obsessiveness.

That’s why I love online calculators. Once I find one I like and test it for accuracy, I bookmark it for future use. From then on, I let the calculator do the computation because it is infinitely more reliable than I am when it comes to running numbers.

One of my favorite sites for online calculators is Calculator Soup, because it has so many of them. You may not ever use 90% of its calculators, but who knows when you might need to compute someone’s age from a birth date or convert days to hours. The calculators also show you the exact steps in their calculations. This allows you to check their work. You also can find formulas that you then can apply in an Excel spreadsheet.

One word of advice: test a calculator for accuracy before adopting it. I always test a new calculator to be sure the designers knew what they were doing. For Calculator Soup, I can vouch for the percent change and the mean/median/mode calculator. If I use any others at that site, I’ll test them as well. I’ll create an easy problem that I can solve manually and make sure my result matches the calculator’s.

If you want to see what Calculator Soup has to offer, check out their calculator index here.

Random array of simple arithmetic formulas

How to Write a Mission Statement Without Losing Your Mind

Friday, July 31st, 2015

Mission statements are important. Organizations use them to declare to the world how their work matters. They are the North Star for employees, guiding their efforts toward supporting organizational priorities.  And mission statements are important to evaluators, because evaluation methods are ultimately designed to assess an organization’s value.  Having those values explicitly stated is very helpful.

Yet most of us would rather clean out the office refrigerator than participate in a mission-writing process. Now imagine involving 30 people in the writing process. Make that the refrigerator and the microwave, right?

That’s why I am so enthusiastic about the Nonprofit Hub’s document A Step-By-Step Exercise for Creating a Mission Statement, which the authors promise  is a tool “for those who want to skip the nitpicking, word choice arguments or needing to create the elusive ‘perfect mission statement.’”

I won’t go into details about how their process works, because the guide lays it out elegantly and concisely. You can read through the process in five minutes, it is so succinct.   I’ll just tell you what I like most:

  • The exercise reportedly takes 1-2 hours, even though you are engaging up to 30 stakeholders in the process.
  • Stories comprise the foundation of the mission statement: people start by sharing stories about the organization’s best work.
  • The individuals do group qualitative analysis on the stories to begin to understand the organization’s cause, activities, and impact.
  • Small groups draft mission statements, with instruction to write short, simple sentences. In fact, 10- word sentences are held up as an ideal. The small groups share back with the large group, where big ideas are identified and discussed.
  • The actual final wording is assigned to a small task force to create after the meeting, which prevents wordsmithing from dampening the momentum (and the mood).
  • In the end, everyone understands and endorses the mission statement because they helped develop it.

This exercise has potential that reaches beyond development of mission statements.  It would be a great exercise for advisory groups to contribute their ideas about future activities. Their advice will be based on your organization’s past successes.  The stories generated are data that can be analyzed for organizational impact.  If you are familiar with Appreciative Inquiry, you’ll recognize the AI influence in this exercise.

The group qualitative analysis process, alone, could be adapted to other situations (see steps 1 and 2).  For example, a small project team could use the process to analyze stories from interviews, focus groups, or even written comments to open-ended survey questions.

Even if mission statements are not on your horizon, check out the Nonprofit Hub’s document. There might be something you can adapt for future planning and evaluation projects.

Cover sheet for the Nonprofit Hub's "A Step-by-Step Exercise for Creating a Mission Statement" exercise instructions

Getting Started in Evaluation – Evaluation Guides from the OERC

Thursday, July 23rd, 2015

New to the world of evaluation? What is your boss talking about when she says she wants you to measure outcomes, not outputs?  What is an indicator? How many responses should you get from your surveys?

Sometimes people think evaluation is just the form that you fill out at the end of a class or event. But in fact evaluation can start at the beginning of the project when you do a community assessment and evaluation includes building support for your project from your stakeholders. And it continues through making an evaluation plan as part of your project, gathering data, analyzing data, and reporting the data back to the stakeholders in a way that it is useful.  Here is a model that the CDC uses to describe the evaluation framework:

CDC Framework for Program Evaluation

The Outreach Evaluation Resource Center (OERC) has a series of three booklets entitled Planning and Evaluating Health Information Outreach Projects that guide people through the evaluation process, from needs assessment to analyzing data.  While focusing on “health information outreach” this series of books can be used to learn how to do evaluation for any type of project.

Booklet 1: Getting Started with Community-Based Outreach

  • Getting organized: literature review; assembling team of advisors; taking an inventory; developing evaluation questions
  • Gathering information: primary data; secondary data, and publicly accessible databases
  • Assembling, Interpreting and Acting: summarizing data and keeping stakeholders involved

Booklet 2: Planning Outcomes-Based Outreach Projects

  • Planning your program with a logic model to connect activities to outcomes
  • Planning your process assessment
  • Developing an outcomes assessment plan, using indicators, objectives and an action plan

Booklet 3: Collecting and Analyzing Evaluation Data

  • Designing data collection methods; collecting data; summarizing and analyzing data for:
    • Quantitative methods
    • Qualitative methods

The books can be read in HTML, downloaded as a PDF or physical booklets can be ordered for free from the OERC by sending an email request to:

Learn more about the CDC’s Evaluation Framework:



Fast Track Interview Analysis: The RITA Method

Friday, July 17th, 2015

If you want a systematic way to analyze interview data, check out the Rapid Identification of Themes from Audio Recordings (RITA) method described in Neal et al. (2014). This method skips the time-consuming transcription process, because you conduct your analysis while listening to the recordings.  Also, the process maintains nonverbal elements of your data (i.e., intonation), which are lost when interviews are transcribed. The authors presented a case in their article to demonstrate how to use the RITA method.

The five-step RITA process, briefly described below, is meant to be used with multiple raters:

  1. Develop focused evaluation questions. Don’t try to extract every detail from the recordings. Instead, write some focused evaluation questions to guide your analysis. For instance, you might want to know how participants applied lessons learn from a class on consumer health information or what services are desired by a specific type of library user.
  2. Create a codebook. Develop a list of themes by talking with the project team, reviewing interviewer notes, or checking theories or literature related to your project. For their sample case, the authors used eight themes. That’s probably is the upper limit for the number of themes that can be effectively used for this process. Once you have the list, create a codebook with detailed theme definitions.
  3. Develop a coding form. (See the figure below.) This will be used by all coders to record absence or presence of a theme in time-specified (e.g., 3 minute) segments of the interview. They listen to a time segment, mark if any themes were present, and then repeat the process with the next segment. (The article describes the process for figuring out the most appropriate time segment length for your project.) If you want, you can also incorporate codes for “valance,” indicating if comments were expressed positively, negatively, or in neutral tones.
  4. Have the coding team pilot-test the codebook and coding form on a small subset of interviews. The team then should refine both documents before coding all recordings.
  5. Code the recordings. At this stage, one coder per interview is acceptable, although the authors recommend that a subset of the interviews be coded by multiple coders and results tested for rater agreement.

RITA sample coding sheet (spreadsheet with themes in first column and time segments of 3-minute length in top row for recording presence of themes.

While the RITA process may seem time consuming, it is much more efficient than producing verbatim transcripts. Once the authors finalized their coding form, it took a team member about 68 minutes to code a one-hour interview. Because coded data was expressed in numbers, it allowed the authors to assess inter-rater reliability (agreement), which demonstrated an acceptable level of agreement among coders. Rater agreement adds credibility to your findings and can be helpful if you seek to publish your results.

While the RITA method is used with qualitative data, it is essentially a quantitative analytic method, producing numbers from text.  That leads me to my main concern. By reducing the data to counts, you lose some of the rich detail and subtle nuances that are the hallmarks of qualitative data. However, most evaluation studies use mixed methods to provide a complete picture of their programs.  In that spirit, you can  simply  keep track of time segments that contain particularly great quotes and stories, then transcribe and include them in your project report. They will complement nicely the findings from your RITA analysis.

Here is the full citation for the Neal et al  article, which provides excellent instructions for conducting the RITA process.

Neal JW, Neal ZP, VanDyke E, Kornbluh M. Expediting the analysis of qualitative data in evaluation: a procedure for the rapid identification of themes from audio recordings (RITA). American Journal of Evaluation. 2015; 36(1): 118-132.







Designing Questionnaires for the Mobile Age

Friday, July 10th, 2015

How does your web survey look on a handheld device?  Did you check?

The Pew Research Center reported that 27% of respondents to one of its recent surveys answered using a smartphone. Another 8% used a tablet. That means over one-third of participants used handheld devices to answer the questionnaire. Lesson learned: Unless you are absolutely sure your respondents will be using a computer, you need to design with mobile devices in mind.

As a public opinion polling organization, the Pew Center knows effective practices in survey research. It offers advice on developing questionnaires for handhelds in its article Tips for Creating Web Surveys for Completion on a Mobile Device. The top suggestion is to be sure your survey software is optimized for smartphones and tablets. The OERC uses SurveyMonkey, which fits this criteria. Many other popular Web survey applications do as well. Just be sure to check.

However, software alone will not automatically create surveys that are usable on handhelds devices. You also need to follow effective design principles. As a rule of thumb, keep it simple. Use short question formats. Avoid matrix-style questions. Keep the length of your survey short. And don’t get fancy: Questionnaires with logos and icons take longer to load on smartphones.

This article provides a great summary of tips to help you design mobile-device friendly questionnaires. My final word of advice? Pilot test questionnaires on computers, smartphones, and tablets. That way, you can make sure you are offering a smooth user experience to all of your respondents.

Many smart phones with application tiles on their touchscreens
Many smart phones with application tiles on their touchscreens


Telling Good Stories about Good Programs

Monday, June 29th, 2015

Sometimes our program successes are a well-kept secret, hidden deep in our final reports under pages of statistics, tables, and descriptive details. There is a way to shine a stronger light on positive program impacts: program success stories. These are short (1-2 page) narratives that are designed to educate policy makers, attract partners, and share effective practices among colleagues.

The Centers for Disease Control and Prevention deserves credit in leading a program success story movement within the public health sector. You can find lots of resources at the CDC’s website for developing program success stories. A quick Google search will turn up many success story web pages from public health departments, such as the three listed below:

If you want to create success stories for your program or organization, you need to start with a plan. You want to establish a routine to collect information in a timely manner. To get started, check out the CDC Division of Oral Health’s Tips for Writing an Effective Success Story. For more details, the CDC offers the workbook Impact and Value: Telling Your Program’s Story. The CDC Division of Adolescent and School Health also has a how-to guide for writing success stories: How to Develop a Success Story. Finally, you might find this Success Story Data Collection Tool helpful for organizing and writing your program story.  A data collection sheet could be particularly useful if multiple team members are involved in collecting success story data. The data collection tool is available in PDF or Word formats.

stockfresh_687180_magic-book-with-pages-transforming-into-birds_sizeS (2)

Last updated on Saturday, 23 November, 2013

Funded by the National Library of Medicine under contract # HHS-N-276-2011-00008-C.