Skip all navigation and go to page content

NEO Shop Talk

The blog of the National Network of Libraries of Medicine Evaluation Office

Archive for August, 2015

Simply Elegant Evaluation: Appreciative Inquiry at NN/LM MAR

Friday, August 28th, 2015

KF brochure

Kate Flewelling is the Outreach Coordinator for the National Network of Libraries of Medicine’s Middle Atlantic Region (NN/LM MAR), which is located at the University of Pittsburgh Health Sciences Library.  For those unfamiliar with the NN/LM, libraries and organizations join the network to help promote health information access and use.  The program is funded by the NIH National Library of Medicine, and NN/LM MAR is one of eight regional medical libraries that coordinate a region of the network. These eight health sciences libraries partner with member organizations and sometimes fund their health information outreach activities.

After attending an NN/LM Outreach Evaluation Resource Center training session, Kate was inspired to conduct an Appreciative Inquiry (AI) evaluation project to explore how NN/LM MAR could attract health professionals into the NN/LM and provide them with useful services.  In March 2015, she conducted eight 30-minute interviews with health professionals. She recruited interviewees from among health professionals who served on a special NN/LM MAR advisory committee or represented health organizations that received funding from NN/LM MAR. She chose interviewees from three states where NN/LM MAR works (Pennsylvania, New York, and Delaware), with representation from organizations in both rural and urban areas. Her interview guide was modeled after one presented in an OERC blog post.

Kate agreed to talk with the OERC to share her experience using Appreciative Inquiry.

OERC: What motivated you to do an Appreciative Inquiry project?

Kate:I wanted to see why health professionals got involved with us and have been so committed to us. We wanted to come up with selling points to tell other potential health professional members why they should join our network.”

Kate also chose an AI approach because she wanted stories, not numbers.  She was working with a small but diverse group of representatives, so interviews seemed to be a better approach than surveys for getting unique perspectives. She also believed an AI assessment was simple enough to be completed in about a week. In fact, she completed all eight interviews in eight days.

Slide1

OERC: Did you believe you got responses that had an overly positive bias?

Kate: “I only asked people who loved us, but they also know us. So they have an idea of what we’re doing and a much broader understanding of what we do with outreach because they hear about the whole outreach program. But I got really good feedback. Not criticism, but stuff we could do to improve our services.”

In AI, it is not unusual to interview an organization’s champions, because they often can provide the most informed advice about improving a program. Kate understood that her interviewees had favorable opinions about NN/LM MAR, but she said her interviews still identified holes in their outreach efforts to health professionals. They provided good advice on how to attract other health professionals to the network.

OERC: What did you want to learn from the study?  

Kate: “The experience was great! It gave me good ideas.  I realized we weren’t using them [health professional colleagues] as much as we could. They told me ‘I will pass on whatever you need me to pass on.’  It gave me great ideas for how to use them, use their connections and develop target outreach materials and messages for special audiences.’”   

She realized that NN/LM MAR could send regular postings to the health representatives, just as they do to health sciences librarians.  The postings just needed to contain more context so that they were targeting a public health or clinical audience.

Kate: “The project also made me realize how far [NN/LM MAR’s] reach has gone in the past four years…It felt like, during our first and second year, throwing spaghetti on the wall to see if it was working with health professionals. But we were trying to make the connections. Now we know, for our most engaged people, what they value about their relationship with us.”

Most of the staff joined the NN/LM MAR in 2011, when University of Pittsburgh Health Sciences Library was awarded the contract to coordinate the Middle Atlantic Region. So the AI project was a member check with their public health and clinical health partners, to see how well the relatively new program was meeting their needs.  Before the AI project, Kate said she knew what NN/LM MAR staff was getting from their relationships with health professionals.  Afterwards, she understood what the health professionals were getting from NN/LM MAR.

OERC: How did you use the information you gathered?

Kate: “Just starting to talk to people at exhibits, I have a sense of what’s going to grab them.”

Kate developed a brochure targeted to health professionals with a summary of NN/LM selling points on the front (gleaned from her AI interviews) and resources of interest on the back. She plans to share the brochure with the other eight NN/LM regional medical libraries. She also believes the NN/LM MAR staff will tap into this information in the future when they plan programs for health professionals.

Slide2

Improving Your Data Storytelling in 30 Days

Friday, August 21st, 2015

Here are some more great techniques to help with telling a story to report your evaluation data so it will get the attention it deserves.

Friends at campfire telling storiesJuice Analytics has this truly wonderful collection of resources in a guide called “30 Days to Data Storytelling.” With assignments of less than 30 minutes a day, this guide links to data visualization and storytelling resources from sources as varied as Pixar, Harvard Business Review, Ira Glass, the New York Times, and Bono (yes, that Bono).

The document is a checklist of daily activities lasting no longer than 30 minutes per day. Each activity is either an article to read, a video to watch, or a small project to do.

The resources answer valuable questions like:

  • What do you do when you’re stuck?
  • How do I decide between visual narrative techniques?
  • Where can I find some examples of using data visualization to tell a story?

Soup Up Your Annual Reports with Calculator Soup

Friday, August 14th, 2015

Summer is annual report time for our organization. Sometimes when I’m putting together my bulleted list of accomplishments for those reports, I feel as though our major wins get lost in the narrative. So I recently turned to an online calculator to help me create better metrics to talk about our center’s annual wins.

One of our objectives for the year was to increase participation in our evaluation training program. We developed new webinars based on our users’ feedback and also increased promotion of our training opportunities. The efforts paid off: training session attendance increased from 291 participants the previous year to 651 this year. Now that is a notable increase, but the numbers sort of disappear into the paragraph, don’t they? So I decided to add a metric to draw attention to this finding: Our participation rate increased 124% over last year’s attendance. Isn’t “percent increase” a simpler and more eye-catching way to express the same accomplishment?

Doing this extra analysis seems simple, but it takes time and gives me angst because it usually requires manual calculation. First I have to look up the formula somewhere. Then I have to calculate the statistic. Then I calculate it again, because I don’t trust myself. Then I calculate it again out of pure obsessiveness.

That’s why I love online calculators. Once I find one I like and test it for accuracy, I bookmark it for future use. From then on, I let the calculator do the computation because it is infinitely more reliable than I am when it comes to running numbers.

One of my favorite sites for online calculators is Calculator Soup, because it has so many of them. You may not ever use 90% of its calculators, but who knows when you might need to compute someone’s age from a birth date or convert days to hours. The calculators also show you the exact steps in their calculations. This allows you to check their work. You also can find formulas that you then can apply in an Excel spreadsheet.

One word of advice: test a calculator for accuracy before adopting it. I always test a new calculator to be sure the designers knew what they were doing. For Calculator Soup, I can vouch for the percent change and the mean/median/mode calculator. If I use any others at that site, I’ll test them as well. I’ll create an easy problem that I can solve manually and make sure my result matches the calculator’s.

If you want to see what Calculator Soup has to offer, check out their calculator index here.

Random array of simple arithmetic formulas

A Rainbow Connection? The Evaluation Rainbow Framework

Friday, August 7th, 2015

Once you start looking online for help with your evaluation project, you will find a veritable storm of evaluation resources out there. So many that it can be confusing how choose the best ones for your needs.  But don’t worry, once you’ve looked at this online tool you will find the rainbow of hope that follows the storm (okay that was pretty cheesy – stay with me, it gets better).

A group of evaluators from all over the world created a website called BetterEvaluation.org for the purpose of organizing and sharing useful online evaluation resources. The framework they created to organize resources is called the Rainbow Framework because it divides the world of evaluation into seven different “clusters” which are delineated by rainbow colors.  Each cluster is then broken down into a number of tasks, and each task broken down into options and resources.

Here is an example of the Rainbow Framework in action.  By clicking on the yellow “Describe” category, the image opens a window on the right that lists seven tasks: 1) Sample; 2) Use measures, indicators, or metrics; 3) Collect and/or retrieve data; 4) Manage data; 5) Combine qualitative and quantitative data; 6) Analyze data; and 7) Visualize data.

When you click on a specific task, a page listing a variety of Options and Resources will open, like this:Rainbow Framework Options

 

BetterEvaluation made eight 20 minute “coffee break” webinars in conjunction with AEA that you can watch for free on the BetterEvaluation website. Each webinar describes a cluster, and there is one overview webinar.  The webinars are two years old, so the actual image of the rainbow looks a little different from the webinar video, but the content is still relevant.  Here is a link to the webinar series: http://betterevaluation.org/events/coffee_break_webinars_2013

The Rainbow Framework does more than just organize resources. Here are some reasons you might want to use this Framework.

1) Help designing and planning an evaluation

2) Check the quality of an ongoing evaluation

3) Commission an evaluation – will help formulate what’s important to include when you commission an evaluator and then when you assess the quality of the proposals

4) Embed stakeholder participation thoughtfully throughout the evaluation

5) Develop your evaluation capacity – lifelong learning – to fill in gaps of knowledge.

So, somewhere over the rainbow, your evaluation skies may be blue…

Last updated on Monday, June 27, 2016

Funded by the National Library of Medicine under Contract No. UG4LM012343 with the University of Washington.