Skip all navigation and go to page content

NEO Shop Talk

The blog of the National Network of Libraries of Medicine Evaluation Office

Worksheets and Checklists to Help with Evaluation

You may already know that the NEO offers some booklets that work through some basic evaluation processes, called Planning and Evaluating Health Information Outreach Projects. You can view them as a PDF, an HTML or you can order a print version (there are limited copies of these left, so no promises). But I think the under-marketed gem in these booklets are the checklists and worksheets at the end of each one. And the ones in the HTML version of the Booklets are Word docs that you can download, modify if you want, and use.

Covers of the three Evaluation Booklets

For example, let’s say you’d like to create a survey to find out if you’ve reached your project’s outcomes. A process for this is explained in Booklet 3: Collecting and Analyzing Evaluation Data. At the end of this booklet is a blank worksheet called “Planning a Survey” that you can download and use to work through writing your survey questions.  Along with that, there’s also an example of a filled out worksheet based on a realistic scenario that helps demonstrate how the worksheet can be used.

The importance of checklists in improving outcomes is underscored in Dr. Atul Gawande’s book The Checklist Manifesto.  While he’s mostly talking about medical scenarios, the same can be true with evaluation.  Let’s face it, even if you feel fairly confident in evaluation, there are a lot of little things to remember, especially if you don’t do it all the time.

In Booklet 1: Getting Started with Community-Based Outreach, the checklist items are sorted into the three categories “Step 1 – Get Organized,” “Step 2 – Gather Information,” and “Step 3 – Assemble, Interpret and Act.”  These are the same categories as the chapters in the book.  So for example, one of the items under “Get Organized” is “Gather primary data to complete the picture of your target community.”  If you’d like a reminder or some suggestions of how to go about this, go to the chapter heading “Step 2 – Gather Information” where you can find a list of ways to gather primary data.  These checklists can also be downloaded as a Word document and adapted to your own needs.

I hope this isn’t too meta, but while you’re using evaluation to help you reach your project’s outcomes, you also have a personal outcome of doing a good job with your evaluation plan!  So when you head into your evaluation projects, don’t forget your checklists to make sure all of your evaluation outcomes are successful.

 

Steering by Outcomes: Begin with the End in Mind

If you don’t know where you’re going, you might not get there – Yogi Berra

Toy car sitting on a road map

Next week, Karen and I will be facilitating an online version of one of NEO’s oldest workshops, Planning Outcomes-Based Outreach Projects for the Health Science Information Section of the North Dakota Library Association.The main tool we teach in this workshop is the program logic model, but our key takeaway message is this: Figure out where you’re going before you start driving.

If you drive to a new place, your navigator app will insist on a destination, right?  Well, I’m like an evaluation consulting app: those who work with me on evaluation planning have to define what they hope to accomplish before we start designing anything.

In fact, I get positively antsy until we nail down the desired end results.  If I’m helping a colleague develop a needs assessment, I want to know how he or she plans to use the data.  To design a program evaluation process, I have to know how the project team defines success. When consulting with others on survey design, I help them determine how each question will provide them with actionable information.

My obsession with outcomes crept into my personal life years ago. Before I sign up for continuing education or personal development workshops, I consider how they will change my life.  When my husband and I plan vacations, we talk about what we hope to gain on our trip. Do we want to connect with friends? See a new landscape? Catch up on some excellent Chicago comedy? Outcomes-thinking may be an occupational hazard for evaluation professionals. Case in point: Have you seen Karen Vargas’s birthday party logic model?

Top 5 Reasons to Love Outcomes

So how did I become an outcomes geek? Here are the top five reasons:

  • Outcomes are motivating: Activities describe work and who among us needs more work? Outcomes, on the other hand, are visionary. They allow you to imagine and bask in a job well done. Group discussions about outcomes are almost always more uplifting and enthusiastic than discussions about project implementation. Plus, you will attract more key supporters by talking about the positive benefits you hope to attain.
  • Outcomes help you focus: Once you have determined what success looks like, you’ll think more carefully about how to accomplish it.
  • Outcomes provide a reality check: Once you know what you want to accomplish, you’ll think more critically about your project plans. If the logical connection doesn’t hold, you can course-correct before you even start.
  • Planned outcomes set the final scene for your project story: Ultimately, most of us want or have to report our efforts to stakeholders, who, by definition, have a vested interest in our program. Project stories, like fairy tales, unfold in three acts: (Act 1) This is where we started; (Act 2) This is what we did; (Act 3) This is what happened in the end.  Program teams notoriously focus on collecting evaluation data to tell Act 2, while stakeholders are more interested in Act 3.  However, if you articulate your outcomes clearly from the start, you are more likely to collect good data to produce a compelling final act.
  • Identifying expected outcomes helps you notice the unexpected ones. Once you start monitoring for planned outcomes, you’ll pick up on the unplanned ones as well. In my experience, most unplanned outcomes are sweet surprises: results that no one on the team ever imagined in the planning phase.  However, you also may catch the not-so-great outcomes early and address them before any real damage is done.

How to Steer by Outcomes 

When I work with individuals or small project teams, here are the questions we address when trying to identify program outcomes:

  • What will project success look like?
  • What will you observe that will convince you that this project was worth your effort?
  • What story do you want to tell at the end of this project?
  • Who needs to hear your story and what will they want to hear?

These questions help small project teams identify outcomes and figure out how to measure them. If you want a larger group to participate in your outcomes-planning discussion, consider adapting the Nine Whys exercise from Liberating Structures.

Once the outcomes are identified, you’re ready to check the logical connection between your program strategies and your planned results. The logic model is a great tool for this stage of planning. The NEO’s booklet Planning Outcomes-Based Programs provides detailed guidance for how to create project logic models.

Yogi Berra famously said “When you come to a fork in the road, take it.”  I would paraphrase that to say “When you come to a fork in the road, check your outcomes and proceed.”

Summer Evaluation Institute Registration is Open!

AEA Summer Institute Logo

Whether you call what you do evaluation or assessment, the American Evaluation Association’s Summer Evaluation Institute is an amazing event, sure to teach you something and give you a different perspective on the job you do.

The institute, held June 26-29, 2016, is made up of ½ day hands-on training sessions, taught by the best professionals in the field of evaluation.  It’s attended by people from all over the world who want to improve their skills in different aspects of the evaluation process.

Why would you as a librarian want to attend the AEA Summer Evaluation Institute?   Here are some ideas:

Let’s say you were in charge of eliminating much of your print journal collection and increasing your online journals, and you want to figure out what data you should collect that will show that your users are still getting what they need. There’s a great program called “Development and Use of Indicators for Program Evaluation” by Goldie MacDonald that covers criteria for selection of indicators. Goldie MacDonald is a Health Scientist in the Center for Global Health at the U.S. Centers for Disease Control and Prevention (CDC) and a dynamic speaker and trainer.

Are you taking on the planning of an important project, like finding ways to ensure that your hospital administration values the contributions of your library?  Logic models are great planning tools, but additionally are useful for integrating evaluation plans and strategic plans.  How would you like to take a four hour class on logic models taught by the Chief Evaluation Officer at the CDC, Tom Chapel?

What if you’re a liaison librarian to your university’s biology department and you’re looking for ways to improve collaboration with the faculty? There’s a program called Evaluating and Improving Organizational Collaboration that gives participants the opportunity to increase their capacity to quantitatively and qualitatively examine the development of inter- and intra-organizational partnerships. It’s taught by Rebecca Woodland, Chair of the Department of Educational Policy and Administration at University of Massachusetts Amherst (and recognized as one of the AEA’s most effective presenters).

Maybe you’ve been responsible for a program at your public library training the community in using MedlinePlus for their health information needs. You’ve collecting a lot of data showing the success of your programs, and want to make sure your stakeholders take notice of it. How about giving them an opportunity to work with the data themselves? There’s a program called: A Participatory Method for Engaging Stakeholders with Evaluation Findings, taught by Adrienne Adams at Michigan State University.

This is only a small sampling of the great workshops at the Summer Evaluation Institute.

For those of you who don’t know much about the American Evaluation Association: The AEA is an international professional association devoted to the application and exploration of program evaluation, personnel evaluation, technology, and many other forms of evaluation. Evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products, and organizations to improve their effectiveness. AEA has approximately 7000 members representing all 50 states in the United States as well as over 60 foreign countries.

Cindy and I will be there – we hope to see you there too!

 

Data Party for Public Librarians

The Engage for Health project team from left to right: Lydia Collins, Kathy Silks, Susan Jeffery, Cindy Olney

Last week, I threw my first data party. I served descriptive statistics and graphs; my co-hosts brought chocolate.

I first learned about data parties from evaluation consultant Kylie Hutchinson’s presentation It’s A Data Party that she gave at the 2016 American Evaluation Association Conference. Also known as data briefings or sense-making sessions, data parties actively engage stakeholders with evaluation findings.

Guest List

My guests were librarians from a cohort of public libraries that participated in the Engage for Health project, a statewide collaboration led by the NN/LM Middle Atlantic Region (MAR) and the Pennsylvania Library Association (PaLA). The NN/LM MAR is one of PaLA’s partners in a statewide literacy initiative called PA Forward, an initiative to engage libraries in activities that address five types of literacy.  The project team was composed of Lydia Collins of NN/LM MAR (which also funded the project), Kathy Silks of the PaLA, and Susan Jeffery of the North Pocono Public Library. I joined the team to help them evaluate the project and develop reports to bring visibility to the initiative.  Specifically, my charge was to use this project to provide experiential evaluation training to the participating librarians.

Librarians from our 18 cohort libraries participated in all phases of the planning and evaluation process.  Kathy and Susan managed our participant recruitment and communication. Lydia provided training on how to promote and deliver the program, as well as assistance with finding health care partners to team-teach with the librarians. I involved the librarians in every phase of the program planning and evaluation process. We met to create the project logic model, develop the evaluation forms, and establish a standard process for printing, distributing, and returning the forms to the project team. In the end, librarians delivered completed evaluation forms from 77% of their adult participants from Engage for Health training sessions.

What We Evaluated

The objective of PA Forward includes improving health literacy, so the group’s outcomes for Engage for Health was to empower people to better manage their health. Specifically, we wanted them to learn strategies that would lead to more effective conversations with their health care providers. Librarians and their health care partners emphasized strategies such as researching health issues using quality online health resources, making a list of medications, and writing down questions to discuss at their appointments.  We also wanted them to know how to use two trustworthy online health information sources from the National Library of Medicine: MedlinePlus and NIHSeniorHealth.

 Party Activities

Sharing with Appreciative Inquiry. The data party kicked off with Appreciative Inquiry interviews. Participants interviewed each other, sharing their peak experiences and what they valued about those experiences. Everyone then shared their peak experiences in a large group. (See our blog entries here and here for detailed examples of using Appreciative Inquiry.)

Data sense-making: Participants then worked with a fact sheet with graphs and summary statistics compiled from the session evaluation data.  As a group, we reviewed our logic model and discussed whether our data showed that we achieved our anticipated outcomes.  The group also drew on both the fact sheet and the stories from the Appreciative Inquiry interviews to identify unanticipated outcomes.  Finally they identified metrics they wish we had collected. What was missing?

Consulting Circles: After a morning of sharing successes, the group got together to help each other with challenges.  We had three challenge areas that the group wanted to address: integration of technology into the classes; finding partners from local health organizations; and promotional strategies.  No area was a problem for all librarians: some were quite successful in a given areas, while others struggled. The consulting groups were a chance to brainstorm effective practices in each area.

Next steps:  As with most funded projects, both host organizations hoped that the libraries would continue providing health literacy activities beyond the funding period.  To get the group thinking about program continuity, we ran a 1-2-4-All discussion about next steps.  They first identified the next steps they will take at their libraries, then provided suggestions to NN/LM MAR and PALA on how to support their continued efforts.

Post Party Activities

For each of the four party activities, a recorder from each group took discussion notes on a worksheet developed for the activity, then turned it into the project team. We will incorporate their group feedback into written reports that are currently in process.

If you are curious about our findings, I will say generally that our data supports the success of this project.  We have plans to publish our findings in a number of venues, once we have a chance to synthesize everything.  So watch this blog space and I’ll let you know when a report of our findings becomes available.

Meanwhile, if you are interested in reading more about data parties, check out this article in the Journal of Extension.

 

Diversity, Texas Libraries and Participatory Data Collection

On April 20, Cindy Olney and I facilitated a program for the Texas Library Association Annual Conference called Open Libraries! Making Your Library Welcome to All.  The program was sponsored by TLA’s Diversity and Inclusion Committee and the plan for the program was for attendees to work cooperatively to discover ways to ensure that people of diverse cultures, languages, ages, religions, sexual orientations, physical abilities, and others feel welcome at the library.  The committee wanted to get ideas from the wealth of TLA librarians’ experiences, so Cindy was invited to gather as much information from the attendees as possible. As co-chair of the TLA Diversity and Inclusion Committee, I co-facilitated the event.

The process used was a modified 1-2-4-All process, that you can find on the Liberating Structures website.  Our primary question was “What can my library do to become more welcoming to all people?”  We asked everyone in the room to brainstorm together all the different parts of a library that could be modified to make it more welcoming (e.g., reference services, facility, etc.).  We wanted to be sure that everyone thought as broadly and creatively as possible.

TLA Diversity Data Collection Program 2016

The discussion process actually had two parts.  For part one, we gave everyone two minutes to write as many ideas as they could on index cards (one idea per card).  Then we asked people to take two minutes to share their ideas with a partner.  They then shared their ideas with the entire table (up to 10 participants). The group then chose and wrote down the three best ideas and turned them in to the moderators.  Participants were instructed to leave their index cards with their ideas piled in the middle of their tables.

Here were some of the ideas that were generated through this discussion.

  • Welcome signs in different languages
  • Signage
  • Physical spaces – access to mobility

As you can see, the responses were fairly non-specific. We wanted richer descriptions of modifications of programs or services.  So part two of the process involved asking participants to develop more detailed plans for making their libraries more welcoming. Using a method involving dog, cat, and sea creature stickers, we moved participants randomly to new tables so that they ended up with a new group of colleagues.  They then chose a partner from their new table members and, as a pair, randomly chose one idea card the piles generated in part one of the process. They worked on a plan for one idea for eight minutes.  When the moderator called time, they pulled another card and worked on plans for a second idea. In the final eight minutes of the session, we asked for idea sharing by table to the entire group.

The plans in part 2 were better articulated and detailed than those we got in part one. Here are some examples of the kind of result we got from that exercise:

  • Signage: Making clearer, more colorful. Different languages signage or use digital signage.
  • Language material specific to the community and programming in various language spoken in the community. ESL classes partnered with community colleges.
  • Invite representatives from ADA/disability advocates to give suggestions on making library desks/areas more accessible.

The whole process was completed in a 50-minute conference program session.  Both myself and the other Diversity and Inclusion co-chair, Sharon Amastae from El Paso, TX, were impressed with the energy and enthusiasm that was present among attendees in the room.

The results of this data gathering event will be communicated to the TLA membership.  When that project is completed, we’ll let you know here on the NEO Shop Talk blog!

Photo credit: Esther Garcia

 

Meet the New NEO

Cindy Olney and Karen Vargas

Head’s up, readers.  Look for a name change to our blog on May 1.

That’s the day the NN/LM Outreach Evaluation Resource Center will be replaced by the new NN/LM Evaluation Office, a.k.a. NEO.  The NEO will have the same staff (Karen Vargas and Cindy Olney) and same location (headquartered in University of Washington Health Sciences Library) as the OERC; but it has a new and evolving role in the National Network of Libraries of Medicine (NN/LM).

This time last year, Karen Vargas (evaluation specialist) and I (acting assistant director) began writing a proposal for this new NN/LM office.  The University of Washington Health Sciences Library submitted our proposal as part of its larger one for a five-year cooperative agreement to fund the NN/LM Pacific Northwest Regional Medical Library. (Spoil alert: UW HSL won the award.  See the announcement here:  https://www.nlm.nih.gov/news/nlm-rml-coop-agreement-2016.html.)

Our new name reflects one of a number of changes in NN/LM’s funding, organization, and management.  Leaders of the NN/LM are re-envisioning what it means to be a national network of organizations that work together to advance the progress of medicine and public health through access to health information. The NEO staff will contribute our evaluation expertise to help the leaders focus on key outcomes and measure progress and accomplishments.

The vision set forth in our proposal is to influence NN/LM’s use of evaluation to engage and learn about its programs, make good decisions, and enhance the visibility of its successes. Our proposed strategies were organized around five main aims. First, we will support the NN/LM leadership’s ability to make data-driven decisions. Second, we will collaborate with the regional medical libraries to increase use of evaluation in their regions.  Third, we will provide quality evaluation training opportunities to build evaluation skills of network members.  Our fourth aim is to increase visibility of NN/LM’s program successes.  Lastly, we plan to provide new written materials about effective and emerging evaluation practices and trends.

The exact nature of our services will be determined by the needs of the NN/LM as we all develop new approaches to working together. We do know that the NEO’s scope will expand beyond health information outreach evaluation to include other areas, such as organizational development and internal services to users and clients. We also want to put more emphasis on evaluation use, both for decision-making and advocating program value to stakeholders. As a teaser, Karen and I plan to develop our own expertise in evaluation reporting, participatory evaluation methods, and digital story-telling. (In fact, Karen’s blog post next week will describe our  recent participatory evaluation experience at the Texas Library Association 2016 meeting.)

The most important news for our blog readers, though, is that our URL address will not change for the foreseeable future. So in spite of the name change that’s coming, you will still find our weekly blog posts here.  So “see” you next week.

 

Our Favorite Evaluation Blogs

successful business woman on a laptop

We really don’t want you to stop reading our blog!  But April is a really busy month for us, so this week we’ll make sure you get your evaluation buzz by letting you know of some other great evaluation blogs.

AEA365 – This is the blog of the American Evaluation Association.  This blog shares hot tips, cool tricks, rad resources, and lessons learned by different evaluators every single day!

Better Evaluation Blog – Better Evaluation is an international collaboration to improve evaluation by sharing information about evaluation methods, processes and approaches. The blog has posts that provide new perspectives  about particular issues in evaluation.

EvaluATE – EvaluATE is the evaluation resource center for the National Science Foundation’s Advanced Technological Education program. Their blog has lessons learned, tips, or techniques on evaluation management, proposal development, evaluation design, data collection and analysis, reporting, and more.

Evergreen Data – Stephanie Evergreen writes a blog about data visualization.  For the record, she has written the book(s) on data visualization, Effective Data Visualization and Presenting Data Effectively.

Visual Brains – Sara Vaca writes about new techniques and ways of visualizing data, information, and figures to communicate evaluation findings and to improve evaluation use, but also for use in other stages such as planning and analyzing.

The OERC Is On The Road in April

 

A young boy having fun driving his toy car outdoors.

The OERC staff will be putting on some miles this month. Karen and Cindy are slated to present and teach at various library conferences and meetings. If you happen to be at any of these events, please look for us and say “hello.”  Here is the April itinerary: 

Cindy will participate in a panel presentation titled “Services to Those Who Serve: Library Programs for Veterans and Active Duty Military Families” at the Public Library Association’s 2016 conference in Denver. The panel presentation will be held from 10:45 – 11:45 am, April 7. She and Jennifer Taft, who is now with Harnett County Public Library,  will present a community assessment project they conducted for the Cumberland County Public Library, described here in the November/December 2014 edition of Public Libraries.

Karen will conduct the OERC workshop “Planning Outcomes-Based Outreach Programs” on April 8 for the Joint Meeting of the Georgia Health Sciences Library Association and the Atlanta Health Science Library Consortium in Decatur, GA. This workshop teaches participants how to develop logic models for both program and evaluation planning.

 Cindy and Karen will facilitate two different sessions for the Texas Library Association’s annual conference, both on April 20 in Houston. One session will be a large-group participatory evaluation exercise to gather ideas from the TLA membership about how  libraries can become more welcoming to diverse populations. The second is an 80-minute workshop on participatory evaluation methods, featuring experiential learning exercises about Appreciative Inquiry, 1-2-4-All, Photovoice, and Most Significant Change methods.

Cindy will join the NN/LM Middle Atlantic Region and the Pennsylvania Library Association to talk about evaluation findings from a collaborative health literacy effort conducted with 18 public libraries across the state. The public libraries partnered with health professionals to run health literacy workshops targeted at improving consumers’ ability to research their own health concerns and talk more effectively with their doctors. The public librarians involved in this initiative worked together to design an evaluation questionnaire that they gave to participants at the end of their workshops. The combined effort of the cohort librarians allowed the group to pool a substantial amount of evaluation data. Cindy will facilitate a number of participatory evaluation exercises to help the librarians interpret the data, make plans for future programming, and develop a communication plan that will allow them to publicize the value of the health literacy initiative to various stakeholders. The meeting will be held April 29 in Mechanicsburg, PA.

In addition, Cindy will be attending a planning meeting at the National Library of Medicine in Bethesda in mid-April with Directors, Associate Directors, and Assistant Directors from the NN/LM. Our library, the University of Washington Health Sciences Library, will receive cooperative agreements for both the NN/LM Pacific Northwest Regional Medical Library and the NN/LM Evaluation Office, which will replace the OERC on May 1. (You can see the announcement here.)  We will let you know more about the NEO later, except to say that we will be moving into the same positions in the NEO that we hold with the OERC. You have not heard the end of us!

Although we will be on the road quite a bit, rest assured we will not let our loyal readers down. So please tune in on Fridays for our weekly posts.

 

What chart should I use?

It’s time to put your carefully collected data into a chart, but which chart should you use?  And then how do you set it up from scratch in your Excel spreadsheet or Power Point presentation if you aren’t experienced with charts?

Here’s one way to start: go to the Chart Chooser at Juice Analytics.  They allow you to pick your chart and then download it into Excel or Power Point. Then you can simply put in your own data and modify the chart the way you want to.

They also have a way to narrow down the options.  As a hypothetical example, let’s say a fictional health science librarian, Susan, is in charge of the social media campaign for her library.  She wants to compare user engagement for her Twitter, Facebook and Blog posts to see if there is any patterns in their trends. Here are some fictional stats showing how difficult it is to find trends in the data.

Monthly stats of blog, Twitter and Facebook engagement

Susan goes to the Juice Analytics Chart Chooser and selects from the options given (Comparison, Distribution, Composition, Trend, Relationship, and Table).  She selects Comparison and Trend, and then also selects Excel, because she is comfortable working in Excel.  The Chart Chooser selects two options: a column chart and a line chart.  Susan thinks the line chart would work best for her, so she downloads it (by the way, you can download both and see which one you like better).  After substituting their data with hers, and making a couple of other small design changes, here is Susan’s resulting chart in Excel, showing that user engagement with both blog posts and Facebook posts shows a pattern of increasing and decreasing at the same time, but that Twitter engagement does not show the same pattern.

Line chart of Blog Twitter and Facebook engagment

By the way, the total time spent selecting the chart, downloading it, putting in the fictional data, and making chart adjustments was less than 15 minutes.  Is it a perfect chart?  Given more time, I would suggest adjusting some more of the chart features (see our January 29, 2016 post The Zen Trend in Data Visualization). But it was a very easy way to pick out a chart that allowed Susan to learn what she needed to from the data.

One thing I want to point out is that this is not a complete list of charts.  This is a good starting place, and depending on your needs, this might be enough. But if you get more involved in data, you might want to take a look at small multiples, lollipop charts, dot plots, and other ways to visualize data.  Check out Stephanie Evergreen’s EvergreenData Blog  for more chart types.

 

Inspirational Annual Reporting with Appreciative Inquiry

Hiker enjoying the view along the Iceberg Lake trail in Glacier National Park

Do you have to file annual reports? How much do you love doing them?

Did I hear someone say “no comment?”

In January, I challenged the outreach librarians of the National Network of Libraries of Medicine Greater Midwest Region (NN/LM GMR) to experiment with a reflective exercise designed to add some inspiration to their annual reporting. The setting was a monthly webinar attended by librarians who led outreach activities at their libraries to promote health information access and use. Because their libraries received funding from the NN/LM GMR, they were required to submit annual reports for their funded activities.

My charge was to teach this group something about evaluation. In response, I presented them with this short (about 15 minute) exercise to be used when they began preparing their annual reports.

When preparing your report, answer these questions. Then write a short paragraph based on your answers and add it to your annual report:

  1. Describe one of the best experiences you had this year conducting outreach for the NN/LM.
  2. What do you value most about that experience?
  3. What do you wish could happen so that you had more experiences like this?

You may recognize these as the three signature questions of the basic Appreciative Inquiry (AI) interview. Appreciative Inquiry is a practice of influencing organizational change by identifying peak experiences and discovering ways to build on them. The book Reframing Evaluation through Appreciative Inquiry (Preskill and Catsambas, Sage, 2006) provides descriptions and examples of how to apply AI to every part of the evaluation process.

My partners for this webinar were host Jacqueline Leskovec, Outreach, Planning and Evaluation Coordinator, and presenter Carmen Howard, who is the Regional Health Sciences Librarian and Visiting Assistant Professor from UIC Library of the Health Sciences Peoria. Carmen headlined the webinar with her presentation about the Nursing Experts: Translating the Evidence (NExT) Guide, which provides resources on evidence-based practice to nurses. Good sport that she was, Carmen helped me demonstrate the exercise to our audience by participating in an AI interview about her outreach project. The outreach librarians then brainstormed ways to use the three questions to prepare their own NN/LM reports. We also talked about how to add their reflective statements to their annual reports, which are entered into an online system.

Soon after that webinar, Carmen wrote an entry about her experience using the three questions that appeared in the NN/LM GMR’s blog The Cornflower. Here is my favorite quote from her entry:

“These three simple questions which only take about 10-15 minutes to answer forced me to stop and reflect on the NExT project. Rather than just being focused on what was next on the to-do list, I was looking back on what had already been accomplished, and better yet, I was thinking about the good stuff.”

The NN/LM GMR outreach librarians have not yet filed their 2016 annual reports, so I can’t tell you how many rose to my challenge. (This exercise was a suggestion, not a requirement.) One other outreach librarian did send an email to say she was using the three questions to have a reflective group discussion with other librarians who participate in NN/LM outreach activities.

I would like to extend the challenge to our readers who may be facing annual reports. Try this exercise and see if you don’t start thinking and writing differently about your efforts over the past year.

If you want to know more about Appreciative Inquiry, we highly recommend this source:

  • Preskill H, Catsambas TT. Reframing Evaluation through Appreciative Inquiry. Thousand Oaks, CA: Sage, 2006.

You also might be interested in the OERC’s other blog posts about Appreciative Inquiry:

If you are interested in earning some continuing education credits from the Medical Library Association while trying your hand at an Appreciative Inquiry project, reach this post: Appreciative Inquiry of Oz: Building on the Best in the Emerald City 

 

Last updated on Thursday, May 19, 2016

Funded by the National Library of Medicine under contract # HHS-N-276-2011-00008-C.