Skip all navigation and go to page content


NN/LM Outreach Evaluation Resource Center

Archive for the ‘Library Value’ Category

Group Learning about Evaluation

Friday, October 23rd, 2015

Group of young business people talking on business meeting at office.

I recently got to participate in a very successful roundtable at a library conference.  I co-moderated an evaluation roundtable entitled “Library assessment: You’ve measured your success – now how do you get people to listen?” with OU-Tulsa Schusterman Library Associate Director Katie Prentice at the South Central Chapter of the Medical Library Association Annual Meeting in Little Rock, AR.

What makes roundtables unique among educational opportunities at library conferences is that unlike presentations or papers where attendees sit and listen, in a roundtable everyone can participate. It is a moderated discussion on a given topic among the people who attend, and since anyone can chime in, learning is active instead of passive.

About 25 people attended this roundtable and enthusiastically participated in a discussion about library assessment and evaluation data. Katie and I led the discussion with questions starting from what kind of data you collect at your library and leading to what libraries do with the data and how to make it work better for them. Our goal was to use our questions to all issues and solutions to come from the attendees themselves.

As an example, when we asked the question “what would you really like to know about your library and what do you dream of finding out about your users?” one hospital librarian said that she wanted to know how doctors were using the information and how it impacted the patients. Katie Prentice asked “can anyone help her with this?” and another hospital librarian responded that she sends emails to some of her doctors to ask for a sentence or two describing how the information was used.  These sentences, when collected and analyzed, could be a powerful tool to show hospital administration the importance of the library to patient outcomes.

Other kinds of evaluation ideas that were generated from attendees at this roundtable were:

  • using heat map software to determine where people go most often on your website
  • having student workers note what pieces of furniture are being used to improve furniture types and placement in the library
  • using a product like Constant Contact or Mail Chimp to send library newsletters to the doctors and employees at hospitals with assessment data.

While not all roundtables at conferences are this successful, this roundtable demonstrated the ability of librarians brought together in a group to learn from each other and solve problems.

Creative Annual Reports

Friday, October 9th, 2015

Ah, the annual report – at its best we expect to see a glossy booklet with pie charts, short paragraphs and some quotes. At its worst it can be pages of dry text. Our main hope with annual reports is that our stakeholders and others will read them and be impressed with the successes of our organizations.

Last month I ran across the annual report from the Nowra Public Library in New South Wales, Australia, which was so compelling and understandable that over 100,000 people have viewed it on YouTube:

Photo of librarian links to Nowra library video

Since most organizations don’t have the resources to do their own music video (e.g. singers, writers, silly costumes), I thought I would look at a few other examples to consider when it’s time to do your annual report.

One of my all-time favorites is the annual report from the Schusterman Library of The University of Oklahoma-Tulsa. Their annual report is an infographic that shows the data that is collected, but also describes the data in such a way that 1) you have a better feel for what is going on in the library; and also 2) you might think “I didn’t know they would help me with that!”  For example: “7,274 Reference questions answered in person, by phone, by email, and instant message or text on everything from ADHD and child welfare to decision trees, LEED homes, and census reporting.” It is available on their website, and the librarians at the Schusterman Library say they frequently find students looking at it.

The Michigan State University College of Education won a gold ADDY and a Best in Show Award for their 2012 Annual Report (an ADDY is the advertising industry’s largest competition).  Their report featured a tri-fold, die-cut skyline that presented the college’s missions and strengths with an emphasis on “college as community.” The annual report also included a video and a website that gives detailed narratives that show institutional successes in terms of personal stories.

Of course, not all institutions want an unusual annual report.  But it is important to consider the target audience.  Annual reports reach the upper administration, potential funders, and patrons of the library. The success of this years annual report might shape the library users view of the library for years to come.

The OERC is on the Road Again

Friday, September 25th, 2015

The OERC is on the road again.  Today, Cindy and Beth Layton, Associate Director of the NN/LM Greater Midwest Region, are team-teaching Measuring What Matters to Stakeholders at the Michigan Health Sciences Library Association’s annual conference in Flint, MI.

Logo for Michigan Health Sciences Library Association

This workshop covers strategies for using evaluation to enhance and communicate a library’s value to organizational decision-makers and stakeholders who influence decision makers. The workshop combines updated information with material from the NN/LM MidContinental Region’s Measuring Your Impact and the OERC’s Valuing Your Library workshops that have been taught by a number of regional medical library staff members over the past decade.

On Saturday, Karen is presenting a brand-new workshop for the Texas Library Association’s District 8 Conference called Adding Meaning to Planning: A Step-by-Step Method for Involving Your Community in Meaningful Library Planning.

TLA District 8 Logo

The workshop is a method of involving community members in creating pain-free logic models to ensure that the long term vision is always in sight when planning.  Karen wrote a blog entry about the creating “tearless” logic models here.  This is Karen’s first experience creating and delivering a workshop that is purely about library evaluation.

The NN/LM travel season is about to go into full swing.  We know we aren’t the only ones out and about with presentations, trainings, and exhibits.  So safe travels. And we will see you in a week with another OERC blog post.

Improving Your Data Storytelling in 30 Days

Friday, August 21st, 2015

Here are some more great techniques to help with telling a story to report your evaluation data so it will get the attention it deserves.

Friends at campfire telling storiesJuice Analytics has this truly wonderful collection of resources in a guide called “30 Days to Data Storytelling.” With assignments of less than 30 minutes a day, this guide links to data visualization and storytelling resources from sources as varied as Pixar, Harvard Business Review, Ira Glass, the New York Times, and Bono (yes, that Bono).

The document is a checklist of daily activities lasting no longer than 30 minutes per day. Each activity is either an article to read, a video to watch, or a small project to do.

The resources answer valuable questions like:

  • What do you do when you’re stuck?
  • How do I decide between visual narrative techniques?
  • Where can I find some examples of using data visualization to tell a story?

Infographics Basics: A Picture is Worth 1000 Data Points

Friday, April 24th, 2015

Open Access Week at University of Cape Town infographicYou’ve been collecting great data for your library, and now you have to figure out how to use it to convince someone of something, for example how great your library is. Part of the trick is turning that data into a presentation that your stakeholders understand – especially if you are not there to explain it.  Infographics are images that make data easy to understand in a way that gets your message across.

It turns out it doesn’t have to be difficult or expensive to create your own infographics.  Last week I went to a hands-on workshop at the Texas Library Association called “Infographics: One Picture is  Worth 1,000 Data Points,” taught by Leslie Barrett, Education Specialist from the Education Service Center Region 13 in Austin, TX. Using this website as her interactive “handout”, Leslie walked us through the process of creating an infographic (and as a byproduct of this great class, she also demonstrated a number of free instructional resources, such as Weebly, Padlet, and Thinglink).

Starting at the top of the page, click on anything with a hyperlink.  You will find a video as well as other “infographics of infographics”  which demonstrate how and why infographics can be used.  There are also a variety of examples to evaluate as part of the learning process.

Finally, there is information on the design process and resources that make infographics fairly easy to create.  These resources, such as Piktochart and Easelly, have free subscriptions for simple graphics and experimenting.

Leslie Barrett allowed us to share this website with you, so feel free to get started making your own infographics!

Image credit: Open Access Week at University of Cape Town by Shihaam Donnelly / CC BY SA 3.0

New Journal: Systematic Reviews

Saturday, April 19th, 2014

Librarians’ expert searching skills provide some great opportunities for collaboration with researchers. Biomed Central’s new open access Systematic Reviews journal is about a specialized type of expert searching that librarians can provide for their communities. More than a source of protocols and a record of others’ work, this journal has great potential for those of us in academia who want to publish articles to share information with our colleagues about what we have done.

Here’s more information from the Aims and Scope:

Systematic Reviews encompasses all aspects of the design, conduct and reporting of systematic reviews. The journal aims to publish high quality systematic review products including systematic review protocols, systematic reviews related to a very broad definition of health, rapid reviews, updates of already completed systematic reviews, and methods research related to the science of systematic reviews, such as decision modeling. The journal also aims to ensure that the results of all well-conducted systematic reviews are published, regardless of their outcome.

It is a long-term goal of the journal to ensure all systematic reviews are prospectively registered in an appropriate database, such as PROSPERO, as these resources for registration become available and are endorsed by the scientific community.

Article types include:

  • Research Articles
  • Commentaries
  • Letters
  • Methodologies
  • Protocols
  • Review Updates

The editors-in-chief comprise an international group hailing from the University of Ottawa; the RAND corporation and UCLA; and the University of York.

Take a look at this journal! It could be a source of inspiration for any librarian whose emphasis is on expert searching.

Institute for Research Design in Librarianship: 9 days in southern CA; full scholarships available

Monday, December 2nd, 2013

Do you want to learn about how your user groups and communities find and use information? Do you want to gather evidence to demonstrate that your work is making a difference?

Exciting news! You can work on these questions, and questions like them, June 16-26, 2014!

The Institute for Research Design in Librarianship is a great opportunity for an academic librarian who is interested in conducting research. Research and evaluation are not necessarily identical, although they do employ many of the same methods and are closely related. This Institute is open to academic librarians from all over the country. If your proposal is accepted, your attendance at the Institute will be paid for, as will your travel, lodging, and food expenses.

The William H. Hannon Library has received a three-year grant from the Institute for Museum and Library Services (IMLS) to offer a nine-day continuing education opportunity for academic and research librarians. Each year 21 librarians will receive instruction in research design and a full year of support to complete a research project at their home institutions. The summer Institute for Research Design in Librarianship (IRDL) is supplemented with pre-institute learning activities and a personal learning network that provides ongoing mentoring. The institutes will be held on the campus of Loyola Marymount University in Los Angeles, California.

The Institute is particularly interested in applicants who have identified a real-world research question and/or opportunity. It is intended to

“bring together a diverse group of academic and research librarians who are motivated and enthusiastic about conducting research but need additional training and/or other support to perform the steps successfully. The institute is designed around the components of the research process, with special focus given to areas that our 2010 national survey of academic librarians identified as the most troublesome; the co-investigators on this project conducted the survey to provide a snapshot of the current state of academic librarian confidence in conducting research. During the nine-day institute held annually in June, participants will receive expert instruction on research design and small-group and one-on-one assistance in writing and/or revising their own draft research proposal. In the following academic year, participants will receive ongoing support in conducting their research and preparing the results for dissemination.”

Your proposal is due by February 1, 2014. Details are available at the Institute’s Prepare Your Proposal web site.

Factoid: Loyola Marymount is on a bluff above the Pacific Ocean, west of central LA.

New, Improved, and Available Now!

Thursday, September 12th, 2013

The 2nd Edition of the Planning and Evaluating Health Information Outreach Projects series of 3 booklets is now available online:

Getting Started with Community-Based Outreach (Booklet 1)
What’s new? More emphasis and background on the value of health information outreach, including its relationship to the Healthy People 2020 Health Communication and Health Information Technology topic areas

Planning Outcomes-Based Outreach Projects (Booklet 2)
What’s new? Focus on uses of the logic model planning tool beyond project planning, such as providing approaches to writing proposals and reports.

Collecting and Analyzing Evaluation Data (Booklet 3)
What’s new? Step-by-step guide to collecting, analyzing, and assessing the validity (or trustworthiness) of quantitative and qualitative data, using questionnaires and interviews as examples.

These are all available free to NN/LM regional offices and network members. To request printed copies, send an email to

Non-508 compliant pdf versions of all three booklets are available here: .

The Planning and Evaluating Health Information Outreach series, by Cynthia Olney and Susan Barnes, supplements and summarizes material in Cathy Burroughs’ groundbreaking work from 2000, Measuring the Difference: Guide to Planning and Evaluating Health Information Outreach. Printed copies of Burroughs’ book are also available free—just send an email request to

The Value of Academic Libraries

Friday, September 23rd, 2011

Last year the Association of College and Research Libraries issued a very substantial and thorough review of the research that has been done on how to measure library value:  “The Value of Academic Libraries: A Comprehensive Research Review and Report” by Megan Oakleaf.  Althought its focus is academia, there are sections reviewing work in public libraries, school libraries, and special libraries.  I recommend the section on special libraries, which has quite an emphasis on medical libraries, including references to past work regarding clinical impacts.

For those who are interested in approaches that libraries have taken to establishing their value, there is potential benefit in reading the entire report cover-to-cover.  For those who want a quick overview, these are sections that I recommend:

  • Executive Summary
  • Defining “Value”
  • Special Libraries

For more information about this report, see “A tool kit to help academic librarians demonstrate their value” from the 9/14/2010 issue of the Chronicle of Higher Education.

The full report is available at

The Critical Incident Technique and Service Evaluation

Thursday, September 22nd, 2011

In their systematic review of clinical library (CL) service evaluation, Brettle et al. summarize evidence showing that CLs contribute to patient care through saving time and providing effective results.  Pointing out the wisdom of using evaluation measures that can be linked to organizational objectives, they advocate for using the Critical Incident Technique to collect data on specific outcomes and demonstrate where library contributions make a difference.  In the Critical Incident Technique, respondents are asked about an “individual case of specific and recent library use/information provision rather than library use in general.”  In addition, the authors point to Weightman, et al.’s suggested approaches for conducting a practical and valid study of library services:

  • Researchers are independet of the library service.
  • Respondents are anonymous.
  • Participants are selected either by random sample or by choosing all members of specific user groups.
  • Questions are developed with input from library users.
  • Questionnaires and interviews are both used.

Brettle, et al., “Evaluating clinical librarian services: a systematic review.”  Health Information and Libraries Journal, March 2010.  28(1):3-22.

Weightman, et al., “The value and impact of information provided through library services for patient care: developing guidance for best practice.”  Health Information and Libraries Journal, March 2009.  26(1):63-71.

Last updated on Saturday, 23 November, 2013

Funded by the National Library of Medicine under contract # HHS-N-276-2011-00008-C.