Archive for the ‘Library Value’ Category
Last year the Association of College and Research Libraries issued a very substantial and thorough review of the research that has been done on how to measure library value: “The Value of Academic Libraries: A Comprehensive Research Review and Report” by Megan Oakleaf. Althought its focus is academia, there are sections reviewing work in public libraries, school libraries, and special libraries. I recommend the section on special libraries, which has quite an emphasis on medical libraries, including references to past work regarding clinical impacts.
For those who are interested in approaches that libraries have taken to establishing their value, there is potential benefit in reading the entire report cover-to-cover. For those who want a quick overview, these are sections that I recommend:
- Executive Summary
- Defining “Value”
- Special Libraries
For more information about this report, see “A tool kit to help academic librarians demonstrate their value” from the 9/14/2010 issue of the Chronicle of Higher Education.
The full report is available at http://www.acrl.ala.org/value/.
In their systematic review of clinical library (CL) service evaluation, Brettle et al. summarize evidence showing that CLs contribute to patient care through saving time and providing effective results. Pointing out the wisdom of using evaluation measures that can be linked to organizational objectives, they advocate for using the Critical Incident Technique to collect data on specific outcomes and demonstrate where library contributions make a difference. In the Critical Incident Technique, respondents are asked about an “individual case of specific and recent library use/information provision rather than library use in general.” In addition, the authors point to Weightman, et al.’s suggested approaches for conducting a practical and valid study of library services:
- Researchers are independet of the library service.
- Respondents are anonymous.
- Participants are selected either by random sample or by choosing all members of specific user groups.
- Questions are developed with input from library users.
- Questionnaires and interviews are both used.
Brettle, et al., “Evaluating clinical librarian services: a systematic review.” Health Information and Libraries Journal, March 2010. 28(1):3-22.
Weightman, et al., “The value and impact of information provided through library services for patient care: developing guidance for best practice.” Health Information and Libraries Journal, March 2009. 26(1):63-71.
The Institute of Museum and Library Services has awarded a grant that will test and implement methodologies measuring the return on investment (ROI) in academic libraries. The goals are to provide evidence and a set of tested methodologies that academic libraries will be able to use in demonstrating their value. The University of Tenessee, Knoxville will be conducting this study in collaboration with the University of Illinois at Urbana-Champaign and the Association of Research Libraries. Dr. Carol Tenopir, professor in the School of Information Sciences, is the project’s lead investigator. This news item from the UT Knoxville’s Tennessee Today provides more details: “UT Shares in Grant to Study Value of Academic Libraries.”
In a complementary project, the Association of College and Research Libraries has selected Dr. Megan Oakleaf to conduct a review of the “quantitative and qualitative literature, methodologies and best practices currently in place for demonstrating the value of academic libraries.” The Association plans to issue a completed report later this year.
A belated note about an interesting item at the Medical Library Association meeting in DC this past May. Christine Chastain-Warheit gave a fascinating 5-minute “Lightning Round” presentation, “Can Hospital Librarians Demonstrate Internal Revenue Service-mandated Community Benefit for Their Nonprofit Organizations? Reflecting on Value Provided and Connecting the Hospital Library to Community Benefit.” She pointed out that the IRS Community Benefit standard for not-for-profit hospitals includes activities that promote health in response to community needs. Community Benefit is the basis of the tax-exemption of not-for-profit hospitals. Her institution has agreed that the library’s outreach activities can be included in calculating hospital community benefit efforts for IRS reporting (Poster presented at Medical Library Association Annual Meeting, May 23, 2010). This approach could have good potential for libraries in not-for-profit hospitals demonstrating their value to their institutions and their institutions’ communities, so it’s definitely something to watch. Ms. Chastain-Warheit is Director of Medical Libraries at Christiana Hospital in Newark, DE.
In the October, 2008 issue of the MLA News, Terrance Burton presented a quick overview of how the business, dollor-based Return on Investment approach (profit gained from invested dollars) can be expanded to a view of what value is derived from whatever inputs you choose to work with. This can be more relevant in the library world, where profit is not the main point. In fact, in many health-related institutions served by libraries, profit is not the main point. Money is indeed invested in libraries, but it can be challenging to assess the outputs from that investment. Burton suggests that we identify outcomes that we want to measure, use a mix of approaches (quantitative, qualitative, speculative), and accept that any measure will be flawed. Despite flaws, using mixed approaches can provide various indicators to bolster a case.
MacDonald, J; Bath, P; Booth, A. “Healthcare services managers: What information do they need and use?” Evidence-Based Library and Information Practice. 3(3):18-38.
This paper presents research results that provide insights into how information influences healthcare managers’ decisions. Information needs included explicit Organizational Knowledge (such as policies and guidelines), Cultural Organizational Knowledge (situational such as buy-in, controversy, bias, conflict of interest; and environmental such as politics and power), and Tacit Organizational Knowledge (gained experientially and through intuition). Managers tended to use internal information (already created or implemented within an organization) when investigating an issue and developing strategies. When selecting a strategy, managers either actively looked for additional external information–or else they didn’t, and simply made a decision without all of the information that they would have liked to have. Managers may be more likely to use external information (ie, research-based library resources) if their own internal information is well-managed. The article’s authors suggest that librarians may have a role in managing information created within an organization in order to integrate it with externally created information resources.
The penultimate session of the 2008 Library Assessment Conference was a panel discussion, Assessment Plans: Four Case Studies. Among the experiences and advice provided by the four expert panelists was this final observation about working with library staff on assessment projects: “If you include them, they’re your partner. If you exclude them, they’re your judge.”
The final session of the Library Assessment Conference featured a panel from academic librarianship and LIS education, who provided brief summations of what they learned during the three days of the conference.
Deborah Carver, Dean of Libraries, University of Oregon Libraries
There is more to the assessment story than numbers: Narratives are very important. Assessment is local, so think about what matters most to your institution. Borrow from others and share, but customize your methods for your own environment. Also, use what you have (ie, data you’re already collecting) as much as possible.
Debra Gilchrist, Dean, Libraries and Media Services, Pierce College Library
Inquiry is central to learning. Accountability is local and assessment provides vital signs. Stay ahead of the game so you can influence the future–as Betsy Wilson says, accelerate relevance. Can we call assessment something else? Focusing on assessment is like focusing on the test instead of the content of a class. We should come up with some label that puts focus on the outcomes. We should also do more with linking library assessment findings to what research says is important (eg about student learning).
Paul Beavers, Director, Information Services Group, Wayne State University Libraries
Assessment provides information that we can use in communicating with our patrons so that they can make more demands on us. We can help them understand that they can ask us to do more; we must make sure they have high expectations of us. Assessing the library’s contribution to educational outcomes is a “highest-hanging fruit” for academic libraries.
Peter Hernon, Professor, Simmons Graduate School of Library and Information Science
Evaluation and assessment are different from each other. Program evaluation is collecting and using data to make improvements, while environmental assessment is taking a broad view of the world. LIS education is sadly lacking in preparing future library and information science professionals with research skills that they can use in evaluating library programs.
Good news: wine assessment has a lot in common with library assessment! At the Library Assessment Conference on August 4, wine author/columnist Paul Gregutt described the winery rating system that he developed for his book about Washington wineries. He rated wineries’ quality according to four criteria: value, consistency, style, and contribution. It can make sense to apply these criteria in assessing libraries’ quality: value (people who visit libraries and wineries are both often under time stress and looking for answers about the best products), consistency (customers of libraries and wineries want a personalized experience that is comfortable, reliable, and won’t disappoint), style (a combination of physical characteristics, service, and collection strengths–a big winery or library MUST be well-organized; smaller ones MUST demonstrate uniqueness and depth), and contribution (to the wine industry or to libraries’ stakeholders via outreach/community programs).
The Library Assessment Conference took place in Seattle from August 4-7 and at the opening session, the audience heard three academic library directors’ perspectives on the “Most Important Challenge for Library Assessment.”
Susan Gibbons, Dean of the University of Rochester’s River Campus Libraries, opened with the observations about the attractions of qualitative data: they give you a sense of “precision” and a “correct” answer, they’re perceived as weighty, and their collection can be automated. She emphasized the importance of thinking about what we are counting, and why, and provided the example of the decrease in reference questions answered at the University of Rochester by 10,000 from 1996 to 2006. To learn about the “why” behind this quantitative finding, the library used qualitative approaches. For example, they asked students to take pictures of what they carry with them all the time, to map out their daily movements, to indicate what is useful/not useful by writing on a printed copy of the library’s web page, and to imagine what they would wish for if they had a magic wand. They learned that all students carry cell phones but that their library’s phone number did not appear on its home page on the web (in fact, 40% of ARL libraries’ home pages lack phone numbers!) and that students’ peak time period for studying is from 11pm to 1am. Those findings led to better visibility of the library’s phone number on the web and near library computers. Through the magic wand exercise they learned the importance of providing skills and tools to graduate students early in their careers. She emphasized that local assessment methods are required since every campus is unique and accountability is local. If opportunities are available for wide staff participation in assessment, changes are easier and work better.
Rick Luce, Director of Libraries at Emory University, characterized assessment as a method of planning for improvement–a catalyst for change rather than a quick fix. Performance measures are an organization’s vital signs through metrics that show innovation, research leadership, brand identity, and gains in market share. Successful organizations offer something that others can’t do, do poorly, or have difficulty doing well. Satisfaction can be studied through questionnaires that function as “happiness meters,” investigation into what’s important, and looking at how an organization rates against the best in an industry. He cautioned that assessment efforts can be hampered by pitfalls such as lack of accountability, too many initiatives, forgetting larger organizational drivers, and lack of discipline. He reminded us that time and patience are needed for real change in organizations. Providing a brief mention of the “Hedgehog” concept (a single, simple idea that guides great organizations’ efforts to be the best) from Jim Collins’ book Good to Great, he urged us to understand what we are passionate about, what we are best at, and what drives our economic engines.
Betsy Wilson, Dean of Libraries at the University of Washington, provided her perspective that the most important challenge for libraries is accelerating relevance. Assessment can help by providing fuel for that acceleration. So, it is extremely important for libraries that assessment becomes part of their organizational lifeblood, turning cultures of complaint into cultures of assessment.
Sullivan, M. “The Promise of Appreciative Inquiry in Library Organizations.” Library Trends. Summer 2004. 53(1):218-229.
According to Sullivan (2004), Appreciative Inquiry is a different approach to organizational change that “calls for the deliberate search for what contributes to organizational effectiveness and excellence” (p. 218). This perspective proposes moving from a traditional “deficit-based approach” in which there is an emphasis on problems to a more positive and collaborative framework. Therefore, Appreciative Inquiry is a unique approach that includes the identification of positive experiences and achievements as a “means to create change based upon the premise that we can effectively move forward if we know what has worked in the past” (p. 219). Furthermore, this approach “engages people in an exploration of what they value most about their work” (p. 219).
Overall, this article discusses the origins and basic principles of Appreciative Inquiry. In particular, the author provides practical suggestions for how libraries can begin to apply the principles and practices of Appreciative Inquiry to foster a more positive environment for creating change in libraries. For example:
· Start a problem-solving effort with a reflection on strengths, values, and best experiences.
· Support suggestions, possible scenarios, and ideas.
· Take time to frame questions in a positive light that will generate hope, imagination, and creative thinking.
· Ask staff to describe a peak experience in their professional work or a time when they felt most effective and engaged.
· Close meetings with a discussion of what worked well and identify individual contributions to the success of the meeting.
· Create a recognition program and make sure that it is possible (and easy) for everyone to participate.
· Expect the best performance and assume that everyone has the best intentions in what they do.
In conclusion, Appreciative Inquiry entails a major shift in thinking about how change can occur in library organizations. By examining what is working, this approach provides a useful and positive framework for transforming libraries.