In the October, 2008 issue of the MLA News, Terrance Burton presented a quick overview of how the business, dollor-based Return on Investment approach (profit gained from invested dollars) can be expanded to a view of what value is derived from whatever inputs you choose to work with. This can be more relevant in the library world, where profit is not the main point. In fact, in many health-related institutions served by libraries, profit is not the main point. Money is indeed invested in libraries, but it can be challenging to assess the outputs from that investment. Burton suggests that we identify outcomes that we want to measure, use a mix of approaches (quantitative, qualitative, speculative), and accept that any measure will be flawed. Despite flaws, using mixed approaches can provide various indicators to bolster a case.
Archive for the ‘Library Value’ Category
This paper presents research results that provide insights into how information influences healthcare managers’ decisions. Information needs included explicit Organizational Knowledge (such as policies and guidelines), Cultural Organizational Knowledge (situational such as buy-in, controversy, bias, conflict of interest; and environmental such as politics and power), and Tacit Organizational Knowledge (gained experientially and through intuition). Managers tended to use internal information (already created or implemented within an organization) when investigating an issue and developing strategies. When selecting a strategy, managers either actively looked for additional external information–or else they didn’t, and simply made a decision without all of the information that they would have liked to have. Managers may be more likely to use external information (ie, research-based library resources) if their own internal information is well-managed. The article’s authors suggest that librarians may have a role in managing information created within an organization in order to integrate it with externally created information resources.
The penultimate session of the 2008 Library Assessment Conference was a panel discussion, Assessment Plans: Four Case Studies. Among the experiences and advice provided by the four expert panelists was this final observation about working with library staff on assessment projects: “If you include them, they’re your partner. If you exclude them, they’re your judge.”
The final session of the Library Assessment Conference featured a panel from academic librarianship and LIS education, who provided brief summations of what they learned during the three days of the conference.
Deborah Carver, Dean of Libraries, University of Oregon Libraries
There is more to the assessment story than numbers: Narratives are very important. Assessment is local, so think about what matters most to your institution. Borrow from others and share, but customize your methods for your own environment. Also, use what you have (ie, data you’re already collecting) as much as possible.
Debra Gilchrist, Dean, Libraries and Media Services, Pierce College Library
Inquiry is central to learning. Accountability is local and assessment provides vital signs. Stay ahead of the game so you can influence the future–as Betsy Wilson says, accelerate relevance. Can we call assessment something else? Focusing on assessment is like focusing on the test instead of the content of a class. We should come up with some label that puts focus on the outcomes. We should also do more with linking library assessment findings to what research says is important (eg about student learning).
Paul Beavers, Director, Information Services Group, Wayne State University Libraries
Assessment provides information that we can use in communicating with our patrons so that they can make more demands on us. We can help them understand that they can ask us to do more; we must make sure they have high expectations of us. Assessing the library’s contribution to educational outcomes is a “highest-hanging fruit” for academic libraries.
Peter Hernon, Professor, Simmons Graduate School of Library and Information Science
Evaluation and assessment are different from each other. Program evaluation is collecting and using data to make improvements, while environmental assessment is taking a broad view of the world. LIS education is sadly lacking in preparing future library and information science professionals with research skills that they can use in evaluating library programs.
Good news: wine assessment has a lot in common with library assessment! At the Library Assessment Conference on August 4, wine author/columnist Paul Gregutt described the winery rating system that he developed for his book about Washington wineries. He rated wineries’ quality according to four criteria: value, consistency, style, and contribution. It can make sense to apply these criteria in assessing libraries’ quality: value (people who visit libraries and wineries are both often under time stress and looking for answers about the best products), consistency (customers of libraries and wineries want a personalized experience that is comfortable, reliable, and won’t disappoint), style (a combination of physical characteristics, service, and collection strengths–a big winery or library MUST be well-organized; smaller ones MUST demonstrate uniqueness and depth), and contribution (to the wine industry or to libraries’ stakeholders via outreach/community programs).
The Library Assessment Conference took place in Seattle from August 4-7 and at the opening session, the audience heard three academic library directors’ perspectives on the “Most Important Challenge for Library Assessment.”
Susan Gibbons, Dean of the University of Rochester’s River Campus Libraries, opened with the observations about the attractions of qualitative data: they give you a sense of “precision” and a “correct” answer, they’re perceived as weighty, and their collection can be automated. She emphasized the importance of thinking about what we are counting, and why, and provided the example of the decrease in reference questions answered at the University of Rochester by 10,000 from 1996 to 2006. To learn about the “why” behind this quantitative finding, the library used qualitative approaches. For example, they asked students to take pictures of what they carry with them all the time, to map out their daily movements, to indicate what is useful/not useful by writing on a printed copy of the library’s web page, and to imagine what they would wish for if they had a magic wand. They learned that all students carry cell phones but that their library’s phone number did not appear on its home page on the web (in fact, 40% of ARL libraries’ home pages lack phone numbers!) and that students’ peak time period for studying is from 11pm to 1am. Those findings led to better visibility of the library’s phone number on the web and near library computers. Through the magic wand exercise they learned the importance of providing skills and tools to graduate students early in their careers. She emphasized that local assessment methods are required since every campus is unique and accountability is local. If opportunities are available for wide staff participation in assessment, changes are easier and work better.
Rick Luce, Director of Libraries at Emory University, characterized assessment as a method of planning for improvement–a catalyst for change rather than a quick fix. Performance measures are an organization’s vital signs through metrics that show innovation, research leadership, brand identity, and gains in market share. Successful organizations offer something that others can’t do, do poorly, or have difficulty doing well. Satisfaction can be studied through questionnaires that function as “happiness meters,” investigation into what’s important, and looking at how an organization rates against the best in an industry. He cautioned that assessment efforts can be hampered by pitfalls such as lack of accountability, too many initiatives, forgetting larger organizational drivers, and lack of discipline. He reminded us that time and patience are needed for real change in organizations. Providing a brief mention of the “Hedgehog” concept (a single, simple idea that guides great organizations’ efforts to be the best) from Jim Collins’ book Good to Great, he urged us to understand what we are passionate about, what we are best at, and what drives our economic engines.
Betsy Wilson, Dean of Libraries at the University of Washington, provided her perspective that the most important challenge for libraries is accelerating relevance. Assessment can help by providing fuel for that acceleration. So, it is extremely important for libraries that assessment becomes part of their organizational lifeblood, turning cultures of complaint into cultures of assessment.
This year’s Medical Library Association annual meeting in Chicago had several good sessions in which speakers presented experiences and approaches to assigning dollar values to library services and activities. These included:
- “A Calculator for Measuring the Impact of Health Sciences Libraries and Librarians” presented by Betsy Kelly and Barb Jones of the MidContinental Region, National Network of Libraries of Medicine–Their calculators include the Valuing Library Services Calculator and the Cost Benefit and ROI Calculator. These have the potential to be very useful tools.
- “Connecting with Administrators: Demonstrating the Value of Library Services” presented by Edward J. Poletti of the Central Arkansas Veterans Health Care System in Little Rock, AR–He and VA Library colleagues conducted value studies of shared electronic resources, ILL, and literature searches. His presentation included a list of sources of dollar values such as Fortney’s “Price History for Core Clinical Journals in Medicine and Nursing 2003-2007” and “Doody’s core titles in the health sciences 2007: list overview and analysis.” This paper received honorable mention for the MLA Research Award, and a summary is available at the MLA Federal Libraries Section blog.
- “Bridging the Gap: Using Dollar Values to Demonstrate the Value of Library Services” presented by Julia Esparza of Louisana State University Health Sciences Center in Shreveport, LA–Her experience with assigning and tracking dollar values included analysis of copying/printing costs and article costs.
- “The Academic Library’s Perspective on Assessment and Institutional Role” presented by James Shedlock of Northwestern University in Chicago, IL–In comparative benchmarking, libraries should look at their “true peers” ie, institutions that are similar programmatically.
- “Quantum Physics and Hospital Library Assessment” presented by Michele Klein-Fedyshin of UPMC Shadyside, Pittsburgh, PA–Assessment must be locally relevant and there are various possible foci, such as the financial impact of local consortia, the impact of library services on nursing certification, prevention of hospital acquired infections, cost savings from library contributions to pay-for-performance, library as drug information center, etc.
- “University Investment in the Library: What’s the Return?” presentations at the MLA Elsevier exhibit reported on an Elsevier-funded case study at the University of Illinois at Urbana-Champaign in which a model for calculating Return on Investment (ROI) was developed.
Back in May at the 2008 Medical Library Association meeting in Chicago, a group of health care administrators presented a panel discussion titled “Connecting with Leaders: What Do They Expect?” in which they provided their perspectives regarding their expectations for the health sciences library. This was a group of library supporters and their comments revealed their expectations that library leaders need to think broadly and creatively about their libraries’ roles. Suggestions included:
- Participate in community outreach to serve the greater good of the institution and its communities
- Work with IT to find ways that the library complements IT
- Develop allegiances; although forming partnerships isn’t easy, a fundamental component of administration is relationship building
- Stay connected and aligned with operational opportunities and priorities
- Participate! In the “journey” toward magnet status; in research to improve patient care; in the institution’s constant staff retooling and retraining; in instructional delivery; in grant proposal creation; in benchmarking to learn what similar institutions have and what admired institutions have
- Think in terms of dollars but remember other values
This session was on Monday, May 19 at 10:35am and, if you have access to the MLA ’08 CD-ROM, it’s definitely a worthwhile listen.
How can benchmarking, ROI, and other metrics illustrate value to users and stakeholders? This standing-room-only session at the Special Libraries Association meeting featured analysts from Outsell shared benchmarking results and suggested combining such comparative data with “Market Penetration” (the ratio of your actual to potential users). Panelists also discussed the difference between “Operational Metrics” (measures needed for daily library management activities) and “Strategic Metrics” (measures that show the library’s value to the organization). They described strategic assessment–in which 6-8 strategic actions that support the organization’s critical strategies are identified through user and stakeholder research combined with group brainstorming. After strategic actions are selected, metrics are determined, and ownership is assigned. Stakeholder research includes needs assessment, client satisfaction studies, and return on investment/cost-benefit analysis. Outsell panelists also advocated use of a combination of qualitative and quantitative research, since “numbers alone do not tell the story,” and attention to organization-wide standards (such as Balanced Scorecard and/or Total Quality Management).
This bright and early morning session on 6/16/08 was hosted and organized by the Special Libraries Association’s Government Information Division. Librarians in the audience shared their challenges and best practices for applying metrics to quantify and justify their operations. The PowerPoint from Outsell should be available soon at the division’s web site.