Archive for August, 2008
An article entitled “Demystifying Survey Research: Practical Suggestions for Effective Question Design” was published in the journal Evidence Based Library and Information Practice (2007). The aim of this article is to provide practical suggestions for effective questions when designing written surveys. Sample survey questions used in the article help to illustrate how some basic techniques, such as choosing appropriate question forms and incorporating the use of scales, can be used to improve survey questions.
Since this is a peer reviewed, open-access journal, those interested may access the full-text article online at: http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/516/668.
In addition, for those interested in exploring survey research more, I have found the following print resources to be very helpful in this learning process:
Converse, J.M., and S. Presser. Survey Questions: Handcrafting the Standardized Questionnaire. Thousand Oaks, CA: Sage, 1986.
Fink, A. How to Ask Survey Questions. Thousand Oaks: Sage Publications, 2003.
Fowler, F.J. Improving Survey Questions: Design and Evaluation. Thousand Oaks: Sage Publications, 1995.
The penultimate session of the 2008 Library Assessment Conference was a panel discussion, Assessment Plans: Four Case Studies. Among the experiences and advice provided by the four expert panelists was this final observation about working with library staff on assessment projects: “If you include them, they’re your partner. If you exclude them, they’re your judge.”
The final session of the Library Assessment Conference featured a panel from academic librarianship and LIS education, who provided brief summations of what they learned during the three days of the conference.
Deborah Carver, Dean of Libraries, University of Oregon Libraries
There is more to the assessment story than numbers: Narratives are very important. Assessment is local, so think about what matters most to your institution. Borrow from others and share, but customize your methods for your own environment. Also, use what you have (ie, data you’re already collecting) as much as possible.
Debra Gilchrist, Dean, Libraries and Media Services, Pierce College Library
Inquiry is central to learning. Accountability is local and assessment provides vital signs. Stay ahead of the game so you can influence the future–as Betsy Wilson says, accelerate relevance. Can we call assessment something else? Focusing on assessment is like focusing on the test instead of the content of a class. We should come up with some label that puts focus on the outcomes. We should also do more with linking library assessment findings to what research says is important (eg about student learning).
Paul Beavers, Director, Information Services Group, Wayne State University Libraries
Assessment provides information that we can use in communicating with our patrons so that they can make more demands on us. We can help them understand that they can ask us to do more; we must make sure they have high expectations of us. Assessing the library’s contribution to educational outcomes is a “highest-hanging fruit” for academic libraries.
Peter Hernon, Professor, Simmons Graduate School of Library and Information Science
Evaluation and assessment are different from each other. Program evaluation is collecting and using data to make improvements, while environmental assessment is taking a broad view of the world. LIS education is sadly lacking in preparing future library and information science professionals with research skills that they can use in evaluating library programs.
Good news: wine assessment has a lot in common with library assessment! At the Library Assessment Conference on August 4, wine author/columnist Paul Gregutt described the winery rating system that he developed for his book about Washington wineries. He rated wineries’ quality according to four criteria: value, consistency, style, and contribution. It can make sense to apply these criteria in assessing libraries’ quality: value (people who visit libraries and wineries are both often under time stress and looking for answers about the best products), consistency (customers of libraries and wineries want a personalized experience that is comfortable, reliable, and won’t disappoint), style (a combination of physical characteristics, service, and collection strengths–a big winery or library MUST be well-organized; smaller ones MUST demonstrate uniqueness and depth), and contribution (to the wine industry or to libraries’ stakeholders via outreach/community programs).
The Library Assessment Conference took place in Seattle from August 4-7 and at the opening session, the audience heard three academic library directors’ perspectives on the “Most Important Challenge for Library Assessment.”
Susan Gibbons, Dean of the University of Rochester’s River Campus Libraries, opened with the observations about the attractions of qualitative data: they give you a sense of “precision” and a “correct” answer, they’re perceived as weighty, and their collection can be automated. She emphasized the importance of thinking about what we are counting, and why, and provided the example of the decrease in reference questions answered at the University of Rochester by 10,000 from 1996 to 2006. To learn about the “why” behind this quantitative finding, the library used qualitative approaches. For example, they asked students to take pictures of what they carry with them all the time, to map out their daily movements, to indicate what is useful/not useful by writing on a printed copy of the library’s web page, and to imagine what they would wish for if they had a magic wand. They learned that all students carry cell phones but that their library’s phone number did not appear on its home page on the web (in fact, 40% of ARL libraries’ home pages lack phone numbers!) and that students’ peak time period for studying is from 11pm to 1am. Those findings led to better visibility of the library’s phone number on the web and near library computers. Through the magic wand exercise they learned the importance of providing skills and tools to graduate students early in their careers. She emphasized that local assessment methods are required since every campus is unique and accountability is local. If opportunities are available for wide staff participation in assessment, changes are easier and work better.
Rick Luce, Director of Libraries at Emory University, characterized assessment as a method of planning for improvement–a catalyst for change rather than a quick fix. Performance measures are an organization’s vital signs through metrics that show innovation, research leadership, brand identity, and gains in market share. Successful organizations offer something that others can’t do, do poorly, or have difficulty doing well. Satisfaction can be studied through questionnaires that function as “happiness meters,” investigation into what’s important, and looking at how an organization rates against the best in an industry. He cautioned that assessment efforts can be hampered by pitfalls such as lack of accountability, too many initiatives, forgetting larger organizational drivers, and lack of discipline. He reminded us that time and patience are needed for real change in organizations. Providing a brief mention of the “Hedgehog” concept (a single, simple idea that guides great organizations’ efforts to be the best) from Jim Collins’ book Good to Great, he urged us to understand what we are passionate about, what we are best at, and what drives our economic engines.
Betsy Wilson, Dean of Libraries at the University of Washington, provided her perspective that the most important challenge for libraries is accelerating relevance. Assessment can help by providing fuel for that acceleration. So, it is extremely important for libraries that assessment becomes part of their organizational lifeblood, turning cultures of complaint into cultures of assessment.
Sullivan, M. “The Promise of Appreciative Inquiry in Library Organizations.” Library Trends. Summer 2004. 53(1):218-229.
According to Sullivan (2004), Appreciative Inquiry is a different approach to organizational change that “calls for the deliberate search for what contributes to organizational effectiveness and excellence” (p. 218). This perspective proposes moving from a traditional “deficit-based approach” in which there is an emphasis on problems to a more positive and collaborative framework. Therefore, Appreciative Inquiry is a unique approach that includes the identification of positive experiences and achievements as a “means to create change based upon the premise that we can effectively move forward if we know what has worked in the past” (p. 219). Furthermore, this approach “engages people in an exploration of what they value most about their work” (p. 219).
Overall, this article discusses the origins and basic principles of Appreciative Inquiry. In particular, the author provides practical suggestions for how libraries can begin to apply the principles and practices of Appreciative Inquiry to foster a more positive environment for creating change in libraries. For example:
· Start a problem-solving effort with a reflection on strengths, values, and best experiences.
· Support suggestions, possible scenarios, and ideas.
· Take time to frame questions in a positive light that will generate hope, imagination, and creative thinking.
· Ask staff to describe a peak experience in their professional work or a time when they felt most effective and engaged.
· Close meetings with a discussion of what worked well and identify individual contributions to the success of the meeting.
· Create a recognition program and make sure that it is possible (and easy) for everyone to participate.
· Expect the best performance and assume that everyone has the best intentions in what they do.
In conclusion, Appreciative Inquiry entails a major shift in thinking about how change can occur in library organizations. By examining what is working, this approach provides a useful and positive framework for transforming libraries.