Skip all navigation and go to page content

NEO Shop Talk

The blog of the National Network of Libraries of Medicine Evaluation Office

Archive for 2007

Needs Assessment Example

Tuesday, October 2nd, 2007

Perley CM, Gentry GA, Fleming S, Sen KM. Conducting a user-centered information needs assessment: the Via Christi Libraries’ experience.  J Med Libr Assoc  2007 Apr; 95(2):173-181.

This article provides a good example of a needs assessment using multiple evaluation methods. Librarians at the Via Christi Libraries in Wichita, Kansas, provide information services to all employees of the Via Christi Regional Medical Center (VCRMC) and needed to develop a strategic plan to meet the expanding use of their services and increasing cost of providing access.  This article provides detailed descriptions of how the researchers used a self-administered survey, telephone survey, and focus groups to gather information of increasing depth among users, and includes appendices with survey and focus group questions. The samples used in the project were not random, but the researchers used many venues to capture a solid cross section of their user population; and the multi-method approach allowed them to corroborate findings across different perspectives.  They also described how they used the findings to develop a strategic plan and listed their “lessons learned” about doing needs assessment.  This is not a “how to conduct a needs assessment” article and the findings are the main point of the piece.  But their concrete description of their methods provides added value to their article.

An ROI Calculator for Libraries

Friday, July 20th, 2007

The North Suburban Library System (north suburban Chicago, IL) has created a Return on Investment (ROI) Calculator on their website in keeping with their theme for the month of July, 2007: Dollars and Sense: Why Libraries are a Good Investment. For more information about this tool, visit:

Free Online Course on Outcomes-Based Planning and Evaluation

Wednesday, July 18th, 2007

Shaping Outcomes, an online course on outcomes-based planning and evaluation, will be available free to museum and library professionals this summer and fall. The instructor-mediated course, which will help participants improve program designs and evaluations, was developed through a cooperative agreement between the Institute of Museum and Library Services (IMLS) and Indiana University-Purdue University, Indianapolis (IUPUI). Through the approximately five-week course, participants will work at their own pace to learn outcomes-based planning and evaluation concepts and apply the concepts to a program or a project at their own institutions. A special course for those interested in teaching Shaping Outcomes or incorporating it into their own curricula will be offered in October 2007. Those interested in learning more about Shaping Outcomes or registering for one of the courses should visit or e-mail

Registration for 2007 American Evaluation Association Conference

Tuesday, July 10th, 2007

For those of you interested in attending the 2007 American Evaluation Association Conference in Baltimore (November 7-10), information is now online at the AEA conference Web site.  I also recommend AEA’s pre- and post-conference professional development workshops, which are offered Nov. 5-7 and Nov. 11. Workshop descriptions and cost information are available here.  Many of these workshops have been taught by the same instructors year after year and have received high evaluations. The quality can vary, however — so maybe some of you who have previously attended the AEA conference can add their comments about the workshops they liked.

New books about online surveys recommended at the AEA listserv

Tuesday, July 10th, 2007

Because so many RMLs conduct online surveys, I thought I would mention a couple of books recently recommended in discussions at EvalTalk, the American Evaluation Association’s listserv. The first is Don Dillman’s book “Mail and Internet Surveys” (updated 2007, Sage). Don Dillman is probably the best known survey researcher – that is, he researches the best ways to conduct survey research. I have used this particular book and found it to be very useful, although I have not seen this edition. The second book is by Sue and Ritter called “Conducting Online Surveys” (2007, Sage). One listserv member said he liked the book because (among other things) it addressed issues related to response rates – a concern that most of us have regarding online surveys. If you are a member of AEA, you may get similar information from an upcoming issue of New Directions in Evaluation which deals directly with use of online surveys in evaluation. This issue is edited by Sue and Ritter and is due out in September. New Directions in Evaluation is free to AEA members.

How to Assess the Value of Libraries from ACT for Libraries

Thursday, July 5th, 2007

Imholz S, Arns JW. Worth Their Weight: An Assessment of the Evolving Field of Library Valuation. New York: Americans for Libraries Council, 2007.

“Worth Their Weight” takes stock of the field of library valuation, defined as the process of assessing the value of a library to its community in actual dollars and cents. The report was issued by Americans for Libraries Council (ALC), “a nonprofit organization dedicated to increasing innovation and investment in the nation’s libraries.” The report describes different valuation models adapted from business and the nonprofit sector, such as social return-on-investment, triple-bottom line accounting, corporate social responsibility reports, and the balanced scorecard. To provide an overview of the valuation field, the report includes summaries of 17 public library valuation and impact studies (with links to the full reports). These summaries include detailed descriptions of the methods used, including actual surveys employed by the libraries. Finally, the report suggests ways to build the field of valuation and apply findings for use in library advocacy. The report relates specifically to public libraries, but the information has applicability to hospital and health science library valuation.

Advice on Writing Competitive Grant Proposals

Wednesday, June 13th, 2007

Langille, L, MacKenzie, T. Navigating the road to success: A systematic approach to preparing competitive grant proposals. EBLIP 2007; 2 (1): 23-31.

This article gives pragmatic advice about writing competitive grant proposals, with information organized around 11 basic principles of grant preparation including “address a specific audience,” “be innovative,” “involve stakeholders” and “define your objectives and outcomes.” The tips are so concrete, an applicant could create a checklist of characteristics to include in the proposal. The article emphasizes writing to two audiences: the funding agency and reviewers. While the article specifically describes preparation of a research grant proposal, many of the principles would apply to any grant application. RML staff members who teach proposal writing to network members or provide one-to-one help to network members with applications might find the article can help to articulate characteristics of a good proposal. Those reviewing grant applications also will find good criteria for judging applications. The article includes a list of online sources to support proposal writing and project planning. It also presents a timeline, so applicants understand the lead time they need to truly prepare good proposals.

Survey Monkey Re-design

Monday, June 4th, 2007

Apparently Survey Monkey has undergone a major re-design. You might want to take a look before you need to use it again.

In my quick tour around the updated site, I noticed the list manager feature is much improved. It stores your recipient list and messages with the survey, so there is less opportunity for error (like sending your survey to someone else’s list!). You now create and store your recipient list with the survey and access it under the collect button: just hit the collect button for a specific survey, and you’ll find the distribution list for that survey. (You create your distribution list there too.)

The other major change I discovered is that inactive surveys are archived and it can take up to 24 hours to restore data from archived surveys. (I couldn’t find any notations about how long surveys are inactive before they are archived.) Only the data takes time to restore, though. You do not have to restore questionnaires if you want to copy them for the basis of a new survey. When you hit the “create survey” button, a list appears of the surveys in the account that can be copied and archived surveys are included in that list.

It would be great if others using Survey Monkey could give us all a heads-up on changes you find (for better or worse) by inserting comments to this post. Instructions for adding comments can be found by clicking on “How to Participate” under “About OERC Blog” to the left.

Qualitative-based Evidence

Thursday, May 24th, 2007

Brophy, P. Narrative-Based Practice. EBLIP. 2007. 2:(1):149-158.

Given, L. Evidence-Based Practice and Qualitative Research: A Primer for Library and Information Professionals. EBLIP. 2007. 2:(1):15-21

In the movement promoting evidence-based library and information practice — defined as use of formalized strategies for including evidence in daily practice – the definition of the body of knowledge constituting “best evidence” continues to evolve. In the two articles cited above, which share the same issue of EBLIP, authors Brophy and Given argue for the inclusion of qualitative studies in that body of knowledge. While quantitative randomized-controlled trial (RCT) studies have long been considered the gold standard for producing evidence in the disciplines embracing evidence-based practice, Brophy and Given both argue that social fields like librarianship must look to qualitative studies to answer questions of “why” and “in what context.” They also argue that we cannot fully understand social context without methods that emphasize listening to people, observing behavior and reviewing textual and pictorial documents.

Brophy’s article, a commentary, promotes use of a database of high quality narratives (or stories) to inform practice – something he calls “narrative-based practice.” He writes “We are more likely to find meaning in the telling of how things have been experienced by others than in the formality of arid statistics and measures.” (p.156) Thus, he believes that narratives must be presented along with statistics to help managers with their “evidence-based” decision making.

Given’s article presents a more informative treatment of qualitative research with examples of its three primary methodologies: interviews; observation; and analysis of textual data (e.g., participant-created documents like journals or existing texts like policy manuals or meeting minutes). She also discusses some standard criteria for assessing qualitative research – which differs considerably from criteria for judging quantitative research.

Any number of articles have been published that argue for the legitimacy of qualitative research, but Brophy and Given go a step further. They believe that qualitative studies are essential to the development of a complete body of knowledge for informing practice.

The Joint Commission’s Strategies for Addressing Patient Diversity and Low Health Literacy

Wednesday, May 2nd, 2007

The Joint Commission (the health care organizations’ accrediting agency formerly known as JCAHO), has launched a public policy initiative to address broad issues that may undermine patient safety and quality of health care. It has published two papers that address important concerns related to patient safety and quality of care: linguistic and cultural diversity of patients; and low health literacy. A recent publication titled Hospitals, Language, and Culture: Snapshot of a Nation. A Report of Findings presents findings from a qualitative study of how 60 hospitals dealt with linguistic and cultural diversity. A white paper, “What did the Doctor Say? Improving Health Literacy to Improve Patient Safety” addresses the problem of health literacy and strategies for meeting patients’ communication needs.

The Joint Commission recognizes that the problems rising from diversity and low health literacy must be resolved through engagement of “multiple publics.” And medical librarians represent one “public” that should be engaged. Yet neither publication specifically mentions librarians, who could play a key role in providing access to information needed to support the strategies suggested in the publication. It appears that librarians may have to advocate for their role.

Fortunately, the MLA’s 2006-2007 priorities include a broad media effort to inform health care administrators of “librarians’ value (ROI) in providing consumer health information and patient education that improves patient safety, welfare and empowerment.” Another MLA priority — research defining the relationship between consumer health information and health information literacy on patient outcomes, safety and health care costs — will provide an “evidence-based” foundation for advocacy. The Joint Commission’s public policy initiative (and the accreditation standards that result from it) could pave the way for collaboration between health care organizations facing the challenges of caring for a diverse patient population and medical librarians who can support their efforts through access to much-needed information.

Last updated on Monday, June 27, 2016

Funded by the National Library of Medicine under Contract No. UG4LM012343 with the University of Washington.