Skip all navigation and go to page content
NN/LM Home About OERC | Contact Us | Feedback |OERC Sitemap | Help | Bookmark and Share

Logic Models for Library Assessment Planning

The Engaged Librarian: Crafting an Effective Assessment Plan to Determine the Impact of a Key Strategic Library Initiative by Sarah Murphy at The Ohio State University (OSU) was presented during the Library Assessment Conference and provided an overview to the use of a logic model as part of library strategic planning. Ms. Murphy’s presentation slides are available by clicking here.

Their project incorporated the theory of change methodology with logic models and used the Kellogg Foundation Logic Model as a template. They storyboarded data within a data dashboard that was both aligned with and broken down by the applicable OSU strategic vision goals. Ms. Murphy reported that the benefits of using a logic model approach included having a flexible but structured way to do library assessment planning, having a collaborative and inclusive approach, creating a project focus, being able to assess linear and iterative programs and services, and the ability to communicate program accomplishments in interesting ways. During the question and answer session they noted they are also Tableau fans (we will write about Tableau for our next post) and like to create data structures in their dashboard to avoid information silos.

If you’d like to learn more about logic models and data dashboards, be sure to check out our freely available Outreach Evaluation Resource Center (OERC) Evaluation Guides, especially Planning Outcomes-Based Outreach Projects. We also offer Data Dashboards: Monitoring Progress toward Program Outcomes as one of our webinars and a recording of Data Dashboards is available by clicking here.

Freebie Friday: Presentations from the Library Assessment Conference

ARL Library Assessment Conference 2014 digital badge for your website and/or blog

We here at the Outreach Evaluation Resource Center (OERC) and 600 others enjoyed a fabulous week of learning and engagement during the 2014 Library Assessment Conference at the University of Washington in Seattle. We will be covering some of the great assessment resources and information shared here in our blog (http://nnlm.gov/evaluation/blog/) that are of interest to National Network of Library of Medicine (NN/LM) network members.

In the meantime you may be interested in the freely available copies of most of the conference presentations that are posted on the main program schedule website at http://libraryassessment.org/schedule/index.shtml  along with photos of the poster session, which will soon be available at http://libraryassessment.org/schedule/2014-posters.shtml.

Data Visualizations at Information is Beautiful

The OERC will be attending the Library Assessment Conference (LAC) at University of Washington this week, where we will be learning about new trends in library assessment, evaluation, and improvement. Stayed tuned to our blog and we will pass along what we learn. The LAC is sponsored by the Association of Research Libraries.

In the meantime, I want to leave you with a fun data visualization site to explore while we’re gone.  David McCandless creates wonderful infographics for his site “Information is Beautiful.”  Many are health-related. All are gorgeous.  You can find the list of his data visualizations here.

 

A Guide for Conducting Community Conversations

Planning focus groups? You might want to check out the Libraries Transforming Communities Community Conversation Workbook by the American Library Association (ALA).

This workbook is a resource developed for the ALA’s Libraries Transforming Communities (LTC) initiative, which provides librarians with training and resources to enhance their roles as community leaders and change-agents. The initiative’s goal is to help librarians promote the visibility and value of their libraries within their communities. Public discussions are promoted as key community engagement strategies.

To that end, ALA has developed the Libraries Transforming Communities Community Conversation Workbook. This workbook provides invaluable guidance to anyone who wants to conduct discussion groups for community assessment purposes. It provides practical advice on every aspect of convening group discussions, including tips on participant recruitment, a list of discussion questions, facilitator guidelines, note-taking tools, and templates for organizing key findings.

Demonstrating value is of considerable interest to many libraries and organizations these days. Such organizations may be interested in exploring other articles and resources related to the LTC initiative. You can find more information about the initiative at the LTC web page. You can also see how libraries are implementing LTC activities at the initiative’s digital portal:

Freebie Friday: Shaping Outcomes Course

logo for Shaping Outcomes class

Do you want to learn more about outcomes-based planning and evaluation (OBPE) for your outreach project but there’s no money in the training budget to do so?

Shaping Outcomes: Making a Difference in Libraries and Museums (shapingoutcomes.org) is available as a free online course that learners can start anytime and work on at their own self-navigated pace. While there are library and museum-specific examples provided in the course the concepts of learning more about target audience needs, how to clarify desired results, developing logic models and evaluating outcomes are applicable for most other organizations’ outreach projects as well.

Modules of the class are broken into five sections (Overview, Plan, Build, Evaluate, Report) with a helpful Glossary to learn OBPE terminology and a Logic Model template. Shaping Outcomes was developed by the Institute of Museum and Library Services (IMLS) and Indiana University/Purdue University Indianapolis (IUPUI) and previously was available as an instructor-led class.

More information specific to developing logic models in health information outreach programs is available from Booklet Two: Planning Outcomes-Based Outreach Projects, part of our resources on our Outreach Evaluation Resource Center (OERC) Evaluation Guides page at http://nnlm.gov/evaluation/guides.html.

 

Freebie Friday: Visualization Literacy

With an increase of technology tools available for data reporting and visualization (be sure to check out some of our Outreach Evaluation Resource Center Reporting and Visualizing tools at http://guides.nnlm.gov/oerc/tools) sometimes it’s challenging to know how to best use them to clearly communicate the intended meaning of the data. The concept of visualization literacy and a broader theme of visual literacy are often not included as part of the instructions guiding people in the steps to create their own visualization design.

A recent entry by Andrew Kirk on the blog of Seeing Data, a research project in the United Kingdom studying how people understand big data visualizations shown in the media, offers a great review of 8 Articles Discussing Visual and Visualization Literacy that are freely available and well worth a read to better understand both visual and visualization literacy. Their featured articles include resources ranging from the importance of Visual Literacy in an Age of Data to How to Be an Educated Consumer of Infographics, and Seeing Data has asked that you share additional ones with them via blog comments or their Twitter social media account @SeeingData.

Telling Training’s Story: The Success Case Method

“On the average, it is true that most training does not work very well. But some programs work very well with some of the people, and this represents their great potential for being leveraged for even greater results.” Robert O. Brinkerhoff, “Telling Training’s Story.”

Most evaluation methods for program training reduce data to averages: the average number of things learned by participants; the average number of techniques applied on the job; the average number of times a skill was used post-training. Unfortunately, this approach can underestimate the true value of training for the organizations investing in the programs.

In Telling Training’s Story, Brinkerhoff writes that, in reality, the majority of participants gain little from training programs.  They either use some information but get no results, or they simply give up after a few attempts. Sometimes poor instructional design is to blame. More often, low success is caused by contextual variables, such as lack of supervisory support, no opportunity to try out the learning, or program timing. In fact, good instructional design often cannot compensate for these environmental crosscurrents.

Yet, Brinkerhoff argues that most training programs can boast a few success cases. There are usually a handful of participants (sometimes more) who apply their new knowledge or skill to produce valuable results for their organizations.  Sometimes the value of their contributions justifies program cost. Or, if the percentage of success cases was boosted by just 10%, the investment would be worthwhile to the organization.

To truly evaluate a training program, you need to identify any positive outcomes that occur, even if they are traced to a small number of participants, and assess the value of those results. You also need to determine what instructional and contextual factors influence successful use of training information. Then, organizations can make informed decisions about continuing to invest in training.

Brinkerhoff’s Success Case Method (SCM) was designed for in-depth analysis of training programs and their outcomes. The method focuses on high and low success cases. High success cases refer to incidents where participants applied training program information and attained positive results for their organizations. Low success cases are situations in which participants demonstrated no application of the training information.

Investigation of the high-success cases identifies the best possible outcomes that occur when employees apply information gained from the training program. By adding low-success cases into the mix, the method also leads to a thorough understanding of key factors, both in training design and in the organizational context, that influence participants use of their new capabilities.

Brinkerhoff’s book Telling Training’s Story provides step-by-step guidance for conducting SCM studies. Steps include working with stakeholders to define success and value; developing a program impact model, using rigorous sampling methods; and testing rival hypotheses for your findings. By following these steps, you can present a case study with compelling evidence of your program’s value. If you find your program is ineffective, the process will illustrate the factors working against its success.

So, the next time you want to evaluate a training program, consider going beyond “average.”  Check out the Success Case Method.

Source: Brinkerhoff RO. Telling training’s story. San Francisco, CA: Berrett-Koehler Publishers; 2006.

 

Freebie Friday: OERC Data Burger Webinar

veggie burger via Flickr
Veggie burger by Dan McKay, Flickr Creative Commons license

The Outreach Evaluation Resource Center (OERC) was pleased to present our Data Burger: A “Good” Questionnaire Response Rate plus Basic Quantitative Data Analysis webinar for the National Network of Libraries of Medicine, New England Region (NN/LM NER) Health Care Workforce Community of Interest (COI) this week. Thanks to NER for freely sharing the 1 hour webcast recording at https://webmeeting.nih.gov/p87ze5jc0do/.

Are you interested in this and other webinar training sessions the OERC has to offer? You can learn more about Data Burger and others from our presentation listing at http://nnlm.gov/evaluation/workshops/ and let us or your Regional office know that you’d like us to present a webinar. We welcome the opportunity to work with you!

 

Freebie Friday: American Evaluation Association Online Public Library

As we at the Outreach Evaluation Resource Center (OERC) learn more about great evaluation resources available at an even better price (free!) in addition to our own freely available resources (http://guides.nnlm.gov/oerc/tools), we will feature some of them here for you to explore on ‘Freebie Friday’ as an occasional series.

To begin our coverage, did you know the American Evaluation Association (AEA) has an online public library of AEA conference presentations, assessment instruments (such as rubrics and logic models), and more available at http://comm.eval.org/communities/resources/libraryview/? The default library setting displays those files most recently updated in the online library and offers keyword searching, but I recommend going straight to the more advanced search functionality at http://comm.eval.org/communities/resources/searchlibrary/.

Read more »

Brush Up On Your Excel Skills

If you do evaluation, you likely use Excel.  So I recommend bookmarking this web page of short videos with excellent instructions and practical tips for analyzing data with Excel.  Ann Emery produced these videos, most of which are 2-5 minutes long. Some cover the basics, such as how to calculate descriptive statistics. If you find pivot tables difficult to grasp, there’s a short video describing the different components.  For the experienced user, there are videos about more advanced topics such as how to automate dashboards using Excel and Word.

Emery is an evaluator and data analyst with Innovation Network in Washington, DC.  She has a special interest in data visualization and covers that and other topics in her blog, also available at her website.