Skip all navigation and go to page content
NN/LM Home About OERC | Contact Us | Feedback |OERC Sitemap | Help | Bookmark and Share

A Guide for Conducting Community Conversations

Planning focus groups? You might want to check out the Libraries Transforming Communities Community Conversation Workbook by the American Library Association (ALA).

This workbook is a resource developed for the ALA’s Libraries Transforming Communities (LTC) initiative, which provides librarians with training and resources to enhance their roles as community leaders and change-agents. The initiative’s goal is to help librarians promote the visibility and value of their libraries within their communities. Public discussions are promoted as key community engagement strategies.

To that end, ALA has developed the Libraries Transforming Communities Community Conversation Workbook. This workbook provides invaluable guidance to anyone who wants to conduct discussion groups for community assessment purposes. It provides practical advice on every aspect of convening group discussions, including tips on participant recruitment, a list of discussion questions, facilitator guidelines, note-taking tools, and templates for organizing key findings.

Demonstrating value is of considerable interest to many libraries and organizations these days. Such organizations may be interested in exploring other articles and resources related to the LTC initiative. You can find more information about the initiative at the LTC web page. You can also see how libraries are implementing LTC activities at the initiative’s digital portal:

Freebie Friday: Shaping Outcomes Course

logo for Shaping Outcomes class

Do you want to learn more about outcomes-based planning and evaluation (OBPE) for your outreach project but there’s no money in the training budget to do so?

Shaping Outcomes: Making a Difference in Libraries and Museums (shapingoutcomes.org) is available as a free online course that learners can start anytime and work on at their own self-navigated pace. While there are library and museum-specific examples provided in the course the concepts of learning more about target audience needs, how to clarify desired results, developing logic models and evaluating outcomes are applicable for most other organizations’ outreach projects as well.

Modules of the class are broken into five sections (Overview, Plan, Build, Evaluate, Report) with a helpful Glossary to learn OBPE terminology and a Logic Model template. Shaping Outcomes was developed by the Institute of Museum and Library Services (IMLS) and Indiana University/Purdue University Indianapolis (IUPUI) and previously was available as an instructor-led class.

More information specific to developing logic models in health information outreach programs is available from Booklet Two: Planning Outcomes-Based Outreach Projects, part of our resources on our Outreach Evaluation Resource Center (OERC) Evaluation Guides page at http://nnlm.gov/evaluation/guides.html.

 

Freebie Friday: Visualization Literacy

With an increase of technology tools available for data reporting and visualization (be sure to check out some of our Outreach Evaluation Resource Center Reporting and Visualizing tools at http://guides.nnlm.gov/oerc/tools) sometimes it’s challenging to know how to best use them to clearly communicate the intended meaning of the data. The concept of visualization literacy and a broader theme of visual literacy are often not included as part of the instructions guiding people in the steps to create their own visualization design.

A recent entry by Andrew Kirk on the blog of Seeing Data, a research project in the United Kingdom studying how people understand big data visualizations shown in the media, offers a great review of 8 Articles Discussing Visual and Visualization Literacy that are freely available and well worth a read to better understand both visual and visualization literacy. Their featured articles include resources ranging from the importance of Visual Literacy in an Age of Data to How to Be an Educated Consumer of Infographics, and Seeing Data has asked that you share additional ones with them via blog comments or their Twitter social media account @SeeingData.

Telling Training’s Story: The Success Case Method

“On the average, it is true that most training does not work very well. But some programs work very well with some of the people, and this represents their great potential for being leveraged for even greater results.” Robert O. Brinkerhoff, “Telling Training’s Story.”

Most evaluation methods for program training reduce data to averages: the average number of things learned by participants; the average number of techniques applied on the job; the average number of times a skill was used post-training. Unfortunately, this approach can underestimate the true value of training for the organizations investing in the programs.

In Telling Training’s Story, Brinkerhoff writes that, in reality, the majority of participants gain little from training programs.  They either use some information but get no results, or they simply give up after a few attempts. Sometimes poor instructional design is to blame. More often, low success is caused by contextual variables, such as lack of supervisory support, no opportunity to try out the learning, or program timing. In fact, good instructional design often cannot compensate for these environmental crosscurrents.

Yet, Brinkerhoff argues that most training programs can boast a few success cases. There are usually a handful of participants (sometimes more) who apply their new knowledge or skill to produce valuable results for their organizations.  Sometimes the value of their contributions justifies program cost. Or, if the percentage of success cases was boosted by just 10%, the investment would be worthwhile to the organization.

To truly evaluate a training program, you need to identify any positive outcomes that occur, even if they are traced to a small number of participants, and assess the value of those results. You also need to determine what instructional and contextual factors influence successful use of training information. Then, organizations can make informed decisions about continuing to invest in training.

Brinkerhoff’s Success Case Method (SCM) was designed for in-depth analysis of training programs and their outcomes. The method focuses on high and low success cases. High success cases refer to incidents where participants applied training program information and attained positive results for their organizations. Low success cases are situations in which participants demonstrated no application of the training information.

Investigation of the high-success cases identifies the best possible outcomes that occur when employees apply information gained from the training program. By adding low-success cases into the mix, the method also leads to a thorough understanding of key factors, both in training design and in the organizational context, that influence participants use of their new capabilities.

Brinkerhoff’s book Telling Training’s Story provides step-by-step guidance for conducting SCM studies. Steps include working with stakeholders to define success and value; developing a program impact model, using rigorous sampling methods; and testing rival hypotheses for your findings. By following these steps, you can present a case study with compelling evidence of your program’s value. If you find your program is ineffective, the process will illustrate the factors working against its success.

So, the next time you want to evaluate a training program, consider going beyond “average.”  Check out the Success Case Method.

Source: Brinkerhoff RO. Telling training’s story. San Francisco, CA: Berrett-Koehler Publishers; 2006.

 

Freebie Friday: OERC Data Burger Webinar

veggie burger via Flickr
Veggie burger by Dan McKay, Flickr Creative Commons license

The Outreach Evaluation Resource Center (OERC) was pleased to present our Data Burger: A “Good” Questionnaire Response Rate plus Basic Quantitative Data Analysis webinar for the National Network of Libraries of Medicine, New England Region (NN/LM NER) Health Care Workforce Community of Interest (COI) this week. Thanks to NER for freely sharing the 1 hour webcast recording at https://webmeeting.nih.gov/p87ze5jc0do/.

Are you interested in this and other webinar training sessions the OERC has to offer? You can learn more about Data Burger and others from our presentation listing at http://nnlm.gov/evaluation/workshops/ and let us or your Regional office know that you’d like us to present a webinar. We welcome the opportunity to work with you!

 

Freebie Friday: American Evaluation Association Online Public Library

As we at the Outreach Evaluation Resource Center (OERC) learn more about great evaluation resources available at an even better price (free!) in addition to our own freely available resources (http://guides.nnlm.gov/oerc/tools), we will feature some of them here for you to explore on ‘Freebie Friday’ as an occasional series.

To begin our coverage, did you know the American Evaluation Association (AEA) has an online public library of AEA conference presentations, assessment instruments (such as rubrics and logic models), and more available at http://comm.eval.org/communities/resources/libraryview/? The default library setting displays those files most recently updated in the online library and offers keyword searching, but I recommend going straight to the more advanced search functionality at http://comm.eval.org/communities/resources/searchlibrary/.

Read more »

Brush Up On Your Excel Skills

If you do evaluation, you likely use Excel.  So I recommend bookmarking this web page of short videos with excellent instructions and practical tips for analyzing data with Excel.  Ann Emery produced these videos, most of which are 2-5 minutes long. Some cover the basics, such as how to calculate descriptive statistics. If you find pivot tables difficult to grasp, there’s a short video describing the different components.  For the experienced user, there are videos about more advanced topics such as how to automate dashboards using Excel and Word.

Emery is an evaluator and data analyst with Innovation Network in Washington, DC.  She has a special interest in data visualization and covers that and other topics in her blog, also available at her website.

SlideDocs help you create readable reports

Thanks to evolving Internet and mobile technology, most of us now prefer our information presented in tapas-sized quantities, preferably in a visually appealing way. This preference is having an impact on our business communication. Before our eyes, bullet points on PowerPoint slides are gradually giving away to engaging visuals. Infographics are replacing text-dense brochures and executive summaries.

Nancy Duarte, who is motivating  the world to clean up its PowerPoint presentations, is venturing into a new frontier: Written reports. She suggests creating reports using presentation software like PowerPoint, which is more amenable to manipulating layouts. The idea is to present one idea per slide, incorporating visuals, text and much-appreciated white space.

Duarte calls this type of report a SlideDoc and offers a free tutorial on how to create one (presented in, you guessed it, a PowerPoint document). The site includes downloadable templates to help you create your own SlideDoc report. You also can find Duarte’s Diagrammer here, which I wrote about in a previous post.  I believe, at this point, that I should state explicitly that I do not own stock in her company. I just love her approach to presenting ideas and I think evaluation reports would be read and heard by more people if we all started using her guidelines.

As someone who wants to stop producing coma-inducing reports, I have tried my hand at creating more engaging layouts using Microsoft Word. What I learned is this: Take your anticipated timeline for report-writing and double it. Word does not lend itself to interesting layouts, so inserting pictures, graphs, and call-out quotes is like putting shoes on a toddler. You can do it, but it’s probably going to make you late.

Consequently, my results have been underwhelming.

So I‘m excited to try SlideDocs. I already can see how much more flexible PowerPoint will be for arranging text, images, and graphs. I won’t be wasting time adjusting margins and re-positioning graphs that always seem to wander around in Word documents. I also like that I don’t have to learn a new software application, nor do I have to get my audience to download a special reader to view my reports.

Just remember: Shorter isn’t easier when it comes to reports. These “to-the-point” SlideDocs require that you know both your findings and your audience extremely well so you can communicate the most important information in the most succinct way. SlideDocs will help with the layout, but you still have to do the thinking.

How to use Hashtags to Increase Social Media Presence

Hashtag #like
like 2 by misspixels on Flickr, Creative Commons license

 

Have you determined that the use of social media channels is appropriate for your organization following our coverage of evaluating social media activities last week?

If so you will quickly encounter hashtags, which are user-controlled categories prefaced with a pound sign. Hashtags were once limited to Twitter but are now used on most social media sites including Facebook and Google+. Conversational,  concise, and consistent use of up to two hashtags per social media message can result in double the amount of user engagement compared to messages without them. For more statistics specific to Twitter and user engagement, Buffer’s coverage at http://blog.bufferapp.com/10-new-twitter-stats-twitter-statistics-to-help-you-reach-your-followers is an excellent overview.

What are some of the ways to show that hashtags increase user engagement with your organization’s message? Look for performance indicators of reposts (the use of ‘Share’ on Facebook or retweets on Twitter), replies (comments under the message from Facebook followers, replies to the tweet from Twitter users), the number of clicks to any links included in your message (ideally to your organization’s website and resources), and hashtag usage frequency.

For tips on how to track these performance indicators and additional statistics regarding hashtag creation and use check out the helpful infographic at http://www.digitalinformationworld.com/2014/04/using-hashtags-to-boost-your-social-presence-infographic.html.

Evaluating your social media activities

For public sector and nonprofit organizations, social media can be a cost-effective way to engage with users and supporters. However, social media is not without its cost, particularly in terms of staff time.  So organizations have an interest in assessing the value of their social media activities.

One great resource for social media evaluation is Paine’s book, Measure What Matters.  The book contains detailed guidance for evaluating social media use by different types of organizations. A great supplement to Paine’s book is The Nonprofit Social Media Decision Guide by Idealware, which has worksheets that will help you plan your social media strategies and implement recommendations in Measure What Matters.

Below are the key elements of Paine’s evaluation framework:

  • Begin with a solid social media plan that identifies specific goals and objectives. As with any project, you need a plan for social media that links strategies to the organizational mission and includes objectives with targets and key performance indicators. Objectives for social media in the public-sector often belong in one of two categories: helping users find information they need; or building user awareness, engagement, or loyalty. (To inspire you, The Nonprofit Social Media Decision Guide provides a list of potential objectives on page 52.)
  • Define your target audience: Organizations often have many stakeholder groups, so you want to identify the groups most attuned to social media. On page 54 of The Nonprofit Social Media Decision Guide, you’ll find a worksheet for narrowing down your stakeholder audiences to those most receptive to your social media activities.
  • Pick your metrics: Metrics such as views, followers, and measures of engagement with online content will help you monitor reach. Conversions, defined as the actions you want your social media followers to complete, might include becoming members of your organization or actively recommending your organization to colleagues or friends.
  • Identify a source for benchmarks. Benchmarks provide a basis for comparison to assess progress. Organizations often use their own histories as benchmarks, comparing progress against baseline measures. You also may have access to data from a competing or peer organization that you can use for comparison.
  • Pick a measurement tool: Paine’s book describes different measurement methods for evaluating social media, such as content analysis, web analytics, or surveys.

For more information, check out the resources used for this blog post:

  • Katie Delahaye Paine, Measure What Matters: Online Tools For Understanding Customers, Social Media, Engagement, and Key Relationships. Hoboken, NJ: John Wiley & Sons, Inc, 2011.
  •  Idealware. The Nonprofit Social Media Decision Guide, 2013.