Skip all navigation and go to page content
NN/LM Home About OERC | Contact Us | Feedback |OERC Sitemap | Help | Bookmark and Share

Archive for the ‘Practical Evaluation’ Category

Freebie Friday: American Evaluation Association Online Public Library

As we at the Outreach Evaluation Resource Center (OERC) learn more about great evaluation resources available at an even better price (free!) in addition to our own freely available resources (http://guides.nnlm.gov/oerc/tools), we will feature some of them here for you to explore on ‘Freebie Friday’ as an occasional series.

To begin our coverage, did you know the American Evaluation Association (AEA) has an online public library of AEA conference presentations, assessment instruments (such as rubrics and logic models), and more available at http://comm.eval.org/communities/resources/libraryview/? The default library setting displays those files most recently updated in the online library and offers keyword searching, but I recommend going straight to the more advanced search functionality at http://comm.eval.org/communities/resources/searchlibrary/.

(more…)

SlideDocs help you create readable reports

Thanks to evolving Internet and mobile technology, most of us now prefer our information presented in tapas-sized quantities, preferably in a visually appealing way. This preference is having an impact on our business communication. Before our eyes, bullet points on PowerPoint slides are gradually giving away to engaging visuals. Infographics are replacing text-dense brochures and executive summaries.

Nancy Duarte, who is motivating  the world to clean up its PowerPoint presentations, is venturing into a new frontier: Written reports. She suggests creating reports using presentation software like PowerPoint, which is more amenable to manipulating layouts. The idea is to present one idea per slide, incorporating visuals, text and much-appreciated white space.

Duarte calls this type of report a SlideDoc and offers a free tutorial on how to create one (presented in, you guessed it, a PowerPoint document). The site includes downloadable templates to help you create your own SlideDoc report. You also can find Duarte’s Diagrammer here, which I wrote about in a previous post.  I believe, at this point, that I should state explicitly that I do not own stock in her company. I just love her approach to presenting ideas and I think evaluation reports would be read and heard by more people if we all started using her guidelines.

As someone who wants to stop producing coma-inducing reports, I have tried my hand at creating more engaging layouts using Microsoft Word. What I learned is this: Take your anticipated timeline for report-writing and double it. Word does not lend itself to interesting layouts, so inserting pictures, graphs, and call-out quotes is like putting shoes on a toddler. You can do it, but it’s probably going to make you late.

Consequently, my results have been underwhelming.

So I‘m excited to try SlideDocs. I already can see how much more flexible PowerPoint will be for arranging text, images, and graphs. I won’t be wasting time adjusting margins and re-positioning graphs that always seem to wander around in Word documents. I also like that I don’t have to learn a new software application, nor do I have to get my audience to download a special reader to view my reports.

Just remember: Shorter isn’t easier when it comes to reports. These “to-the-point” SlideDocs require that you know both your findings and your audience extremely well so you can communicate the most important information in the most succinct way. SlideDocs will help with the layout, but you still have to do the thinking.

Evaluating your social media activities

For public sector and nonprofit organizations, social media can be a cost-effective way to engage with users and supporters. However, social media is not without its cost, particularly in terms of staff time.  So organizations have an interest in assessing the value of their social media activities.

One great resource for social media evaluation is Paine’s book, Measure What Matters.  The book contains detailed guidance for evaluating social media use by different types of organizations. A great supplement to Paine’s book is The Nonprofit Social Media Decision Guide by Idealware, which has worksheets that will help you plan your social media strategies and implement recommendations in Measure What Matters.

Below are the key elements of Paine’s evaluation framework:

  • Begin with a solid social media plan that identifies specific goals and objectives. As with any project, you need a plan for social media that links strategies to the organizational mission and includes objectives with targets and key performance indicators. Objectives for social media in the public-sector often belong in one of two categories: helping users find information they need; or building user awareness, engagement, or loyalty. (To inspire you, The Nonprofit Social Media Decision Guide provides a list of potential objectives on page 52.)
  • Define your target audience: Organizations often have many stakeholder groups, so you want to identify the groups most attuned to social media. On page 54 of The Nonprofit Social Media Decision Guide, you’ll find a worksheet for narrowing down your stakeholder audiences to those most receptive to your social media activities.
  • Pick your metrics: Metrics such as views, followers, and measures of engagement with online content will help you monitor reach. Conversions, defined as the actions you want your social media followers to complete, might include becoming members of your organization or actively recommending your organization to colleagues or friends.
  • Identify a source for benchmarks. Benchmarks provide a basis for comparison to assess progress. Organizations often use their own histories as benchmarks, comparing progress against baseline measures. You also may have access to data from a competing or peer organization that you can use for comparison.
  • Pick a measurement tool: Paine’s book describes different measurement methods for evaluating social media, such as content analysis, web analytics, or surveys.

For more information, check out the resources used for this blog post:

  • Katie Delahaye Paine, Measure What Matters: Online Tools For Understanding Customers, Social Media, Engagement, and Key Relationships. Hoboken, NJ: John Wiley & Sons, Inc, 2011.
  •  Idealware. The Nonprofit Social Media Decision Guide, 2013.

4,000+ free diagram templates

“If people can see what you’re saying, they’ll understand it.”  This quote comes from the Perspectives page of Duarte.com, which now offers a free Diagrammer that provides more than 4,000 free, downloadable diagram templates to help you present your evaluation findings visually.

A handy directory helps you determine the type of diagram you need based on the data relationships you want to portray. You then choose a diagram and download it into a PowerPoint-ready image that is completely customizable. You add text, change font size and color, even move or eliminate parts of the diagram. The PowerPoint slide can be inserted into a slide file for an oral presentation or saved as an image and inserted into a written evaluation report.

I tried my hand at creating a diagram to show how NN/LM train-the-trainer programs encourage the spread of health information resource use. This is what I came up with:

Diagrammer image

I learned about this tool  from an AEA365 blog post by Sheila Robinson. As an example, she included an infographic she designed using the Diagrammer to illustrate American Evaluation Association learning opportunities.

Duarte.com is the company of Nancy Duarte, a master presentation designer who has become a favorite among evaluators on a mission to get their evaluation reports understood and used. If you are interested in punching up the impact of your presentations, you also might want to check out her book “Resonate” (2010, Wiley & Sons) or watch her popular TEDtalk The Secret Structure of Great Talks. Many of her principals can be applied to oral or written presentations.

 

 

 

New Journal: Systematic Reviews

Librarians’ expert searching skills provide some great opportunities for collaboration with researchers. Biomed Central’s new open access Systematic Reviews journal is about a specialized type of expert searching that librarians can provide for their communities. More than a source of protocols and a record of others’ work, this journal has great potential for those of us in academia who want to publish articles to share information with our colleagues about what we have done.

Here’s more information from the Aims and Scope:

Systematic Reviews encompasses all aspects of the design, conduct and reporting of systematic reviews. The journal aims to publish high quality systematic review products including systematic review protocols, systematic reviews related to a very broad definition of health, rapid reviews, updates of already completed systematic reviews, and methods research related to the science of systematic reviews, such as decision modeling. The journal also aims to ensure that the results of all well-conducted systematic reviews are published, regardless of their outcome.

It is a long-term goal of the journal to ensure all systematic reviews are prospectively registered in an appropriate database, such as PROSPERO, as these resources for registration become available and are endorsed by the scientific community.

Article types include:

  • Research Articles
  • Commentaries
  • Letters
  • Methodologies
  • Protocols
  • Review Updates

The editors-in-chief comprise an international group hailing from the University of Ottawa; the RAND corporation and UCLA; and the University of York.

Take a look at this journal! It could be a source of inspiration for any librarian whose emphasis is on expert searching.

Community Health Strategies: Strengthening Reach and Impact

Last month the Group Health Research Institute (GHRI) presented a webcast entitled A Healthy Dose: Strengthening Reach and Impact of Community Strategies from their background of working with stakeholders who have deep (25-30 years) experience working with community health initiatives. They have noticed a trend over the past ten years to include more information beyond the basics of the number of community members reached and initial (short term) impacts of community health projects in related reports.

GHRI have also studied data available from the Kaiser Foundation and other resources to find that the critical factor for successful long term impacts of community health projects  is community involvement in them. Using the Reach Effectiveness Adoption Implementation (RE-AIM) model, which helps to best translate public health research into practice, they identified areas in projects ranging from low to high reach (fewer/greater numbers of community members involved) and low to high strength (lesser/greater impact on the health of the community).

Factors that calculate impact strength include whether a health project event is held one time or is a consistent part of the community’s environment, the degree that healthy options are the only choices available, and supporting community health projects with promotion and education. Some examples discussed during the webcast include building sidewalks (high reach, low strength) and establishing physical education classes at local schools (high reach, high strength).

The webcast recording and slides are available here, and their published findings in the American Journal of Evaluation are freely available at Using the Concept of “Population Dose” in Planning and Evaluating Community-Level Obesity Prevention Initiatives.

Maximize your response rate

Did you know that the American Medical Association has a specific recommendation for its authors about questionnaire response rate? Here it is, from the JAMA Instructions for Authors:

Survey Research
Manuscripts reporting survey data, such as studies involving patients, clinicians, the public, or others, should report data collected as recently as possible, ideally within the past 2 years. Survey studies should have sufficient response rates (generally at least 60%) and appropriate characterization of nonresponders to ensure that nonresponse bias does not threaten the validity of the findings. For most surveys, such as those conducted by telephone, personal interviews (eg, drawn from a sample of households), mail, e-mail, or via the web, authors are encouraged to report the survey outcome rates using standard definitions and metrics, such as those proposed by the American Association for Public Opinion Research.

Meanwhile, response rates to questionnaires have been declining over the past 20 years, as reported by the Pew Research Center in “The Problem of Declining Response Rates.” Why should we care about the AMA’s recommendation regarding questionnaire response rates? Many of us will send questionnaires to health care professionals who, like physicians, are very busy and might not pay attention to our efforts to learn about them. Even JAMA authors such as Johnson and Wislar have pointed out that “60% is only a “rule of thumb” that masks a more complex issue.” (Johnson TP; Wislar JS. “Response Rates and Nonresponse Errors in Surveys.” JAMA, May 2, 2012—Vol 307, No. 17, p.1805) These authors recommend that we evaluate nonresponse bias in order to characterize differences between those who respond and those who don’t. These standard techniques include:

  • Conduct a follow-up survey with nonrespondents
  • Use data about your sampling frame and study population to compare respondents to nonrespondents
  • Compare the sample with other data sources
  • Compare early and late respondents

Johnson and Wislar’s article is not open access, unfortunately, but you can find more suggestions about increasing response rates to your questionnaires in two recent AEA365 blog posts that are open access:

Find more useful advice (e.g., make questionnaires short, personalize your mailings, send full reminder packs to nonrespondents) at this open access article: Sahlqvist S, et al., “Effect of questionnaire length, personalisation and reminder type on response rate to a complex postal survey: randomised controlled trial.” BMC Medical Research Methodology 2011, 11:62

Free Images for Your Evaluation Reports

The current trend in evaluation reporting is toward fewer words and more images. There are a number of companies that offer high-quality, royalty free photographs at minimal cost. (Stockfresh, for example, charges as little as $1 per image.) However, no-cost is even better than low-cost. Freelancers Union, a nonprofit organization dedicated to assisting freelance workers, recently published a list of the best websites for no-cost images.  If you are looking for free images for your presentations or reports, check out their article:

https://www.freelancersunion.org/blog/2014/02/07/best-free-image-resources-online/

(The article also describes the difference between public domain, royalty-free and Creative Commons-licensed images.)

Evaluation Tips: Recipe of Evaluation Techniques

Last week I attended a webinar presentation from Stanley Capela entitled Recipe of Evaluation Techniques for the Real World. This is one of the American Evaluation Association’s (AEA) ongoing 20 minute Coffee Break webinars . The webinars, offered Thursdays at 2pm Eastern time, often present similar tools and tips that are also covered in the Tip a Day blog but allow for audience questions &  answers and networking with the presenters.

Capela’s recipe focused primarily on internal evaluation in a non-profit or government settings where people are seeking realistic answers in response to your assessment efforts. His tips include:

  • Value People’s Timeall time is valuable, regardless of who you are working with, and clear communication on the intent of the evaluation helps to make the best use of everyone’s time.
  • Ethical Conduct – working within the parameters of organization and/or professional association codes of conducts in addition to established support of upper level administration will help to minimize the potential for ethical dilemmas.
  • Know Your Enemies – be aware of those who are resistant to program evaluation and may try to undermine these efforts, and also know that you as an evaluator may be perceived as an enemy by others. Again, clear communication helps!
  • Culture of Accountability – take the time to know the story of those you are working with – where are they coming from? What is their history with previous assessments? Were their needs met, or were there issues that had negative effects on relationships and outcomes?
  • Do Something – avoid cycles of conducting reviews, identifying deficiencies, and outcomes that only include developing correction plans. Also important to note is that program evaluation does not solve management problems.
  • A Picture is Worth 1,000 Words – find ways to integrate charts that direct the reader to the most important information clearly and concisely.
  • Let Go of Your Ego – working from a mindset that accepts the people conducting the program itself will most likely ‘get the credit’, and that your measure of success is doing your job to the best of your ability knowing you made a difference.
  • Give Back – develop of network of trusted colleagues, such as through personal and organization connections on LinkedIn and other platforms, share ideas, and asking questions since others have probably encountered a similar situation or can connect you with those who have.

Hopefully you have found the information we at the Outreach Evaluation Resource Center (OERC) have freely available for you in our updated Evaluation Guides helpful as an additional source of  ideas, strategies, and worksheets to include in your evaluation recipe collection!

Webinars and Workshops about Evaluating Outreach

The National Network of Libraries of Medicine Outreach Evaluation Resource Center (OERC) offers a range of webinars and workshops upon request by network members and coordinators from the various regions. Take a look at the list and see if one of the options appeals to you. To request a workshop or webinar, contact Susan Barnes.

The workshops were designed as face-to-face learning opportunities but we can tailor them to meet distance learning needs by distilling them to briefer webinars or offering them in series of 1-hour webinars.

Don’t see what you’re looking for on this list? Then please contact Susan and let her know!

We’re looking forward to hearing from you.