Skip all navigation and go to page content
NN/LM Home About NER | Contact NER | Feedback |Site Map | Help | Bookmark and Share

Archive for the ‘OERC’ Category

New SurveyMonkey mobile app

Monday, April 14th, 2014

Attention iPad and iPhone users: SurveyMonkey recently launched a mobile app so you can create, send, and monitor your surveys from your phone or tablet. The app is free, although you need a SurveyMonkey account to use it.

With the SurveyMonkey app, you no longer have to rely on your computer to design and manage a survey. The app also allows you to conveniently view your data from any location with Internet access. I think the most notable benefit is that the analytic reports are optimized for mobile devices and are easy to read on small screens.

I have been asked how this app compares to QuickTapSurvey (see my previous blog entry). In my opinion, the app does not make SurveyMonkey comparable to QuickTapSurvey, which is designed specifically to collect onsite visitor feedback in informal settings such as exhibits and museums. SurveyMonkey, by comparison, is designed to collect data through email, web sites, or social media. Both apps work best in their respective settings. I think you could adapt SurveyMonkey to collect data at face-to-face events (if there is onsite Internet access), but it probably won’t work as smoothly as QuickTapSurvey.

For more information about the Survey Monkey mobile app, click here.

Evaluation Tips: Recipe of Evaluation Techniques

Tuesday, March 18th, 2014

Last week OERC staff attended a webinar presentation from Stanley Capela entitled Recipe of Evaluation Techniques for the Real World. This is one of the American Evaluation Association’s (AEA) ongoing 20 minute Coffee Break webinars . The webinars, offered Thursdays at 2pm Eastern time, often present similar tools and tips that are also covered in the Tip a Day blog but allow for audience questions &  answers and networking with the presenters.

Capela’s recipe focused primarily on internal evaluation in a non-profit or government settings where people are seeking realistic answers in response to your assessment efforts. His tips include:

  • Value People’s Time - all time is valuable, regardless of who you are working with, and clear communication on the intent of the evaluation helps to make the      best use of everyone’s time.
  • Ethical Conduct – working within the parameters of organization and/or professional association codes of conducts in addition to established support of upper      level administration will help to minimize the potential for ethical dilemmas.
  • Know Your Enemies – be aware of those who are resistant to program evaluation and may try to undermine these efforts, and also know that you as an evaluator may be perceived as an enemy by others. Again, clear communication helps!
  • Culture of Accountability – take the time to know the story of those you are working with – where are they coming from? What is their history with previous assessments? Were their needs met, or were there issues that had negative effects on relationships and outcomes?
  • Do Something – avoid cycles of conducting reviews, identifying deficiencies, and outcomes that only include developing correction plans. Also important to note is that program evaluation does not solve management problems.
  • A Picture is Worth 1,000 Words – find ways to integrate charts that direct the reader to the most important information clearly and concisely.
  • Let Go of Your Ego – working from a mindset that accepts the people conducting the program itself will most likely ‘get the credit’, and that your measure of success is doing your job to the best of your ability knowing you made a difference.
  • Give Back – develop of network of trusted colleagues, such as through personal and organization connections on LinkedIn and other platforms, share ideas, and      asking questions since others have probably encountered a similar situation or can connect you with those who have.

Hopefully you have found the information we at the Outreach Evaluation Resource Center (OERC) have freely available for you in our updated Evaluation Guides helpful as an additional source of  ideas, strategies, and worksheets to include in your evaluation recipe collection!

Free Images for Your Evaluation Reports

Tuesday, March 18th, 2014

The current trend in evaluation reporting is toward fewer words and more images. There are a number of companies that offer high-quality, royalty free photographs at minimal cost. (Stockfresh, for example, charges as little as $1 per image.) However, no-cost is even better than low-cost. Freelancers Union, a nonprofit organization dedicated to assisting freelance workers, recently published a list of the best websites for no-cost images.  If you are looking for free images for your presentations or reports, check out their article:

https://www.freelancersunion.org/blog/2014/02/07/best-free-image-resources-online/

(The article also describes the difference between public domain, royalty-free and Creative Commons-licensed images.)

“Evidence” — what does that mean?

Monday, January 27th, 2014

New OERC Blog posting! This is to let you know that a new OERC Blog article has become available. You can find this article online here. For simplicity’s sake, we’ve posted the article below:

“Evidence” — what does that mean?

In our health information outreach work we are expected to provide evidence of the value of our work, but there are varying definitions of the word “evidence.” The classical evidence-based medicine approach (featuring results from randomized controlled clinical trials) is a model that is not always relevant in our work. At the 2013 EBLIP7 meeting in Saskatoon, Saskatchewan, Canada, Denise Kaufogiannakis presented a keynote address that is now available as an open-access article on the web:

“What We Talk About When We Talk About Evidence” Evidence-Based Library and Information Practice 2013 8.4

This article looks at various interpretations of what it means to provide “evidence” such as

theoretical (ideas, concepts and models to explain how and why something works),
empirical (measuring outcomes and effectiveness via empirical research), and
experiential (people’s experiences with an intervention).

Kaufogiannakis points out that academic librarians’ decisions are usually made in groups of people working together and she proposes a new model for evidence-based library and information practice:

1) Articulate – come to an understanding of the problem and articulate it. Set boundaries and clearly articulate a problem that requires a decision.

2) Assemble – assemble evidence from multiple sources that are most appropriate to the problem at hand. Gather evidence from appropriate sources.

3) Assess – place the evidence against all components of the wider overarching problem. Assess the evidence for its quantity and quality. Evaluate and weigh evidence sources. Determine what the evidence says as a whole.

4) Agree – determine the best way forward and if working with a group, try to achieve consensus based on the evidence and organizational goals. Determine a course of action and begin implementation of the decision.

5) Adapt – revisit goals and needs. Reflect on the success of the implementation. Evaluate the decision and how it has worked in practice. Reflect on your role and actions. Discuss the situation
with others and determine any changes required.

Kaufogiannakis concludes by reminding us that “Ultimately, evidence, in its many forms, helps us find answers. However, we can’t just accept evidence at face value. We need to better understand evidence – otherwise we don’t really know what ‘proof’ the various pieces of evidence provide.

Please visit WP-Admin > Options > Snap Shots and enter the Snap Shots key. How to find your key