Posted on May 1st, 2015 by Cindy Olney | Filed under News
| Comments Off on Guerrilla Assessment Methods
Recently, the Association of Research Libraries email discussion list had an enthusiastic discussion about guerrilla assessment techniques. These are low-cost, unconventional data collection methods that gather timely responses from library users. I thought I would share some of the favored methods from this discussion.
Graffiti walls seemed to be the most popular guerrilla method discussed in this group. Users were invited to write responses to one question on white boards or flip charts; or they were asked to write comments on sticky notes and post them to bulletin boards. Questions might be related, for example, to library space use or new furniture choices, or users might write suggestions for new resources. Pictured below is a colorful example of a graffiti wall from Clemson University’s Cooper Library posted by Peggy Tyler. Flip charts also were featured in this space use assessment conducted at University of Pittsburgh University Library System (see the FlipChart Analysis and the Flipchart Survey—Our Response presentations).
Short questionnaires to collect on-the-spot responses from users were also mentioned frequently. Some libraries placed laptops in conspicuous parts of the library to capture responses. Others took advantage of tablets, such as this project conducted at Georgia State University. Sometimes the low-tech approach worked best, featuring paper-and-pencil questionnaires or note cards for written comments.
Photographs also were used creatively to capture helpful assessment information. University of Pittsburgh University Library System staff used photographs to examine use of study space. With so many library users carrying mobile phones with cameras, there is a lot of potential for inviting users to incorporate photographs into their responses to assessment questions. In the ARL-assess discussion, Holt Zaugg at Brigham Young’s Harold B. Lee Library described a study in which student volunteers took pictures of places on campus that they thought fit a certain characteristic (e.g. too noisy, busy place to study). The staff did follow-up interviews with the student volunteers for added insight about their photographs.
Guerrilla methods may look easy, but they do require careful planning and thought. You’ll need well-crafted, focused questions. You also will need an effective promotional strategy to attract user participation. And you’ll want a well-executed schedule for collecting and inputting data so that key information is not lost. Yet these guerrilla methods are worth the challenge, because they engage both participants and staff in the assessment process. These methods are a refreshing alternative to conventional methods.
You’ve been collecting great data for your library, and now you have to figure out how to use it to convince someone of something, for example how great your library is. Part of the trick is turning that data into a presentation that your stakeholders understand – especially if you are not there to explain it. Infographics are images that make data easy to understand in a way that gets your message across.
It turns out it doesn’t have to be difficult or expensive to create your own infographics. Last week I went to a hands-on workshop at the Texas Library Association called “Infographics: One Picture is Worth 1,000 Data Points,” taught by Leslie Barrett, Education Specialist from the Education Service Center Region 13 in Austin, TX. Using this website as her interactive “handout” http://r13hybrarian.weebly.com/infographicstla.html, Leslie walked us through the process of creating an infographic (and as a byproduct of this great class, she also demonstrated a number of free instructional resources, such as Weebly, Padlet, and Thinglink).
Starting at the top of the page, click on anything with a hyperlink. You will find a video as well as other “infographics of infographics” which demonstrate how and why infographics can be used. There are also a variety of examples to evaluate as part of the learning process.
Finally, there is information on the design process and resources that make infographics fairly easy to create. These resources, such as Piktochart and Easelly, have free subscriptions for simple graphics and experimenting.
Leslie Barrett allowed us to share this website with you, so feel free to get started making your own infographics!
Image credit: Open Access Week at University of Cape Town by Shihaam Donnelly / CC BY SA 3.0
Posted on April 17th, 2015 by Cindy Olney | Filed under News
| Comments Off on Keep It Simple with Micro-Surveys
A hot trend in marketing research is the micro-survey. Also known as the bite-sized survey, these questionnaires are short (about three questions) with the goal of collecting focused feedback to guide specific action.
The micro-survey is a technique for overcoming what is arguably the biggest hurdle in survey assessment: Getting people to respond to your questionnaire. It is a technique that is particularly useful for populations where mobile technology use is on the rise, and where there is competition for everyone’s attention in any given moment. If we expect our respondents to answer our questionnaires, we can’t saddle them with long, matrix-like questions or require them to flip through numerous web pages. We need to simplify, or we will lose respondents before they ever get to the submit button.
The trick to micro-surveys is to keep them short, but administer multiple questionnaires over time. You can break down a traditional membership or customer questionnaire into several micro-surveys and distribute them periodically. “Survey Doctor” Matthew Champagne gives evidence to the effectiveness of this technique in his blog post about bite-sized surveys. He provides an example of a project that boasted an 86% response rate.
Of course, the length of your survey is not the only factor contributing to response rate. You should strive to follow the Dillman method, which provides time-tested guidelines for administering surveys. (Here is one researcher’s description of how to use the Dillman method.) Also, take a look at Champagne’s Nine Principles of Embedded Assessment. His website has articles and YouTube videos on how to implement these principles.
If you want to try doing a micro-survey, check out the effective practices described in this blog article from the marketing research company Instantly.
Are you apprehensive when someone says it’s time to do “outcome-based planning using a logic model?” The Wichita State University Community Psychology Practice and Research Collaborative, and the Community Psychology Doctoral Program in Wichita, KS, have come up with an easy way to do logic models. This is described in an article, “Tearless Logic Model,” in the Global Journal of Community Psychology Practice.
Their goal was to create a facilitated, non-intimidating logic model process that would be more likely to be used in planning. This approach is designed to give community-based groups, faith-based organizations, smaller nonprofits and people with little experience in strategic planning greater impact when planning community projects.
Tearless Logic Model planning requires only flip charts, magic markers, blue painters’ tape and a safe space to work with a group. Jargon is eliminated, and is replaced with simple terms that anyone can understand. For example, instead of asking “what are the anticipated impacts,” a facilitator would ask, “if you really got it right, what would it look like in 10 or 20 years?”
Ashlee D. Lien, Justin P. Greenleaf, Michael K. Lemke, Sharon M. Hakim, Nathan P. Swink, Rosemary Wright, Greg Meissen. Tearless Logic Model. Global Journal of Community Psychology Practice [Internet].2011 Dec [cited 2015 Apr 10];2(2). Available from http://www.gjcpp.org/pdfs/2011-0010-tool.pdf
Posted on April 3rd, 2015 by Cindy Olney | Filed under News, Storytelling
| Comments Off on Story-Telling: The NTOTAP Community Health Advocate Project Showcase
Want to see how stories can raise the visibility of successful programs? Check out the project showcase of Community Health Advocate programs, created under the direction of the Native Telehealth Outreach and Technical Assistance Program (NTOTAP). NTOTAP, a program of the Centers for American Indian & Alaska Native Health, provides training on website design and social media marketing to Native health programs. The short videos, which are project slides with narrative by the community health advocates themselves, provide digital vignettes of community health advocacy activities and accomplishments.
I heard about the NTOTAP showcase from Spero Manson, PhD, Director of the Centers for American Indian and Alaska Native Health. Dr. Manson was a keynote speaker at the Quint*Essential Conference in Denver last October. He incorporated one of the videos into his presentation. (See the video of Russell George under “Walleen Whitson.”)Dr. Manson’s use of one of the stories proves the versatility of digital story-telling. The audience heard, in a participant’s own words, how a community health program made a difference in his life.
In the field of evaluation, which emphasizes evaluation use, story methods are emerging as an important trend. Project stories seem to have greater reach to program stakeholders than traditional reporting formats. The NTOTAP showcase is a great example of digital project story-telling in action.
Posted on March 27th, 2015 by Karen Vargas | Filed under News, Practical Evaluation
| Comments Off on Updated Community Health Status Indicators (CHSI)
The OERC is excited to get the word out about the Center for Disease Control and Prevention (CDC)’s newly updated and redesigned Community Health Status Indicators (CHSI). The new CHSI 2015 represents the collaboration of public health partners in the public, non-profit and research communities, including the National Library of Medicine.
The OERC recommends the CHSI as a possible resource in the data gathering portion of planning outreach projects or needs assessments. CHSI 2015 is an interactive online tool that produces health profiles for all 3,143 counties in the United States. Each profile includes key indicators of health outcomes that describe the population health status of a county. What makes CHSI 2015 an important tool is that it includes comparisons to “peer counties” – groups of counties that are similar to each other based on 19 variables, including population size, percent high school graduates and household income.
For each county, CHSI 2015 provides a Summary Comparison Report. Using Karen Vargas’ childhood home of Union County, PA as an example, this report (right) shows that in the case of the overall cancer death rate, Union County does better than most of its peer counties. But in the case of stroke death rate, they do worse.
By selecting a specific indicator, such as coronary heart disease death rate, the interactive CHSI 2015 will produce a bar chart showing Union County in comparison to its peer counties, as well as the US median and the Healthy People 2020 target (left).
More detailed, downloadable data for each peer county can also be found (below), as well as a web page detailing the sources of the data for each indicator. CHSI 2015 provides a helpful How to use CHSI web page that explains each feature and provides helpful hints.
Posted on March 20th, 2015 by Cindy Olney | Filed under News
| Comments Off on Five reasons to attend the AEA Summer Institute 2015
Registration is now open for the American Evaluation Association’s annual Summer Evaluation Institute. The Institute, held in Atlanta, runs for 2.5 days and features 26 half-day training sessions. Here are five reasons I make a point of attending this Institute every year.
Great instructors. The training is offered by some of the most experienced evaluators in the field.
A continuing education bargain. Training costs about $80-90 per half-day session, less for students.
CDC presence. Historically, AEA co-sponsored this annual event with Atlanta-based Centers for Disease Control and Prevention. While the CDC no longer co-sponsors the Institute, you will meet lots of CDC staff members and consultants.
Networking opportunities. Between lunch and breaks, you get eight opportunities to chat with your colleagues.
Great location. The Institute is held at the Crown Plaza Atlanta Perimeter at Ravinia, located in a park-like setting on Atlanta’s perimeter near shopping and restaurants. The hotel is on the MARTA (mass transit) red line, so you can get from the airport to the hotel without facing Atlanta’s legendary traffic. Because I live near Atlanta, I haven’t stayed in the hotel; but I’ve never heard any complaints.
Full-day pre-Institute workshops are held, for an additional charge, on the Sunday before the Institute. You can attend pre-conference sessions without registering for the Institute itself. For example, beginners might want to take “Introduction to Evaluation” taught by Tom Chapel, the Chief Evaluation Officer at the CDC. Chapel organizes the workshop around the CDC’s six-step framework for program evaluation.
The AEA Institute 2015 runs June 1-3, with pre-session workshops conducted on May 31. The cost for the Institute is $395 for members and $480 for nonmembers, with a special student rate of $250. The price covers five training sessions (your choice among the 26 offerings), snacks, and lunch. Pre-Institute workshops are an additional $150 (all participants).
It was an amazing a-ha moment. We kind of blinked at each other, and then simultaneously said ‘We got to do something.’ – Dr. Nancy Hardt, University of Florida
This week on National Public Radio’s (NPR) All Things Considered was a story of what happened when Dr. Nancy Hardt, an OB-GYN, used data from Medicaid birth records to see where children were born into poverty in Gainesville, FL to try and identify ways to intervene and prevent poor childhood health outcomes. She was surprised to see a 1 square mile high-density ‘hot spot’ of births in dark blue appear in her map above. Dr. Hardt was encouraged to share her map with Sheriff Sadie Darnell, who pulled out a map of her own of Gainesville.
Sheriff Darnell’s map showed an exact overlay with the ‘hot spot’ on Dr. Hardt’s map of the highest crime rates in the city. By visiting the area they identified many things in the community that were barriers to good health including hunger, substandard housing, and a lack of medical care facilities – the closest location for uninsured patients was a 2 hour bus ride each way to the county health department. You’ll want to check out the rest of A Sheriff and A Doctor Team Up to Map Childhood Trauma to learn more about a mobile health clinic, what data from additional maps showed, and other steps they have taken since to help improve health outcomes for the community.
Have you ever found yourself trying to do an evaluation activity, but needing that one helpful tool? Or perhaps you need a step-by-step guide on how to do a community assessment, or are looking for ways to build evaluation into a project that you are planning?
The OERC has an online guide called Tools and Resources for Evaluation that you and your library can use to evaluate your programs. Here are some of the types of tools and resources described in the Guide.
Community Oriented Outreach
Tips on successful collaborations and tools for improving collaboration with community networks
Toolkits for practical participatory evaluation and processes for conducting outcome-based evaluations
Step-by-step guides on incorporating evaluation planning into your outreach projects
Instructions on using logic models for program planning
Data Collection and Analysis
Tips for questionnaire development
Resources for statistical methods of data analysis
Guides for analyzing qualitative and quantitative data
Reporting and Visualizing
Help with creating popular data dashboards
Descriptions of data visualization methods
Tools and TEDtalks about how you will present your data
Posted on February 27th, 2015 by Nikki Dettmar | Filed under News
| Comments Off on Qualitative Evaluation Week
The American Evaluation Association (AEA) just concluded a week-long blog theme about qualitative evaluation, which we’ve summarized below for your reference and to consider as part of your own assessment efforts:
The Role of Context – the authors of this entry previously shared five high quality elements of qualitative evaluation, and this entry referenced them while emphasizing the need for evaluators to understand what role setting, relationships, and other context factors play in data as well.
Purposeful Sampling – a great explanation on why to avoid convenience sampling (interviewing people because they happen to be around) and using caution with your qualitative evaluation terminology to consider not using the word ‘sampling’ due to people’s association of it with random probability.
Interviewing People who are Challenging – establishing rapport leads to good qualitative data, but what does an interviewer do if there seems to be conflict with the interviewee? Details about how to manage your own feelings and approach with a curious mindset are very helpful!
Asking Stupid Questions – this example from a bilingual HIV/AIDS training is especially insightful about the importance of clarifying sexual terms, putting aside concerns the evaluator may have about looking ‘stupid,’ and outcomes that led to deeper engagement and discussion from the group.
Practical Qualitative Analysis – many helpful tips and lessons shared, including the reminder of being sure to group our participants’ responses that answer the same question together even if these replies come from different parts of the survey or interview.
Providing Descriptions – sometimes there are concerns expressed that evaluation is ‘only looking at the negative,’ and by including full details about your qualitative inquiry collection and analysis as an additional resource or appendix you can help explain the steps of the process that otherwise may not be evident.
Need more information about qualitative and other types of evaluation? The Outreach Evaluation Resource Center (OERC) has resources available including our Tools and Resources for Evaluation guide and our freely available Planning and Evaluating Health Information Outreach booklet series.