Skip all navigation and go to page content

NEO Shop Talk

The blog of the National Network of Libraries of Medicine Evaluation Office

Archive for October, 2015

Boosting Response Rates with Invitation Letters

Friday, October 30th, 2015

"You've got mail" graphicwith mail spelled m@il

Today’s topic: The humble survey invitation letter.

I used to think of the invitation letter (or email) as a “questionnaire delivery device.”  You needed some way to get the URL to your prospective respondents, and the letter (or, more specifically, the email) was how you distributed the link. The invitation email was always an afterthought, hastily composed after the arduous process of developing the questionnaire itself.

Then I was introduced to Donald Dillman’s “Tailored Design Method” and learned that I needed to take as much care with the letter as I did the questionnaire. A carefully crafted invitation has been proven to boost response rates. And response rate is a key concern when conducting surveys, for reasons clearly articulated in this quote from the American Association of Public Opinion Research:

“A low cooperation or response rate does more damage in rendering a survey’s results questionable than a small sample, because there may be no valid way scientifically of inferring the characteristics of the population represented by the non-respondents.” (AAPOR, Best Practices for Research)

With response rate at stake, we need to pay attention to how we write and send out our invitation emails.

This blog post features my most-used tips for writing invitation emails, all of which are included in Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method by Dillman, Smyth, and Christian (2014). Now in its fourth edition, this book is the go-to resource for how to conduct all aspects of the survey process. It is evidence-based, drawing on an extensive body of research literature on survey practice.

Plan for Multiple Contacts

Don’t think “invitation email.”  Think “communication plan,” because Dillman et al. emphasized a need for multiple contacts with participants to elicit good response rates. The book outlines various mailing schedules, but you should plan for a minimum of four contacts:

  • A preliminary email message to let your participants know you will be sending them a questionnaire. (Do not include the questionnaire link)
  • An invitation email with a link to your questionnaire (2-3 days after preliminary letter)
  • A reminder notice, preferably only to those who have not responded (one week after the invitation email)
  • A final reminder notice, also specifically to those who have not responded (one week after the first reminder).

 Tell Them Why Their Feedback Matters

Emphasize how the participants’ feedback will help your organization improve services or programs. This simple request appeals to a common desire among humans to help others. If applicable, emphasize that you need their advice specifically because of their special experience or expertise. It is best to use mail merge to personalize your email messages, so that each participant is personally invited by name to submit their feedback.

If you are contacting people who have a relationships with your organization, such as your library users or members of your organization, play up that relationship. Also, make a commitment to share results with them at a later date. (And be sure to keep that commitment.)

Make Sure They Know Who’s Asking

With phishing and email scams abounding, people are leery about clicking on URLs if an email message seems “off” in any way. Make sure they know they can trust your invitation email and survey link. Take opportunities to publicize your institutional affiliation. Incorporate logos or letterhead into your emails, when possible.

Provide names, email addresses and phone numbers of one or two members of your evaluation team, so participants know who to contact with questions or to authenticate the source of the email request. You may never get a call, but they will feel better about answering questions if you give them convenient access to a member of the project team.

It is also helpful to get a public endorsement of your survey project from someone who is known and trusted by your participants.  You can ask someone influential in your organization to send out your preliminary letter on your behalf. Also you or your champion can publicize your project over social media channels or through organizational newsletters or blogs.

And How You Will Protect Their Information

Be explicit about who will have access to individual-level data and will know how they answered specific questions. Be sure you know the difference between anonymity (where no one knows what any given participant specifically said) and confidentiality (where identifiable comments are seen by a few specific people). You can also let them know how you will protect their identity, but don’t go overboard. Long explanations also can cast doubt on the trustworthiness of your invitation.

Provide Status Updates

While this may seem “so high school,” most of us want to act in a manner consistent with our peer group. So if you casually mention in reminder emails that you are getting great feedback from other respondents, you may motivate the late responders who want to match the behavior of their peers.

Gifts Work Better Than Promises

The research consistently shows that sending a small gift to everyone, with your preliminary or invitation letter, is more effective than promising an incentive to those who complete your questionnaire. If you are bothered by the thought of rewarding those who may never follow through, keep in mind that small tokens (worth $2-3) sent to all participants is the most cost effective practice involving incentives. More expensive gifts are generally no more influential than small gifts when it comes to response rates. Also, cash works better than gift cards or other nonmonetary incentives, even if the cash is of less value.

Beyond Invitation Letters

The emails in your survey projects are good tools for enhancing response rate, but questionnaire design also matters. Visual layout, item order, and wording also influence response rate. While questionnaire design is beyond the scope of today’s post, I recommend The Tailored Design Method to anyone who plans to conduct survey-based evaluation in the near future. The complete source is provided below.

Source: Dillman DA, Smyth JD, and Christian LM. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th edition. Hoboken, NJ: Wiley; 2014.

 

 

 

Group Learning about Evaluation

Friday, October 23rd, 2015

Group of young business people talking on business meeting at office.

I recently got to participate in a very successful roundtable at a library conference.  I co-moderated an evaluation roundtable entitled “Library assessment: You’ve measured your success – now how do you get people to listen?” with OU-Tulsa Schusterman Library Associate Director Katie Prentice at the South Central Chapter of the Medical Library Association Annual Meeting in Little Rock, AR.

What makes roundtables unique among educational opportunities at library conferences is that unlike presentations or papers where attendees sit and listen, in a roundtable everyone can participate. It is a moderated discussion on a given topic among the people who attend, and since anyone can chime in, learning is active instead of passive.

About 25 people attended this roundtable and enthusiastically participated in a discussion about library assessment and evaluation data. Katie and I led the discussion with questions starting from what kind of data you collect at your library and leading to what libraries do with the data and how to make it work better for them. Our goal was to use our questions to all issues and solutions to come from the attendees themselves.

As an example, when we asked the question “what would you really like to know about your library and what do you dream of finding out about your users?” one hospital librarian said that she wanted to know how doctors were using the information and how it impacted the patients. Katie Prentice asked “can anyone help her with this?” and another hospital librarian responded that she sends emails to some of her doctors to ask for a sentence or two describing how the information was used.  These sentences, when collected and analyzed, could be a powerful tool to show hospital administration the importance of the library to patient outcomes.

Other kinds of evaluation ideas that were generated from attendees at this roundtable were:

  • using heat map software to determine where people go most often on your website
  • having student workers note what pieces of furniture are being used to improve furniture types and placement in the library
  • using a product like Constant Contact or Mail Chimp to send library newsletters to the doctors and employees at hospitals with assessment data.

While not all roundtables at conferences are this successful, this roundtable demonstrated the ability of librarians brought together in a group to learn from each other and solve problems.

OERC Travels: The Michigan Health Sciences Libraries Association

Friday, October 16th, 2015

Measuring What Matters to Stakeholders 9.21.15 (master)

I love to talk about evaluation strategies that can make our organizational successes more visible.  So I was thrilled for the opportunity to team-teach a continuing education workshop with Beth Layton,  Associate Director of the NN/LM Greater Midwest Region, at the annual Michigan Health Sciences Libraries Association (MHSLA) meeting. We taught Measuring What Matters to Stakeholders on September 25 in Flint, MI.

Our primary objective was to help participants discover how they can use evaluation to collect evidence of their libraries’ contributions to their larger organizations. Participants first identified their library activities that directly support the mission of their organizations and goals of key decision-makers. They then focused on developing evaluation plans for activities with high impact potential. The plans emphasized metrics related to organizational mission. We talked about analytic strategies, including monetary-based approaches such as cost-benefit analysis.  The workshop concluded with tips for communicating with stakeholders.

There is a 10-year history behind this workshop. The concept of linking evaluation and library advocacy was introduced to the NN/LM by Maryanne Blake from the NN/LM Pacific Northwest Region (now retired), and Betsy Kelly, Assessment and Evaluation Coordinator at the NN/LM MidContental Region. They developed the first NN/LM evaluation-for-library-advocacy workshop, Measuring Your Impact (MYI), which exploded in popularity and was taught by regional medical library staff throughout the country. Betsy made additional contributions in this area, including her value and cost-benefit analysis calculators. These calculators are publically available and have been featured in various NN/LM evaluation classes. Several regions have adapted MYI into shorter workshops to better fit the meeting schedules of associations that host NN/LM workshops.

Beth and I merged the GMR and OERC short versions for our MHSLA workshop. We stuck with the basic goals of the original MYI class, but introduced some new exercises and new approaches to communicating evaluation results.  I developed a new worksheet to help participants practice matching library activities to organizational mission and goals.  For the communication part of the workshop, we practiced messaging with Nancy Duarte’s “sparkline” presentation structure (presented here in her TEDTalk) and Tim David’s version of elevator speeches.

The class was well-received, with 92% of participants giving us an “A” rating (on a scale of “A” to “F”). Eighty-two percent of participants said the class improved their confidence about using evaluation to demonstrate their library’s value.  Asked how likely they were to use the information from the class, 73% said “very likely” and 27% said “somewhat likely.”  A quick qualitative analysis of their written comments on the evaluation form indicated that the following strategies were they were most interested in pursuing: Appreciative Inquiry interviews, logic models, and elevator pitches (as described in OERC blog posts here and here).

I want to express my appreciation to Jacqueline Leskovec at the NN/LM GMR, who bravely piloted some of the new content in advance of the our MHSLA workshop. Jacqueline is one of the GMR coordinators who adapted MYI for her region. She was on the North Dakota Library Association’s meeting program to teach Measuring What Matters to Stakeholders on September 17, more than a week before Beth and I traveled to Flint. Jacqueline tried out our new material and reported back.

I also want to thank our MHSLA participants, who made the class so engaging and enjoyable.  If any of our attendees have successes or lessons learned from using the strategies covered in the class, please contact me at olneyc@uw.edu. I would love to feature your experience in a blog post.

Cindy Olney roleplays elevator pitches with Oren Duda from Windsor Regional Hospital Library (Canada
Cindy roleplays elevator pitches with Oren Duda from Windsor Regional Hospital Library.

Creative Annual Reports

Friday, October 9th, 2015

Ah, the annual report – at its best we expect to see a glossy booklet with pie charts, short paragraphs and some quotes. At its worst it can be pages of dry text. Our main hope with annual reports is that our stakeholders and others will read them and be impressed with the successes of our organizations.

Last month I ran across the annual report from the Nowra Public Library in New South Wales, Australia, which was so compelling and understandable that over 100,000 people have viewed it on YouTube:

Photo of librarian links to Nowra library video

Since most organizations don’t have the resources to do their own music video (e.g. singers, writers, silly costumes), I thought I would look at a few other examples to consider when it’s time to do your annual report.

One of my all-time favorites is the annual report from the Schusterman Library of The University of Oklahoma-Tulsa. Their annual report is an infographic that shows the data that is collected, but also describes the data in such a way that 1) you have a better feel for what is going on in the library; and also 2) you might think “I didn’t know they would help me with that!”  For example: “7,274 Reference questions answered in person, by phone, by email, and instant message or text on everything from ADHD and child welfare to decision trees, LEED homes, and census reporting.” It is available on their website, and the librarians at the Schusterman Library say they frequently find students looking at it.

The Michigan State University College of Education won a gold ADDY and a Best in Show Award for their 2012 Annual Report (an ADDY is the advertising industry’s largest competition).  Their report featured a tri-fold, die-cut skyline that presented the college’s missions and strengths with an emphasis on “college as community.” The annual report also included a video and a website that gives detailed narratives that show institutional successes in terms of personal stories.

Of course, not all institutions want an unusual annual report.  But it is important to consider the target audience.  Annual reports reach the upper administration, potential funders, and patrons of the library. The success of this years annual report might shape the library users view of the library for years to come.

DYI: Two Great Photovoice Guides

Friday, October 2nd, 2015

Hand holding tablet, with photo of field of green grass on background with sunrise

Photovoice is an evaluation method for the times.  This method engages program stakeholders (learners; service recipients; community members) in taking photographs and using them as springboards to express their experiences and points of view.  With the prevalence of cameras in mobile devices, along with social media forums, most of us already are engaging in the foundational practices underlying photovoice: taking photos, posting them, and sharing our experiences.  Add in some facilitators who provide systematic method design, project management and ethical oversight; and you have the potential to gather program insights that would go untouched through traditional methods.

Today’s post introduces you to two practical resources written by action researchers describing their lessons learned about conducting photovoice projects. The documents also show you or link you to photos and commentary from contributing participants.

 

From the Prairie Women’s Health Centre of Excellence

One comprehensive guide comes from the Prairie Women’s Health Centre of Excellence  (PWHCE), located in Canada.  The center engages in collaborative, community-based research on social and other determinants of the health of women and girls. The center’s mission is to provide expert advice on social policies related to women’s health. The authors (Beverly Palibroda, Brigette Krieg, Lisa Murdock and Joanne Havelock) published A Practical Guide To Photovoice: Sharing Pictures, Telling Stories and Changing Communities, a nuts-and-bolts photovoice manual. It provides detailed advice, with periodic sidebars summarizing process. An appendix includes a helpful checklist. You will find sample photovoice entries throughout the document.

The manual was written in 2009.  Since that time, the PWHCE has introduced digital story-telling into its portfolio of participatory methods.  Check out the stories here.

From Brainline.org

Another guide was produced based on a photovoice project for Brainline.org, an educational website providing authoritative information about brain injury symptoms, diagnosis, and treatment. The project featured the stories of eight members with traumatic brain injury.  The gallery of essays is available here.   Facilitators Laura Lorenz and Barbara Webster developed a succinct facilitator guide based on this project.

If you want to learn how to do a photovoice project, these documents are a great place to start. You also can find other resources in OERC’s blog entries posted in 2012 and  2014.

Last updated on Monday, June 27, 2016

Funded by the National Library of Medicine under Contract No. UG4LM012343 with the University of Washington.