Archive for the ‘Communications Tools’ Category
You may think of a survey invitation letter or email message as simply a delivery mechanism to send the questionnaire link to prospective respondents. The invitation may be an afterthought, hastily composed after the process of developing the questionnaire itself. However, a carefully crafted invitation has been proven to boost response rates, which are a key concern when conducting surveys. The following tips for writing invitation messages are all included in the 4th edition of Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, by Dillman, Smyth, and Christian (2014), an excellent resource for conducting all aspects of the survey process. It is evidence-based, drawing on an extensive body of research literature on survey practice.
Think of the survey invitation as a “communication plan,” utilizing multiple contacts with participants to elicit good response rates. Plan for a minimum of four contacts:
- A preliminary message to let your participants know you will be sending them a questionnaire. (Do not include the questionnaire link)
- An invitation message with a link to the questionnaire (2-3 days after the preliminary contact).
- A reminder notice, preferably only to those who have not responded (one week after the invitation message).
- A final reminder notice, also specifically to those who have not responded (one week after the first reminder).
Emphasize how the participants’ feedback will help your organization improve services or programs. This simple request appeals to a common desire among humans to help others. If applicable, emphasize that you need their advice specifically because of their special experience or expertise. It is best to use mail merge to personalize your email messages, so that each participant is personally invited by name to submit their feedback. If you are contacting people who have a relationship with your organization, such as your library users or members of your organization, play up that relationship. Also, make a commitment to share results with them at a later date. And be sure to keep that commitment!
Phishing and email scams may cause leeriness about clicking on links if an email message seems odd in any way. Make sure participants know they can trust your invitation email and survey link. Take opportunities to publicize your institutional affiliation. Incorporate logos or letterhead into your emails, when possible. Provide names, email addresses, and phone numbers of one or two members of your evaluation team, so participants know who to contact with questions or to authenticate the source of the email request. You may never get a call, but they will feel better about answering questions if you give them convenient access to a member of the project team. It is also helpful to get a public endorsement of your survey project from someone who is known and trusted by your participants. You can ask someone influential in your organization to send the preliminary letter or message on your behalf. Also, publicize the project over social media channels or through organizational newsletters or blogs.
Be explicit about who will have access to individual-level data. Be sure you know the difference between anonymity (where no one knows what any given participant specifically said) and confidentiality (where identifiable comments are seen by a few specific people). You can also let participants know how you will protect their identity, but don’t go overboard. Long explanations also can cast doubt on the trustworthiness of your invitation.
And finally, provide status updates when sending reminder messages. If you mention that you are getting great feedback from other respondents, it may motivate the late responders who want to match the behavior of their peers!
Sponsored by the U.S. Department of Health and Human Services’ Office of the Assistant Secretary for Preparedness and Response (ASPR), the Technical Resources, Assistance Center, and Information Exchange (TRACIE) features resource materials, a help line, just-in-time suggestions and tools to share information gleaned from real-life experiences in preparing for, responding to and recovering from disasters. This effort resulted from the collaborative efforts of local, state and federal government agencies, regional health-care coalitions, academia, and partners from the private sector and nongovernmental organizations.
TRACIE provides technical resources and a technical assistance center, a comprehensive national knowledge center, and multiple ways to share information between federal, state and local officials. TRACIE’s technical resources include a living library of audience-tailored and subject matter expert-reviewed topic collections and materials highlighting real-life tools and experiences. TRACIE’s resources include user rating and comments, which can be used to help choose the best resource for a particular need. Through TRACIE’s assistance center, state, tribal, local and territorial officials can reach subject matter experts for technical assistance and consultations on a range of topics. Technical assistance could vary widely, including pediatric preparedness resources, crisis standards of care, tools to assess the readiness of hospitals and health care coalition for emergencies, lessons learned about delivering dialysis care during disasters, and more. Officials also can find training related to preparedness, response and recovery. The assistance center is available through a toll-free number, email, and online.
TRACIE also includes an information exchange. Through this forum, health care emergency preparedness stakeholders can discuss, collaborate and share information about pending and actual health threats and promising practices. Users also can exchange templates, plans and other materials through this feature. Users can get advice, including just-in-time advice, from hundreds of health care, disaster medicine, public health and public safety professionals, through TRACIE. TRACIE’s free registration allows users to rate the usefulness of the resources and to access the information exchange.
Photovoice is an evaluation method that engages program stakeholders (learners, service recipients, community members) in taking photographs and using them as springboards to express their experiences and points of view. With the prevalence of cameras in mobile devices, along with social media forums, many of us are already engaged in the foundational practices underlying photovoice: taking photos, posting them, and sharing our experiences. Add in some facilitators who provide systematic method design, project management and ethical oversight; and you have the potential to gather program insights that would go untouched through traditional methods. The following two practical resources are written by action researchers describing their lessons learned about conducting photovoice projects. The documents also show you or link you to photos and commentary from contributing participants.
One comprehensive guide comes from the Prairie Women’s Health Centre of Excellence (PWHCE), located in Canada. The center engages in collaborative, community-based research on social and other determinants of the health of women and girls. The center’s mission is to provide expert advice on social policies related to women’s health. The authors (Beverly Palibroda, Brigette Krieg, Lisa Murdock and Joanne Havelock) published A Practical Guide To Photovoice: Sharing Pictures, Telling Stories and Changing Communities, a nuts-and-bolts photovoice manual. It provides detailed advice, with periodic sidebars summarizing process. An appendix includes a helpful checklist. You will find sample photovoice entries throughout the document. The manual was written in 2009. Since then, the PWHCE has introduced digital story-telling into its portfolio of participatory methods.
Another guide was produced based on a photovoice project for Brainline.org, an educational website providing authoritative information about brain injury symptoms, diagnosis, and treatment. The project featured the stories of eight members with traumatic brain injury, with a gallery of essays. Facilitators Laura Lorenz and Barbara Webster developed a succinct facilitator guide based on this project.
Forget about elevator speeches. Think elevator conversations instead. Elevator pitches are one of a number of strategies you may use to stealthily promote your organization’s successful programs and services, which generally consist of little promotional speeches of elevator-ride length that you can slip into small talk when you run in to “someone influential.” You can add nuggets of evaluation findings to these mini-speeches to demonstrate program value. But you may be missing a key element in the elevator pitch exchange: the other person. For insight, review the article by Tim David, Your Elevator Pitch Needs an Elevator Pitch, which appeared in the Harvard Business Review (10 December 2014), and emphasizes the importance of engaging your fellow elevator traveler, rather than talking “at” him or her. This leads to a conversation rather than a speech. It is notable how the author seamlessly slips in evidence to support his low-key pitch. For example, he surreptitiously inserts a statistic that he must have obtained from a follow-up evaluation with one of his client organizations that productivity and morale increased 38% after his training, to help underscore the value his service provided to the organization.
Here are several other tips from the article:
- Answer polite but perfunctory questions (such as “what does your office do?”) with a surprising answer.
- Use questions to draw your elevator companion into the conversation. David suggests that you talk no more than 20% of the time. Yield the remainder of the time to the other traveler, but use questions to keep the conversation rolling.
- Don’t worry too much about that 20-second time frame traditionally recommended for elevator pitches. If you successfully engage your fellow rider, he or she will hold the elevator door open to continue the chat.
The elevator pitch format is a good addition to your story-telling tool kit. But it may take some practice to be able to present an elevator pitch casually and conversationally. If you’re up for that challenge, then check out Tim David’s article for some excellent guidelines!
Recently the NN/LM Outreach Evaluation Resource Center (OERC) investigated web sites offering reviews of multiple online survey tools, yielding the following list of five resources as a starting point. In addition, there are individual reviews of online survey products on a variety of websites and blogs, which are not included in this list.
Zapier.com’s Ultimate Guide to Forms and Surveys, Chapter 7 “The 20 Best Online Survey Builder Tools”
This resource compares 20 different online survey tools. There is a chart with a brief statement of what each survey tool does best, what you get for free, and the lowest plan cost. Additionally, there is a paragraph description of each tool and what it does best. Note: this is part of an eBook published in 2015 which includes chapters like “The Best Online Form Builders for Every Task.”
Appstorm.net’s “18 Awesome Survey & Poll Apps”
This review was posted on May 27, 2015, which reassures that the information is most likely up to date. While there are very brief descriptions, it is good for a quick comparison of the survey products. Each review includes whether or not there is a free account, if the surveys can be customized, and whether or not there are ready-made templates.
Capterra.com’s “Top Survey Software Products”
This resource appears almost too good to be true. However, no date shown means that the specificity in the comparisons might not be accurate. Nevertheless, this website lists over 200 survey software products, has separate profile pages on each product (with varying amounts of detail), and lists features that each product offers. You can even narrow down the surveys you are looking for by filtering by feature. Hopefully the features in Capterra’s database are kept updated for each product. One thing to mention is that at least two fairly well-known survey products are not in their list.
AppAppeal.com’s “Top 31 Free Survey Apps”
Another review site with no date listed. This one compares 31 apps by popularity, presumably in the year the article was written. One thing that is unique about this review site is that the in-depth review includes the history and popularity of the app, the differences of each app to other apps, and recommended users for each app. Many of the reviews include videos showing how to use the app.
TopTenReviews.com’s 2015 Best Survey Software Reviews and Comparisons
This website has the feel of Consumer Reports. It has a long article explaining why you would use survey software, how and what the reviewers tested, and the kinds of things that are important when selecting survey software. Also like Consumer Reports, it has ratings of each product (including the experiences of the business, the respondents, and the quality of the support), and individual reviews of each product showing pros and cons. With the date included in the title of the review, the information is most likely current.
The National Library of Medicine (NLM) has released a new web page, Nursing Resources for Standards and Interoperability. The page is a resource for nurses, students, informaticians, and anyone interested in nursing terminologies for systems development. It describes the role of SNOMED CT and Laboratory Observation Identifiers Names and Codes (LOINC) in implementing Meaningful Use in the United States, specifically for the nursing care domain.
NLM has provided this resource in response to the position statement released by the American Nurses Association (ANA) that reaffirms support for use of recognized terminologies in coding nursing problems, interventions and observations (SNOMED CT), and in nursing assessments and outcomes (LOINC). In addition to SNOMED CT and LOINC, the Nursing Resources for Standards and Interoperability page provides information about other highly utilized nursing terminologies. The resource page provides a new two-minute video tutorial that describes how to use the Unified Medical Language System (UMLS) Metathesaurus Browser to find Concept Unique Identifiers (CUIs) and extract concept-level synonyms between SNOMED CT and other nursing terminologies. Additionally, links to other NLM Terminology resources and helpful resources are provided.
NLM welcomes feedback on the Nursing Resources for Standards and Interoperability page. Please send comments to NLM Customer Service.
Juice Analytics has developed a practical guide to explore how data visualization and storytelling techniques can mix, 30 Days to Data Storytelling. The guide provides a checklist of daily activities lasting no longer than 30 minutes per day. Activities include articles to read, videos to watch, or small projects to complete. The guide links to data visualization and storytelling resources from sources as varied as Pixar, the Harvard Business Review, Ira Glass, the New York Times, and Bono, the lead singer of U2. Use the techniques in this guide to tell a story to report your evaluation data so it gets the attention it deserves!
Over forty new examples have been added throughout Citing Medicine, the NLM style guide for authors, editors, and publishers. New references are for datasets, data repositories, ahead-of-print articles, and more. Corrections and clarifications were made based on user feedback or our own quality assurance efforts. Almost every chapter and two of the appendixes were edited and a new foreword was added. The full list of changes is available in the Content Updates appendix.
Mission statements are important. Organizations use them to declare to the world how their work matters. For employees, they guide efforts toward supporting organizational priorities. And mission statements are important to evaluators, because evaluation methods are ultimately designed to assess an organization’s value. Having those values explicitly stated is very helpful. The Nonprofit Hub’s document A Step-By-Step Exercise for Creating a Mission Statement is a tool that succintly lays out an effective 1-2 hour process to engage multiple stakeholders in the development of a mission statement, starting with a foundation of shared stories about the organization’s best work. In the end, everyone understands and endorses the mission statement because they helped develop it.
This exercise has potential that reaches beyond development of mission statements. It would be a great exercise for advisory groups to contribute their ideas about future activities, based on the organization’s past successes. The stories generated are data that can be analyzed for organizational impact. The group qualitative analysis process, alone, could be adapted to other situations. For example, a small project team could use the process to analyze stories from interviews, focus groups, or even written comments to open-ended survey questions.
How does your web survey look on a handheld device? The Pew Research Center reported that 27% of respondents to one of its recent surveys answered using a smartphone, and another 8% used a tablet. That means over one-third of participants used handheld devices to answer the questionnaire. The lesson learned is unless you are absolutely sure your respondents will be using a computer, you need to design surveys with mobile devices in mind. As a public opinion polling organization, the Pew Center knows effective practices in survey research. It offers advice on developing questionnaires for handhelds in its article Tips for Creating Web Surveys for Completion on a Mobile Device. The top suggestion is to be sure your survey software is optimized for smartphones and tablets. SurveyMonkey fits this criterion, as do many other popular Web survey applications.
Software alone will not automatically create surveys that are usable on handheld devices. It is also important to follow effective design principles, such as keeping it simple and using short question formats. Avoid matrix-style questions. Keep the length of your survey short. And don’t get fancy with questionnaires which include logos and icons, which take longer to load on smart devices. It is also advisable to pilot test questionnaires on computers, smartphones, and tablets, to be sure to offer a smooth user experience to all of your respondents.