Archive for the ‘General’ Category
Data visualization expert Stephen Few explained the problem with pie charts during this interview with the New York Times: “When looking at parts of a whole, the primary task is to rank them to see the relative performance of the parts. That can’t be done easily when relying on angles formed by a slice.” An article by American Evaluation Association’s president-elect John Gargani argues for retirement of the venerable pie chart. He make points that are repeated in many anti-pie chart blog posts. On the other hand, this post by Bruce Gabrielle of Speaking PowerPoint describes situations where pie charts can shine.
In general, most experts believe that the times and places to use pie charts are few and far between. If you have found one of those rare times, then here’s a post at Better Evaluation with design tips to follow. And for humorous examples of what not to do, check out Michael Friendly’s Evil Pies blog!
A data party is another name for a kind of participatory data analysis, where stakeholders are gathered together to help analyze data that you have collected. Here are some reasons to include stakeholders in the data analysis stage:
- It allows stakeholders to get to know and engage with the data.
- Stakeholders may bring context to the data that will help explain some of the results.
- When stakeholders participate in analyzing the data, they are more likely to understand and use it.
- Watching their interactions often reveals the person with the power to act on your recommendations.
To begin the process, you need to know what you hope to gain from the attendees, since you may only be able to hold an event like this one time. There are a number of different ways to organize the event, such as the World Cafe format, where everyone works together to explore a set of questions, or an Open Space system in which attendees create their own agenda about which questions they want to discuss. Recently the American Evaluation Association held a very successful online unconference using MIT’s Unhangout, an approach that could be used for an online data party with people from multiple locations.
Here are suggested questions to ask at a data party:
- What does this data tell you?
- How does this align with your expectations?
- What do you think is occurring here and why?
- What other information do you need to make this actionable?
At the end of the party it might be time to present some of your findings and recommendations. Considering the work that they have done, stakeholders may be more willing to listen, since people often tend to support what they helped to create.
The Medical Library Association’s October 28 continuing education webinar, Data Visualization Skills and Tools for Librarians, was presented by Lisa Federer, Research Data Informationist at the NIH Library. The session provided information on different aspects of data visualization, including information about elements of design, such as color, line, and contrast. Lisa has also created the LibGuide Creating Infographics with Inkscape, which contains the resources for a class she taught with NIH Informationist Chris Belter. The LibGuide includes a Power Point presentation from the lecture part of the class. The slides cover design principles and design elements with links to resources such as Vischeck, a tool for finding out how colors in a chart appear to someone who is color blind, and The 10 Commandments of Typography, with suggestions for choosing font combinations that work well.
The second part of the class is a hands-on section for using Inkscape, a free, open-source graphics program, to make infographics. Inkscape allows you to use “vector graphics” to design infographics. Vector graphics are useful for image design, since they are based on pathways defined by mathematical expressions like lines, curves, and triangles, allowing images to get larger and smaller without losing any quality. If this sounds hard to do, there are Inkscape tutorials available to help. Other vector graphics editors are available, such as Apache OpenOffice Draw, a free service, or Adobe Illustrator. Comparisons with links to detailed information are available in Wikipedia’s “Comparison of Vector Graphics Editors.”
You may think of a survey invitation letter or email message as simply a delivery mechanism to send the questionnaire link to prospective respondents. The invitation may be an afterthought, hastily composed after the process of developing the questionnaire itself. However, a carefully crafted invitation has been proven to boost response rates, which are a key concern when conducting surveys. The following tips for writing invitation messages are all included in the 4th edition of Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, by Dillman, Smyth, and Christian (2014), an excellent resource for conducting all aspects of the survey process. It is evidence-based, drawing on an extensive body of research literature on survey practice.
Think of the survey invitation as a “communication plan,” utilizing multiple contacts with participants to elicit good response rates. Plan for a minimum of four contacts:
- A preliminary message to let your participants know you will be sending them a questionnaire. (Do not include the questionnaire link)
- An invitation message with a link to the questionnaire (2-3 days after the preliminary contact).
- A reminder notice, preferably only to those who have not responded (one week after the invitation message).
- A final reminder notice, also specifically to those who have not responded (one week after the first reminder).
Emphasize how the participants’ feedback will help your organization improve services or programs. This simple request appeals to a common desire among humans to help others. If applicable, emphasize that you need their advice specifically because of their special experience or expertise. It is best to use mail merge to personalize your email messages, so that each participant is personally invited by name to submit their feedback. If you are contacting people who have a relationship with your organization, such as your library users or members of your organization, play up that relationship. Also, make a commitment to share results with them at a later date. And be sure to keep that commitment!
Phishing and email scams may cause leeriness about clicking on links if an email message seems odd in any way. Make sure participants know they can trust your invitation email and survey link. Take opportunities to publicize your institutional affiliation. Incorporate logos or letterhead into your emails, when possible. Provide names, email addresses, and phone numbers of one or two members of your evaluation team, so participants know who to contact with questions or to authenticate the source of the email request. You may never get a call, but they will feel better about answering questions if you give them convenient access to a member of the project team. It is also helpful to get a public endorsement of your survey project from someone who is known and trusted by your participants. You can ask someone influential in your organization to send the preliminary letter or message on your behalf. Also, publicize the project over social media channels or through organizational newsletters or blogs.
Be explicit about who will have access to individual-level data. Be sure you know the difference between anonymity (where no one knows what any given participant specifically said) and confidentiality (where identifiable comments are seen by a few specific people). You can also let participants know how you will protect their identity, but don’t go overboard. Long explanations also can cast doubt on the trustworthiness of your invitation.
And finally, provide status updates when sending reminder messages. If you mention that you are getting great feedback from other respondents, it may motivate the late responders who want to match the behavior of their peers!
The National Library of Medicine (NLM) is accepting applications for its 2016-17 Associate Fellowship program, a one-year training program designed for recent library science graduates and early-career librarians. All U.S. and Canadian citizens who will have earned a MLS or equivalent degree in library/information science from an ALA-accredited school by August 2016 are eligible to apply. Priority is given to U.S. citizens. Applications and additional information are available on the NLM web site. The application deadline is February 12, 2016. Up to five candidates will be selected for the program.
The program is a one-year residency program (with an optional second year) for recent library science graduates interested in a career in health sciences librarianship. The program combines curriculum and project work and is located at the National Library of Medicine on the campus of the National Institutes of Health in Bethesda, Maryland.
The Associate Fellowship provides knowledge and skills in project work ranging from:
- Data analysis of programs and services such as extramural grants, indexed journal articles, controlled vocabularies, datasets, and customer inquiries.
- Creation of online tutorials and educational awareness videos.
- Social media outreach.
- And more, including legislative tracking, web site enhancement, disaster information outreach studies, and review of next generation discovery interfaces.
The Associate Fellowship financial support includes:
- Annual stipend of $52,668.
- Additional funding to support purchase of group health insurance.
- Up to $1,500 in relocation support.
- Funding to support attendance at local and national conferences.
Photovoice is an evaluation method that engages program stakeholders (learners, service recipients, community members) in taking photographs and using them as springboards to express their experiences and points of view. With the prevalence of cameras in mobile devices, along with social media forums, many of us are already engaged in the foundational practices underlying photovoice: taking photos, posting them, and sharing our experiences. Add in some facilitators who provide systematic method design, project management and ethical oversight; and you have the potential to gather program insights that would go untouched through traditional methods. The following two practical resources are written by action researchers describing their lessons learned about conducting photovoice projects. The documents also show you or link you to photos and commentary from contributing participants.
One comprehensive guide comes from the Prairie Women’s Health Centre of Excellence (PWHCE), located in Canada. The center engages in collaborative, community-based research on social and other determinants of the health of women and girls. The center’s mission is to provide expert advice on social policies related to women’s health. The authors (Beverly Palibroda, Brigette Krieg, Lisa Murdock and Joanne Havelock) published A Practical Guide To Photovoice: Sharing Pictures, Telling Stories and Changing Communities, a nuts-and-bolts photovoice manual. It provides detailed advice, with periodic sidebars summarizing process. An appendix includes a helpful checklist. You will find sample photovoice entries throughout the document. The manual was written in 2009. Since then, the PWHCE has introduced digital story-telling into its portfolio of participatory methods.
Another guide was produced based on a photovoice project for Brainline.org, an educational website providing authoritative information about brain injury symptoms, diagnosis, and treatment. The project featured the stories of eight members with traumatic brain injury, with a gallery of essays. Facilitators Laura Lorenz and Barbara Webster developed a succinct facilitator guide based on this project.
Now available from the National Library of Medicine is an extensive selection from the John E. Fogarty Papers at Providence College, on the National Library of Medicine’s Profiles in Science web site. Profiles in Science is a digital project of the Library that provides online access to archival collections of twentieth-century leaders in science, medicine, and public health. John Edward Fogarty (1913–1967) was an American legislator who became known as “Mr. Public Health” for his outstanding advocacy of federal funding for medical research, health education, and health care services. As Democratic representative for Rhode Island, he served in the U.S. House of Representatives from 1941 to 1967, and chaired the House Appropriations Subcommittee for the Departments of Labor and Health, Education, and Welfare beginning in 1949. Under his leadership the budget for NIH grew from $37 million in 1949 to $1.24 billion in 1967. In 1947, Fogarty became convinced that more medical research and better health services were the surest way to help Americans prosper. As chairman of the subcommittee, he worked with a bipartisan coalition to rapidly expand funding for research at the National Institutes of Health, and to fund improved health and educational services for blind, deaf, and mentally disabled children. Fogarty also sponsored many bills for the construction of research facilities, expansion of medical, dental, and public health programs, and construction of community mental health centers. In fact, he contributed to virtually every piece of health-related legislation passed during this time. Fogarty’s achievements also included legislation to support medical and public libraries, including NLM.
The John E. Fogarty Papers Profiles in Science site features correspondence, legislative records, speeches, interviews, and photographs from the John E. Fogarty Papers held by the Phillips Memorial Library, Special and Archival Collections at Providence College in Providence, RI, along with photographs and other materials provided by the Fogarty family. Visitors to Profiles in Science can view, for example, photos from Fogarty’s early career, correspondence with constituents and colleagues, and the journal he kept during his Navy service in 1945. The site also includes a 2014 interview with former Congressman and Secretary of Defense Melvin R. Laird, whose bi-partisan partnership with Congressman Fogarty was instrumental in passing many pieces of legislation related to health care and medical research. The interview with Secretary Laird was made possible through the generosity of Mary Fogarty McAndrew. An in-depth historical narrative leads to a wide range of primary source materials that provide a window into John Fogarty’s life and major contributions to the growth of medical research, public health, and social legislation. Visitors may also view a brief chronology of Fogarty’s life, and a further readings page, as well as search and browse the collection.
Forget about elevator speeches. Think elevator conversations instead. Elevator pitches are one of a number of strategies you may use to stealthily promote your organization’s successful programs and services, which generally consist of little promotional speeches of elevator-ride length that you can slip into small talk when you run in to “someone influential.” You can add nuggets of evaluation findings to these mini-speeches to demonstrate program value. But you may be missing a key element in the elevator pitch exchange: the other person. For insight, review the article by Tim David, Your Elevator Pitch Needs an Elevator Pitch, which appeared in the Harvard Business Review (10 December 2014), and emphasizes the importance of engaging your fellow elevator traveler, rather than talking “at” him or her. This leads to a conversation rather than a speech. It is notable how the author seamlessly slips in evidence to support his low-key pitch. For example, he surreptitiously inserts a statistic that he must have obtained from a follow-up evaluation with one of his client organizations that productivity and morale increased 38% after his training, to help underscore the value his service provided to the organization.
Here are several other tips from the article:
- Answer polite but perfunctory questions (such as “what does your office do?”) with a surprising answer.
- Use questions to draw your elevator companion into the conversation. David suggests that you talk no more than 20% of the time. Yield the remainder of the time to the other traveler, but use questions to keep the conversation rolling.
- Don’t worry too much about that 20-second time frame traditionally recommended for elevator pitches. If you successfully engage your fellow rider, he or she will hold the elevator door open to continue the chat.
The elevator pitch format is a good addition to your story-telling tool kit. But it may take some practice to be able to present an elevator pitch casually and conversationally. If you’re up for that challenge, then check out Tim David’s article for some excellent guidelines!
Recently the NN/LM Outreach Evaluation Resource Center (OERC) investigated web sites offering reviews of multiple online survey tools, yielding the following list of five resources as a starting point. In addition, there are individual reviews of online survey products on a variety of websites and blogs, which are not included in this list.
Zapier.com’s Ultimate Guide to Forms and Surveys, Chapter 7 “The 20 Best Online Survey Builder Tools”
This resource compares 20 different online survey tools. There is a chart with a brief statement of what each survey tool does best, what you get for free, and the lowest plan cost. Additionally, there is a paragraph description of each tool and what it does best. Note: this is part of an eBook published in 2015 which includes chapters like “The Best Online Form Builders for Every Task.”
Appstorm.net’s “18 Awesome Survey & Poll Apps”
This review was posted on May 27, 2015, which reassures that the information is most likely up to date. While there are very brief descriptions, it is good for a quick comparison of the survey products. Each review includes whether or not there is a free account, if the surveys can be customized, and whether or not there are ready-made templates.
Capterra.com’s “Top Survey Software Products”
This resource appears almost too good to be true. However, no date shown means that the specificity in the comparisons might not be accurate. Nevertheless, this website lists over 200 survey software products, has separate profile pages on each product (with varying amounts of detail), and lists features that each product offers. You can even narrow down the surveys you are looking for by filtering by feature. Hopefully the features in Capterra’s database are kept updated for each product. One thing to mention is that at least two fairly well-known survey products are not in their list.
AppAppeal.com’s “Top 31 Free Survey Apps”
Another review site with no date listed. This one compares 31 apps by popularity, presumably in the year the article was written. One thing that is unique about this review site is that the in-depth review includes the history and popularity of the app, the differences of each app to other apps, and recommended users for each app. Many of the reviews include videos showing how to use the app.
TopTenReviews.com’s 2015 Best Survey Software Reviews and Comparisons
This website has the feel of Consumer Reports. It has a long article explaining why you would use survey software, how and what the reviewers tested, and the kinds of things that are important when selecting survey software. Also like Consumer Reports, it has ratings of each product (including the experiences of the business, the respondents, and the quality of the support), and individual reviews of each product showing pros and cons. With the date included in the title of the review, the information is most likely current.
The U.S. Department of Health and Human Services (HHS) has issued a Notice of Proposed Rule Making (NPRM) to revise the Common Rule for the protection of human participants in research. It is now posted on the Federal Register’s public inspection website and will appear on Tuesday, September 8, in the printed version of the Federal Register for a 90-day public comment period. The major reforms propose to: 1) calibrate oversight to level of risk; 2) enhance respect for research participants; 3) facilitate broad participation in research; 4) increase privacy and security safeguards for research with biospecimens and data; 5) simplify consent documents; and 6) streamline IRB review. All interested stakeholders are encouraged to review the proposed revisions and make comments.