Skip all navigation and go to page content
NN/LM Home About OERC | Contact Us | Feedback |OERC Sitemap | Help | Bookmark and Share

Archive for the ‘News’ Category

Webinars and Workshops about Evaluating Outreach

The National Network of Libraries of Medicine Outreach Evaluation Resource Center (OERC) offers a range of webinars and workshops upon request by network members and coordinators from the various regions. Take a look at the list and see if one of the options appeals to you. To request a workshop or webinar, contact Susan Barnes.

The workshops were designed as face-to-face learning opportunities but we can tailor them to meet distance learning needs by distilling them to briefer webinars or offering them in series of 1-hour webinars.

Don’t see what you’re looking for on this list? Then please contact Susan and let her know!

We’re looking forward to hearing from you.

Interview tips: Talking with participants during a usability test

The Nielsen Norman Group (NNG) conducts research and publishes information about user experience with interfaces. NNG was an early critic of the troubled “healthcare.gov” web site: “Healthcare.gov’s Account Setup: 10 Broken Usability Guidelines.” recent post (“Talking with participants during a usability test”) provided tips for facilitating usability tests that could be very useful whenever you’re facilitating a discussion or conducting an observation. When in doubt about whether to speak to a participant, count to 10 and decide whether to say something. Consider using “Echo” or “Boomerang” or “Columbo” approaches:

  • Echo–repeat the last words or phrase, using an interrogatory tone.
  • Boomerang–formulate a nonthreatening question that “pushes” a user’s comment back and causes them to think of a response for you, such as “What would you do if you were on your own?”
  • Columbo–be smart but don’t act that way, as in the “Columbo” TV series from the 1960’s and 1970’s starring Peter Falk.

The full article “Talking with participants during a usability test” provides audio examples of these techniques that you can listen to. You can find a large amount of additional information about usability testing on the Nielsen Norman Group’s web site, such as “How to Conduct Usability Studies” and “Usability 101: Introduction to Usability.”

Cleaning Up Your Charts

So how are those New Year’s resolutions going?

Many of us like to start the year resolving to clean up some part of our lives. Our diet. Our spending habits. The five years of magazine subscriptions sitting by our recliner.

Here’s another suggestion: Resolve to clean up “chart junk” in the charts you add to PowerPoint presentations or written reports.

Now I can pack information into a bar chart with the best of them. But it is no longer in vogue to clutter charts with data labels, gridlines, and detailed legends. This is not just a fashion statement, either. Design experts point out that charts should make their point without the inclusion of a bunch of distracting details. If the main point of your chart is not visually obvious, you either have not designed it correctly or you are presenting a finding that is not particularly significant.

So the next time you create a chart, consider these suggestions:

  • Use your title to communicate the main point of the chart. Take a tip from newspaper headlines and make your title a complete sentence.
  • Don’t use three-dimensional displays. It interferes with people’s comprehension of charts.
  • Ditch the gridlines or make them faint so they don’t clutter the view.
  • Use contrast to make your point. Add a bright color to the bar or line that carries the main point and use gray or another faint color for the comparison bars or lines.
  • Be careful in picking colors. Use contrasting colors that are distinguishable to people with colorblindness. If your report is going to be printed, be sure the contrast still shows up when presented in black-and-white.
  • Consider not using data labels, or just label the bar or line associated with your main point.
  • Remove legends and apply legend labels inside the bars or at the end of lines.

For more comprehensive information on eliminating chart junk, check out this article:

Evergreen S, Mezner C. Design principles for data visualization in evaluation. In Azzam T, Evergreen S. (eds). Data visualization, part 2. New Directions in Evaluation. Winter 2013, 5-20.

“Evidence” — what does that mean?

In our health information outreach work we are expected to provide evidence of the value of our work, but there are varying definitions of the word “evidence.” The classical evidence-based medicine approach (featuring results from randomized controlled clinical trials) is a model that is not always relevant in our work. At the 2013 EBLIP7 meeting in Saskatoon, Saskatchewan, Canada, Denise Kaufogiannakis presented a keynote address that is now available as an open-access article on the web:

“What We Talk About When We Talk About Evidence” Evidence-Based Library and Information Practice 2013 8.4

This article looks at various interpretations of what it means to provide “evidence” such as

theoretical (ideas, concepts and models to explain how and why something works),
empirical (measuring outcomes and effectiveness via empirical research), and
experiential (people’s experiences with an intervention).

Kaufogiannakis points out that academic librarians’ decisions are usually made in groups of people working together and she proposes a new model for evidence-based library and information practice:

1) Articulate – come to an understanding of the problem and articulate it. Set boundaries and clearly articulate a problem that requires a decision.

2) Assemble – assemble evidence from multiple sources that are most appropriate to the problem at hand. Gather evidence from appropriate sources.

3) Assess – place the evidence against all components of the wider overarching problem. Assess the evidence for its quantity and quality. Evaluate and weigh evidence sources. Determine what the evidence says as a whole.

4) Agree – determine the best way forward and if working with a group, try to achieve consensus based on the evidence and organizational goals. Determine a course of action and begin implementation of the decision.

5) Adapt – revisit goals and needs. Reflect on the success of the implementation. Evaluate the decision and how it has worked in practice. Reflect on your role and actions. Discuss the situation
with others and determine any changes required.

Kaufogiannakis concludes by reminding us that “Ultimately, evidence, in its many forms, helps us find answers. However, we can’t just accept evidence at face value. We need to better understand evidence – otherwise we don’t really know what ‘proof’ the various pieces of evidence provide.”

Institute for Research Design in Librarianship: 9 days in southern CA; full scholarships available

Do you want to learn about how your user groups and communities find and use information? Do you want to gather evidence to demonstrate that your work is making a difference?

Exciting news! You can work on these questions, and questions like them, June 16-26, 2014!

The Institute for Research Design in Librarianship is a great opportunity for an academic librarian who is interested in conducting research. Research and evaluation are not necessarily identical, although they do employ many of the same methods and are closely related. This Institute is open to academic librarians from all over the country. If your proposal is accepted, your attendance at the Institute will be paid for, as will your travel, lodging, and food expenses.

The William H. Hannon Library has received a three-year grant from the Institute for Museum and Library Services (IMLS) to offer a nine-day continuing education opportunity for academic and research librarians. Each year 21 librarians will receive instruction in research design and a full year of support to complete a research project at their home institutions. The summer Institute for Research Design in Librarianship (IRDL) is supplemented with pre-institute learning activities and a personal learning network that provides ongoing mentoring. The institutes will be held on the campus of Loyola Marymount University in Los Angeles, California.

The Institute is particularly interested in applicants who have identified a real-world research question and/or opportunity. It is intended to

“bring together a diverse group of academic and research librarians who are motivated and enthusiastic about conducting research but need additional training and/or other support to perform the steps successfully. The institute is designed around the components of the research process, with special focus given to areas that our 2010 national survey of academic librarians identified as the most troublesome; the co-investigators on this project conducted the survey to provide a snapshot of the current state of academic librarian confidence in conducting research. During the nine-day institute held annually in June, participants will receive expert instruction on research design and small-group and one-on-one assistance in writing and/or revising their own draft research proposal. In the following academic year, participants will receive ongoing support in conducting their research and preparing the results for dissemination.”

Your proposal is due by February 1, 2014. Details are available at the Institute’s Prepare Your Proposal web site.

Factoid: Loyola Marymount is on a bluff above the Pacific Ocean, west of central LA.

New, Improved, and Available Now!

The 2nd Edition of the Planning and Evaluating Health Information Outreach Projects series of 3 booklets is now available online:

Getting Started with Community-Based Outreach (Booklet 1)
What’s new? More emphasis and background on the value of health information outreach, including its relationship to the Healthy People 2020 Health Communication and Health Information Technology topic areas

Planning Outcomes-Based Outreach Projects (Booklet 2)
What’s new? Focus on uses of the logic model planning tool beyond project planning, such as providing approaches to writing proposals and reports.

Collecting and Analyzing Evaluation Data (Booklet 3)
What’s new? Step-by-step guide to collecting, analyzing, and assessing the validity (or trustworthiness) of quantitative and qualitative data, using questionnaires and interviews as examples.

These are all available free to NN/LM regional offices and network members. To request printed copies, send an email to nnlm@uw.edu.

Non-508 compliant pdf versions of all three booklets are available here: http://nnlm.gov/evaluation/guides.html#A2 .

The Planning and Evaluating Health Information Outreach series, by Cynthia Olney and Susan Barnes, supplements and summarizes material in Cathy Burroughs’ groundbreaking work from 2000, Measuring the Difference: Guide to Planning and Evaluating Health Information Outreach. Printed copies of Burroughs’ book are also available free—just send an email request to nnlm@uw.edu.

How clinicians use information resources at the point of care–a grounded theory study

An interesting study by clinicians of how clinicians use information resources has appeared in a recent issue of JAMA Internal Medicine:

Cook DA; Sorensen KJ; Wilkinson JM; and Berger RA. “Barriers and decisions when answering clinical questions at the point of care: A grounded theory study.” JAMA Intern Med, published online August 26, 2013. [epub ahead of print PMID: 23979118] This article provides details about steps that the researchers took in their qualitative study of how 50 primary care and subspecialist internal medicine and family medicine physicians use online information resources (such as UpToDate, MD Consult, Micromedex, and publicly available Internet resources) to answer clinical questions at the point of care. You can find details here about how the focus groups were conducted, how the participants were selected, and how data was collected and analyzed. This article provides a great template for an approach to collecting qualitative information via focus groups, and ends with an unsurprising conclusion:

“Physicians perceive that insufficient time is the greatest barrier to point-of-care learning, and efficiency is the most important determinant in selecting an information source.”

Photovoice – Evaluation through Photography

A picture’s worth a thousand words, and a method called photovoice takes advantage of pictures’ compelling qualities by incorporating photography into research and evaluation. Photovoice is a participatory evaluation method in which program participants are given cameras to capture images that convey their feelings, beliefs and experiences about an issue. The method is used frequently in advocacy projects, allowing the less powerful stakeholders to communicate about issues that impact their lives.

Photovoice seems to be a particularly popular way to engage youth in projects or in evaluation. For examples of photovoice projects with teenagers, check out the two articles listed at the end of this blog entry. The project described in Necheles et al. used photovoice to engage teenagers in identifying influences over their own health behavior. These teens then developed materials such as posters to advocate for healthier lifestyles among their peers. The article by Strack, Magill and McDonagh presents a project in which teens identified problems in their neighborhoods through photovoice. Both articles provide abundant advice for conducting photovoice projects, including how to engage youth in analyzing photos and ideas for presenting results.

Some photovoice projects carry potential risk for participants. Participants also must be taught how to get and document consent from others who appear in their photos. Consequently, photovoice projects require above-average planning and management. For an excellent resource on managing photovoice projects, check out photovoice.org.

Resources:

Necheles JW et al. The Teen Photovoice Project: A pilot study to promote Health through Advocacy. Prog Community Health Partnersh 2007 Fall; 1(3): 221–229.  (available at PubMedCentral, http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2837515/)

Strack RW, Magill C, McDonagh K. Engaging youth through photovoice. Health Promot Pract 2004;5:49–58. Available at http://www.ncbi.nlm.nih.gov/pubmed/14965435

Photovoice.org, particularly the organization’s methodology section at http://www.photovoice.org/shop/info/methodology-series

How to Ignite Your Presentation: AEA Training Webinar

On July 27, 2012 Stephanie Evergreen, eLearning Initiatives Director for the American Evaluation Association, gave a half-hour webinar about the Ignite approach to giving presentations.  This approach involves a 5 minute presentation based on 20 slides that are each shown for 15 seconds.  (Yes, this is similar to Pecha Kucha.)  The American Evaluation Association, which is conducting a “Potent Presentations” initiative to help its members improve their reporting skills, has made the recording and slides for this great presentation available in its free AEA Public Library:

In her short, practical webinar, Stephanie demonstrated the Ignite approach with a great presentation about “Chart Junk Extraction”—valuable tips for creating streamlined, readable charts with maximized visual impact.  Spend an enjoyable and enlightening few minutes viewing the fast-paced and interesting “Light Your Ignite Training Webinar”—you can even learn how to set your PowerPoint timer to move forward automatically every 15 seconds so that you can practice your Igniting!

How to Analyze Qualitative Data

“Utilizing grounded theory to explore the information-seeking behavior of senior nursing students.” Duncan V; Holtslander, L.  J Med Lib Assoc 100(1) January 2012:20-27.

In this very practical article, the authors describe the steps they took to analyze qualitative data from written records that nursing students kept about their experiences with finding information in the CINAHL database.  They point out that, although the ideal way to gather data about student information seeking behavior would be via direct observation, that approach is not always practical.  Also, self-reporting via surveys and interviews may create bias because members of sample populations might “censor themselves instead of admitting an information need.”  For this study, students were asked to document their search process using an electronic template that included “prompts such as resource consulted, reason for choice, terms searched, outcome, comments, and sources consulted (people).”

After reviewing these searching journals, the authors followed up with interviews.

The “Data analysis and interpretation” section of this article provides a clear, concise description of the grounded theory approach to analyzing qualitative data using initial, focused, and theoretical coding using the nVivo 8 software.  [Note, as of this writing, the latest version is nVivo 10]

  • Initial codes:  “participants’ words were highlighted to create initial codes that reflected as closely as possible the participants’ own words.”
  • Focused codes:  “more directed, selective, and conceptual than word-by-word, line-by-line, and incident-by-incident coding.”
  • Theoretical codes:  focused codes were compared and contrasted “in order to develop the emerging theory of the information-seeking process.”

The authors reviewed the coding in follow-up interviews with participants to check the credibility of their findings:  “The central theme that united all categories and explained most of the variation among the data was ‘discovering vocabulary.’”  They recommend “teaching strategies to identify possible words and phrases to use” when searching for information.

You can do this even if you don’t have access to nVivo 8 software.  Here’s an illustration: “Summarize and Analyze Your Data” from the OERC’s Collecting and Analyzing Evaluation Data booklet.