Planning & Evaluating
Health Information
Outreach Projects

Booklet Two

PLANNING OUTCOMES-BASED
OUTREACH PROJECTS

2nd Edition - Outreach Evalution Resource Center 2013

 

 

Figure 1: Planning the Program and Evaluation Methods

STEP 1

Plan Your Program with a Logic Model

  • Start with Outcomes
  • Connect Activities to Outcomes
  • Identify Inputs Needed to Conduct Activities
  • Finish with a Reality Check
  • Get Input from Your Team of Advisors
STEP 2

Use Your Logic Model for Process Assessment

  • Plan Ahead for Data Collection
  • Conduct Audience Analysis
  • Track Progress, Make Needed Changes, and Identify Lessons Learned
STEP 3

Use Your Logic Model to Develop an Outcomes Assessment Plan

  • Identify Quantifiable Indicators
  • Choose a Target and Time Frame
  • Write Objectives
  • Create an Action Plan

 

 

Figure 2: A Logic Model's "If...Then" Concepts

Inputs

Activities

Outcomes

What we invest

What we do - Who we reach

  • Short-Term (Learning)
  • Intermediate (Action)
  • Long-Term (Conditions)

If we get these resources...

...and conduct these activities to reach these people...

...then we will accomplish these outcomes.

 

 

Figure 3: Basic Logic Model Template *

Program: Health Information Outreach Program
Goal: Improve community members' abilities to find, evaluate, and use health information

Inputs

Activities
Column 1

Activities
Column 2

Outcomes
Column 1

Outcomes
Column 2

Outcomes
Column 3

What we invest What we do Who we reach Why we do it: Short-term results Why we do it: Intermediate results Why we do it: Long-Term results
  • Staff
  • Volunteers
  • Time
  • Money
  • Research findings
  • Materials
  • Equipment
  • Technology
  • Partners
  • Conduct workshops and meetings
  • Train
  • Deliver services
  • Develop products, curricula, resources
  • Facilitate access to information
  • Work with media
  • Participants
  • Clients
  • Agencies and community-based organizations (CBOs)
  • Decision-makers
  • Customers
  • Clinical professionals
  • Members of CBOs
Learning
  • Awareness
  • Knowledge
  • Attitudes
  • Skills
  • Opinions
  • Aspirations
  • Motivations
Action
  • Behavior
  • Practice
  • Decision-making
  • Policies
  • Social Action
Conditions
  • Health
  • Social
  • Economic
  • Civic
  • Environmental

Assumptions

(Should be confirmed before beginning the program)

External Factors

(Should be identified before beginning the program)

*Adapted from the U.S. Government Accounting Office [8], the University of Wisconsin-Extension [7], and the W.K. Kellogg Foundation [9]

 

 

Figure 4: Examples of Different Outcomes

Types of Outcomes Examples
Individual Level
Cognitive
  • Increased awareness of Internet-based health resources
  • Improved understanding of side effects of a prescription drug
  • Improved knowledge of how to control a chronic health condition such as hypertension or diabetes
Affective
  • Increased confidence in finding good health information
  • Increased confidence in asking questions of a physician
Skills
  • Improved ability to distinguish reliable from unreliable health information
  • Improved ability to manage health issues (e.g., prevent asthma attacks; cook with less salt to manage hypertension)
Quality-of-Care
  • Increased use of Internet resources to supplement information from health care providers
  • Increased use of health information when making health care decisions
Community Level
Environmental
  • Improved community access to the Internet
  • Improved reliability of Internet service in a community organization
Social
  • Increased number of volunteers available to help members of the community access online health resources

 

 

Figure 5: Kirkpatrick's Four Levels of Training Evaluation

Level 4 - Results
  • Long-Term Changes
  • Community-Level Impacts
  • Public Health
Level 3 - Behavior
  • Use Knowledge
  • Change Practices
Level 2 - Learning
  • Change Attitudes
  • Improve Knowledge
  • Increase Skills
Level 1 - Reaction
  • Satisfaction
  • Motivation

Note: The Kirkpatrick model traditionally has been portrayed as a triangle. Kirkpatrick himself changed the image in recent years, so Figure 5 presents his current portrayal of his model.

 

 

Figure 6: The Innovation-Decision Process

Flowchart depicting flow from knowledge to persuasion to decision to implementation to confirmation.

This is the process that individuals go through when adopting a new product, resource or behavior.

 

 

Figure 7: Assumptions in Program Planning

Category Examples of Assumptions
Target Population
  • They are interested in your activities
  • They can be motivated to participate
  • They will be available to participate
Environment
  • Convenient and reliable access to computers and the Internet can be obtained
  • Access to convenient and suitable facilities for your activities will be available
Staff
  • Staff members have knowledge and skills to implement the program
  • These staff have the time and resources to work on the project
  • These staff are motivated and committed to participate

 

 

Figure 8: Using a Logic Model in Proposal Writing

Logic Model Column Proposal Section
Inputs Budget
Activities Strategies
Outcomes Results and evaluation
Assumptions Reviewers' questions
External Factors Support and barriers

 

 

Figure 9: Using a Logic Model for Reporting

Logic Model Column Report Section
Inputs What you needed
Activities What you did
Outcomes What you accomplished
Assumptions Background
External Factors Background

 

 

Figure 10: Process Assessment Questions and Methods

Process Questions Information to collect Methods
To what extent were you able to implement your project as planned?
  • How well did the project staff follow procedures in the plan?
  • What factors increased or decreased the quality of delivery?
  • Focused staff feedback sessions
  • Observations of activities
To what extent were you able to conduct specific activities as they were planned?
  • How many promotional items were given away?
  • How many training sessions were offered?
  • How many hours of support were provided to community members?
  • How many hours were computers and Internet labs available to your target community?
  • Counts of promotional materials
  • Counts of activities (such as exhibits, training sessions, etc.)
  • Total hours of support provided in project period
  • Total hours of computer and Internet availability
  • Checklists for staff to report what resources they demonstrated or taught
How much community interest and activity did your projects generate?
  • How many people attended your activities?
  • How many people completed activities (e.g., participated in all sessions of a multi-day training)?
  • How many people requested assistance?
  • How many people used the equipment or websites made available through your project?
  • Attendance counts for events or training sessions
  • Feedback forms from participants asking them to evaluate their experience
  • Reference desk usage counts
  • Visitor counts for computer labs, kiosks, etc.
  • Web traffic statistics for websites
To what extent did you reach your intended community?
  • What proportion of all participants were from your priority target community?
  • Percentage of participants from high-need groups (e.g., low-income participants; residents of medically underserved areas)
How effective were your recruitment strategies for attracting community members?
  • What strategies worked well to attract community members and what barriers impacted recruitment?
  • What strategies helped you maintain participant involvement as needed and what barriers did you face?
  • Written feedback forms asking users what attracted them to activities
  • Counts of participants who completed all activities (e.g., all sessions of a multi-day training)
  • Feedback sessions with project staff
  • Interviews with participants
  • Interviews with members of your communitybased partner organization that experienced or contributed to your project
What situational factors in the environment, community, or organizations affected project implementation?
  • What influenced project staff 's ability to implement the project?
  • What influenced users' reactions to the program or their ability to participate in activities?
  • Focus groups with participants or users
  • Interviews with participants or staff from partnering organizations

 

 

Figure 11: Examples of Outcomes with Indicators and Objectives

Outcome Indicator Objective
Participants will feel more confident about locating high-quality health information on the Internet Participants will indicate on the training evaluation form that they are more confident about locating highquality health information on the Internet One month after a training session, 50% of participants will report feeling more confident about locating high-quality health information on the Internet
Diabetes patients will discuss information they found on MedlinePlus with their diabetes educator Diabetes educator will track the number of diabetes education class participants who bring MedlinePlus information to discuss in class or at appointments Three months after the training session, 50% of diabetes patients trained to use MedlinePlus will report having a discussion with their diabetes educator about the information they found on MedlinePlus
Teenagers will use MedlinePlus to get health information for a family member after they receive training Teenagers will indicate in a questionnaire that they got MedlinePlus information for family members 50% of teenagers trained to use MedlinePlus will report getting health information from a family member within a month after training
Library staff will use NLM resources more often after being trained on these resources Library computers will show more hits to the library's NLM resource web page after library staff members have been trained There will be a 25% increase in the number of visits to the library's NLM resource web page from the library's computers six months after all library staff members have completed training

 

 

Figure 12: Evaluating Findings Using Success Criteria

Objective: At the end of a training session, 50% of participants will report feeling more confident about locating high-quality health information on the Internet

Measurable Indicator: % of participants who report feeling more confident about locating high-quality health information on the Internet
Target: 50% of participants
Time frame: One month after the training session

Data Source Evalutation Method Data Collection Timing
Training participants Post-training electronic questionnaire sent to all training participants Participants will receive the survey approximately 1 month after their training

 

 

Figure 13: Evaluating Findings Using Change Over Time

Objective: There will be a 25% increase in the number of visits to the library's NLM resource web page from the library's computers within six months after all library staff has completed training

Measurable Indicator: % increase in the number of visits to NLM resource websites
Target: 25% increase
Time frame: Six months after library staff has been trained

Data Source Evalutation Method Data Collection Timing
Web traffic data from library computers Pre/post training comparison of number of hits Total number of visits to NLM resources three months prior to staff training (baseline) and total number of visits for the three months after staff training