English Arabic Chinese (Simplified) French Hindi Japanese Korean Persian Portuguese Russian Spanish

5 Steps to an Evaluation

Step 4: Create an Evaluation Plan

The fourth step in designing your evaluation is to create the evaluation plan. An evaluation plan describes how the project will be evaluated. It includes the description of the purpose of the evaluation, evaluation questions, timetable/work plan, as well as a description of the data collection tools to be used, an analysis framework, and a section articulating how data will be used and disseminated. An evaluation plan is often a key component of a grant proposal but will also serve as your guide for implementing the evaluation. This phase is completed in three segments:

  1. Defining evaluation questions
  2. Developing the evaluation design
  3. Conducting an ethical review

Defining Evaluation Questions

Evaluation questions help define what to measure and provide clarity and direction to the project.

Process evaluations and outcome evaluations are two common types of evaluations with different purposes. Consider which makes the most sense for you and the objectives of your evaluation (you DO NOT need to do both). Then explore the resources to inform your evaluation plan.

Process Questions - Are you doing what you said you'd do?

  • Process evaluation questions address program operations – the who, what, when, and how many related to program inputs, activities, and outputs.
  • The CDC recommends the following process to guide development of process evaluation questions that reflect the diversity of stakeholder perspectives and the program's most important information needs:
    Gather your stakeholders The engagement of stakeholders involved in the planning of the program may vary by context. It may be best to meet together to develop the questions, or it may be preferred that the person(s) in charge of the evaluation plan develop a list of questions and solicit feedback before finalizing the list.
    Review supporting materials This may include the program design documents, logic model, work plan, and/or community-level data available through external sources.
    Brainstorm evaluation questions Start with a specific program activity but be sure to consider the full program. Consider goals and objectives from the strategic plan and inputs, activities, and outputs from the program logic model to create process evaluation questions.
    Sort evaluation questions into categories that are relevant to all stakeholders It is difficult to limit evaluation questions, but few programs have the time or resources to answer all questions! Prioritize those that are most useful for all stakeholders.
    Decide which evaluation questions to answer Prioritize questions that:
    • Are important to program staff and stakeholders
    • Address the most important program needs
    • Reflect program goals and objectives outlined in any program strategy or design documents
    • Can be answered using the time and resources available to program staff, including staff expertise
    • Provide relevant information for making program improvements
    Verify questions are linked to the program Once questions are agreed upon, revisit your strategic plan, work plan, and/or logic model to ensure the questions are linked to these program documents.
    Determine how to collect the data required This includes determining who will be responsible for collecting and analyzing data, when the data can be collected, and from who the data will be collected.

Outcome Questions - Are you accomplishing the WHY of what you wanted to do?

  • Outcome evaluation questions address the changes or impact seen as a result of program implementation.
  • Use the same CDC process for developing process evaluation questions to develop outcome evaluation questions.
    • Consider whether the impact assessed relates to short-term, intermediate, or long-term outcomes outlined in your logic model.
  • Outcome Objective Blank Worksheet (bottom section) Book 2 Worksheet: Outcome Objectives

Evaluation Design

  • Evaluation design influences the validity of the results.
  • Most NNLM grant projects will use a non-experimental or quasi-experimental design.
    • Quasi-experimental evaluations include surveys of a comparison group - individuals not participating in the program but with similar characteristics to the participants - to isolate the impact of the program from other external factors that may change attitudes or behaviors.
    • Non-experimental evaluations only survey program participants.
    • If comparison groups are part of your evaluation design, use a 'do no harm' approach that makes the program available to those in the comparison group after the evaluation period has ended.
  • More information on Evaluation Design

Ethical Considerations

Image of Belmont Report principles

Consider the Belmont Report principles to examine ethical considerations:

  • Respect for Persons: Protecting the autonomy of all people and treating them with courtesy and respect and allowing for informed consent
  • Beneficence: The philosophy of "Do no harm" while maximizing benefits for the research project and minimizing risks to the research subjects
  • Justice: Ensuring reasonable, non-exploitative, and well-considered procedures are administered fairly — the fair distribution of costs and benefits to potential research participants — and equally

Trauma-Informed Evaluation

  • Asking someone about trauma is asking that person to recall potentially difficult events from their past.
  • If absolutely necessary for the evaluation to ask questions about potentially traumatic events, incorporate a trauma-informed approach to collect data in a sensitive way.
  • Amherst H. Wilder Foundation Fact Sheet on Trauma-Informed Evaluation
  • More information on Trauma-Informed Evaluation

Ethical Review

  • The Institutional Review Board (IRB) is an administrative body established to protect the rights and welfare of human research subjects recruited to participate in research activities.
    • The IRB is charged with the responsibility of reviewing all research, prior to its initiation (whether funded or not), involving human participants.
    • The IRB is concerned with protecting the welfare, rights, and privacy of human subjects.
    • The IRB has authority to approve, disapprove, monitor, and require modifications in all research activities that fall within its jurisdiction as specified by both the federal regulations and institutional policy.
  • Click here for more information on the IRB, including contact information for your local IRB
  • Depending on the nature of the evaluation, the IRB may exempt a program from approval, but an initial review by the Board is recommended for all programs working with minors.

Jump to Step 4 of a Pathway