Skip all navigation and go to page content
NN/LM Home About OERC | Contact Us | Feedback |OERC Sitemap | Help | Bookmark and Share

Evaluation 2006 (AEA Annual Meeting) Tidbits from Susan B.

The American Evaluation Association annual meeting was held in Portland, OR last week and I attended two workshops and several sessions of interest. Here is a selection of tidbits that I picked up:

  • Integrating Program Planning and Evaluation Using Logic Models–this 2 day workshop was taught by Thomas Chapel from the CDC, a very lively and entertaining instructor. He spoke of his teaching strategy of presenting evaluation to people as “systematic thinking that they already do.” The three main concepts that he asked us to take away were:
    1. Evaluation seeks to improve, not necessarily to prove
    2. Evaluation requires program description
    3. Logic models make a program theory clear but not necessarily true

    The instructor used a teaching technique that I thought worked well: he had our working groups (we were at tables of rounds, of course) begin to construct a logic model by listing outcomes and activities on post-its, each separately on its own post-it. Then we arranged the post-its on big sheets of paper on the wall. It was a good way to be able to move things around and add and subtract ideas–and then the group could stand back and discuss it, and think about the logical chain of events that is envisioned to lead from activities to outcomes. He also introduced the new (to me) idea of establishing a program’s “critical path”–the activities that can be focused on if conditions prevent accomplishment of an entire program.

  • Needs Assessment: A Professional Development Workshop–a one-day workshop taught by James Altschuld from Ohio State. Prof. Altschuld, another lively and entertaining instructor, literally “wrote the book” on needs assessment (Witkin and Altschuld. Planning and Conducting Needs Assessment, a Practical Guide. Sage, 1995). He pointed out that, since needs are so complex and value-laden, a needs assessment must mix both qualitative and quantitative methods. The workshop focused on the importance of a needs assessment committee comprising “power-brokers” (people who have influence with the consituencies that they represent) rather than administrators during initial phases of planning and designing a needs assessment.
  • You Can’t Push a String: Building Skills, Capacity and Enthusiasm for Evaluation at the Frontlines–a panel presentation with three speakers from the CDC (Yamir Salabarria-Pena from the Division of STD Prevention; Maureen Wilce and Kai Young from the Division of TB Elimination) who spoke about the toolkits that each division has created for its partners and grantees (Program Operations Guidelines for STD Prevention: Program Evaluation and A Guide to Developing a TB Program Evaluation Plan), both based on the CDC Framework for Program Evaluation.
  • Culturally-Based Traditions as Best Practices: Working Effectively with Tribal Programs–The Oregon Legislature has directed the state’s Department of Human Services and four other state agencies to spend increasing shares of public dollars on evidence-based services, culminating in 75 percent by the 2009-11 budget period. This session focused on work that has been done to recognize that “evidence-based practice” can be seen as a threat by tribal populations for whom there is no written evidence for centuries of successful practices. The three speakers (Caroline Cruz, OR DHS Mental Health and Addictions Division; John Spence, Northwest Indian Training Associates; and Juliette Mackin, NPC Research) have been working to show that tribes are in compliance with the legislature since their practices are based on evidence (albeit oral and nonlinear) and, in fact, tribes have been doing logic models “forever” through stories, analogies, and rituals. They have produced two publications that discuss these concepts: Oregon Tribal Evidence Based and Cultural Best Practices and Tribal Practices Based on Evidence (a training manual first printed by Oregon’s Addictions and Mental Health Division in November, 2006 and not yet available on the web).
  • Building Evaluation Capacity in Indian Country–this panel presentation featured Joan LaFrance (who has done valuable work for various NN/LM Tribal Connections projects, most recently with the Conference on Native American Health Information Services in the United States) and two of her colleagues from Mckinak Consulting (Frieda Kirk and Richard Nichols) spoke of their work with program evaluation in Indian Country and an ongoing NSF-funded collaboration with the American Indian Higher Education Consortium of tribal colleges. This project involves developing and presenting a curriculum, online resources, and an online graduate course aimed at building capacity for indigenous evaluation, featuring strategies relevant to tribes and built on indigenous values, ways of knowing, and ancient ways. Principles that guide their work include the recognition that Indians have always had ways of assessing merit or worth based on traditional values and cultural expressions; evaluation should serve tribal goals for self-determination and sovereignty; tribes can have ownership in evaluation by defining it in their own terms and not just as a response to outsiders’ requirements. The training curriculum, to be offered in Albuquerque for the first time in Febuary, again in Rapid City in March, and at one more location later, connects Western evaluation practice to tribal core values and includes six modules:
    1. Indigenous Grounding
    2. Creating Our Story
    3. Structuring the Evaluation that is Relevant to Our Community and Place
    4. Responsive Data Collection and Measurement
    5. Telling Our Story
    6. Creating a Network for Indigenous Evaluation
  • Make Your Data Come Alive: Engaging Stakeholders in Interpreting Their Data–a one-hour “skill-building workshop” from members of the Formative Evaluation Research Associates (FERA) company. They include a “Data Interpretation Workshop” as part of the evaluation support that they provide to their clients. This is a way to work with stakeholders to generate findings and make meaning from them, which can be more effective in some contexts than simply presenting findings. In this workshop, FERA presents project data organized by the evaluation questions that had been previously identified during the planning phase. Each workshop participant–these are project stakeholders and managers–fills out a form for the data from each evaluation question. This form asks participants to express one or two key findings, the implications of these findings, and recommendations based on these implications. The results are presented in a “round robin” format and then FERA gathers them and uses them as the basis for its final report.

Comments are closed.