
Planning an Evaluation
Evaluation should be integral to all aspects of program planning and implementation. It is best to design your program with evaluation in mind, collect data on an ongoing basis and then use these data to improve your program.
Planning for and incorporating evaluation into your ongoing program operations will help you to:
- Better understand your target audiences' needs and how to meet these needs.
- Design objectives that are more achievable and measurable.
- Monitor progress toward objectives more effectively and efficiently.
- Learn more from evaluation.
- Increase your program's productivity and effectiveness.
To create and sustain support for program evaluation at your institution:
Integrate evaluation into your strategic planning. As you set goals for your program, identify ways to measure these goals and how you might collect data, analyze it and use the findings. This process will help ensure that your objectives are measurable and that you are collecting information that you will actually use.
Build an evaluation culture by rewarding participation in evaluation, offering evaluation capacity-building opportunities, providing funding for evaluation, communicating a convincing purpose for evaluation and focusing on successful evaluations. Reporting on the impact of program changes made as a result of evaluation findings also demonstrates the importance of evaluation.
Conducting an Evaluation
You should consider the following issues as you plan your evaluation:
1. Program description
A good evaluation is tailored to the specific goals and objectives of a program. Be sure that you understand thoroughly how your program works and that there is agreement about its purpose and goals.
Questions you should be able to answer include:
- What are the goals and purpose of the program?
- What problem or need is it intended to address?
- How is the program designed to work?
- What are the measurable objectives? What are the strategies to achieve the objectives?
- What are expected effects?
- What are the resources and activities associated with the program?
2. Focus and rationale for evaluation
Remember, an evaluation cannot answer all questions for all stakeholders. Focus your program’s evaluation on the most important questions you need answered, whether for your own purposes or as required by NIGMS, NIH or other agencies. Keep in mind that the answers to these questions should be useful for making programmatic decisions. Consider:
- What do you want to know? (often called the "key questions")
- Why is this the right time to conduct an evaluation?
- Who will be involved in or affected by the evaluation or use the findings? (who are the "stakeholders"?)
- What are the evaluation’s purpose, uses, questions, methods, roles, budgets, deliverables and stakeholders?
3. Evaluation design
An evaluation design should include the key questions to be addressed, target populations, key variables and a conceptual framework.
Key questions
A program evaluation’s key questions link directly to the stated purpose of the evaluation and program activities. Defining these questions is crucial-—what you want to answer determines the rest of your study design, including what data you will need, as well as the methods for collecting and presenting that data. Study questions should be clear, specific, measurable and answerable. Sample key questions include:
- How is my training program being implemented?
- What factors have limited the ability of my program to achieve its goals?
- What has been the impact of my training program on its participants?
- What is the quality and character of mentorship provided in my training program?
- How and to what extent does my program increase student skills and knowledge about laboratory research?
Target population
Identify which group or groups you need information about in order to answer your key questions. Examples include:
- Trainees and students.
- Project managers.
- Academic coordinators.
- Faculty.
- High-ranking institutional administrators.
Key variables
What specific information is needed to answer your key questions? Determine the most important variables for which you will collect data. Examples include:
- Program Resources.
- Population characteristics.
- Program activities.
- Program goals, performance measures, comparison measures.
- External or environmental factors.
Conceptual framework
Consider developing a conceptual framework (also known as a logic model) to illustrate how your program is supposed to achieve its goals. A model is a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, the activities you plan and the changes or results you hope to achieve.
There are several good reasons to use a conceptual framework:
- To increase understanding of a program.
- To link activities to results.
- To help identify variables to measure.
- To reflect group process and shared understanding.
- To strengthen the case for investing in a program.
4. Data collection and analyses
Where and how are you going to get the information you need for your evaluation, and how are you going to get it?
- Will you use new data or secondary data?
- Will it be quantitative, qualitative or mixed?
- Are there appropriate comparison groups?
- How will you collect the data?
- Are there ethical or institutional review board (IRB) considerations?
- What are the limitations of the data?
Typical data collection strategies
| Method |
Pro |
Con |
| Bibliometric analyses |
Quantitative; useful in aggregate as a tool to assess productivity and impact of biomedical research |
Measure only quantity; can be artificially influenced |
| Case studies |
Provide understanding of the interaction of various influences on the research process |
Not necessarily representative within or across programs |
| Database extractions, document reviews |
Useful for analyzing archival data: databases, program records, literature review, etc. |
Records may be incomplete |
| Expert panels |
Useful in research fields, especially when few quantifiable indicators exist |
Difficult to obtain a systematic, objective assessment |
| Focus groups |
Provide understanding of attitudes and thoughts on a subject; group dynamic can help elicit honest responses |
Results cannot be statistically generalized to larger populations; not quantifiable |
| Interviews |
Offer insight from the perspective of specific program roles and expertise |
Limited perspective; time-intensive |
| Surveys |
Generate statistically reliable data via rating services, behavior, demographics, etc. |
Require statistically representative sample and adequate response rate |
5. Products of evaluation and their use
Consider what reports and products you need from an evaluation. How are you going to share the findings? How will the findings be conveyed to your various stakeholders? How will the results be used? Do you need to create a follow-up plan to act on the findings? How might your program be affected by the evaluation?
6. Project management—who participates?
| Role |
Contributions |
Challenges |
| Program manager and staff |
Program knowledge |
Vested interest |
| Evaluator |
Evaluation expertise, independence |
Limited program knowledge |
| Evaluation advisory committee |
Program familiarity, evaluation expertise, organizational content |
Limited program knowledge |
| Senior leader/decision maker |
Organizational context, resources |
Vested interest |
7. Budget estimate
When planning an evaluation, it's important to consider your budget needs up front. Because evaluations are tailored to fit individual programs, their costs vary considerably. One common error is to be unrealistic about the resource and time needs for each step. In general, the effort you put into an evaluation should be commensurate with the effort you put into the program you’re evaluating. Also, you'll need to consider whether your institution has the capacity to conduct an evaluation or whether you need to hire external expertise. Consider how much time and money you're able to dedicate to the evaluation and what kind of evaluator you're likely to need.