You are on page 1of 18

SLIDE 1

Planning and Designing Useful Evaluations: A Lecture Based on "Handbook of


Practical Program Evaluation"

This lecture will provide an overview of planning and designing useful evaluations based
on the insights from the book "Handbook of Practical Program Evaluation" by Joseph S.
Wholey, Harry P. Hatry, and Kathryn E. Newcomer.

SLIDE 2

The Importance of Planning and Design

Evaluation is a crucial component of any program or intervention. It allows us to assess


the program's effectiveness, identify areas for improvement, and ensure that resources
are being used efficiently. However, to yield valuable insights, an evaluation needs to be
well-planned and designed.

A. Planning a Responsive and Useful Evaluation:

1. Evaluation as an Art: Despite the proliferation of evaluation methodologies,


evaluation remains more of an art than a science. It involves difficult trade-offs
balancing feasibility, cost, and benefits.

Example: Imagine you're an artist trying to create a painting. You have many colors and
brushes to choose from, but deciding how to use them to create something beautiful
requires skill and intuition. Similarly, evaluation involves choosing from various methods
and techniques to assess a program. It's not just about following a set formula; it
requires creativity and judgment to balance different factors like cost, feasibility, and
benefits, much like an artist balances colors and brushstrokes to create a masterpiece.

2. Balancing Rigor and Resources: Methodological rigor needs to be balanced with


available resources. Professional judgment plays a crucial role in arbitrating these
trade-offs.
Example: Let's say you're planning a party with a limited budget. You want it to be memorable
and enjoyable for everyone, but you also have to be mindful of how much money you can
spend. Similarly, in evaluation, you want to ensure that your methods are rigorous and accurate
in assessing the program's effectiveness. However, you also have to consider the resources
available, such as time, money, and manpower. Balancing these factors requires careful
consideration and professional judgment.

3. Early Planning: Evaluation planning ideally begins before the program starts,
preferably during the program design phase. Gathering data from the program's
inception allows for more comprehensive evaluation, including pre-program attitudes
and experiences of clients, which might be difficult to obtain later.

Example: Suppose you're building a house. It's essential to plan the layout and gather all the
necessary materials before starting construction. Similarly, in evaluation, it's beneficial to start
planning before the program begins. Gathering data from the program's inception allows
evaluators to capture valuable information, such as people's attitudes and experiences before
they are influenced by the program. This early planning ensures a more comprehensive
evaluation and prevents missing out on crucial data that may be difficult to obtain later.

4. Organic Nature of Evaluation Plans: Evaluation plans are not static but organic, likely
to evolve over time. Responsive to changes in context, data availability, and evolving
understanding of the situation.

Example: Think of a plant growing in a garden. It starts as a seed and gradually grows,
responding to changes in its environment like sunlight and water availability. Similarly,
evaluation plans are not fixed but evolve over time. As evaluators gather more data and
gain a deeper understanding of the program, they may need to adjust their plans
accordingly. This responsiveness allows evaluation to adapt to changing circumstances
and ensures that the findings remain relevant and useful.

SLIDE 3
FIGURE 1.3. REVISE QUESTIONS AND APPROACHES AS YOU LEARN MORE
DURING THE EVALUATION PROCESS:

Figure 1.3 displays the key steps in planning and conducting an evaluation. It highlights
many feedback loops in order to stress how important it is for evaluators to be
responsive to changes in context, data availability, and their own evolving
understanding of context.

Sure, let's break down the scheme in simple terms:

1. Scoping: This is where you start by outlining what you want to achieve with your
evaluation. You set your evaluation objectives and figure out what specific questions
you need to answer.

2. Formulate evaluation objectives: Decide on the main goals or purposes of your


evaluation. What do you hope to achieve by evaluating?

3. Frame evaluation questions: Once you know your objectives, you need to come up
with specific questions that will help you gather the information you need to meet those
objectives.

4. Match methodology to questions: Choose the best methods (like surveys, interviews,
etc.) to get the answers to your evaluation questions.
5. Identify constraints on implementing methodology: Consider any limitations or
challenges you might face in carrying out your chosen methods, such as time, budget,
or access to certain data.

6. Identify means to ensure quality of work: Figure out how you'll make sure your
evaluation is done well and the results are reliable. This might involve things like
double-checking data or using standardized procedures.

7. Anticipate problems and develop contingency plans: Think ahead about what could
go wrong and come up with backup plans to deal with those issues if they arise.

8. Design: This is where you put all your planning into action and actually carry out the
evaluation, using the methods you've chosen.

9. Feedback loops: Throughout the evaluation process, keep an eye on how things are
going and be ready to make adjustments if needed based on what you're learning.

10. Enhance validity and reliability: Continuously work to make sure your evaluation is
giving you accurate and trustworthy results.

11. Identify caveats: Be aware of any limitations or factors that might affect the
interpretation of your results.
12. Ensure findings will address information needs: Make sure that the conclusions you
draw from your evaluation will actually provide useful information to the people who
need it.

13. Ensure presentation addresses audience(s): When you share your findings, make
sure you're presenting them in a way that makes sense to the people who will be using
them.

14. Write report: Document your evaluation process and results in a clear and organized
report.

15. Data Collection and Analysis: Gather the data you need for your evaluation and then
analyze it to draw conclusions and make recommendations based on your findings.

SLIDE 4

Key Considerations in Planning and Design

The following are some key aspects to consider when planning and designing an
evaluation:

 Identifying Evaluation Questions: The first and most crucial step is to determine what
you want to learn from the evaluation. What are the program's objectives? What
information do stakeholders need to make informed decisions? Formulating clear and
concise evaluation questions will guide the entire evaluation process.
Example: Imagine you're planning a trip. Before you start packing or booking tickets, you need
to decide where you want to go and what you want to do there. Similarly, in evaluation, you
need to determine what you want to learn about the program. For instance, if you're evaluating
a tutoring program, you might ask questions like, "Does the program improve students'
academic performance?" or "Are students satisfied with the tutoring sessions?"

 Matching Methodology to Questions: Different evaluation questions necessitate


different methodological approaches. Common designs include experimental designs,
quasi-experimental designs, and non-experimental designs. The choice of design
depends on the level of rigor needed and the resources available.
Example: Think of different cooking techniques for preparing a meal. If you're baking a cake,
you'll follow a specific recipe and method. If you're grilling burgers, you'll use a different
approach. Similarly, in evaluation, different questions require different methodologies. For
instance, if you want to measure the impact of a new teaching method, you might use an
experimental design with control and experimental groups to compare outcomes.

 Data Collection and Analysis: Selecting appropriate data collection methods is


essential. This may involve surveys, interviews, focus groups, or analyzing existing data
records. The data analysis plan should be determined beforehand to ensure you collect
the necessary data to answer your evaluation questions.
Example: Imagine you're conducting a survey to gather information about people's favorite
foods. You could either ask them directly or look at their past food orders. Similarly, in
evaluation, you need to select appropriate methods for collecting data, such as surveys,
interviews, or analyzing existing records. Once you collect the data, you analyze it to find
patterns or trends that answer your evaluation questions.

 Ensuring Credibility and Relevance: The evaluation's findings should be credible and
trustworthy. This requires employing sound methodological practices and ensuring data
quality. Additionally, the evaluation should be relevant to the stakeholders' information
needs and address their decision-making requirements.
Example: Think of a trustworthy news source that provides accurate and reliable information.
Similarly, in evaluation, it's essential to ensure that the findings are credible and trustworthy.
This involves using sound methodological practices and maintaining data quality. Additionally,
the evaluation should address stakeholders' information needs and help them make informed
decisions about the program's future.

 Planning for Use: Consider how the evaluation results will be used to improve the
program or inform future program development. Disseminating the findings effectively to
relevant audiences is crucial for maximizing the evaluation's impact.
Example: Consider a recipe book with clear instructions on how to prepare different dishes. The
book not only provides recipes but also guides you on how to use the recipes effectively.
Similarly, in evaluation, it's essential to plan how the findings will be used to improve the
program. This may involve disseminating the findings to relevant stakeholders and providing
recommendations for program enhancement based on the evaluation results.

SLIDE 5
Sure, let's delve deeper into the main ideas of planning and design, as well as the
GAO's Evaluation Design Process:

B. Planning and Design:

This section emphasizes the importance of meticulous planning and designing in the
evaluation process. Here's an elaboration based on the provided text:

- Identification of Key Evaluation Questions: The text highlights that identifying key
evaluation questions is the initial and often challenging task during the design phase. It's
crucial to determine what specific questions need to be answered to evaluate a program
effectively. This involves understanding the objectives of the evaluation and anticipating
what information clients need. For instance, in the example of the GAO, they need to
understand the nature of the research questions from congressional requests and how
the information will be used. This ensures that the evaluation is focused and aligned
with the needs of the stakeholders.
- Balancing Resources and Client Needs: The text also discusses the importance of
balancing resources with clients' information needs. Evaluators must consider factors
like budget, time constraints, and available expertise while designing the evaluation.
This is essential for selecting an appropriate evaluation design and strategies for data
collection and analysis. For example, the GAO systematically selects the most
appropriate approach for each study based on a risk-based process, ensuring timely
and quality information delivery at a reasonable cost.

SLIDE 6
Box 1.5. GAO's Evaluation Design Process:

This box provides insights into the structured process followed by the U.S.
Government Accountability Office (GAO) for designing evaluations. Here's a breakdown
of the key points:

- Clarify the Study Objectives: The GAO starts by meeting with congressional
requester staff to understand their information needs and research questions. This step
ensures that the evaluation objectives are well-defined and aligned with congressional
requirements. By clarifying the study objectives, the GAO can determine the scope of
the evaluation and establish expectations regarding the type of information needed and
its intended use.

- Develop and Test the Proposed Approach: The GAO carefully considers different
approaches, data sources, and methodologies to address the research questions
effectively within resource and time constraints. They review existing literature, consult
with experts, and test proposed data collection approaches to ensure reliability and
validity. The development and testing phase allow the GAO to refine the evaluation
approach and outline detailed plans for data collection, analysis, and reporting.
Overall, both sections emphasize the importance of systematic planning, clear objective
setting, and rigorous methodological considerations in the evaluation process, ensuring
that evaluations are well-designed, focused, and capable of providing valuable insights
to stakeholders.

Sure, let's break down both the Figure 1.4 and Table 1.1:

Figure 1.4: Sample Design Matrix

- Issue/Problem Statement: This is where you define the problem or issue that the
evaluation aims to address. It's important to provide context and identify who will
potentially use the information gathered from the evaluation.

- Researchable Question(s): Here, you specify the questions that the evaluation seeks
to answer. These questions should be clear and directly related to the issue or problem
statement.

- Criteria and Information Required and Source(s): This section outlines the criteria and
information needed to answer each research question and where this information will be
sourced from.

- Scope and Methodology Including Data Reliability Limitations: It describes the scope
of the evaluation and the methodologies that will be employed to answer the research
questions. It also identifies any limitations in data reliability that may affect the analysis.

- What This Analysis Will Likely Allow GAO to Say: This section specifies what
conclusions or insights the analysis is expected to provide based on the research
questions and available data.
In simple terms, the design matrix helps organize the planning of an evaluation by
breaking down the problem, the questions to be answered, the information needed, how
it will be obtained, the methods used, and what can be concluded from the analysis.

Table 1.1: Matching Evaluation Designs to Questions

This table matches different types of evaluation questions with suitable evaluation
designs. It provides illustrative questions along with the corresponding evaluation
designs and the chapters in the text where more information about these designs can
be found.

For example:
- Describe program activities, ongoing program performance (outputs or outcomes), and
stakeholder agreement on program objectives: This question would likely require
methods like performance measurement, evaluability assessment, exploratory
evaluation, or case studies, which are covered in Chapters 4, 5, and 8.

- Probe program implementation and analyze program targeting: This type of question
could be addressed through case studies, implementation evaluation, or exploratory
evaluation, discussed in Chapters 4, 8, 10, and 14.

- Evaluate program impact (results) and identify side effects of program: To answer this
question, you might employ experimental and quasi-experimental designs, case studies,
or cost-effectiveness analyses, discussed in Chapters 6, 7, 8, 10, and 12.

Overall, Table 1.1 helps evaluators match the types of questions they have with
appropriate evaluation designs, ensuring that the chosen methods are suitable for
addressing the specific objectives of the evaluation.

SLIDE 7
C. Data Collection:

In the context provided, data collection is described as a process that involves potential
political and bureaucratic challenges. It includes identifying barriers that may arise due
to political reasons and practical constraints such as compatibility issues between
different computer systems. Extensive planning is necessary for field data collection to
ensure that evaluators gather relevant data efficiently. Chapters Eleven through
Eighteen of the text offer detailed guidance on selecting and implementing various data
collection strategies.

Example: Imagine you're organizing a scavenger hunt in your neighborhood. Before you send
participants off to search for items, you need to anticipate potential challenges they might face,
like encountering closed-off areas or finding items in hard-to-reach places. Similarly, in
evaluation, data collection involves anticipating and addressing challenges, such as political
resistance or technical issues like incompatible computer systems. Extensive planning ensures
that data can be collected efficiently, much like planning routes and checkpoints for a scavenger
hunt. Just as different chapters in a book provide guidance on various topics, different chapters
in a guidebook offer detailed strategies for overcoming data collection challenges.

SLIDE 8
D. Data Analysis:

Data analysis is portrayed as a critical aspect that influences the entire evaluation
process, including data collection. It requires evaluators to clearly define how each
piece of data will be used. Collecting excessive data is cautioned against, as it can lead
to unnecessary expenses. Developing a detailed data analysis plan as part of the
evaluation design is recommended to help evaluators determine which data elements
are essential, thereby avoiding the collection of unnecessary information.
Additionally, an analysis plan serves to structure the layout of the evaluation report by
identifying the graphs and tables through which the findings will be presented.
Anticipating how the findings might be utilized prompts evaluators to carefully consider
presentations that address the original evaluation questions logically and clearly.

Moreover, the text emphasizes that the effectiveness of evaluation results relies not
only on drafting attractive reports but also on understanding the bureaucratic and
political contexts of the program. Evaluators must craft their findings and
recommendations in a manner that facilitates their use within these contexts, ultimately
aiming to improve program performance.
Example: Think of sorting through a pile of puzzle pieces to complete a jigsaw puzzle. You
wouldn't collect more pieces than you need, or it would make the task overwhelming and costly.
Similarly, in evaluation, data analysis requires careful consideration of how each piece of data
contributes to answering evaluation questions. Collecting excessive data can be like collecting
unnecessary puzzle pieces, leading to unnecessary expenses and complexity. Instead,
developing a detailed data analysis plan, akin to planning how to assemble the puzzle, helps
evaluators focus on essential data elements and avoid unnecessary collection, making the
analysis process more manageable and cost-effective.

SLIDE 9
A. Getting Evaluation Information Used:

The text emphasizes the importance of making sure that evaluation efforts lead to
positive change and contribute to achieving important policy and program goals. The
goal for most evaluators is program improvement, and they must produce convincing
evidence to support recommendations for change. Understanding how program
managers and stakeholders view evaluation is crucial for producing useful information.

Box 1.6. Anticipate These Challenges to the Use of Evaluation and Performance Data:
This box lists typical challenges that may hinder the use of evaluation and performance
data in public and nonprofit organizations. Some of these challenges include:

1. Lack of visible appreciation and support for evaluation among leaders.


2. Unrealistic expectations of what can be measured and proven.
3. A compliance mentality among staff regarding data collection and reporting, coupled
with disinterest in using data.
4. Resistance to adding the burden of data collection to staff workloads.
5. Lack of positive incentives for learning about and using evaluation and data.
6. Lack of compelling examples of how evaluation findings or data have led to
significant improvements in programs.
7. Poor presentation of evaluation findings.

These challenges highlight the importance of addressing organizational attitudes and


perceptions toward evaluation and performance data to foster their effective use.

Box 1.7. Tips on Getting Evaluation Findings and Data Used:

This box provides practical tips for increasing the likelihood that evaluation findings and
data will be used. Some of the key tips include:

1. Understanding and appreciating the audience for presenting evaluation findings.


2. Addressing the most relevant questions for the audience's information needs.
3. Envisioning the final evaluation product early in the design phase.
4. Designing sampling procedures carefully to ensure findings can be generalized.
5. Ensuring measurement validity and addressing alternative explanations for program
outcomes.
6. Conveying the competence of evaluators and the methodology employed to enhance
the credibility of findings.
7. Tailoring reports to address the communication preferences of different target
audiences.
8. Providing an executive summary and a report written clearly and without jargon.
9. Developing strong working relationships with program staff and stakeholders from the
beginning.
Example:

Imagine you're a chef preparing a delicious meal for your guests. You want to ensure
that they not only enjoy the food but also appreciate the effort and thoughtfulness
behind each dish. Similarly, when presenting evaluation findings and data, you want to
make sure that your audience – whether it's your diners or stakeholders in a program –
not only understands but also values the information you're providing.

Here's how you can make your evaluation findings more palatable and enticing:

1. **Know Your Audience**: Just like you would tailor your menu to suit the tastes of
your guests, understand who will be receiving your evaluation findings and what they're
interested in. For instance, if your audience consists of health-conscious individuals,
focus on highlighting the nutritional aspects of your dishes or program outcomes.

2. **Answer Their Burning Questions**: Serve up answers to the most pressing


questions your audience may have. If your guests are curious about the ingredients or
cooking techniques used, ensure your evaluation addresses the key aspects they're
eager to know about.

3. **Visualize Success Early On**: Picture the final presentation of your evaluation from
the beginning, much like visualizing how your meal will be plated before you even start
cooking. This helps ensure a cohesive and appealing end result.
4. **Sample with Precision**: Just as you carefully select the best ingredients for your
dish, design your sampling procedures thoughtfully to ensure your evaluation findings
can be generalized and representative of the whole program.

5. **Ensure Quality Ingredients**: Validate your measurements and consider all possible
explanations for your program outcomes, similar to ensuring the quality and authenticity
of your ingredients to create a top-notch dish.

6. **Showcase Your Expertise**: Highlight your skills and the methodology you've
employed to gather and analyze data, just as a chef would proudly display their culinary
expertise to reassure diners of the quality of their meal.

7. **Serve Communication with a Twist**: Tailor your reports to suit the communication
preferences of different audiences, just as you would customize your dishes to cater to
different dietary preferences.

8. **Present with Clarity**: Provide a clear executive summary and report that's free of
jargon, much like serving a beautifully presented and easily digestible dish that's
enjoyed by all.

9. **Build Strong Relationships**: Develop strong relationships with program staff and
stakeholders right from the start, akin to forging a bond with your suppliers and kitchen
team to ensure smooth coordination and collaboration throughout the meal preparation
process.

By following these tips, you can ensure that your evaluation findings are not only well-
received but also effectively utilized to make informed decisions and drive positive
change in your program or organization, much like how a well-executed meal leaves
diners satisfied and eager for more.
These tips emphasize the importance of thoughtful planning, clear communication, and
building trust with stakeholders to facilitate the use of evaluation findings and data in
decision-making and program improvement processes.

SLIDE 10
Challenges in Planning and Design

Planning and designing evaluations can be challenging. Here are some common
obstacles to consider:

 Balancing Rigor and Resources: Striking a balance between methodological rigor and
resource constraints is essential. Evaluators may need to adapt their designs to fit
budgetary limitations or time restrictions.
 Stakeholder Needs and Expectations: Managing the expectations of various
stakeholders (program staff, funders, beneficiaries) can be difficult. It's important to
ensure that the evaluation addresses their information needs while also maintaining
methodological soundness.
 Data Availability and Quality: Existing data may not always be readily available or of
sufficient quality to answer the evaluation questions. Evaluators may need to collect
new data, which can be time-consuming and resource-intensive.
Strategies for Effective Planning and Design

Here are some strategies to enhance the planning and design of your evaluation:

 Involve Stakeholders Early: Involving stakeholders throughout the planning process


can help ensure that the evaluation is relevant and addresses their needs.
 Pilot Test the Design: Conducting a pilot test of the data collection instruments can
help identify any issues and refine the evaluation design before full-scale
implementation.
 Clearly Communicate Findings: Present the evaluation results in a clear, concise, and
easy-to-understand manner. Tailor the report to the audience's level of understanding
and avoid technical jargon.
Slide 11
Conclusion

By carefully planning and designing your evaluation, you can ensure that it yields
valuable and actionable insights. By considering the factors mentioned above and
employing effective strategies, you can design an evaluation that contributes to program
improvement and informs future program development.

Additional Resources

The "Handbook of Practical Program Evaluation" by Wholey, Hatry, and Newcomer


provides a comprehensive overview of program evaluation concepts, methods, and best
practices. It can be a valuable resource for anyone involved in planning, designing, or
conducting evaluations.

In conclusion, the lecture based on the "Handbook of Practical Program Evaluation"


underscores the critical importance of planning and designing evaluations that are not
only comprehensive but also practical and useful. By following a systematic approach
outlined in the handbook, evaluators can ensure that evaluations are well-conceived,
effectively executed, and ultimately contribute to informed decision-making and
program improvement.

Key takeaways from the lecture include the necessity of:

1. Clearly defining evaluation objectives and questions to guide the evaluation process.
2. Selecting appropriate evaluation designs and methods tailored to the program's context
and goals.
3. Establishing evaluation criteria and indicators that align with program objectives and
outcomes.
4. Engaging stakeholders throughout the evaluation process to ensure their perspectives
are considered and valued.
5. Collecting and analyzing data rigorously to generate meaningful insights and findings.
6. Communicating evaluation results in a clear, concise, and actionable manner to inform
program stakeholders.

I hope this lecture has provided a helpful overview of planning and designing useful
evaluations. If you have any questions, please don't hesitate to ask.

You might also like