You are on page 1of 15

EVALUATION QUESTIONS

 Process Evaluation
 How does the program work?
 What services are provided?
 Is the program implemented as planned?
 What are the outputs?

What works or does not work in the program?


Outcome Evaluation

Assess program effectiveness

What are the results?


Short term
Long term
Choose the type of evaluation to conduct:

 Decide on the data to be collected to monitor


performance measures/outputs and outcomes

 Create instruments needed to collect data

 Decide how to analyze and use data


 Need up-front planning

 Need a sense of what you are trying to accomplish

 What data will you collect and why?

 What data sources are available and which


will you use?
Activity Logs/ Written documentation of participant's Good for “What?” and
Skill Sheets attendance, achievement, or acquisition of “How many” questions
skills
Document/Records Review of written documents such as Good for “What?” and
review performance ratings, program logs, tally “How many?” questions
sheets, and other existing indicators
Focus groups Moderated discussions on a particular topic Good for “What?”
or issue “How?” and “Why?”
questions
Interviews Data collection through oral conversations Good for “What?” and
“Why?” questions
Observation Watching people engaged in activities and Good for “How?”
recording what occurs “What?” and “How
many?” questions
Questionnaires Written responses to clearly defined Good for “What?” and
questions “How many?” questions
Construct your own instruments
 Attendance sheets
 Activity logs
 Surveys for clients
Use existing instruments
 Templates available online
 Modify to suit your particular program
 Paper
̶ Internal forms/checklists
 Electronic
̶ Excel, Access, SPSS
 Online
̶ Web-based data collection tools
̶ OJJDP’s Data Collection & Technical Assistance
Tool (DCTAT)
 Analyze Data
 How will the data be analyzed?
 Who will be responsible for data analysis?

 Use Data
 How will the data be used?
 Will the data be made available in reports?
 Who will receive the reports?
 A program is evidence-based if:
 evaluation research shows that it produces the expected
positive results;
 the results can be attributed to the program itself, not to
other factors or events;
 the evaluation is peer-reviewed by experts in the field;
 the program is “endorsed” by a federal agency or
respected research organization and included in its list of
effective programs.
From: Evidence-based Programs: An Overview
http://www.uwex.edu/ces/flp/families/whatworks_06.pdf
 Government-wide move toward accountability
 Government Performance and Results Act
 Shift from accountability for process to accountability for
results
 Programs must show effectiveness to justify funding

 President’s Management Agenda


 Evidence-based practices are cost effective
 It is important to provide programs that work
 Use data to create interest and support for program
continuation
 From funders
 From the community

 Program evaluation helps identify the most


successful components of your program
 Maintain effective staff, procedures, or activities
 Program evaluation is an ongoing process
 Start with a logic model to identify the main
elements of the program
 Choose the type of evaluation needed
 Plan data collection and analysis activities in
advance
 Use program evaluation results to justify and market
the program
Boulmetis, J., & Dutwin, P. (2002). The ABC’s of Evaluation.
San Francisco: Jossey Bass.
The Program Manager’s Guide to Evaluation
Office of Planning, Research and Evaluation
Administration for Children and Families
U.S. Department of Health and Services
http://www.acf.hhs.gov/programs/opre/other_resrch/p
m_guide_eval

Planning a Program Evaluation


University of Wisconsin Cooperative Extension
http://learningstore.uwex.edu/assets/pdfs/G3658-1.pdf

The Global Social Change Research Project


http://gsociology.icaap.org/methods/evaluationbeginn
ersguide.pdf

You might also like