PURPOSE
We want to take some time to discuss common misconceptions and issues experienced by colleges around the subject of evaluation. We want to understand the differences between evaluation and research. We want to know how to develop and implement a good evaluation for an intervention or program.
Center for Applied Research at CPCC 2013
PROGRAM EVALUATION
What is evaluation?
Evaluation is a profession composed of persons with varying interests, potentially encompassing but not limited to the evaluation of programs, products, personnel, policy, performance, proposals, technology, research, theory and even of evaluation itself.
Go to: http://www.eval.org At the bottom of the homepage there is a link to a free training package and facilitators guide for teaching the Guiding Principles for Evaluator Training
Center for Applied Research at CPCC 2013
MORE ON EVALUATION
As defined by the American Evaluation Association, evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products, and organizations to improve their effectiveness.
Evaluation is the systematic collection and analysis of data needed to make decisions, a process in which most well-run programs engage from the outset. Here are just some of the evaluation activities that are already likely to be incorporated into many programs or that can be added easily: Pinpointing the services needed for example, finding out what knowledge, skills, attitudes, or behaviors a program should address
Center for Applied Research at CPCC 2013
CONTINUED
Establishing program objectives and deciding the particular evidence (such as the specific knowledge, attitudes, or behavior) that will demonstrate that the objectives have been met. A key to successful evaluation is a set of clear, measurable, and realistic program objectives. If objectives are unrealistically optimistic or are not measurable, the program may not be able to demonstrate that it has been successful even if it has done a good job Developing or selecting from among alternative program approaches for example, trying different curricula or policies and determining which ones best achieve the goals
Center for Applied Research at CPCC 2013
CONTINUED
Tracking program objectives for example, setting up a system that shows who gets services, how much service is delivered, how participants rate the services they receive, and which approaches are most readily adopted by staff Trying out and assessing new program designs determining the extent to which a particular approach is being implemented faithfully by school or agency person
Center for Applied Research at CPCC 2013
PROGRAM EVALUATION
Purpose
To establish better products, personnel, programs, organizations, governments, consumers and the public interest; to contribute to informed decision making and more enlightened change; precipitating needed change; empowering all stakeholders by collecting data from them and engaging them in the evaluation process; and experiencing the excitement of new insights. Evaluators aspire to construct and provide the best possible information that might bear on the value of whatever is being evaluated.
Definition of Evaluation
Study designed and conducted to assist some audience to assess an objects merit and worth.
(Stufflebeam, 1999)
Identification of defensible criteria to determine an evaluation objects value (worth or merit), quality, utility, effectiveness, or significance in relation to those criteria. (Fitzpatrick, Sanders & Worthen, 2004)
Definition of Evaluation
Goal 1 Determine the merit or worth of an evaluand.
(Scriven 1991)
Goal 2 Provide answers to significant evaluative questions that are posed It is a value judgment based on defensible criteria
Center for Applied Research at CPCC 2013
Evaluation Questions
Provide the direction and foundation for the evaluation (without them the evaluation will lack focus) The evaluations focus will determine the questions asked.
Need Assessment Questions? Process Evaluation Questions?
Center for Applied Research at CPCC 2013
T YPES OF EVALUATION
Process evaluation determines if the processes are happening according to the plan The processes of a program are the nitty gritty details or the dosage students, patients or clients receive the activities It is the who is going to do what and when It answers the question Is this program being delivered as it was intended.
Center for Applied Research at CPCC 2013
T YPES OF EVALUATION
Outcome evaluation (most critical piece for accreditation) determines how participants do on shortrange, mid-range or long-range outcomes Usually involves setting program goals and outcome objectives Answers the question is this program working and/or are participants accomplishing what we intended for them to accomplish
Center for Applied Research at CPCC 2013
T YPES OF EVALUATION
Impact evaluation How did the results impact the student group, college, community, family (larger group over time) Answers the question Is this program having the impact it was intended to have (so you must start with intentions)?
Center for Applied Research at CPCC 2013
Process
Needs Assessment
Did it Work
Long-range Outcomes
Impact
IR DEPARTMENTS
Evaluation
Use intended for use use is the rationale the decision-maker, not evaluator, comes up with the questions to study. compares what is with what should be does it meet established criteria action setting/priority is to the program, not the evaluation friction among evaluators roles and program givers roles because of judgmental qualities of evaluator.
Research
produces knowledge lets the natural process determine use the researcher determines the questions studies what is
Questions
Judgment
Setting
priority is to the research, not what is being studied not the friction; research vs. funder no friction
Roles
In Community Colleges
ANALYSIS PARALYSIS Lets splice and dice the data more and more and more. Too much data to analyze Dont know what it tells them How do we make a decision about priorities and strategies from 200 pages of data tables?
Center for Applied Research at CPCC 2013
Relationships, linkages.
Multiple models
Source / Adapted from UW-Extension: http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html Center for Applied Research at CPCC 2013
Where are you going? How will you get there? What will tell you that youve arrived?
INPUTS
Extension invests time and resources
OUTPUTS
We conduct a variety of educational activities targeted to individuals who participate
OUTCOMES
Participants gain knowledge, change practices and have improved financial well-being
WHAT WE INVEST
WHAT WE DO
WHAT RESULTS
Source: E Taylor-Powell, University of Wisconsin- Extension-Cooperative Extension Center for Applied Research at CPCC 2013
INPUTS
Staff Develop parent ed curriculum
OUTPUTS
OUTCOMES
Money
Partners Research
Parents better understanding their own parenting style Parents gain skills in effective parenting practices
Strong families
Assumptions:
External factors:
Center for Applied Research at CPCC 2013
Outputs
Increased awareness of importance of SF worksites
Outcomes
Demonstrations of public support for SF worksites SF worksites SF worksites policies drafted
Unions
Increased knowledge of SF worksite benefits & options Increased commitment, support and demand for SF worksites
Source: E Taylor-Powell, University of Wisconsin- ExtensionCenter for Applied Research at CPCC 2013 Cooperative Extension
Need Assessment Questions? INPUT What resources are needed for starting this intervention strategy?
CHAIN OF OUTCOMES
SHORT
Seniors increase knowledge of food contamination risks Participants increase knowledge and skills in financial management Community increases understanding of childcare needs
MEDIUM
Practice safe cooling of food; food preparation guidelines Establish financial goals, use spending plan
LONG-TERM
Lowered incidence of food borne illness Reduced debt and increased savings
Empty inner city parking Youth and adults learn Money saved, nutrition lot converted to gardening skills, nutrition, improved, residents enjoy community garden food preparation and mgt. greater sense of community
for Applied Research at CPCC 2013 Source: E Taylor-Powell, University ofCenter WisconsinExtension-Cooperative Extension
AT YOUR TABLES .
Select an ATD student success initiative at your college that you plan to evaluate before you make the decision to scale it up. (if you
cant think of one use the online learning one in your handouts)
1. Why did you develop this program with these program characteristics? 2. What do you think students (or participants) will get out of this program (what changes)? 3. How do you tie specific program content to specific expected changes or improvements in participants.
Center for Applied Research at CPCC 2013
How would you create these targets or benchmark? Do you need a comparison group? What is an acceptable level of improvement or change?
Center for Applied Research at CPCC 2013
Will they be online or pencil/paper tools (benefits of each) When do they need to be ready? Who needs copies? Create evaluation timeline.
Center for Applied Research at CPCC 2013
CLOSING
Establish your plan Follow your plan Assign responsibility for it Expect big things Use results to improve what you do (close the loop)