You are on page 1of 6

G3658-1W

Revised February 2008

Planning a Program Evaluation: Worksheet

Steps in Program Evaluation


1.
Engage stakeholders

2.
Focus
 Describe
Manage program-logic
3.
the model Collect
evaluation  Define data
purpose
 Identify
 Determine sources
4.
 Human use/users Analyze &
Subjects  Select
 Determine method(s) interpret
Protection
key questions
 Timeline  Pilot test  Process data
 Select 5.
 Responsibilities indicators  Set schedule  Analyze
 Budget  Determine Use
 Determine  Interpret data
design sample
 What did you  Share findings
learn? and lessons
 What are the learned
limitations?  Use in
decision
making
 Determine
next steps

Standards of evaluation:
 Utility  Feasibility  Propriety Accuracy

University of Wisconsin-Extension • Cooperative Extension • Program Development & Evaluation


UW-Extension provides equal opportunities in employment and programming, including Title IX and ADA.
© 2006 by the Board of Regents of the University of Wisconsin System http://www.uwex.edu/ces/pdande
Engage Stakeholders
Who should be involved?
Administration, teachers, students

How might they be engaged?


Raising the issue of an evaluation so as to obtain their support and cooperation; soliciting their feedback, which
will be the most useful information we gather; keeping them informed as to the results of the evaluation.

Focus the Evaluation


What are you going to evaluate? Describe program (logic model).
I am going to evaluate the Independent Study program at Royal West Campus. It is designed to allow students to
finish one high school course at a time while working or parenting full time (or just while generally having such a
busy life that attending school on a full-time basis is not possible). In the program, students are required to attend
only once a week, checking in with their teacher to hand in completed homework and pick up the homework
package for the next week. The program is somewhat similar to a correspondence course in its self-directed
nature, but with teacher support on a weekly basis.

What is the purpose of the evaluation?


To determine the effectiveness of the program. Anecdotally, teachers know the program has a high drop-out rate
due to the ease with which students become disconnected from the school. In addition to examining the statistics
more closely, the evaluation will endeavor to engage the stakeholders in an effort to determine what the goals of
the program are, the effectiveness of the program’s implementation, and finally to assess whether or not those
goals are being met.

Who will use the evaluation? How will they use it?
Who/users How will they use the information?
To make evidence-based decisions regarding the
Administration program’s implementation and outcomes with a view to
possible changes.
To better their understanding of their role (both as the
program stands and as it potentially might look) in the
Teachers program: currently some teachers struggle to fully
comprehend what it is they should be doing other than
simply providing homework packages.
To better inform students at the outset as to what the
Students program is designed to do as well as the expectations of
how to best achieve those designs.

What questions will the evaluation seek to answer?


Outcomes: What do people do differently as a result of the program? What, if any, are the unintended secondary
or negative effects? How efficiently are clientele and agency resources being used?
Implementation: What are the roles of the individual stakeholders? Are the delivery methods effective? Who
actually carries out the program and how well do they do so?
Context: How well does the program fit in the local setting—with educational needs and learning styles of target
audience? What in the socio-economic environment inhibits or contributes to program success or failure? What in
the setting are givens and what can be changed?
Need: What needs are appropriately addressed through the Independent Study program? What are the
characteristics of the target population? What changes do people see as possible or important?

University of Wisconsin-Extension • Cooperative Extension • Program Development & Evaluation


UW-Extension provides equal opportunities in employment and programming, including Title IX and ADA.
© 2006 by the Board of Regents of the University of Wisconsin System http://www.uwex.edu/ces/pdande
What information do you need to answer the questions?
What I wish to know Indicators – How will I know it?
Comparison of feedback from stakeholders in the
Outcomes questions program versus what they see in the regular “Daily
Instruction” program.
Assessment of stakeholders’ perceptions coupled with a
comparison of the data—how many students finish
Implementation questions
courses in the Independent Study program versus the
regular Daily Instruction program.
Will determine the role students play in their own
success. That is to say, are the results different for
Context questions students of different backgrounds? For example, do
upgraders finish courses in the I.S. program more
frequently than Adult 12 students?
Regardless of success RATE, is the success of the I.S.
program the fact that it allows some students (regardless
Need questions of how many) to finish courses they otherwise never
would have? Is the unique nature of the program enough
to justify its existence?

When is the evaluation needed?


Mid-June works well because the school is currently going through its own visioning process for next year. We
are currently examining all our programs in an effort to continue to improve and solidify them.

What evaluation design will you use?


Investigative surveys of all stakeholders will be the key piece to obtaining the necessary feedback to do an
effective evaluation of the program. The key will be to establish whether the goals of the program match the
outcomes.

Collect the information


What sources of information will you use?

Existing information: Course completion stats, drop-out rates, final averages


People: Observations and reflections from stakeholders
Pictorial records and observations:

What data collection method(s) will you use?

Y Survey Y Document review


Interview Y Testimonials
Y Observation Y Expert panel
Group techniques Simulated problems or situations
Case study Journal, log, diary
Tests Unobtrusive measures
Photos, videos Other (list)

University of Wisconsin-Extension • Cooperative Extension • Program Development & Evaluation


UW-Extension provides equal opportunities in employment and programming, including Title IX and ADA.
© 2006 by the Board of Regents of the University of Wisconsin System http://www.uwex.edu/ces/pdande
Instrumentation: What is needed to record the information?
Perhaps not the answer that was sought for this question, but Time and Patience are what is needed most. The
nature of the program is such that students do not attend regularly, and those who do are obviously those for
whom the program already works in its current incarnation. As a result, if we are to get an accurate reflection of
the program and how it affects ALL the stakeholders, we will need to get feedback from those for whom the
program has obviously NOT worked as well. This means patiently waiting to connect with those who do not
connect with the school regularly enough to be successful.
Other than time, there will be a need to collate the information gathered using surveys with students and meetings
with teachers so that sense can be made of the feedback.

When will you collect data for each method you’ve chosen?
Before During Immediately
Method program program after Later
Y (depends on Y (depends on
Survey, testimonials
student) student)
Observation, expert panel Y
Document review Y Y

Will a sample be used?

No N
Yes If yes, describe the procedure you will use.

Pilot testing: when, where, how?

University of Wisconsin-Extension • Cooperative Extension • Program Development & Evaluation


UW-Extension provides equal opportunities in employment and programming, including Title IX and ADA.
© 2006 by the Board of Regents of the University of Wisconsin System http://www.uwex.edu/ces/pdande
Analyze and Interpret
How will the data be analyzed?
The questions on both the survey and those used in meetings will all end up with written
answers, meaning the results will have to be read and made sense of manually. Some
Data analysis methods: questions will be open-ended, while others will be closed, meaning some information
will have to be synthesized by the reader(s), while other information can be reduced to
numerical data.
Who responsible: Myself, along with possibly another teacher, and maybe our Director.

How will the information be interpreted—by whom?


Most prominently, our Director will use the information gleaned to help assess what to do with the program for
next school year. We are going through a visioning process at the moment specifically so we can inform students
who are hoping to pre-register this spring for next fall’s start-up. The more information we have on our programs
now and what they might look like next year, the better we can direct our students in terms of which programs
provide the best fit for them.

What did you learn? What are the limitations?


Hopefully we learned about any differences between how our program has been conceived and how it is being
delivered. The idea of this program is fantastic, but the results have not been. Also, it is important to note the
things we learned about how our program is perceived by the students who are supposed to benefit from the
program. Perhaps our ideas and ideals do not necessarily match how they conceive the program.
The limitations, of course, have to do with the limitations of subjective information gathered from the participants
in the program. That is, will students be willing to offer their full honesty, even when that honesty might
incriminate themselves? Are we going to learn what students are actually doing in the program, or what they
would ideally do in the program?

Use the Information


How will the evaluation be communicated and shared?
To whom When/where/how to present
Written report as well as summary statements and
Admin possibly graphs and charts, delivered in a closed-door
meeting to elaborate on the findings.
Highlighted findings of the report (short summary
statements) presented in a meeting so as to continue the
Staff process of improving the program by including the
stakeholders in the decision-making process surrounding
the implementation of the program next year.
The students won’t see the evaluation report per se, but
they will see the new and improved version of the
program when we roll it out. We’ll stress the differences
Students
to them in order to encourage them to sign up for a
program that will (hopefully) better meet their needs of
obtaining high school credits without daily attendance.

Authors: Taylor-Powell, E.; Steele, S.; and Douglah, M.


University of Wisconsin-Extension • Cooperative Extension • Program Development & Evaluation
UW-Extension provides equal opportunities in employment and programming, including Title IX and ADA.
© 2006 by the Board of Regents of the University of Wisconsin System http://www.uwex.edu/ces/pdande
Next steps?

Manage the evaluation Standards


Y Human subject’s protection Y Utility
Y Management chart Y Feasibility
Y Timeline Y Propriety
Y Responsibilities Y Accuracy
Budget

University of Wisconsin-Extension • Cooperative Extension • Program Development & Evaluation


UW-Extension provides equal opportunities in employment and programming, including Title IX and ADA.
© 2006 by the Board of Regents of the University of Wisconsin System http://www.uwex.edu/ces/pdande