You are on page 1of 47

EVALUASI

• Evaluasi adalah suatu identifikasi terhadap


nilai atau kegunaan dari sesuatu

• Evaluasi = appraisal = rating =


assessment
KARAKTERISTIK EVALUASI
• Value Focus  fokus pada penilaian
terhadap nilai dari suatu kebijakan atau
program
• Fact-Value Interdependence  tidak
hanya mempunyai nilai pada suatu
golongan masyarakat, tetapi juga
merupakan konsekuensi untuk
menyelesaikan masalah
KARAKTERISTIK EVALUASI
• Present Past Orientation  berkaitan
dengan outcome pada saat ini dan masa
lalu, daripada masa yang akan datang

• Value-Duality  nilai pada evaluasi


mempunyai dua makna, yaitu sebagai
tujuan (end) dan alat (means)
FUNGSI EVALUASI
• Performance  Realisasi dari kebutuhan,
nilai, dan kesempatan dari suatu
kebijakan, program, atau perencanaan
• Klarifikasi dan Kritik terhadap nilai yang
mendasari pemilihan tujuan dan sasaran
• Aplikasi dari metoda analisis kebijakan
lainnya
Why Evaluate?
• Determine planning & program outcomes
• Identify planning & program strengths
• Identify and improve weaknesses
• Justify use of resources
• Increased emphasis on accountability
• Professional responsibility to show
effectiveness of planning & program
What is Planning Evaluation?
• Purposeful, systematic, and careful
collection and analysis of information used
for the purpose of documenting the
effectiveness and impact of planning &
programs, establishing accountability, and
identifying areas needing change and
improvement
Kinds of Evaluation
• Outcome
• Implementation
• Formative
• Summative
Timing of Evaluation
• Formative
– as the program is happening to make
changes as program is being implemented

• Summative
– at the end of a program to document results
Overview – The 9-step Process
Overview – The 9-step Process
Overview – The 9-step Process
Overview – The 9-step Process
Step 1: Scope/Purpose of
Evaluation
• Why are you doing the evaluation?
– mandatory? program outcomes? program
improvement?
• What is the scope? How large will the
effort be?
– large/small; broad/narrow
• How complex is the proposed evaluation?
– many variables, many questions?
• What can you realistically accomplish?
Resource Considerations
– Resources
• $$
• Staff
– who can assist?
– need to bring in expertise?
– do it yourself?
– advisory team?
• Time
– Set priorities
– How you will use the information
Step 2: Specify Evaluation
Questions
• What is it that you want to know about your
planning & program?
– operationalize it (make it measurable)

Do not move forward if you cannot answer this question !


Sources of Questions
• Strategic plans
• Mission statements
• Policies
• Needs assessment
• Goals and objectives
• National standards and guidelines
Broad & Narrow Questions
• From the list of questions, identify
those that might be considered broad
and those that might be considered
narrow
• How large will the resources need to
be to answer the question
Identifying Evidence Needed to
Answer Your Questions
• What evidence do you have to answer
your question?
• Need to think about what information
you need in order to answer your
evaluation questions
• List evidence you need to have to
answer the question
Step 3: Specify Evaluation Design
• Relates to when data should be collected
• Status (here and now; snapshot)
• Comparison (group A vs. group B;
program A vs. program B)
• Change (what happened as a result of a
program; what differences are there
between time A and time B)
• Longitudinal (what happens over
extended time)
Step 4: Create a Data Collection
Action Plan
Organize Your Evaluation With a Data
Collection Action Plan
Evaluation What is How Collected/What
Question 1 Collected Technique
 
 
  
 
 

From Whom/ When How Data are to be


Data Sources Collected Analyzed
  and By
  Whom
   
Step 5: Collect Data

• How much data do you need?


– 100% of target audience is ideal; may be
too expensive and time consuming
– If not 100%, sample is OK if group is
representative of group as a whole
(population)
Types of Samples
Data Collection Options
• Commercial instrument
• Survey/questionnaire
• Focus group/interviews
• Observations
• Archived information
Data Collection Considerations
• When should you collect the
information?

• Who should collect it?


Commercial Instruments &
Decision-Making Checklist
• Sometimes best to use published or research
instruments
– particularly for tough constructs
– since its not made specifically for you, may not
answer your question entirely

• This checklist will help you conduct a review


of data collection instruments that you are
considering using in your evaluation
Self-Constructed Instruments:
Questionnaires
• Focus on evidence you need
• Use simple language
• Ask only what you need; keep it short
• Don’t use jargon
• Each question should focus on one idea
• Make sure terms are clear
• Make it easy for person to answer the
questions (check rather than write, where
possible)
• Use extended response when you want
details
Self-Constructed Instruments:
Focus Groups/Interviews
• Good to use when you want extended and
detailed responses
• Craft an agenda and stick to it
• Keep groups small (6-10); time short (1-1.5
hours)
• Specify objectives of session
• Questions need to be clear; one question at
a time
• Encourage everyone to participate
• Use opportunity to probe deeper on a topic
Step 6 – Analyzing Data
What is Data Analysis?
• Data collected during program evaluation are
compiled and analyzed (counting; number
crunching)
• Inferences are drawn as to why some results
occurred and others did not
• Can be very complex depending on your
evaluation questions
• We will focus on simple things that can be done
without expert consultants
Types of Data Analysis
Simple Frequency Counts
Types of Data Analysis
Sort by Relevant Categories
Types of Data Analysis
Calculate Percentages
Types of Data Analysis
Showing Change or Differences
Types of Data Analysis Reaching
an Objective or Goal
Types of Data Analysis Observing
Trends
Types of Data Analysis Graph
Results
Types of Data Analysis Calculate
Averages
Types of Data Analysis Calculate
Weighted Averages
Types of Data Analysis
Rank Order Weighted Averages
Types of Data Analysis Graph
Weighted Averages
Using Focus Group/Interview
Information
• Qualitative findings from focus groups,
extended response items, etc., should
be analyzed in a different way
– Code words/frequency
– Identify themes
– Pull quotes
– Summarize and draw conclusions
Drawing Conclusions

• Examine results carefully and


objectively
• Draw conclusions based on your data
• What do the results signify about your
planning & program?

Complete the Interpreting Results !


Unintended Consequences
• Watch for positive and negative
outcomes that you did not plan on
Step 7: Document Finding
What to Include in Your Documentation
• Program description
• Evaluation questions
• Methodology (how and from whom and when)
• Response rate
• Methods of analysis
• Conclusions listed by evaluation question
• General conclusions and findings
• Action items
• Recommendations for program improvement and
change
Document the
Successes and Shortfalls
• Highlight and brag about positive
outcomes
• Document shortfalls
– Provides opportunities to
• improve planning & program
• make recommendations to benefit the
planning & program
Step 8: Disseminate Information
• Inform all your relevant stakeholders
on results
• Dissemination methods should differ
by your target audience
Dissemination Techniques
– Reports
– Journal articles
– Conferences
– Career Newsletter/Tabloids
– Presentations
– Brochures
– TV and newspaper interviews
– Executive summary
– Posting on Web site
Feedback to Program
Improvement
• You can use evaluation findings to make
planning & program improvements
– Consider adjustments
– Re-examine/revise planning & program strategies
– Change programs or methodologies
– Increase time with the planning & program
• Use your results as a needs assessment for
future efforts

You might also like