You are on page 1of 24

MODULE

6 Educational Evaluation

Module Hello! Welcome to Module 6. In this module, you are to explore evaluation as
Overview a tool towards improving quality of educational services and programs.
Specifically, you will examine the significance of educational evaluation and a
variety of evaluation approaches, methods and techniques. However, this
module emphasizes the CIPP evaluation model which remains a practical lens
for you to have a better grasp of the impact of evaluation to the academic
institutions and the educational system.

Module
Objectives/Out Upon the completion of this module, you should be able to:
comes 1. articulate the role of evaluation in ensuring the quality of education
academic institutions provide;
2. compare various evaluation approaches, methods and techniques; and
3. explain the CIPP evaluation model as well as its utility in schools.

Lessons in the In this module, you will be exploring these topics:


module
Lesson 1: Educational Evaluation
Lesson 2: Evaluation Approaches
Lesson 3: Evaluation Methods & Techniques
Lesson 4: The CIPP Evaluation Model

LESSON 1 Educational Evaluation

Learning Upon the completion of this lesson, you should be able to:
Outcomes 1. demonstrate concrete knowledge and understanding of educational
evaluation and
2. explain the rationale behind the conduct of educational evaluations.
Time Frame 2 hours
Ready to begin a new lesson? Good! Anyway, Lesson 1 introduces you to the
utilization of evaluation as a valuable decision making tool of educators,
administrators and program designers. This lesson persuades you to become
active participants in the accreditation endeavor of your future employer as
teachers.

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
Activity 1: Word Cloud

Search the meanings of the words and acronyms shown in the word cloud
below. Which terms seem unfamiliar to you?

Activity

Reflect on the following processing questions.


1. What is your understanding of quality education? Which government
agencies set the standards of quality education?
2. How can you ascertain whether a school provides relevant and quality
academic programs?
3. What assessment data will support a school’s claim of excellence and
quality?
4. Who are the stakeholders of educational evaluation?
Analysis 5. Why do schools submit themselves to external accreditation?
Educational evaluation is an essential part of educational policy-making,
planning, and implementation. It is a systematic, continuous and
comprehensive process of determining the merit, worth, and significance of
school initiatives and programs (Navarro, R. L. & Santos, R., 2013). Its main
tenet is the holistic appraisal of a learner, his/her environment and
accomplishments (Cajigal, R. & Mantuano, M., 2014).
Abstraction
Educational evaluation is not limited to the teacher-student engagement. All
programs or activities in the school that went through the process of deliberate
planning and implementation require an assessment of their worth and value
(Reyes, E. & Dizon, E., 2015). Educational evaluation is deemed imperative
as a tool in the continuous quality improvement of schools as an institution.
In layman’s term, educational evaluation is the process of ascertaining the
quality of education provided by schools.

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
As a tool for decision making, educational evaluation generates data that may
trigger changes in the current practices, programs, initiatives, activities and
policies of schools. The results of evaluation shall become the basis in the
formulation of appropriate educational decisions and actions (Kubiszyn, T. &
Borich, G., 2000).

Instructional & Grading: Inside the classroom, teachers reach instructional


decisions with respect to the extent of attainment of the intended learning
outcomes. Data is obtained from test results and performance scores. Analysis
will lead teachers to implement adjustments in the delivery of the lessons and
the designs of assessment tasks. This includes also decisions for promotion or
retention of students in a particular grade level.

Diagnostic: Assessment of the strengths and weaknesses of the learners


allows teachers to identify the root cause/s of the difficulty. Diagnostic
assessment provides relevant information regarding the readiness of the
students. Intervention and remediation programs must be based on needs
assessment

Selection & Placement: Evaluation data may also be gathered to select the
students to be admitted to a program or activity. Moreover, the placement
decision is made once the student is admitted to the school and usually intends
to identify students who need remediation or enrichment classes.

Guidance & Counseling: Guidance and counseling initiatives are deemed


more suitable if they are products of assessment. This includes the use of

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
socio-metric and standardized personality tests, anecdotal records and clinical
observations. Evaluation results may become basis for guidance and
counseling initiatives in response to the needs of the learners.

Program/ Curriculum: Based on results of evaluation, school administrators


may decide to continue, discontinue, revise or pursue a program, activity, and
curriculum. Evaluation shall lead to better planning and implementation in the
succeeding school endeavors. Hence, evaluation should be an imperative in
every school’s processes and procedures.

Administrative Policy: Given the available resources of the school, a


thorough evaluation of the efficiency of utilization of funding and assets shall
provide the basis for modifications in plans, policies and processes. Decisions
whether to acquire new facilities, machineries and materials and whether to
add more staffs must be based on gathered data.

According to the American Evaluation Association (2018), the five guiding


principles for evaluators are as follow:

Evaluation is an intrinsic and essential component of teaching and learning.


The results of an evaluation in educational setting may determine whether a
student passes to the next grade level, a teacher gets promoted, a particular
textbook will be used, a course will continue to be offered, a laboratory will
require renovation, and a school regulation will be modified. Educational
assessment typically uses preselected measurements such as norm- referenced
standardized tools to measure and evaluate quality of learners, instructors,
classes, institutions or the educational system as a whole (ACSME, 2007).
Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
Competency evaluation is a means for teachers to determine the ability of
their students, not necessarily through a standardized test. Performance
evaluation ascertains the extent of capability to demonstrate a particular skill.
Course evaluation evaluates the quality of the delivery of a given course
while program evaluation determines if a program “works”. All of these are
components of educational evaluation.

The evaluation process goes through four phases as shown in the diagram
below.

Planning: In the planning phase, there must be constructive alignment among


objectives, programs and evaluation criteria. What are the program’s
conceptual underpinnings? What information is needed to make decisions?
Which stakeholders will be directly involved in the process? Designing the
data collection tool is also a foremost concern in this phase.

Implementation: In the implementation phase, the prior concern is the


administration of the data collection tool. Extra care in data gathering and
handling is a must to ensure authenticity of findings.

Analysis: In the analysis phase, objectivity in interpretation and credibility of


the findings are to be established. Appropriate quantitative and qualitative data
analysis tools must be utilized carefully.

Reporting: In the reporting phase, translating the evaluation results in


concordance with the context of the recipients of the findings. Data
presentation must lead to clarity and not confusion. Consequently, the results
will lead to planning for program changes.

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
All these four phases complete the evaluation cycle regardless of the
evaluation approach employed by the academic institutions.

As evidence of the significance of educational evaluation, many schools


pursue accreditation endeavors. DepEd, CHED and TESDA have established
respective standards for K-12, tertiary and technical-vocational education.
These standards have become the basis of the evaluation tools of several
external accrediting agencies such as PACUCOA, PAASCU, ACSCU,
AACCUP and etc.
Activity 2: Pattern Fan
Fill up the template shown by writing the important ideas you have learned
about education evaluation
----------------------------------------------------
---------------------------------------------------

EDUCATIONAL ----------------------------------------------------
EVALUATION ---------------------------------------------------

Application
----------------------------------------------------
---------------------------------------------------

Activity 3: Case Analysis


Given the following cases, identify the type of educational decisions and
assessment action to be done.

Case Educational Assessment


Decision Action/s
1. Readiness of
Grade 10 students to
take the NAT
2. Implementation of
the Remediation
Classes in Reading
3. Shift from Face-
to-Face Delivery to
Full Online Mode
4. Utilization of a
learning management
system

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
Congratulations for completing Lesson 1! Educational evaluation is an
essential mechanism for the smooth flow of processes and operations.
Educational decisions based on evaluation data are deemed trustworthy.
However, what are the approaches and methods in educational evaluation?
This shall be the focal point of the next lesson.

LESSON 2 Evaluation Approaches

Learning Upon the completion of this lesson, you should be able to:
Outcomes 1. distinguish the different evaluation approaches
Time Frame 2 hours
Ready to begin a new lesson? Good! Anyway, Lesson 2 introduces you to the
different evaluation approaches that schools may employ to ascertain attainment of
intended outcomes. Particularly, you will explore the key features of these
evaluation approaches.

Activity 1: Background Knowledge Probe


Based on your understanding of the types of assessment, identify which type must
be employed according to the intention of the evaluator. Indicate your response by
putting a check under the appropriate column.

Formative Summative
Intention Assessment Assessment
Activity 1. Group students according to their
achievement levels.
2. Provide timely feedback to students.
3. Help students to feel safe to take risks
and make mistakes in the classroom.

4. Certify learning and award


qualification.
5. Diagnose student learning needs.
6. Motivate students to increase effort
and achievement.
7. Actively engage students in their
own learning process.
8. Provide information about student
performance to stakeholders

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
.

Reflect on the following processing questions.


1. How do you distinguish diagnostic, formative and summative assessment from
each other?
2. Is assessment synonymous with evaluation? If not, contrast them from one
another.
3. What is educational evaluation? Why do we conduct evaluation? What can we
Analysis derive from evaluation results? Who benefits from the results of the evaluation?
Effective evaluation ensures that students properly placed, learning problems and
progress are appropriately diagnosed, performance of teachers are improved and
enriched, and academic standards are achieved and sustained. Thus, the choice of
the evaluation approach to use is a valid issue.

Evaluation approaches refer to the different ways to view, design, and conduct
evaluation activities. Some evaluation approaches provide solutions to problems;
Abstraction others improve existing processes and procedures. Generally, any evaluation
process may either employ formative or summative approaches depending upon the
intent of the evaluation activity.

Formative evaluation is an on-going process that allows for feedback to be


implemented during a program cycle. Formative evaluation is deemed a process-
oriented approach where feedback is generated while the program is being run
(Boulmetis, J. & Dutwin, P., 2005). Formative evaluation includes several types
(Trochim, W., 2020):

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
Need assessment – identifies who needs the
program, how great the need is, and what might
work to meet the need

Structured conceptualization – defines the


program based on the target population and
FORMATIVE perceived outcomes
EVALUATION
METHODS Implementation evaluation – monitors the
correspondence between the plan and the actual
delivery

Process evaluation – investigates the process of


delivering the program including alternative
delivery procedures

Summative evaluation takes place at the end of a program cycle providing an


overall description of its effectiveness. Summative evaluation measures the extent
of attainment of the program objectives. The results enable schools to determine the
future direction of a program or initiative. Summative evaluation includes several
types (Trochim, W., 2020):

Goal-based evaluation – ascertains whether the


intended goals of the program or project were
achieved

Outcome evaluation – identifies the effects to


students of participating in the program
SUMMATIVE
EVALUATION
METHODS Impact evaluation – determines the effect of the
program to larger stakeholders like community
and educational system

Cost-benefit analysis – investigates the cost-


effectiveness of the program

Moreover, House (1978) and Stufflebeam & Webster (1980) classified approaches
for conducting evaluations based on epistemology, perspective, and orientation.

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
Orientation
Epistemology Political Questions Values
(Ethic) Perspective (Pseudo- (Quasi- (True
evaluation) evaluation) evaluation)
Elite Politically Experimental Decision-
(Managerial) controlled research oriented
Public Management Policy studies
relations information
systems
Testing
programs
Objectivist Objective-
(Utilitarian) based
Content
analysis

Mass Accountability Consumer-


(Consumers) oriented

Elite Accreditation/
(Professional) certification
Subjectivist Connoisseur
(Intuitionist/
Pluralist) Mass
(Participatory)

In terms of the ways of obtaining knowledge, the objectivist epistemology is


associated with utilitarian ethics which concurs that something is good if the
society as whole is happy about it and it’s possible to validate externally the
knowledge acquired through publicly exposed evaluation methods and data. The
subjectivist epistemology is associated with intuitionist/pluralist ethics which
posits that there is no single interpretation of “good” and evaluation entails looking
into both the explicit and the tacit knowledge.

In terms of perspective, evaluation approaches may be categorized as elitist or mass-


based. An elitist perspective focuses on the views of the administrators and/or
experts in the field or profession. On the contrary, the mass-based perspective puts
the consumers at the apex of evaluation and highly participatory in nature. The
consumers may refer to the students, parents, community, and employers.

In terms of orientation, evaluation approaches may be clustered into political,


question and values orientation. The political orientation or pseudo-evaluation
approaches tend to selectively present information and is skewed towards certain
perspectives or ideas. These types of evaluation includes public relations inspired

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
(a feel good evaluation focused on the positives of a program), politically controlled
(multiple truths uncovered) and evaluation by pretext (the client has a hidden agenda
for conducting the evaluation that is unknown to the evaluator).

The question orientation or quasi-evaluation approaches entail the collection of


evidence to ascertain whether any change that has occurred is due to the program or
intervention or other confounding factors. An elitist quasi-evaluation employs
experimental research (causal relationships), management information systems
(scientific efficiency), testing programs (individual differences), objective-based
studies (outcome-objective relationship) and content analysis (communication
data). However, a mass-based perspective quasi-evaluation determines the extent of
accountability based on well-defined performance expectation and accurate
accounting of outcomes.

The values orientation or true evaluation approaches are not only concerned with
goals, but also whether the goals are worth achieving. The evaluator considers the
impact, accomplishments and consequences of the program. A decision-oriented
approach promotes the use of evaluation as premise for the educational decisions
and planning activities. Policy studies include evaluation approaches that focus in
assessing potential costs and benefits of competing policies. Consumer-oriented
approach determines how the school has satisfied the clientele’s needs and
expectation.

Additionally, accreditation is a mechanism that allows academic institutions to


prove that they meet a general standard of quality. It is the formal recognition by an
authoritative body of the competence to operate with respect to specified criterions.
As a process, it is a form of peer review in which an association of schools, colleges
and universities evaluates a particular institution based on an agreed set of norms
encouraging improvement of every affiliate member. As a result, schools receive
recognition from the agency for having met the prescribed minimum requirements.

Certification, on the other hand, represents a written assurance by a third party of


the conformity of a product, process or services to specified requirements. In the
Philippine context, this may refer to grant to operate certain programs in schools
and universities. Connoisseurship as an outgrowth of art appreciation advocates
the use of qualitative evaluation. It attempts to discern the subtle but significant
aspect of classroom life, schooling and education as a whole.

The adversary approach makes use of debate as its methodology. Two opposing
views on issues are presented with a neutral party acting as the referee. Moreover,
the client-centered approach places the unique needs of the clients at its core.

Evaluation ascertains how well a program, a practice, an intervention or an initiative


achieves its goals. It helps in determining what works well and what could be
improved. The selection of the evaluation approach to employ, however,

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
is dictated by the intent of the institution to be evaluated.

Activity 2: Grid Analysis


Identify the evaluation criteria being measured when you sought answers to these
key evaluation questions. Put a  on the appropriate column.

Questions Relevance Effectiveness Efficiency Impact

1. What range of
outcomes has the
Application school contributed to
society, environment
and economy?
2. Is the initiative
delivering on outputs
and outcomes as
planned?
3. Has the initiative
being delivered on
budget?
4. Is the school
impacting positively
on key groups and
issues?
5. Are there aspects of
the program that could
have been done
differently?

Activity 3: Balancing Act


Choose three evaluation approaches. Describe the advantages and disadvantages
of each approach. Present your response using the template below.

DISADVANTAGES
ADVANTAGES

APPROACH

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
Congratulations for completing Lesson 2! Evaluation approaches are distinguished
by the nature of the questions they attempt to answer. It is therefore vital to begin
evaluation by being clear on what is wanted from evaluation. In other words, begin
with the end in mind. In the next lesson, you shall explore the different methods and
techniques in evaluation.

LESSON 3 Evaluation Methods and Techniques

Learning Upon the completion of this lesson, you should be able to:
Outcomes 1. describe the salient features of the different methods and techniques
of evaluation and
2. identify the key strengths and weaknesses of each method and
technique.
Time Frame 1 hour
Ready to begin a new lesson? Good! Anyway, Lesson 3 introduces you to the
different evaluation methods and techniques that schools may utilize to
gather essential description of school performance. Particularly, you will
explore the key features of these evaluation approaches and methodologies.

Activity 1: Word Search


Placed in this puzzle are terms related to evaluation methodologies. Can
you find all of them?

L D M R H I S T O R I C A L I
F I X E X P E R I M E N T F E
Activity
Z B E N C H M A R K I N G O K
E V A L U A T I O N Q J Q C Q
Q U A L I T A T I V E V L U Z
F E A S I B I L I T Y L A S C
K S I S Y L A N A E S N V I O
L E C R O F Q L R O T C A F M
W E I V R E T N I I W B W V P
X O B S E R V A T I O N Y M E
P D L K H C R A E S E R H A T
U L Q A U Q T E U N Y Q V R I
O E B Y T I L A U Q M K S K T
R I C N V E S R U O C T J E O
G F F E Y S N T R O H O C T R

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
Reflect on the following processing questions.
1. How do you distinguish approach, method and technique?
2. In the process of assessment, what types of data can you generate?
3. Among the evaluation methods from the puzzle, which will yield a
quantitative data? Qualitative data?
Analysis 4. How do you decide which method or technique to use?

Evaluation helps schools to sought answers to questions such as “How are


we doing?”, “How do we know?”, and “What are we going to do now?” It is
ideal in investigating the influence of courses of action on the school’s vision,
mission, goals, learning and teaching practices, responses to changes, and
operational procedures. Quality education program evaluation includes both
qualitative and quantitative measures and evidences.
Abstraction
In the deciding which evaluation methodology to employ, academic
institutions must deal with theoretical and practical issues. Theoretical issues
include the value of the type of data, the perceived scientific rigor of the data
and underlying philosophies of evaluation. Practical issues encompass the
credibility of results, skills of the staffs, and financial and time constraints
(NSF, 2010).

Quantitative method focuses on “what” and “how many” while qualitative


method focuses on “why” and “how”. To choose between them, you may
use the flowchart below.

Data is intended to have Data is projected to lead


a robust understanding to a general
description of the program’s context of the program

Emphasis is examining, Emphasis is on the use of


comparing & interpreting statistical tools to
analyze patterns and themes and interpret data

Plans to use interviews, Plans to utilize


surveys, focus groups, journals, questionnaires,
observations, narratives, experiments,
& ethnographies secondary
data
use use
QUALITATIVE QUANTITATIVE

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
This comparison of the two methods is too simplistic. Both methods may or
may not satisfy the canons of scientific rigor. Quantitative methods may seem
precise if used properly and carefully; but, if respondents failed to
comprehend completely the items in the survey then findings may be affected
badly. Qualitative method setbacks, however, includes the difficulty of
gathering credible data sources, time-consuming and costly nature of data
collection, and intricacy of data analysis and interpretation (Patton, 2002).
Nowadays, to take advantage of the strengths of each method the use of
mixed-methods is advocated.

Different evaluation techniques have different purposes, work in different


contexts, and give you different types of feedback. Depending on what you
expect to obtain from the evaluation, you might find some techniques more
useful than others. Listed below are the common techniques employed in
education evaluations (NSF, 2010).

Technique Salient Features


Surveys  gather descriptive responses to a wide
range of close-ended or open-ended
questions
 require a large number of respondents
 may be administered by pen-and-paper
or via web-based online data
collection systems
can be easily analyzed by existing
software
 provide a general picture but may fail to
consider audience’s context
Interviews  regard the participants’ viewpoint as
meaningful and recognizable
 require well-selected group of
participants based on a defined
inclusion criterion
 may be done through face-to-face or
telephone/video interview
 may use carefully worded questionnaire
(structured) or a free-wheeling
probing (unstructured)
 require skillfulness and flexibility in
interviewing
 prone to information distortion to
interviewee
 produce vast volume of information
that may be difficult to transcribe
Focus groups  combine elements of interviewing

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
and participant observation
 explicitly use group dynamics to
generate data and insights
 may be conducted in a room or through
web-based discussion platforms
 may be used at both the formative
and summative stages of an evaluation
 less costly than individual in-depth
interviews
Observations  gather firsthand data on the
interventions, processes, or behaviors
 occur in natural, unstructured and
flexible setting
 need qualified and highly-
trained observers
 may push some participants to behave
differently
 may be prone to distortion due selective
perception of observer
Tests  provide means to assess subject’s
knowledge and capacity to apply
knowledge
 may be in selected-response
or constructed-response formats
 may be interpreted based on a certain
norm or criterion
 are criticized as fragmented, superficial
and punitive
 provide objective information that can
be scored in a straightforward manner
 may be distorted via
coaching or cheating
Checklists  use a standard list of action items, steps,
or elements that the clientele should have
demonstrated in completing a task,
program or activity
 can be cheap and easy and covers a
wide array of factors
 depth and breadth is limited
Document Studies  use existing documents and secondary
data
 useful in analyzing trends and patterns
over time
 prone to doubts towards its authenticity,
completeness, and suitability
 time consuming to analyze and difficult
to access data

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
Key Informant  entails selection or invitation of
participants based on their skills, background
and involvement in the program
 provides an “insider” perspective
concerning the issue evaluated
 prone to informants’ biases and
impressions
 requires observance of professional
relationship between evaluator and informants to
avoid tainting the results
Case Studies  provides a specific illustrative case or
exemplar of the issue evaluated
 allow a thorough exploration of
interactions between treatment and contextual
factors
 require well-trained data collection and
reporting teams
 may be exposed to excessive
interpretation and generalization.

Other evaluation techniques include cohort studies, social network analysis,


self-completion questionnaires, feasibility studies, force field analysis and
etc.

Educational evaluation may need both qualitative and quantitative methods


because of the diversity of issues addressed. The choice of methods should
fit the need for the evaluation, availability of resources and time, and
capability of the staffs. While every evaluator has his/her own preference, the
dominant notion is that no one method is always best.

Activity 2: Pyramid
From the lessons you have learned; fill up the pyramid of thoughts below.

IDEAS
1.
Application 2.
CONCEPTS
3.
4.
GENERALIZATIONS
5.
6.

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
Activity 3: T-Chart
Contrast quantitative and qualitative methods based on the specified
features.
QUANTITATIVE FEATURE QUALITATIVE

Value of the types of


data

Relative scientific
rigor of data

Underlying
philosophies of
evaluation

Sample Techniques
employed

Congratulations for completing Lesson 3! Evaluation need not be conducted


in an adversarial mode. Evaluations are designed for various audiences who
might be somewhat skeptical about the methods and techniques employed.
Evaluators, therefore, need to ensure credibility and objectivity. In the next
lesson, you shall focus on a specific educational model.

LESSON 4 The CIPP Evaluation Model

Learning Upon the completion of this lesson, you should be able to:
Outcomes 1. explain the elements of the CIPP evaluation model and
2. elaborate how the CIPP model can be used in school setting.
Time Frame 1 hour
Ready to begin a new lesson? Good! Anyway, Lesson 4 highlights the CIPP
model which is deemed an efficient tool for conducting educational
evaluations. Also, you will explore its efficiency and effectiveness as a model
of evaluation.

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
Activity 1: Classify Them

Identify the tool that can be used in each level of evaluation activities.

Activity Opinion polls Interest inventories


Observation guides Focus-group discussion
Personality inventories Interview guides
Tracer studies Rating scales
Anecdotal records National test results

Classroom Level Evaluation School System Level Evaluation

Reflect on the following processing questions.


1. Which tools are best used in classroom level of evaluation? School
system level of evaluation?
2. Which tool/s may be used in either level?
3. What must be the basis in selecting the tool to use in evaluation?
4. What is your understanding of the term model?
5. Are you familiar of any educational evaluation model? How about the
Analysis CIPP model?

Conducting an evaluation of a school program, project, intervention,


curriculum or initiative requires specific and systematic procedures.
Extensive studies of experts have yielded quite a number of evaluation
models. However, in this lesson you shall only focus on the CIPP model
developed by Phi Delta Kappa chaired by Daniel Stufflebeam (1971).
Abstraction
The CIPP (context, input, process, and product) evaluation model claims that
evaluation is conducted to reach a well-founded decision. It does not assume
linear relationship among its components. This model can be used for both
formative and summative kinds of evaluation activity. By alternately
focusing on program context, inputs, process, and products, the CIPP model
encompasses all phases of an educational program: planning, implementation
and evaluation. The first three elements of the CIPP model are suitable for
formative evaluation while the fourth element is ideal for summative studies.
The components of the model are summarized in the model adapted from
Stufflebeam (2003) below.

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
CORE
VALU
ES

Context Evaluation: The context evaluation component of the model


establishes the connection between the program goals and evaluation. The
evaluator describes the environment and determines the needs of the program
beneficiaries. The unmet needs, problems, issues and challenges are
identified and evaluated.

Input Evaluation: The input evaluation component of the model determines


how resources are utilized to achieve program objectives and goals. Data
regarding the school’s mission, goals, and plans are collected leading to the
assessment of the responsiveness of program strategies to the stakeholders’
needs. A comparison to alternative strategies used in similar programs is also
aimed in this stage. The input evaluation complements the context evaluation.

Process Evaluation: The progress evaluation component of the model


reviews the program quality. It ascertains whether the program is
implemented as it is planned. Program activities are monitored, documented
and assessed. Feedback mechanisms and continuous quality improvement are
of utmost concern by this stage.

Product Evaluation: The product evaluation component of the model


measures the impact of the program to target beneficiaries. Evaluators assess
the program effectiveness and sustainability. As a summative component,
decisions whether to continue, modify or terminate the program are
established in this stage.

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
As a whole, the CIPP model looks at evaluation both in terms of processes
and products in all the various phases of school program, project,
intervention, curriculum, or initiative implementation. Outcomes and
projected objectives are matched and the discrepancies between them are
considered as basis for future plans and decisions.

Activity 2:
Identify the CIPP stage where you can obtain responses to these key
questions.

1. How do students use what they have learned?


2. Has the teacher’s reputation improved or been ruined as a result?
3. Should courses be integrated or separated?
Application
4. What is the relation of the course to other courses?
5. What is the entry level skill of the students?
6. Is the course content clearly defined?
7. What books do teachers have?
8. Is there an effective two-way communication?
9. What is the workload of student?
10. How well do students participate?
11. Is there any informal assessment?
12. How is disciplined maintained?
13. Do the objectives derive from aims?
14. What are the living conditions of students?
15. Is there a need for a course?

Activity 3
Determine in which CIPP component do each evaluator activities fall.
1. Determine the extent to which the program reached an appropriate group
of beneficiaries.
2. Assess the program’s work plan and schedule for sufficiency, feasibility,
and viability.
3. Maintain an up-to-date profile of the program.
4. Assess program goals in light of beneficiaries’ assessed needs and
potentially useful assets.
5. Assess the program’s proposed strategy for responsiveness to assessed
needs and feasibility.
6. Periodically interviews beneficiaries, program leaders, and staff to obtain
their assessments of the program’s progress.

Congratulations for completing Lesson 4! Schools often have both internal


and external reasons for evaluating their programs. Primary external reason
is the requirement of accreditation and government agencies. It is therefore
essential to have a strong evaluation process supported by a robust evaluation
model like CIPP.

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
MODULE ASSESSMENT
Choose the option that provides the correct response.
1. How is assessment related to a course’s learning objectives?
a. Assessment and learning objectives are essentially the same thing.
b. The learning objectives are based on the way students are assessed.
c. Teachers use assessment to ensure a course’s objectives are met.
d. They are not at all related to one another.
2. If a teacher gives and exam and everyone fails, what should he/she do?
a. Give the exam again.
b. Determine why students missed the questions they missed.
c. Make the exam easier.
d. Adjust his/her teaching style.
3. Feedback is important because …
a. It allows students to learn from their mistakes.
b. It makes the student feel good about themselves.
c. It explains the grade that was assigned.
d. Teachers are supposed to give their students feedback.
4. Which is NOT true about formative evaluation?
a. It is focus is program improvement.
b. It judges the worth of a program while in progress.
c. It is primarily diagnostic in nature.
d. It is concerned with the overall effectiveness of a program.
5. Which is NOT true about summative evaluation?
a. It is done at the completion of a program.
b. Gathered data determine the worth of the program.
c. It is generally high stakes.
d. It entails comparing against some benchmark.
6. When is focus group more preferable than in-depth interview?
a. Peer pressure would inhibit responses and cloud results.
b. Subject matter is not so sensitive.
c. Group interaction is deemed nonproductive.
d. A greater volume of issues must be covered.
7. Which is a good tool for obtaining information when in-depth probing is not
necessary?
a. observation c. case study
b. survey d. key informant
8. If the university was implementing a new online learning scheme this school year,
which might be regarded as stakeholders?
a. students and teachers c. IT support officers
b. staff development officers d. All of these
9. Which key question is aimed in the input evaluation stage?
a. What are the impediments to meeting necessary or useful needs?
b. How cost-effective is each identified approach?
c. Was the program running efficiently?
d. Were the intended outcomes of the program realized?

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
10. Which key question is aimed in the product evaluation stage?
a. What are the longer-term implications of the program outcomes?
b. Did participants accept and carry out their roles?
c. How feasible is each of the identified approaches?
d. What relevant opportunities exist?

MODULE SUMMARY

Educational evaluation is evaluation that is conducted specifically in educational


setting. It is the process of appraising particular aspect/s of the educational process. It
encompasses the evaluation of school programs, plans, courses, interventions, initiatives or
curriculum.

The purpose of educational evaluation may be formative or summative. Formative


evaluation begins during the program and program development and continues while the
program is ongoing. This entails monitoring and feed backing enhancing the implementation
along the way. Summative evaluation occurs at the completion of the program. Its emphasis
is on the achievement of the program goals and objectives.

Evaluation approaches refer to the underlying principles and assumptions in


designing the evaluation process. These approaches may be categorized in terms of
epistemology, perspective and orientation. Evaluation methods, moreover, are categorized as
quantitative or qualitative methods. There a lot of methods and techniques for evaluation but
there is no single best method. The choice of the method and technique relies on the data and
information you expect to obtain from the evaluation.

Stufflebeam’s CIPP model is a framework that provides a guide from the conception,
designing, implementation and evaluation of educational processes and programs. The
context evaluation identifies and defines program goals based on the needs, problems, assets,
and opportunities relevant to the program. The evaluator might utilize document review,
demographic data analysis, interviews, surveys, records analysis and focus groups.

Input evaluation assesses current practices against other alternative practices with
emphasis on feasibility and effectiveness. Methods useful in this stage include literature
review, visiting exemplary programs, dialogue with experts and benchmarking. Process
evaluation is simply an implementation evaluation which highlights accountability and
documentation of the “lessons learned” in the process. Tools utilized in this stage include
observation, document review, participant interviews, and periodic monitoring reports.
Lastly, product evaluation examines the anticipated and unanticipated outcomes of the
program, positive or negative. Evaluators may use impact surveys, group interviews, case
studies and stakeholders’ judgments of the program.

Educational evaluation should assess and report a program’s merit, worth, probity and
significance. Whatever lessons learned by the entity being evaluated are also indicated

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development
in the evaluation report. Hence, schools must establish an efficient internal and external
evaluation mechanisms and initiatives.

REFERENCES

American Evaluation Association. (2018). American evaluation association guiding


principles for evaluators. Retrieved from { HYPERLINK
"https://www.eval.org/p/cm/ld/fid=51" } on July 10, 2020.

Australian Conference on Science and Mathematics Education (ACSME). (2007). WEAS:


A web-based educational assessment system. Retrieved from { HYPERLINK
"https://dl.acm.org/doi/pdf/10.1145/1233341.1233365%20on%20July%2010" },
2020.

Boulmetis, J. & Dutwin, P. (2005). The ABC’s of evaluation: Timeless techniques for
program and project managers (2nd ed.). San Francisco: Jossey-Bass.

Cajigal, R. M. and Mantuano, M. D. (2014). Assessment of learning 2. Manila: Adriana


Publishing Co., Inc.

House, E. R. (1978). Assumptions underlying evaluation models. Educational Researcher.


7(3), 4-12.

Navarro, R. L. & Santos, R. (2013). Authentic assessment of student learning Outcomes:


Assessment of Learning 2. Manila: Lorimar Publishing Inc.

National Science Foundation (2010). The 2010 User-Friendly Handbook for Project
Evaluation. Retrieved from { HYPERLINK
"https://www.purdue.edu/research/docs/pdf/2010NSFuser-
friendlyhandbookforprojectevaluation.pdf" } on July 10, 2020.

Patton MQ. (2002). Qualitative evaluation and research methods. Newbury Park (CA):
Sage Publishing Inc.

Powell Tate (2020). Evaluation Approaches & Types. Retrieved from { HYPERLINK
"http://toolkit.pellinstitute.org/evaluation-101/evaluation-approaches-types/" } on
July 10, 2020.

Reyes, E. and Dizon, E. (2015) Curriculum development. Manila: Adriana Publishing Co.,
Inc.

Spaulding, D.T. (2008). Program evaluation in practice: Core concepts and examples for
discussion and analysis. San Francisco, CA: Jossey-Bass.

Stufflebeam, D. L., & Webster, W. J. (1980). { HYPERLINK


"https://www.jstor.org/stable/1163593" }. { HYPERLINK
"https://en.wikipedia.org/wiki/Educational_Evaluation_and_Policy_Analysi
s" \o

Project WRITE XI: An Easy Guide for Course Pack making and Module {
Development

You might also like