You are on page 1of 11

STUFFLE BEAMS CIPP MODEL IN CURRICULUM EVALUATIONS

_____________________

A Written Report Presented to


Dr. Corazon B. dela Pea of the Graduate School of
San Pedro College, Davao City

_____________________

In Partial Fulfilment of the Requirements


Of NSG 407 (Curriculum Development in Nursing

by

Michael Philip F. Cervantes, RN

March 2016

I.

INTRODUCTION
Curriculum

evaluation

is

an

essential

phase

of

curriculum

development. Through evaluation a faculty discovers whether a curriculum is


fulfilling its purpose and whether students are actually learning.(DiFlorio, 1989)
Meanwhile, a curriculum evaluation model is a framework that guides
the evaluation of a curriculum. Variations in models arise from differing philosophies,
conceptions, and definitions of evaluation. Accordingly, models differ in the emphasis
placed on the curricular aspects to be examined, the approaches to data gathering,
and the basis for judging the curriculum. The models provide a path for planning and
conducting evaluation, not a detailed roadmap.
Meanwhile, in nursing education, evaluation of the total curriculum is
comprehensive, because a holistic evaluation is most appropriate for a unified
curriculum. It is typically based on the standards of quality and incorporates
quantitative and qualitative approaches. (Iwasiw, 2015)
One of these holistic examination of the nursing curriculum is the CIPP
evaluation model. CIPP is a program evaluation model which was developed by
Daniel Stufflebeam and colleagues in the 1960s. CIPP is an acronym for Context,
Input, Process and Product. CIPP is an evaluation model that requires the
evaluation of context, input, process and product in judging a programmes value.
CIPP is a decision-focused approach to evaluation and emphasises the systematic
provision of information for programme management and operation with the intention
of not to prove, but rather improve, the program itself. (Stufflebeam, 2003)

II.

OBJECTIVES

At the end of this oral presentation, students will be able to:


1. Define curriculum evaluation model
2. Explain the Stufflebeams CIPP model
3. Apply the CIPP model in evaluating the nursing curriculum

4. Distinguish the difference of Stufflebeams CIPP model from the other models
of curriculum evaluation
5. Summarize the core components namely the context, input, process, and
product.

III.

COMPREHENSIVE CONTENT
STUFFLE BEAMS CIPP MODEL

A. Stufflebeams Biography
Daniel Leroy Stufflebeam, education educator
Born in Waterly, Iowa on Sept. 19, 1936
BA, State University Iowa, 1958
MS, Purdue University, 1962
PhD 1964, postgrad University of Winsconsin 1965
Professor, Director Ohio State University Evaluation State Center,

Columbus, 1963-1973
Professor in Education, Director Western Michigan University Evaluation

Center, Kalamazoo 1973


Author monographs and 15 books, recipient Paul Lazersfeld Award in

Evaluation Research Society, 1985


Member American Educational Research Association, National Council on
Measurement in Education, American Evaluation Association

B. CIPP Evaluation Model


The CIPP Evaluation Model is a comprehensive framework for guiding
evaluations of programs, projects, personnel, products, institutions, and

systems.
The CIPP evaluation model is designed to systematically guide both
evaluators and stakeholders in posing relevant questions and conducting
assessments at the beginning of a project, while it is in progress, and at its
end.

The models first instalmentactually before all 4 CIPP parts were


introducedwas published more than 35 years ago (Stufflebeam, 1966)

and stressed the need for process as well as product evaluations.


Corresponding to the letters in the acronym CIPP, this models core parts

are context, input, process, and product evaluation.


The models fourth installment (Stufflebeam, 1972) showed how the model
could and should be used for summative as well as formative evaluation.

Figure 1. Schematic Diagram of CIPP Model (Stufflebeam, 1972)

The third instalment (Stufflebeam et al., 1971) set the 4 types of evaluation within
a systems/improvement-oriented framework.

Figure 2. The CIPP Evaluation


Model (Stufflebeam, 1971)

Context, input, process,


and product evaluations
emphasized that goal-setting should be guided by context evaluation, including a
needs assessment, and that program planning should be guided by input
evaluation, including assessments of alternative program strategies.

Figure 3. Schematic Diagram of CIPP Model (Stufflebeam, 1967)


Four Sections of the Model
1. Context: This first step is conducted when a new program is being planned. The
associated evaluation questions are also useful when an established program is
undergoing planned change or must adapt to changed circumstances.
2. Input: This step is conducted to determine how to use resources, assess
capability of responsible agency, assess strategies for achieving objectives, and
to assess design for implementing a selected strategy.
3. Process: The objective of process evaluation is to provide information for
programming decisions, predict defects in procedure design, and to maintain
records of the procedures as they occur.
4. Product: This final step is conducted as often as necessary during the program
life, to measure and interpret attainments.
Strengths

Combines formative evaluation (Context, Input, Process) with summative

evaluation (Product).
All of the sections combined ensure no part of the program is overlooked.
Clear format for evaluators and stakeholders to follow.

Limitations

This evaluation needs very careful planning or it will not succeed.

Multiple data collection techniques are needed to address each type of data or
evaluation questions.

Rationale
CIPP evaluation model is recommended as a framework for guiding evaluations of:

Programs
Projects
Products
Institutions
Educational Evaluations

It is a recommended option if you are looking for improvement, feedback, and


understanding of project context.
This model is best suited for programs where various stakeholders are identified and
incorporated into the evaluation process in a meaningful way.
The CIPP evaluation model would be used when evaluating improvements and to
examine different cases or situations within the project.
Identifies program goals and objectives by assessing needs, problems, and
opportunities relevant to the program.
A CIPP evaluation is conducted when a new program is being planned. (Frye &
Hemmer, 2012)
It is also useful for established programs undergoing change or adapting to changed
circumstances.
CIPP Focuses on:

Educational context of the program (what comes before, after, or concurrently for

learners)
Learners characteristics, variability, preparation for learning, learning
opportunities, including patient census and adequacy of funding to support
program needs as well as leadership support. (Frye & Hemmer, 2012)

Program products or outcomes


Improvement
Involvement of multiple stakeholders into the process.

Evaluation Questions Common to CIPP Evaluation Studies


CONTEXT What needs to be done?

Assess the overall environmental readiness of the project, examine whether


existing goals

and priorities are attuned to needs, and assess whether

proposed objectives are sufficiently responsive to assessed needs.

Refers to as needs assessment.

What is the relation of the course to other courses?

Is the time adequate?

What are critical or important external factors?

Should courses be integrated or separate?

What are the links between the course and research/extension activities?

Is there a need for the course?

Is the course relevant to the job needs?

INPUT - How should it be done?

Refers to the ingredients of the curriculum which include the goals, instructional
strategies, the learners, the teachers, the contents and all the materials needed.

What is the entering ability of students?

What are the learning skills of students?

What is the motivation of the students?

What are the living conditions of students?

What is the students existing knowledge?

Are the aims suitable?

Are objectives derived from the aim?

Are the objectives smart?

Is the course content clearly defined?

Does the content match student abilities?

Is the content relevant to practical problems?

What is the theory/practice balance?

What resources/equipments are available?

What books do the teachers have?

What books do the students have?

How strong are the teaching strategies of the teachers?

What time is available compared with the workload, for preparation?

What knowledge, skills and attitudes, related to the subject, do the teachers
have?

How supportive is the classroom environment?

How many students are there?

How many teachers are there?

How is the course organized?

What regulations relate to the training?

PROCESS - Is it being done?

Refers to the ways and means of how the curriculum has been implemented.

Monitors the project implementation process.

Assess the extent to which participants accept and carry out their roles.

What is the workload of the students?

How well/actively do students participate?

Are there any problems related to teaching?

Are there any problems related to learning?

Is there an effective 2-way communication?

Is knowledge only transferred to students, or do they use and apply it?

Are there any problems which students face in using/applying/analysing the


knowledge and skills?

Are the teaching and learning process continuously evaluated?

Are the teaching and learning affected by practical/institutional problems?

What is the level of cooperation/interpersonal relations between teachers and


students?

How is discipline maintained?

PRODUCT - Did the project succeed?

Indicates if the curriculum accomplishes its goals.


Measure, interpret, and judge a projects outcomes by assessing their merit,
worth, significance, and probity.

Ascertain the extent to which the needs of all the participants were met.

Is there one final exam at the end or several during the course?

Is there any informal assessment?

What is the quality of the assessment?

What are the students KSA levels after the course?

Is the evaluation carried out for the whole process?

How do students use what they learned?

How was the overall experience for the teachers and for the students?

What are the main lessons learned?

Is there an official report?

Has the teachers reputation improved or been ruined as a result?

Application of CIPP in Nursing


Evaluation Framework for Nursing Education Programs, Application of CIPP Model
by Dr. Mina Singh
Purpose
To assure stakeholders and funders of the nursing education systems that there is a
process in place to ensure the identification and implementation of continuous
improvement efforts. Using the CIPP model, members of the nursing education system
are keeping themselves accountable for the performance of the nursing program.
Four Key Factors Needed (Based on CIPP)
1. Create an evaluation matrix; making it a living document.
2. Form a working group or a program evaluation committee.
3. Determine who will conduct the evaluation; internal vs. External.

4. Ensure the evaluators understand and adhere to the program evaluation


standards of utility, feasibility, propriety, and accuracy (Singh, 2014)
Components of the Matrix

Major questions
Sub questions
Indicators
Sources of data
Methods of data collection

Context Evaluation
Major Question: 1.1 Are the mission and program goals being met?
Sub-Questions
Indicators
Sources of Data
1.1.1 Is the current

Focus of the mission

statement
Congruency with the

strategic plan being


met?
1.1.2 Is the curriculum

strategic plan
Congruence between the

meeting the program

curriculum and program goals

goals and objectives?

and objectives

Methods of Data

Memos, minutes
Strategic plan
Administrators

Collection

Review

documents
Interviews

Curriculum
Program goals

Review

and objectives
Administrators

documents
Interviews

Product Evaluation
Major Question: 4.1 What impacts and outcomes, both intended and unintended, have resulted from this program?
Sub-Questions
Indicators
Sources of Data
Methods of Data
4.1.1

How did the program

Degree of change over

time in :
Demonstration of caring

behaviours
Ethical and professional

knowledge
Ability to integrate theory

and research
Basic and higher level

cognitive skills
Academic performance
Attitude
Organizational skills
Learning ability

made a difference of

the students:
Caring behaviours?
Ethical/professional

knowledge?
Ability to integrate
theory and research

into practice?
Accountability?
Understanding of self?

Summary

Students
Faculty
Potential
employers

Collection

Interviews

Surveys

The models underlying purpose is to provide evaluation to clients with timely


and valid information that allows them to identify areas for development and
improvement.
It allows the evaluator to focus on four areas: Context, Input, Process, and
Products. Evaluation of these areas can be conducted individually, sequentially, or in
parallel, depending on the situation. The basic idea is that these evaluations
complement the information requirements of the stakeholders rather than replace
existing information or reports. (Guerra-Lopez, 2008)

References
Alkin, M.C (2004). Education roots: Tracing theorist views and influences. Thousand
Oaks, CA: Sage.
Frye, A. and Hemmer, P. (2012). Program evaluation models and related theories:AMEE
guide no.67. Medical Teacher, 34, p288-299. doi:10.3109/0142159x2012.668637
Guerra-Lopez, I.J. (2008). Performance evaluation: Proven approaches for improving
program and organizational performance. San Francisco, CA: Jossey-Bass.
Singh, M. (2004). Evaluation framework for nursing education programs: Application of
the CIPP model. International Journal of Nursing, 1, 1-16.
Stufflebeam, D.L. (1971). The use of experimental design in educational evaluation.
Journal of Educational Measurement, 8(4),p 267-274.
Stufflebeam, et al. (1971). Educational evaluation and decision making. IL:Peacock.
Stufflebeam, D.L. and Shinkfield, A.J (2007). Evaluation theory, models, and application.
p325-362. San Francisco, CA: Jessey Bass.