You are on page 1of 133

International

Faculty Workshop for


Continuous Program
Improvement

September 28-2

Moscow

Introductions

September 28-2

Moscow

Continuous Program
Improvement
Moderator:

Gloria Rogers
Associate Executive Director
Professional Services
ABET, Inc.

Facilitator:

David Hornbeck
Adjunct Accreditation Director for Technology
ABET, Inc.

September 28-2

Moscow

ABET Faculty Workshop

To Promote
Continuous Quality
Improvement in Engineering
Education

September 28-2

Moscow

Workshop Expectations

September 28-2

Moscow

Workshop Will Develop:


1. An understanding of program
development and management based on
learning outcomes.
2. An awareness of definitions and linkages
among

Program Educational Objectives


Program Outcomes
Assessment
Evaluation
Constituencies

September 28-2

Moscow

Workshop Will Develop:


3. An awareness of assessment tools and their

Variety
Assets
Utility
Relevance
Limitations

4. An understanding of the structure & cyclic nature


of Continuous Quality Improvement

Planning
Implementation
Assessment
evaluation,
feedback
change

September 28-2

Moscow

Workshop Format
We utilize both small group and plenary
sessions
We introduce concepts via critique of case
study examples
We apply concepts through group
preparation of example scenarios
We share results & develop understanding
through interactive plenary sessions
September 28-2

Moscow

Workshop Day 1
Identify attributes of effective educational
objectives
Identify attributes of effective program
outcomes
Investigate key components of effective
assessment plans and processes
Prepare written program outcomes
September 28-2

Moscow

Workshop Day 2
Investigate the attributes of a variety of
assessment tools
Develop assessment & evaluation plans for the
program educational objectives
Develop assessment & evaluation plans for the
set of program outcomes
Summarize points of learning
Discuss lessons learned by ABET in its
experience with outcomes-based criteria
September 28-2

Moscow

10

Workshop Procedures
A. Record all your work produced in small
group sessions
B. Identify recorded work by table and
breakout room number
C. Reporting in Plenary Sessions: Each group
selects a leader, a recorder & a reporter for
each exercise
D. A workbook of all material & exercises will be
provided to each participant
September 28-2

Moscow

11

Introduction to ABET
Continuous Program
Improvement

September 28-2

Moscow

12

Goal of ABET
To promote Continuous Quality
Improvement in Applied Sciences,
Computing, Engineering, and Technology
education through faculty guidance and
initiative.

September 28-2

Moscow

13

The Paradigm Shift

Accreditation Reform
September 28-2

Moscow

14

Philosophy
Institutions & programs define missions and
objectives
Focus on the needs of their constituents
Enable program differentiation
Encourage creativity in curricula

Emphasis on outcomes
Skills/knowledge required for professional practice
Technical and non-technical elements

Programs demonstrate that they are


Meeting their objectives
Satisfying accreditation criteria
September 28-2

Moscow

15

Emphases
Practice of Continuous Improvement

Input of constituencies
Process reliability & sustainability
Outcomes, Objectives, and Assessment
Technical and Professional Knowledge required by
the Profession

Resources linked to Program Objectives

Student
Faculty and Support Personnel
Facilities
Institutional Support and Funding

September 28-2

Moscow

16

Primary Expectations of
Programs
Adequate preparation of graduates
for engineering careers
Effective Continuous Quality
Improvement Processes
September 28-2

Moscow

17

The Focus

Meaningful Educational Objectives


Effective Program Outcomes
Practical Assessment Tools
Effective & Sustainable Assessment Plan
Robust and Credible Evaluation Plan

September 28-2

Moscow

18

ABET Definitions
Program Educational Objectives broad

statements that describe the career and


professional accomplishments that the program is
preparing graduates to achieve within the first few
years after graduation.

Program Outcomes narrower statements that

describe what students are expected to know and


be able to do by the time of graduation. These are
the skills, knowledge, and behaviors that enable
graduates to achieve the Program Educational
Objectives. They are acquired by students as they
matriculate through the program.

September 28-2

Moscow

19

ABET Definitions
Assessment processes to identify, collect, and

prepare data that are needed to evaluate the


achievement of Program Outcomes and Program
Educational Objectives.

Evaluation processes that interpret data

accumulated through assessment. Evaluation


determines the extent to which Program Outcomes
or Program Educational Objectives are being
achieved. Evaluation results in decisions & actions
that improve a program.

September 28-2

Moscow

20

Continuous Quality Improvement is


a systematic pursuit of excellence and
satisfaction of the needs of constituencies in
a dynamic and competitive environment.

September 28-2

Moscow

21

Continuous Quality Improvement

Must be systematic and systemic


Is the dynamic behavior of an organization
Must be shared at all organizational levels
May be motivated by external factors
Must be sustained by internal behavior
Requires that the continuous pursuit of
excellence determine philosophies, plans,
policies and processes of the organization
Requires continuous interaction between internal
and external constituencies
Focuses on the needs of constituencies
September 28-2

Moscow

22

CQI Starts with Basic


Questions

Who are our constituencies?


What services do we provide?
Do constituencies understand our objectives?
What services, facilities and policies are
necessary to insure that we continue to satisfy
our constituencies?
Do our suppliers and institutional leadership
understand and support our needs?
September 28-2

Moscow

23

...More Basic Questions


What steps do we perform to provide our
services?
Are our constituencies satisfied with our
services?
How do we measure our effectiveness?
How do we use these measures to continuously
improve our services?
Are we achieving our objectives and improving?

September 28-2

Moscow

24

Assessment: Foundation of CQI


Assessment of inputs & processes establishes
the capability or capacity of a program
Assessment of outcomes measures how
effectively the capability has been used
Outcomes assessment improves:
Effectiveness
Learning
Accountability
September 28-2

Moscow

25

CQI as an Operating
Philosophy
Quality improvement comes from within institution
Continuous improvement requires the planned integration
of objectives, performance metrics, & assessment
Continuous improvement is cyclical
Assessment of performance is the baseline for future
assessment
Educational objectives, mission, and needs of
constituencies must be harmonized to achieve CQI
September 28-2

Moscow

26

Role of ABET Accreditation


ABET accreditation provides periodic
external assessment in support of the
continuous quality improvement program
of the institution.

September 28-2

Moscow

27

Potential Constituencies
Students, parents, employers, faculty, alumni
Industry advisors, accrediting agencies
Educational administration: department, school,
college, etc
Government agencies: local, state, federal
Transfer colleges that supply students
Graduate programs that accept graduates
Donors, contributors, supporters
September 28-2

Moscow

28

Step 1: Who are your


constituencies ?
Identify possible constituencies.
What are the expectations of each constituency?
How will constituencies be satisfied?
When will constituencies be satisfied?
What relative priority do constituencies hold?
How will constituencies be involved in your CQI?

September 28-2

Moscow

29

Pick Your Constituencies


Select no more than three constituencies to
focus on for the workshop exercises
Assign a person to represent each of these
constituencies at each table
Consider what influence the choice of
constituencies will have on Educational
Objectives and Outcomes
September 28-2

Moscow

30

Objectives: Exercise 1

September 28-2

Moscow

31

Outcomes: Exercise 2

September 28-2

Moscow

32

Report Out on
Exercise 1 and Exercise 2

September 28-2

Moscow

33

Objectives Summary
Each addresses one or more needs of a
constituency
Must be understandable by the constituency
being served
Should be limited to a manageable number
of statements
Should be broader statements than the
Program Outcomes
Every Objective must be supported by at
least one Program Outcome
September 28-2

Moscow

34

Outcomes Summary
Each describes an area of knowledge and/or skill that a
person can demonstrate
Should be stated such that a student can demonstrate
upon completion of the program and before graduation
Must be a unit of knowledge/skill that supports at least
one Educational Objective
Collectively, Outcomes define the skills and knowledge
imparted by the degree program
Outcomes statements normally do not include measures
or performance expectations
September 28-2

Moscow

35

Assessment Basics

September 28-2

Moscow

36

Program Assessment of

Student Learning
September 28-29, 2006

Gloria Rogers, Ph.D.


Associate Executive Director,
Professional Services
ABET, Inc.

Foundational Truths
Programs are at different places in

the maturity of their assessment


processes
Programs have different resources
available to them (e.g., number of
faculty, availability of assessment
expertise, time)
Each program has faculty who are
at different places in their
understanding of good assessment
practice
September 28-2

Moscow

38

Hierarchy of assessment learning


Evaluation
Synthesis

I can take what I have learned


and put it in context. I begin to
question what I hear, challenge
assumptions and make independent
decisions about effective practices
for my program.

Advanced

Analysis
Application
Comprehension

INTERMEDIATE

I apply what I have learned


and begin to analyze the
effectiveness of my
assessment processes.

NOVICE

Knowledge

September 28-2

Moscow

Everyone who makes a


presentation is an
expert and I am a
sponge.

39

What are we
doing with
the inputs?

What comes
into the
system?

Input

Processes

How many?

Outputs

Programs &
services
Student
offered;
Background
populations
served

Student grades;
graduation rates;
employment
statistics

Faculty
Faculty
teaching
Background loads/class
size

Publication
numbers/Faculty
development
activities;
Credit hrs delivered

Policies,
Educational
procedures,
Resources
governance

September 28-2

What is the
effect?

Outcomes
What have
students
learned; what
skills have
they gained;
attitudes
developed?
Faculty
publication
citations data;
faculty devlpmt

Statistics on
resource
Student learning
availability,
and growth
participation
rates
Moscow

40

Input

Processes

Programs &
services
Student
offered;
Background
populations
served
Faculty
Faculty
teaching
Background loads/class
size
Policies,
Educational
procedures,
Resources
governance

September 28-2

Assessment of

inputs and process


only establishes the
capability or
capacity of a
program (how many
courses and what is
covered,
background of
faculty, nature of
facilities, etc.)

Moscow

41

Outputs

Assessment of

outputs serve as
indirect measures or
proxies for
effectivenessthey
provide general
indicators of
achievement.

September 28-2

Student grades;
graduation rates;
employment
statistics
Publication
numbers/Faculty
development
activities;
Credit hrs delivered

Statistics on
resource
availability,
participation
rates
Moscow

42

Assessment of outcomes

Outcomes

provides for direct


measures of the
effectiveness of what has
been done with that
capability/ capacity
related to individual
learning and growth.

What have
students
learned; what
skills have
they gained;
attitudes
developed?
Faculty
publication
citations data;
faculty devlpmt
Student learning
and growth

September 28-2

Moscow

43

Taxonomy of Approaches to Assessment

Competency-Based
Instruction

Individual

Level
of
Assessment
(Who?)

Group

Assessment-Based Curriculum
Individual Perf. Tests

Placement

Advanced Placement Tests


Vocational Preference Tests
Other Diagnostic Tests

Program Enhancement
Individual assessment
results may be aggregated to
serve program evaluation needs

Learning/Teaching
(Formative)

September 28-2

(Terenzini, JHE Nov/Dec


1989)

Gatekeeping
Admissions Tests
Rising Junior Exams
Comprehensive Exams
Certification Exams

Campus and Program


Evaluation
Program Reviews
Retention Studies
Alumni Studies
Value-added Studies

Accountability
(Summative)

Purpose of Assessment (Why?)

Moscow

K
N
O
W
L
E
D
G
E

S
K
I
L
L
S

A
T
T
I
T
U
D
E
S
&
V
A
L
U
E
S

B
E
H
A
V
I
O
R

of t
ct en
e
j sm ?)
b
O ses at
As (Wh

44

ABET Terms

Objectives

Outcomes

Performance
Criteria

Assessment

Evaluation

Some other terms for


same concept

Definition
Broad statements that describe the career and
professional accomplishments that the program
is preparing graduates to achieve.

Goals, outcomes, purpose,


etc.

Statements that describe what students are


expected to know and able to do by the time of
graduation.

Objectives, standards, etc.

Specific, measurable statements identifying the


performance(s) required to meet the outcome;
confirmable through evidence.

Performance Indicators,
Standards, rubrics,
specifications, metrics,
outcomes, etc.

Processes that identify, collect, use and prepare


data that can be used to evaluate achievement.

Evaluation

Process of reviewing the results of data


collection and analysis and making a
determination of the value of findings and action
to be taken.

September 28-2

Moscow

Assessment

45

Mission

Educational
Objectives
Assess/
Evaluate

Constituents

Learning Outcomes
Feedback for
Continuous
Improvement

Evaluation:
Interpretation of
Evidence

Measurable
Performance
Criteria
Educational
Practices/Strategies

Assessment:
Collection, Analysis
of Evidence

Assessment for Quality Assurance


September 28-2
Moscow
Gloria Rogers ABET, 46
Inc.

Classroom Assessment
Concepts

Context:

Subject matter
Faculty member
Pedagogy
Topics
Student
Terminology
Facility
Strength of Materials

Subject

Material Properties
Beams
Torsion
Columns
Fatigue

Assessment Focus:
Evaluate individual student
performance (grades)
Evaluate teaching/learning

Timeline
September
28-2

Stress
Strain
Tensile strength
Ductility
Sheer force
Bending moment
Angle of twist
Power transmission
Euler buckling
Crack growth
S-N curves

1 semester/quarter
Moscow

47

G.Rogers, ABET

Performance
Criteria

Outcome

Researches and
gathers information

Ability to
function on
multidisciplinary
team

Fulfill duties of team


roles

Objective
Work
effectively
with others

Makes

September 28-2

contributions
Takes
responsibility
Values
other
viewpoints

Shares work equally


Listens to other
teammates

Moscow

48

G.Rogers--ABET, Inc.

Envi
ronm
Fact ental
ors

Program Assessment
Institutional Context
Coursework
& Curricular
Patterns
Classes chosen;
major

Out-of-class
Experiences

Student
Pre-college
Traits

Classroom
Experience

Co-curricular;
co-ops;
internships;
support services

Educational
Outcomes

Pedagogy;
Facilities; Climate;
Faculty & Student
Characteristics

September 28-2

Timeline xx Years

Reciprocal Causation

Moscow

49

Adapted from Terenzini, et.al. 1994,1995

Differences between classroom and


program assessment
Degree of complexity
Time span
Accountability for the assessment
process
Cost
Level of faculty buy-in
Level of precision of the measure
September 28-2

Moscow

50

Work Effectively in Teams


Unsatisfactory
1

Developing
2

Satisfactory
3

Exemplary
4

Score

Contribute
Research &
Gather
Information

Take Responsibility
Fulfill Team
Role's Duties
Share Equally

Value Others' Viewpoints


Listen to Other
Teammates

Average

September 28-2

Moscow

51

Work Effectively in Teams


Unsatisfactory
1

Developing
2

Satisfactory
3

Exemplary
4

Collects some basic


information--most
relates to the topic.

Collects a great
deal of
information--all
relates to the topic.

Score

Contribute
Research &
Gather
Information

Does not collect any


information that relates
to the topic.

Collects very little


information--some relates
to the topic.

Take Responsibility
Fulfill Team
Role's Duties

Share Equally

Does not perform any


duties of assigned team
role.

Performs very little duties.

Performs nearly all


duties.

Performs all duties


of assigned team
role.

Always relies on others


to do the work.

Rarely does the assigned


work--often needs
reminding.

Usually does the


assigned work--rarely
needs reminding.

Always does the


assigned work
without having to
be reminded.

Value Others' Viewpoints


Listen to Other
Teammates

Is always talking--never
allows anyone else to
speak.

September 28-2

Usually doing most of the


talking--rarely allows others
to speak.

Listens, but
sometimes talks too
much.

Moscow

Listens and
speaks a fair
amount.

Average

52

Developing performance criteria


Two essential parts
Content reference
Subject content that is the focus of instruction
(e.g., steps of the design process, chemical
reaction, scientific method)

Action verb
Direct students to a specific performance
(e.g., list, analyze, apply)

September 28-2

Moscow

53

Evaluation
Synthesis

DEMONSTRATE
/CREATE

EXPERT

Analysis
Application
Comprehension

INTERMEDIATE

NOVICE

Knowledge

September 28-2

Moscow

REINFORCE

INTRODUCE

54

Clarity of performance criteria

Use of action verbs


consistent with appropriate
level of learning
Reference table
September 28-2

Moscow

55

Writing Measurable
Outcomes: Exercise 3

September 28-2

Moscow

56

Report Out on Exercise 3

September 28-2

Moscow

57

Examples

www.engrng.pitt.edu/~ec2000
September 28-2

Moscow

58

September 28-2

Moscow

59

What is acceptable level of


performance?

Developing scoring
rubrics
September 28-2

Moscow

60

What is a rubric, anyway?????


A rubric is a set of categories which define
and describe the important components of the
work being completed, critiqued, or assessed.
Each category contains a gradation of levels
of completion or competence with a score
assigned to each level and a clear description
of what performance need to be met to attain
the score at each level.
September 28-2

Moscow

61

Purpose of Rubric
(What do you want it to do?)

Information to/about student


competence (Analytic)
Communicate expectations
Diagnosis for purpose of improvement
and feedback

Overall examination of the status of


student performance? (Holistic)
September 28-2

Moscow

62

Generic or Task-Specific?
Generic

General rubric that can be used across similar


performances (used across all communication
tasks or problem solving tasks)
Big picture approach
Element of subjectivity

Task-specific

Can only be used for a single task


Focused approach
Less subjective

September 28-2

Moscow

63

How many points on the scale?


Consider both the nature of the
performance and purpose of scoring
Recommend 3 to 6 points to describe
student achievement at a single point in
time.
If focused on developmental curriculum
(growth over time) more points are needed
(i.e., 6-11???).
September 28-2

Moscow

64

RUBRIC TEMPLATE
Student Outcome_______________________________

Performance

Scale
(Numeric
w/descriptor)

Scale
(Numeric
w/descriptor)

Scale
(Numeric
w/descriptor)

Scale
(Numeric
w/descriptor)

Scale
(Numeric
w/descriptor)

Scale
(Numeric
w/descriptor)

Identifiable
performance
characteristics
reflecting this level

Identifiable
performance
characteristics reflecting
this level

Identifiable
performance
characteristics
reflecting this level

Identifiable
performance
characteristics
reflecting this level

Identifiable
performance
characteristics
reflecting this level

Identifiable
performance
characteristics
reflecting this level

Performance

Performance

Performance

Performance

Performance

September 28-2

Moscow

65

Effective Writing Skills


Unsatisfactory
1

Developing
2

Satisfactory
3

Exemplary
4

Provides adequate
supporting detail to
support
solution/argument

Provides ample
supporting detail
to support
solution/ argument

Organizational
pattern is logical and
conveys
completeness and
wholeness with few
lapses

Organizational
patter is logical
and conveys
completeness and
wholeness

Uses effective language


and appropriate word
choices for intended
audience and purpose

Uses effective
language; makes
engaging,
appropriate word
choices for audience
and purpose

Content
Supporting
Detail

Includes
inconsistent or few
details which may
interfere with
meaning of text

Includes some
details, but may
include extraneous or
loosely related
material

Organization
Organizational
Pattern

Little evidence of
organization or any
sense of
wholeness or
completeness

Achieves little
completeness and
wholeness though
organization
attempted

Style
Language and
word choice

Has limited or
inappropriate
vocabulary for the
audience and
purpose

September
Does28-2
not follow the

Standard

Limited and predictable


vocabulary, perhaps not
appropriate for
intended audience and
purpose

Moscow
Generally follows the

Generally does not

66

Consistently follows

Work Effectively in Teams


Unsatisfactory
1

Developing
2

Satisfactory
3

Exemplary
4

Collects some basic


information--most
relates to the topic.

Collects a great
deal of
information--all
relates to the topic.

Score

Contribute
Research &
Gather
Information

Does not collect any


information that relates
to the topic.

Collects very little


information--some relates
to the topic.

Take Responsibility
Fulfill Team
Role's Duties

Share Equally

Does not perform any


duties of assigned team
role.

Performs very little duties.

Performs nearly all


duties.

Performs all duties


of assigned team
role.

Always relies on others


to do the work.

Rarely does the assigned


work--often needs
reminding.

Usually does the


assigned work--rarely
needs reminding.

Always does the


assigned work
without having to
be reminded.

Value Others' Viewpoints


Listen to Other
Teammates

Is always talking--never
allows anyone else to
speak.

September 28-2

Usually doing most of the


talking--rarely allows others
to speak.

Listens, but
sometimes talks too
much.

Moscow

Listens and
speaks a fair
amount.

Average

67

Example of Results

At a level expected for a student who will


graduate?

September 28-2

Moscow

68

Example of Results
Teaming Skills

September 28-2

1.
2.
3.
4.

Research & gather information


Fulfill team roles duties
Shares equally
Listens to teammates

Moscow

69

Example of Results
Communication Skills

1.
2.
3.
4.
September 28-2

Research & gather information


Fulfill team roles duties
Shares equally
Listens to teammates

Moscow

70

Linking results to Practice


Development of Curriculum Map
Linking curriculum content/pedagogy to
knowledge, practice and demonstration
of learning outcomes

September 28-2

Moscow

71

Outcome Explicit. This outcome is explicitly stated as being a learning outcome for this course.
Demonstrate Competence. Students are asked to demonstrate their competence on this outcome through homework, projects,
tests, etc.
Formal Feedback. Students are given formal feedback on their performance on this outcome.
Not covered. This outcome is not addressed in these ways in this course.
Note: Clicking on the link view rubric will show you the scoring rubric for that particular performance criteria related to
the outcome.
Outcome
Explicit

Outcome/Performance Criteria

Demonstrate
Competence

Formal
Feedback

Not
Covered

Recognition of ethical and professional responsibilities.


1. Demonstrate knowledge of professional codes of ethics.

View rubric or make a

comment (optional)

Yes

Yes

Yes

2. Evaluate the ethical dimensions of professional engineering, mathematical, and


scientific practices.
View rubric or make a comment (optional)

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Yes

An ability to work effectively in team


1. Share responsibilities and duties, and take on different roles when applicable
View rubric or make a comment (optional)

2. Analyze ideas objectively to discern feasible solutions by building consensus


View rubric or make a comment (optional)

3. Develop a strategy for action.

View rubric or make a comment (optional)

An ability to communicate effectively in oral, written, graphical, and visual


forms
1. Identify the readers/audience, assess their previous knowledge and information
needs,
and organize/design information to meet those needs. View rubric or make a
comment
(optional)

2. Provide content that is factually correct, supported with evidence, explained


with sufficient detail, and properly documented. View rubric or make a comment
(optional)

September 28-2

Moscow
Yes

3. Test readers/audience response to determine how well ideas have been relayed.
View rubric or make a comment (optional)

72

Curriculum map for Communication Skills


1st Year
2nd Year
3rd Year
CM
111

FALL

WINTER

SPRING

Chem I

4th Year

CH
01

Cons
Principles

CH
414

Heat
Transfer

CH
400

Career P III

EM
100

Life Skills

CM
251

O Chem I

CH
415

Materials

CH
401

Mass II

EM
104

Graph Comm

MA
221

DE I

CM
225

A Chem I

CH
403

Lab II

RH
131

Fresh
Comp

HSS

Elective

CH
304

Thermo II

CH
404

Kinetics

MA
111

Calc 1

CH
200

Career P I

Elective

CM
113

Chem II

CH
202

Che Proc
Calc

CH
300

Career P II

CH
406

Design I

PH
111

Physics I

CM
252

O Chem II

CM
360

P Chem

CH
408

Lab III

HSS

Elective

MA
222

DE II

CH
305

Mass I

CH
440

P Control

MA11
2

Calc II

EM
101

Statics I

MA
227

Statistics

HSS

Elective

MS
120

M.History

Hss

Elective

Elective

CM
115

Chem III

CS
100

Program.

EM
103

Int Design

MA
113

Calc III

Physics II

September
28-2
PH

Fluids

EE
206

EEE

CH
407

Design II

Elective

CH
402

ChE Lab I

CH
409

Prof Prac

HSS

Elective

Elective

HSS

Elective

CH
303

Thermo I

Elective

Elective (Des)

Elective

Elective (free)

CH
301

Moscow
HSS

73

Assessment Methods

September 28-2

Moscow

74

Assessment Methods
Written surveys and

questionnaires
Exit and other
interviews
Standardized exams
Locally developed
exams
Archival records
Focus groups

September 28-2

Portfolios
Simulations
Performance
Appraisal
External
examiner
Oral exams
Behavioral
observations
Moscow

75

Direct Measures
Direct measures provide for the direct
examination or observation of student
knowledge or skills against measurable
learning outcomes

September 28-2

Moscow

76

Indirect Measures
Indirect measures of student learning
that ascertain the opinion or self-report
of the extent or value of learning
experiences

September 28-2

Moscow

77

Direct

Exit and other interviews


Standardized exams
Locally developed exams
Portfolios
Simulations
Performance Appraisal
External examiner
Oral exams
Behavioral observations

September 28-2

Indirect
Written surveys and
questionnaires
Exit and other
interviews
Archival records
Focus groups

Moscow

78

Tools: Exercise 4

September 28-2

Moscow

79

Assignment
After you have shared methods, choose
at least two methods (preferably three)
that are appropriate for the performance
criteria chosen
At least one DIRECT measure
Use overhead transparency to record
your findings
Include your rationale for decision
September 28-2

Moscow

80

Report out on Exercise 4

September 28-2

Moscow

81

Validity
relevance - the assessment option measures
the educational outcome as directly as
possible
accuracy - the option measures the
educational outcome as precisely as possible
utility - the option provides formative and
summative results with clear implications for
educational program evaluation and
improvement

September 28-2

Moscow

82

Bottom Lines
All assessment options have advantages and
disadvantages
Ideal method means those that are best fit
between program needs, satisfactory validity, and
affordability (time, effort, and money)
Crucial to use multi-method/multi-source approach
to maximize validity and reduce bias of any one
approach

T
E
AB
September 28-2

Moscow

83

Assessment Method Truisms


There will always be more than one way to
measure any learning outcome
No single method is good for measuring a
wide variety of different student abilities
There is generally an inverse relationship
between the quality of measurement methods
and their expediency
It is important to pilot test to see if a method is
appropriate for your program
September 28-2

Moscow

84

Data Collection Process


Why?
Know your question

What?
Focus on few criteria for each outcome

Who? Students (cohorts); faculty


(some)
When?

September 28-2

Moscow

85

Sampling
For program assessment, sampling is
acceptable and even desirable for
programs of sufficient size.
Sample is representative of all students

September 28-2

Moscow

86

Data collection

Yr 1
Define
Outcomes/
Map Curr.

Yr 2
Data
collection

Yr 3

Yr 4

Yr

Implement
Evaluation &
improvements
design of
& Data
improvements
Collection

How do objectives differ from outcomes in the


data collection process?
September 28-2

Moscow

87

Learning Outcomes related to:


A recognition of ethical and
professional responsibilities

03-04

04-05

An understanding of how contemporary


issues shape and are shaped by
mathematics, science, & engineering
An ability to recognize the role of
professionals in the global society

An understanding of diverse cultural


and humanistic traditions

05-06

06-07

07-08

08-09

An ability to work effectively in teams

An ability to communicate effectively


in oral, written, graphical, and visual
September 28-2
forms

Moscow

88

Closing the loop


DEC
NOV

OCT

JAN

Eval Committee
receives and
evaluates all data;
makes report and
refers recommendations to
appropriate areas.

Institute acts on the


recom-mendations
of the Eval. Comm.
Reports of actions
taken by the
Institute and the
targeted areas are
returned to the Eval
Comm. for iterative
evaluation.

SEP

AUG

FEB

Institute assessment
cmte. prepares
reports for
submission to Dept.
Heads of the
collected data (e.g.
surveys, e-portfolio
ratings).

MAR

APR

JUL

MAY
JUN

September 28-2

Moscow

89

Student Learning Outcomes at the PROGRAM level


Learning Outcome ________________________________________________________________________
Performance Criteria

Strategies

Assessment
Method(s)

Context for
Assessment

Time of data
collection

Assessment
Coordinator

Evaluation
of Results

Results _____ (date):


Actions _____(date):
Second-Cycle Results ____(date):

September 28-2

Moscow

90

grogers@abet.org

Checklist
Assessment question is known and explicit
Outcomes are defined and number of
performance criteria are manageable
Data are efficiently and systematically collected
Assessment methods are appropriate to program
context
Results are evaluated
Evaluation is more than looking at the results of
learning outcomes
Action is appropriate
September 28-2

Moscow

91

Things I wish I had known:


Capitalize on what you are already doing
One size does not fit all
You dont have to measure everything all the time
More data are not always better
Pick your battles
Take advantage of local resources
Dont wait for perfection
Go for the early win
Decouple from faculty evaluation

September 28-2

Moscow

93

September 28-2

Tools to help you work through the assessmen


process
Assessment of student learning outcomes
Assessment processes in business and industry
Assessment rubrics
Electronic portfolios
Assessment terminology
Using grades for assessment
Using surveys and questionnaires for
assessment
Data collection
assessment articles and presentations
General
Moscow
94
Assessment workshops and conferences

September 28-2

Moscow

95

April 13-14, 2007

www.rose-hulman.edu/assessent2007
September
28-2
Moscow
96

ABET Lessons Learned

September 28-2

Moscow

97

ABET Lessons Learned


(1/6)

Start as soon as possible

Develop a comprehensive plan

Begin implementing the plan as quickly as


possible

Do not allow the early steps to consume


excessive time and create delays in the process

Close Continuous Improvement loops as soon as


possible

Use consultants with caution - there can be


positive and negative effects

September 28-2

Moscow

98

ABET Lessons Learned


(2/6)
It is extremely important to defining terminology
When reported to constituents or external
evaluators, evidence should be organized by
Outcomes and Objectives rather than by courses
Evidence should show evaluation and assessment
processes are in place and working
The accumulation of experience with outcomes
assessment and continuous improvement will build
confidence for all constituencies
September 28-2

Moscow

99

ABET Lessons Learned


(3/6)
Coordination between program assessment and
institutional assessment can enhance both
When presenting information for accreditation
reviews:
Descriptions of the CI process should be
accompanied by evidence of data reduction,
analysis, and the resultant actions
Text should be used to explain, interpret, and
strengthen tabular or statistical data

September 28-2

Moscow

100

ABET Lessons Learned


(4/6)
Each program should have some unique Outcomes that
are different from those in accreditation criteria and those
in other programs at the same institution. The absence
unique Outcomes can imply that the program does not
have a clear sense of mission.
The most successful programs are those with faculty
members who have participated in training sessions and
communicated with faculty at other institutions
It is important for the program Administration to be aware
and supportive of Continuous Improvement activities
September 28-2

Moscow

101

ABET Lessons Learned


(5/6)
Continuous Improvement programs should employ a
variety of assessment tools with a mixture of short and
long time cycles
Surveys should be only one of several evaluation tools
used in Continuous Improvement
Requirements for faculty, facilities, etc. should be linked
to objectives, outcomes, and Continuous Improvement
There has been no apparent relationship between the
degree of success and the size of the institution
September 28-2

Moscow

102

ABET Lessons Learned


(6/6)
Programs that have successfully implemented
Continuous Improvement have had two characteristics
in common:
There will be at least one faculty member who is
highly committed to developing and guiding
implementation
There will be sincere involvement of the faculty
members in the program

September 28-2

Moscow

103

Introduction to ABET

September 28-2

Moscow

104

Introduction to ABET
Accreditation
Federation of 28 professional societies
Board of Directors representing those societies
Four Commissions

Applied Science Accreditation Commission (ASAC)


Computing Accreditation Commission (CAC)
Engineering Accreditation Commission (EAC)
Technology Accreditation Commission (TAC)

Accreditation Council
Representatives of each commission
Coordination, harmonization of processes
September 28-2

Moscow

105

Accreditation Process
Commission responsibilities
Conduct evaluations of programs
Determine accreditation actions

Commission makeup
Commissioners are volunteers appointed by societies
Commissioners chair accreditation teams

Accreditation Team
Chair + one Program Evaluator for each program
Program Evaluators (PEVs) are volunteers from
societies
September 28-2

Moscow

106

ABET Accreditation
Federation of 28 professional societies
Board of Directors represents those societies
Four Commissions

Applied Science Accreditation Commission (ASAC)


Computing Accreditation Commission (CAC)
Engineering Accreditation Commission (EAC)
Technology Accreditation Commission (TAC)

Accreditation Council
Representatives of each commission
Coordination, harmonization of processes
September 28-2

Moscow

107

ABET Accreditation Statistics

Commission

ASAC

CAC

EAC

TAC

Total Programs Accredited

72

240

1793

740

Programs Evaluated in 2004-05

15

70

373

206

+57%

+85%

Increase in Number of
Programs from 1995-2005

September 28-2

Moscow

+18% -16%

108

ABET Longitudinal Study

September 28-2

Moscow

109

Engineering Change:
A Study of the Impact of EC2000*
Lisa R. Lattuca, Project Director and Co-PI
Patrick T. Terenzini, Co-PI
J. Fredericks Volkwein, Co-PI

Pennsylvania State University


Center for the Study of Higher Education
*EC2000 = Outcomes-based accreditation criteria for the Engineering
Accreditation Commission of ABET

September 28-2

Moscow

110

Key Questions
1. What impact, if any, has EC2000 had on
the preparation of graduating seniors to
enter the engineering profession?
2. What impact, if any, has EC2000 had on
practices that may be related to changes in
student preparation?

September 28-2

Moscow

111

Significance of the
Engineering Change Study
The first national study of the impact of
outcomes-based accreditation in the U.S.
A model for assessments in other ABET
Commissions.
A pre-EC2000 benchmark (1994) for
graduating seniors preparation.
The first post-EC2000 data point (2004) on
graduating seniors preparation.
September 28-2

Moscow

112

Engineering Change:
Studying the Impact of EC2000
PROGRAM
CHANGES

STUDENT
EXPERIENCES

Curriculum
&

Instruction

EC2000

Faculty
Culture

Policies &
Practices

InClass

Out-ofClass

OUTCOMES
Student
Learning
(3.a-k)

Employer
Ratings

Continuous Improvement

September 28-2

Moscow

113

Engineering Disciplines Examined

September 28-2

Aerospace
Chemical
Civil
Computer
Electrical
Industrial
Mechanical
Moscow

114

Data Sources and Response Rates


Target
Population

Number of
Responses

Response
Rate

203

147

72%

Faculty

2,971

1,243

42%

Deans

40

40+

98%

1994 Graduates (Pre-)

13,054

5,494

42%

2004 Graduates (Post-)

12,921

4,330

34%

unknown

1,622

N/A

Data Sources
Programs

Employers

September 28-2

Moscow

115

Conclusions
Recent graduates are measurably better prepared than
those of a decade ago in all nine EC2000 outcomes.
The most substantial improvements are in Societal and
Global Issues, Applying Engineering Skills, Group Skills,
and Ethics and Professionalism.
Changes in faculty practices are empirically linked to
these increases in preparation.
Although 25% of employers report decreases in
problem-solving skills, 80% still think graduates are
adequately or well-prepared in that skill area.
September 28-2

Moscow

116

Conclusions
A complex array of changes in programs, faculty
practices, and student experiences systematically
enhance student learning.
These changes are consistent with what one would
expect to see if EC2000 was having an impact.
Changes at the classroom level are particularly
effective in promoting the a-k learning outcomes.

September 28-2

Moscow

117

Conclusions
Students also learn engineering skills through out-ofclass experiences.
Finally, a faculty culture that supports assessment and
continuous improvement is also important.
Most deans comments echoed the study findings:
EC2000 is an accelerant for change in engineering
programs.

September 28-2

Moscow

118

Looking Forward
ABET has set the stage for systematic continuous review
of engineering education.
Engineering Change provides important evidence that an
outcomes-based model is an effective quality assurance
mechanism.
Evidence arrives just in time to inform the national
debate.

September 28-2

Moscow

119

ABET Participation Project

September 28-2

Moscow

120

Participation Project
PILOT Report
July 22, 2006

September 28-2

Moscow

121

Partnership to Advance Volunteer


Excellence (PAVE)
Design and implement a comprehensive and
effective program that optimizes the use of the
expertise and experience of the volunteer
professionals that participate in ABETs
outcomes-based accreditation process.

September 28-2

Moscow

122

Key Components
Develop competency model for Program
Evaluators
Design a more effective recruitment and
selection process
Design a more effective training process
Design a method of performance
assessment and improvement
September 28-2

Moscow

123

What are competencies?


Competencies are behaviors (which include
knowledge, skills, and abilities) that define a
successful PEV (program evaluator)
Set expectations
Align with vision, values, and strategy
Drive continuous improvement

September 28-2

Moscow

124

Competencies
Effective Communicator
Easily conducts face to face interviews
Writes clearly and succinctly
Presents focused, concise oral briefings
Professional
Conveys professional appearance
Is committed to contributing and adding value.
Is considered a person with high integrity and ethical
standards

September 28-2

Moscow

125

Competencies
Interpersonally Skilled
Friendly and sets others at ease
Listens and places input into context
Open minded and avoids personal bias
Forthright doesnt hold back what needs to be said
Adept at pointing out strengths & weaknesses in nonconfrontational manner
Technically Current
Demonstrates required technical credentials for the
position
Engaged in life long learning and current in their field
September 28-2

Moscow

126

Competencies
Organized
Is focused on meeting deadlines
Focuses on critical issues and avoids minutia
Displays take charge initiative
Takes responsibility and works under minimum
supervision
Team Oriented
Readily accepts input from team members
Works with team members to reach consensus
Values team success over personal success
September 28-2

Moscow

127

Becoming an ABET Program Evaluator


PHASE I
Candidate
works
preliminary
modules
on-line

Society
assigns
mentor
Member Society
selects PEV
candidate via
competency model
PHASE III
Candidate
attends
program
specific
training
(Society)
Observer visit
(optional)

September 28-2

Candidate
successfully
completes
modules
on-line

PHASE II
Candidate
successfully
completes
visit
simulation
training
Lead
Facilitator
(Society)

Candidate
attends
visit
simulation
training
Support
Facilitators
(Society)

Society
approves
PEV for
assignment

Moscow

Program Evaluator

128

Training Pilot
Pre-Work CD with Checks for Understanding
Mentor Assigned
Self-Study
Complete Pre-visit forms
1.5 days simulating campus visit
Sunday team meeting
Display materials and lab interview
Draft statement homework
Monday night meeting
September 28-2

Moscow

129

Evaluation Pilot
Performance Appraisal forms:
Describe how competencies are demonstrated previsit and during visit
Provide Performance metrics
Require comments for below met expectations
Peer, Team Chair, Program

September 28-2

Moscow

130

Partnership to Advance
Volunteer Excellence
Determine best implementation strategies
together
Information-sharing, action planning and
collaboration to carry the good work forward
Increase the value of accreditation for your
programs
September 28-2

Moscow

131

Points of Learning

September 28-2

Moscow

132

Questions & Answers

September 28-2

Moscow

133

You might also like