Professional Documents
Culture Documents
September 28-2
Moscow
Introductions
September 28-2
Moscow
Continuous Program
Improvement
Moderator:
Gloria Rogers
Associate Executive Director
Professional Services
ABET, Inc.
Facilitator:
David Hornbeck
Adjunct Accreditation Director for Technology
ABET, Inc.
September 28-2
Moscow
To Promote
Continuous Quality
Improvement in Engineering
Education
September 28-2
Moscow
Workshop Expectations
September 28-2
Moscow
September 28-2
Moscow
Variety
Assets
Utility
Relevance
Limitations
Planning
Implementation
Assessment
evaluation,
feedback
change
September 28-2
Moscow
Workshop Format
We utilize both small group and plenary
sessions
We introduce concepts via critique of case
study examples
We apply concepts through group
preparation of example scenarios
We share results & develop understanding
through interactive plenary sessions
September 28-2
Moscow
Workshop Day 1
Identify attributes of effective educational
objectives
Identify attributes of effective program
outcomes
Investigate key components of effective
assessment plans and processes
Prepare written program outcomes
September 28-2
Moscow
Workshop Day 2
Investigate the attributes of a variety of
assessment tools
Develop assessment & evaluation plans for the
program educational objectives
Develop assessment & evaluation plans for the
set of program outcomes
Summarize points of learning
Discuss lessons learned by ABET in its
experience with outcomes-based criteria
September 28-2
Moscow
10
Workshop Procedures
A. Record all your work produced in small
group sessions
B. Identify recorded work by table and
breakout room number
C. Reporting in Plenary Sessions: Each group
selects a leader, a recorder & a reporter for
each exercise
D. A workbook of all material & exercises will be
provided to each participant
September 28-2
Moscow
11
Introduction to ABET
Continuous Program
Improvement
September 28-2
Moscow
12
Goal of ABET
To promote Continuous Quality
Improvement in Applied Sciences,
Computing, Engineering, and Technology
education through faculty guidance and
initiative.
September 28-2
Moscow
13
Accreditation Reform
September 28-2
Moscow
14
Philosophy
Institutions & programs define missions and
objectives
Focus on the needs of their constituents
Enable program differentiation
Encourage creativity in curricula
Emphasis on outcomes
Skills/knowledge required for professional practice
Technical and non-technical elements
Moscow
15
Emphases
Practice of Continuous Improvement
Input of constituencies
Process reliability & sustainability
Outcomes, Objectives, and Assessment
Technical and Professional Knowledge required by
the Profession
Student
Faculty and Support Personnel
Facilities
Institutional Support and Funding
September 28-2
Moscow
16
Primary Expectations of
Programs
Adequate preparation of graduates
for engineering careers
Effective Continuous Quality
Improvement Processes
September 28-2
Moscow
17
The Focus
September 28-2
Moscow
18
ABET Definitions
Program Educational Objectives broad
September 28-2
Moscow
19
ABET Definitions
Assessment processes to identify, collect, and
September 28-2
Moscow
20
September 28-2
Moscow
21
Moscow
22
Moscow
23
September 28-2
Moscow
24
Moscow
25
CQI as an Operating
Philosophy
Quality improvement comes from within institution
Continuous improvement requires the planned integration
of objectives, performance metrics, & assessment
Continuous improvement is cyclical
Assessment of performance is the baseline for future
assessment
Educational objectives, mission, and needs of
constituencies must be harmonized to achieve CQI
September 28-2
Moscow
26
September 28-2
Moscow
27
Potential Constituencies
Students, parents, employers, faculty, alumni
Industry advisors, accrediting agencies
Educational administration: department, school,
college, etc
Government agencies: local, state, federal
Transfer colleges that supply students
Graduate programs that accept graduates
Donors, contributors, supporters
September 28-2
Moscow
28
September 28-2
Moscow
29
Moscow
30
Objectives: Exercise 1
September 28-2
Moscow
31
Outcomes: Exercise 2
September 28-2
Moscow
32
Report Out on
Exercise 1 and Exercise 2
September 28-2
Moscow
33
Objectives Summary
Each addresses one or more needs of a
constituency
Must be understandable by the constituency
being served
Should be limited to a manageable number
of statements
Should be broader statements than the
Program Outcomes
Every Objective must be supported by at
least one Program Outcome
September 28-2
Moscow
34
Outcomes Summary
Each describes an area of knowledge and/or skill that a
person can demonstrate
Should be stated such that a student can demonstrate
upon completion of the program and before graduation
Must be a unit of knowledge/skill that supports at least
one Educational Objective
Collectively, Outcomes define the skills and knowledge
imparted by the degree program
Outcomes statements normally do not include measures
or performance expectations
September 28-2
Moscow
35
Assessment Basics
September 28-2
Moscow
36
Program Assessment of
Student Learning
September 28-29, 2006
Foundational Truths
Programs are at different places in
Moscow
38
Advanced
Analysis
Application
Comprehension
INTERMEDIATE
NOVICE
Knowledge
September 28-2
Moscow
39
What are we
doing with
the inputs?
What comes
into the
system?
Input
Processes
How many?
Outputs
Programs &
services
Student
offered;
Background
populations
served
Student grades;
graduation rates;
employment
statistics
Faculty
Faculty
teaching
Background loads/class
size
Publication
numbers/Faculty
development
activities;
Credit hrs delivered
Policies,
Educational
procedures,
Resources
governance
September 28-2
What is the
effect?
Outcomes
What have
students
learned; what
skills have
they gained;
attitudes
developed?
Faculty
publication
citations data;
faculty devlpmt
Statistics on
resource
Student learning
availability,
and growth
participation
rates
Moscow
40
Input
Processes
Programs &
services
Student
offered;
Background
populations
served
Faculty
Faculty
teaching
Background loads/class
size
Policies,
Educational
procedures,
Resources
governance
September 28-2
Assessment of
Moscow
41
Outputs
Assessment of
outputs serve as
indirect measures or
proxies for
effectivenessthey
provide general
indicators of
achievement.
September 28-2
Student grades;
graduation rates;
employment
statistics
Publication
numbers/Faculty
development
activities;
Credit hrs delivered
Statistics on
resource
availability,
participation
rates
Moscow
42
Assessment of outcomes
Outcomes
What have
students
learned; what
skills have
they gained;
attitudes
developed?
Faculty
publication
citations data;
faculty devlpmt
Student learning
and growth
September 28-2
Moscow
43
Competency-Based
Instruction
Individual
Level
of
Assessment
(Who?)
Group
Assessment-Based Curriculum
Individual Perf. Tests
Placement
Program Enhancement
Individual assessment
results may be aggregated to
serve program evaluation needs
Learning/Teaching
(Formative)
September 28-2
Gatekeeping
Admissions Tests
Rising Junior Exams
Comprehensive Exams
Certification Exams
Accountability
(Summative)
Moscow
K
N
O
W
L
E
D
G
E
S
K
I
L
L
S
A
T
T
I
T
U
D
E
S
&
V
A
L
U
E
S
B
E
H
A
V
I
O
R
of t
ct en
e
j sm ?)
b
O ses at
As (Wh
44
ABET Terms
Objectives
Outcomes
Performance
Criteria
Assessment
Evaluation
Definition
Broad statements that describe the career and
professional accomplishments that the program
is preparing graduates to achieve.
Performance Indicators,
Standards, rubrics,
specifications, metrics,
outcomes, etc.
Evaluation
September 28-2
Moscow
Assessment
45
Mission
Educational
Objectives
Assess/
Evaluate
Constituents
Learning Outcomes
Feedback for
Continuous
Improvement
Evaluation:
Interpretation of
Evidence
Measurable
Performance
Criteria
Educational
Practices/Strategies
Assessment:
Collection, Analysis
of Evidence
Classroom Assessment
Concepts
Context:
Subject matter
Faculty member
Pedagogy
Topics
Student
Terminology
Facility
Strength of Materials
Subject
Material Properties
Beams
Torsion
Columns
Fatigue
Assessment Focus:
Evaluate individual student
performance (grades)
Evaluate teaching/learning
Timeline
September
28-2
Stress
Strain
Tensile strength
Ductility
Sheer force
Bending moment
Angle of twist
Power transmission
Euler buckling
Crack growth
S-N curves
1 semester/quarter
Moscow
47
G.Rogers, ABET
Performance
Criteria
Outcome
Researches and
gathers information
Ability to
function on
multidisciplinary
team
Objective
Work
effectively
with others
Makes
September 28-2
contributions
Takes
responsibility
Values
other
viewpoints
Moscow
48
G.Rogers--ABET, Inc.
Envi
ronm
Fact ental
ors
Program Assessment
Institutional Context
Coursework
& Curricular
Patterns
Classes chosen;
major
Out-of-class
Experiences
Student
Pre-college
Traits
Classroom
Experience
Co-curricular;
co-ops;
internships;
support services
Educational
Outcomes
Pedagogy;
Facilities; Climate;
Faculty & Student
Characteristics
September 28-2
Timeline xx Years
Reciprocal Causation
Moscow
49
Moscow
50
Developing
2
Satisfactory
3
Exemplary
4
Score
Contribute
Research &
Gather
Information
Take Responsibility
Fulfill Team
Role's Duties
Share Equally
Average
September 28-2
Moscow
51
Developing
2
Satisfactory
3
Exemplary
4
Collects a great
deal of
information--all
relates to the topic.
Score
Contribute
Research &
Gather
Information
Take Responsibility
Fulfill Team
Role's Duties
Share Equally
Is always talking--never
allows anyone else to
speak.
September 28-2
Listens, but
sometimes talks too
much.
Moscow
Listens and
speaks a fair
amount.
Average
52
Action verb
Direct students to a specific performance
(e.g., list, analyze, apply)
September 28-2
Moscow
53
Evaluation
Synthesis
DEMONSTRATE
/CREATE
EXPERT
Analysis
Application
Comprehension
INTERMEDIATE
NOVICE
Knowledge
September 28-2
Moscow
REINFORCE
INTRODUCE
54
Moscow
55
Writing Measurable
Outcomes: Exercise 3
September 28-2
Moscow
56
September 28-2
Moscow
57
Examples
www.engrng.pitt.edu/~ec2000
September 28-2
Moscow
58
September 28-2
Moscow
59
Developing scoring
rubrics
September 28-2
Moscow
60
Moscow
61
Purpose of Rubric
(What do you want it to do?)
Moscow
62
Generic or Task-Specific?
Generic
Task-specific
September 28-2
Moscow
63
Moscow
64
RUBRIC TEMPLATE
Student Outcome_______________________________
Performance
Scale
(Numeric
w/descriptor)
Scale
(Numeric
w/descriptor)
Scale
(Numeric
w/descriptor)
Scale
(Numeric
w/descriptor)
Scale
(Numeric
w/descriptor)
Scale
(Numeric
w/descriptor)
Identifiable
performance
characteristics
reflecting this level
Identifiable
performance
characteristics reflecting
this level
Identifiable
performance
characteristics
reflecting this level
Identifiable
performance
characteristics
reflecting this level
Identifiable
performance
characteristics
reflecting this level
Identifiable
performance
characteristics
reflecting this level
Performance
Performance
Performance
Performance
Performance
September 28-2
Moscow
65
Developing
2
Satisfactory
3
Exemplary
4
Provides adequate
supporting detail to
support
solution/argument
Provides ample
supporting detail
to support
solution/ argument
Organizational
pattern is logical and
conveys
completeness and
wholeness with few
lapses
Organizational
patter is logical
and conveys
completeness and
wholeness
Uses effective
language; makes
engaging,
appropriate word
choices for audience
and purpose
Content
Supporting
Detail
Includes
inconsistent or few
details which may
interfere with
meaning of text
Includes some
details, but may
include extraneous or
loosely related
material
Organization
Organizational
Pattern
Little evidence of
organization or any
sense of
wholeness or
completeness
Achieves little
completeness and
wholeness though
organization
attempted
Style
Language and
word choice
Has limited or
inappropriate
vocabulary for the
audience and
purpose
September
Does28-2
not follow the
Standard
Moscow
Generally follows the
66
Consistently follows
Developing
2
Satisfactory
3
Exemplary
4
Collects a great
deal of
information--all
relates to the topic.
Score
Contribute
Research &
Gather
Information
Take Responsibility
Fulfill Team
Role's Duties
Share Equally
Is always talking--never
allows anyone else to
speak.
September 28-2
Listens, but
sometimes talks too
much.
Moscow
Listens and
speaks a fair
amount.
Average
67
Example of Results
September 28-2
Moscow
68
Example of Results
Teaming Skills
September 28-2
1.
2.
3.
4.
Moscow
69
Example of Results
Communication Skills
1.
2.
3.
4.
September 28-2
Moscow
70
September 28-2
Moscow
71
Outcome Explicit. This outcome is explicitly stated as being a learning outcome for this course.
Demonstrate Competence. Students are asked to demonstrate their competence on this outcome through homework, projects,
tests, etc.
Formal Feedback. Students are given formal feedback on their performance on this outcome.
Not covered. This outcome is not addressed in these ways in this course.
Note: Clicking on the link view rubric will show you the scoring rubric for that particular performance criteria related to
the outcome.
Outcome
Explicit
Outcome/Performance Criteria
Demonstrate
Competence
Formal
Feedback
Not
Covered
comment (optional)
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
September 28-2
Moscow
Yes
3. Test readers/audience response to determine how well ideas have been relayed.
View rubric or make a comment (optional)
72
FALL
WINTER
SPRING
Chem I
4th Year
CH
01
Cons
Principles
CH
414
Heat
Transfer
CH
400
Career P III
EM
100
Life Skills
CM
251
O Chem I
CH
415
Materials
CH
401
Mass II
EM
104
Graph Comm
MA
221
DE I
CM
225
A Chem I
CH
403
Lab II
RH
131
Fresh
Comp
HSS
Elective
CH
304
Thermo II
CH
404
Kinetics
MA
111
Calc 1
CH
200
Career P I
Elective
CM
113
Chem II
CH
202
Che Proc
Calc
CH
300
Career P II
CH
406
Design I
PH
111
Physics I
CM
252
O Chem II
CM
360
P Chem
CH
408
Lab III
HSS
Elective
MA
222
DE II
CH
305
Mass I
CH
440
P Control
MA11
2
Calc II
EM
101
Statics I
MA
227
Statistics
HSS
Elective
MS
120
M.History
Hss
Elective
Elective
CM
115
Chem III
CS
100
Program.
EM
103
Int Design
MA
113
Calc III
Physics II
September
28-2
PH
Fluids
EE
206
EEE
CH
407
Design II
Elective
CH
402
ChE Lab I
CH
409
Prof Prac
HSS
Elective
Elective
HSS
Elective
CH
303
Thermo I
Elective
Elective (Des)
Elective
Elective (free)
CH
301
Moscow
HSS
73
Assessment Methods
September 28-2
Moscow
74
Assessment Methods
Written surveys and
questionnaires
Exit and other
interviews
Standardized exams
Locally developed
exams
Archival records
Focus groups
September 28-2
Portfolios
Simulations
Performance
Appraisal
External
examiner
Oral exams
Behavioral
observations
Moscow
75
Direct Measures
Direct measures provide for the direct
examination or observation of student
knowledge or skills against measurable
learning outcomes
September 28-2
Moscow
76
Indirect Measures
Indirect measures of student learning
that ascertain the opinion or self-report
of the extent or value of learning
experiences
September 28-2
Moscow
77
Direct
September 28-2
Indirect
Written surveys and
questionnaires
Exit and other
interviews
Archival records
Focus groups
Moscow
78
Tools: Exercise 4
September 28-2
Moscow
79
Assignment
After you have shared methods, choose
at least two methods (preferably three)
that are appropriate for the performance
criteria chosen
At least one DIRECT measure
Use overhead transparency to record
your findings
Include your rationale for decision
September 28-2
Moscow
80
September 28-2
Moscow
81
Validity
relevance - the assessment option measures
the educational outcome as directly as
possible
accuracy - the option measures the
educational outcome as precisely as possible
utility - the option provides formative and
summative results with clear implications for
educational program evaluation and
improvement
September 28-2
Moscow
82
Bottom Lines
All assessment options have advantages and
disadvantages
Ideal method means those that are best fit
between program needs, satisfactory validity, and
affordability (time, effort, and money)
Crucial to use multi-method/multi-source approach
to maximize validity and reduce bias of any one
approach
T
E
AB
September 28-2
Moscow
83
Moscow
84
What?
Focus on few criteria for each outcome
September 28-2
Moscow
85
Sampling
For program assessment, sampling is
acceptable and even desirable for
programs of sufficient size.
Sample is representative of all students
September 28-2
Moscow
86
Data collection
Yr 1
Define
Outcomes/
Map Curr.
Yr 2
Data
collection
Yr 3
Yr 4
Yr
Implement
Evaluation &
improvements
design of
& Data
improvements
Collection
Moscow
87
03-04
04-05
05-06
06-07
07-08
08-09
Moscow
88
OCT
JAN
Eval Committee
receives and
evaluates all data;
makes report and
refers recommendations to
appropriate areas.
SEP
AUG
FEB
Institute assessment
cmte. prepares
reports for
submission to Dept.
Heads of the
collected data (e.g.
surveys, e-portfolio
ratings).
MAR
APR
JUL
MAY
JUN
September 28-2
Moscow
89
Strategies
Assessment
Method(s)
Context for
Assessment
Time of data
collection
Assessment
Coordinator
Evaluation
of Results
September 28-2
Moscow
90
grogers@abet.org
Checklist
Assessment question is known and explicit
Outcomes are defined and number of
performance criteria are manageable
Data are efficiently and systematically collected
Assessment methods are appropriate to program
context
Results are evaluated
Evaluation is more than looking at the results of
learning outcomes
Action is appropriate
September 28-2
Moscow
91
September 28-2
Moscow
93
September 28-2
September 28-2
Moscow
95
www.rose-hulman.edu/assessent2007
September
28-2
Moscow
96
September 28-2
Moscow
97
September 28-2
Moscow
98
Moscow
99
September 28-2
Moscow
100
Moscow
101
Moscow
102
September 28-2
Moscow
103
Introduction to ABET
September 28-2
Moscow
104
Introduction to ABET
Accreditation
Federation of 28 professional societies
Board of Directors representing those societies
Four Commissions
Accreditation Council
Representatives of each commission
Coordination, harmonization of processes
September 28-2
Moscow
105
Accreditation Process
Commission responsibilities
Conduct evaluations of programs
Determine accreditation actions
Commission makeup
Commissioners are volunteers appointed by societies
Commissioners chair accreditation teams
Accreditation Team
Chair + one Program Evaluator for each program
Program Evaluators (PEVs) are volunteers from
societies
September 28-2
Moscow
106
ABET Accreditation
Federation of 28 professional societies
Board of Directors represents those societies
Four Commissions
Accreditation Council
Representatives of each commission
Coordination, harmonization of processes
September 28-2
Moscow
107
Commission
ASAC
CAC
EAC
TAC
72
240
1793
740
15
70
373
206
+57%
+85%
Increase in Number of
Programs from 1995-2005
September 28-2
Moscow
+18% -16%
108
September 28-2
Moscow
109
Engineering Change:
A Study of the Impact of EC2000*
Lisa R. Lattuca, Project Director and Co-PI
Patrick T. Terenzini, Co-PI
J. Fredericks Volkwein, Co-PI
September 28-2
Moscow
110
Key Questions
1. What impact, if any, has EC2000 had on
the preparation of graduating seniors to
enter the engineering profession?
2. What impact, if any, has EC2000 had on
practices that may be related to changes in
student preparation?
September 28-2
Moscow
111
Significance of the
Engineering Change Study
The first national study of the impact of
outcomes-based accreditation in the U.S.
A model for assessments in other ABET
Commissions.
A pre-EC2000 benchmark (1994) for
graduating seniors preparation.
The first post-EC2000 data point (2004) on
graduating seniors preparation.
September 28-2
Moscow
112
Engineering Change:
Studying the Impact of EC2000
PROGRAM
CHANGES
STUDENT
EXPERIENCES
Curriculum
&
Instruction
EC2000
Faculty
Culture
Policies &
Practices
InClass
Out-ofClass
OUTCOMES
Student
Learning
(3.a-k)
Employer
Ratings
Continuous Improvement
September 28-2
Moscow
113
September 28-2
Aerospace
Chemical
Civil
Computer
Electrical
Industrial
Mechanical
Moscow
114
Number of
Responses
Response
Rate
203
147
72%
Faculty
2,971
1,243
42%
Deans
40
40+
98%
13,054
5,494
42%
12,921
4,330
34%
unknown
1,622
N/A
Data Sources
Programs
Employers
September 28-2
Moscow
115
Conclusions
Recent graduates are measurably better prepared than
those of a decade ago in all nine EC2000 outcomes.
The most substantial improvements are in Societal and
Global Issues, Applying Engineering Skills, Group Skills,
and Ethics and Professionalism.
Changes in faculty practices are empirically linked to
these increases in preparation.
Although 25% of employers report decreases in
problem-solving skills, 80% still think graduates are
adequately or well-prepared in that skill area.
September 28-2
Moscow
116
Conclusions
A complex array of changes in programs, faculty
practices, and student experiences systematically
enhance student learning.
These changes are consistent with what one would
expect to see if EC2000 was having an impact.
Changes at the classroom level are particularly
effective in promoting the a-k learning outcomes.
September 28-2
Moscow
117
Conclusions
Students also learn engineering skills through out-ofclass experiences.
Finally, a faculty culture that supports assessment and
continuous improvement is also important.
Most deans comments echoed the study findings:
EC2000 is an accelerant for change in engineering
programs.
September 28-2
Moscow
118
Looking Forward
ABET has set the stage for systematic continuous review
of engineering education.
Engineering Change provides important evidence that an
outcomes-based model is an effective quality assurance
mechanism.
Evidence arrives just in time to inform the national
debate.
September 28-2
Moscow
119
September 28-2
Moscow
120
Participation Project
PILOT Report
July 22, 2006
September 28-2
Moscow
121
September 28-2
Moscow
122
Key Components
Develop competency model for Program
Evaluators
Design a more effective recruitment and
selection process
Design a more effective training process
Design a method of performance
assessment and improvement
September 28-2
Moscow
123
September 28-2
Moscow
124
Competencies
Effective Communicator
Easily conducts face to face interviews
Writes clearly and succinctly
Presents focused, concise oral briefings
Professional
Conveys professional appearance
Is committed to contributing and adding value.
Is considered a person with high integrity and ethical
standards
September 28-2
Moscow
125
Competencies
Interpersonally Skilled
Friendly and sets others at ease
Listens and places input into context
Open minded and avoids personal bias
Forthright doesnt hold back what needs to be said
Adept at pointing out strengths & weaknesses in nonconfrontational manner
Technically Current
Demonstrates required technical credentials for the
position
Engaged in life long learning and current in their field
September 28-2
Moscow
126
Competencies
Organized
Is focused on meeting deadlines
Focuses on critical issues and avoids minutia
Displays take charge initiative
Takes responsibility and works under minimum
supervision
Team Oriented
Readily accepts input from team members
Works with team members to reach consensus
Values team success over personal success
September 28-2
Moscow
127
Society
assigns
mentor
Member Society
selects PEV
candidate via
competency model
PHASE III
Candidate
attends
program
specific
training
(Society)
Observer visit
(optional)
September 28-2
Candidate
successfully
completes
modules
on-line
PHASE II
Candidate
successfully
completes
visit
simulation
training
Lead
Facilitator
(Society)
Candidate
attends
visit
simulation
training
Support
Facilitators
(Society)
Society
approves
PEV for
assignment
Moscow
Program Evaluator
128
Training Pilot
Pre-Work CD with Checks for Understanding
Mentor Assigned
Self-Study
Complete Pre-visit forms
1.5 days simulating campus visit
Sunday team meeting
Display materials and lab interview
Draft statement homework
Monday night meeting
September 28-2
Moscow
129
Evaluation Pilot
Performance Appraisal forms:
Describe how competencies are demonstrated previsit and during visit
Provide Performance metrics
Require comments for below met expectations
Peer, Team Chair, Program
September 28-2
Moscow
130
Partnership to Advance
Volunteer Excellence
Determine best implementation strategies
together
Information-sharing, action planning and
collaboration to carry the good work forward
Increase the value of accreditation for your
programs
September 28-2
Moscow
131
Points of Learning
September 28-2
Moscow
132
September 28-2
Moscow
133