Professional Documents
Culture Documents
DETAILS
CONTRIBUTORS
GET THIS BOOK Marye Anne Fox and Norman Hackerman, Editors; Committee on Recognizing,
Evaluating, Rewarding, and Developing Excellence in Teaching of Undergraduate
Science, Mathematics, Engineering, and Technology; Center for Education;
Division of Behavioral and Social Sciences and Education; National Research
FIND RELATED TITLES Council
SUGGESTED CITATION
Visit the National Academies Press at NAP.edu and login or register to get:
Distribution, posting, or copying of this PDF is strictly prohibited without written permission of the National Academies Press.
(Request Permission) Unless otherwise indicated, all materials in this PDF are copyrighted by the National Academy of Sciences.
THE NATIONAL ACADEMIES PRESS 500 Fifth Street, N.W. Washington, DC 20001
NOTICE: The project that is the subject of this report was approved by the Governing Board of the
National Research Council, whose members are drawn from the councils of the National Academy of
Sciences, the National Academy of Engineering, and the Institute of Medicine. The members of the
committee responsible for the report were chosen for their special competences and with regard for
appropriate balance.
This study was conducted under an award from the Presidents of the National Academies. Any
opinions, findings, conclusions, or recommendations expressed in this publication are those of the
author(s) and do not necessarily reflect the views of the organizations or agencies that provided
support for the project.
Additional copies of this report are available from the National Academies Press, 500 Fifth Street,
N.W., Lockbox 285, Washington, DC 20055; (800) 624-6242 or (202) 334-3313 (in the Washington
metropolitan area); Internet, http://www.nap.edu
Suggested citation: National Research Council. (2003). Evaluating and improving undergraduate
teaching in science, technology, engineering, and mathematics. Committee on Recognizing, Evaluating,
Rewarding, and Developing Excellence in Teaching of Undergraduate Science, Mathematics,
Engineering, and Technology, M.A. Fox and N. Hackerman, Editors. Center for Education, Division of
Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.
vi
Preface
Americans have long appreciated the new opportunities for graduate and
need for high-quality education and undergraduate students at many univer-
have invested accordingly, at levels from sities to participate in original research
preschool through graduate education. projects. Recognition of the importance
Because of the impact of science and of original peer-reviewed research in
technology on the nation’s economic institutions of higher learning is clearly
growth, these fields have received laudable. As Robert Gavin noted in the
substantial government and private 2000 publication Academic Excellence:
research funding at colleges and univer- The Role of Research in the Physical
sities. Indeed, since World War II, Sciences at Undergraduate Institutions,
federal funding through peer-reviewed “research activity plays a central role in
grants and contracts has placed in the keeping the faculty up to date in the
hands of university faculty the primary field and improves their teaching.”
responsibility for more than half of the Because of the key role of science,
nation’s basic research in these fields. technology, engineering, and mathemat-
This investment has contributed signifi- ics (STEM), mechanisms for careful
cantly to making the United States a scrutiny and evaluation of the quality of
world leader in the discovery and research in these fields are highly
application of new knowledge and has developed, and academic scientists and
produced a well-respected system for engineers often derive reward and
graduate training in science and engi- recognition from their research achieve-
neering. In recent years, additional ments. As is the case with most scholar-
financial support from industry and ship, the criteria used in these evalua-
nonprofit organizations has provided tions differ from one discipline to
vii
viii P R E FA C E
P R E FA C E ix
Acknowledgments
The committee members and staff of the NRC’s Center for Education
acknowledge the contributions of a (CFE) and currently executive director
number of people for providing presen- of the NRC’s Division of Behavioral and
tations, additional data, and valuable Social Sciences and Education, for
insight to the committee both during providing critical support and leader-
and between committee meetings: John ship during the writing and report
V. Byrne, President Emeritus, Oregon review phases of this study; Kirsten
State University and Director, Kellogg Sampson Snyder, CFE Reports Officer,
Commission on the Future of State and for her support and guidance in
Land-Grant Universities; Barbara shepherding this report through report
Cambridge, Director, Teaching Initia- review and in working on the final
tives, American Association for Higher stages of production; Rona Briere and
Education, and Director, Carnegie Kathleen (Kit) Johnston for their editing
Academy Campus Program; R. Eugene skills and insight; Eugenia Grohman
Rice, Director, Assessment Forum, and Yvonne Wise, for their assistance
American Association for Higher Educa- and support in revising the report at
tion; Alan H. Schoenfeld, Professor of several stages of its development; and
Education, University of California- also Rodger W. Bybee, former executive
Berkeley, and Chair, Joint Policy Board director of the NRC’s Center for Sci-
on Mathematics Task Force on Educa- ence, Mathematics, and Engineering
tional Activities. Education, and current Executive
At the National Research Council Director of the Biological Sciences
(NRC), we also would like to acknowl- Curriculum Study. Dr. Bybee helped
edge Michael J. Feuer, former director conceive this study and offered support
xi
and guidance for it during the time that and Learning and Department of
he was affiliated with the NRC. Mathematics, Harvard University
This report has been reviewed in Peter D. Lax, Courant Institute of
draft form by individuals chosen for Mathematical Sciences, New York
their diverse perspectives and technical University
expertise, in accordance with proce- Susan B. Millar, College of Engineer-
dures approved by the NRC’s Report ing, University of Wisconsin,
Review Committee. The purpose of this Madison
independent review is to provide candid Robert E. Newnham, Department of
and critical comments that will assist the Materials Science and Engineer-
institution in making the published ing, Pennsylvania State University
report as sound as possible and to Sheri D. Sheppard, Associate Profes-
ensure that the report meets institu- sor of Mechanical Engineering,
tional standards for objectivity, evi- Stanford University, and
dence, and responsiveness to the study Michael J. Smith, Education Director,
charge. The review comments and draft American Geological Institute.
manuscript remain confidential to
protect the integrity of the deliberative Although the reviewers listed above
process. We wish to thank the following have provided many constructive
individuals for their participation in the comments and suggestions, they were
review of this report: not asked to endorse the conclusions or
recommendations nor did they see the
David F. Brakke, Dean, College of final draft of the report before its re-
Science and Mathematics, James lease. The review of this report was
Madison University overseen by Frank G. Rothman, Brown
Brian P. Coppola, Department of University, and Pierre C. Hohenberg,
Chemistry, University of Michigan Yale University. Appointed by the NRC,
James Gentile, Dean, Natural Science they were responsible for making
Division, Hope College, Holland, certain that an independent examination
Michigan of this report was carried out in accor-
Melvin D. George, Department of dance with institutional procedures and
Mathematics, University of that all review comments were carefully
Missouri considered. Responsibility for the final
Daniel Goroff, Associate Director, content of this report rests entirely with
Derek Bok Center for Teaching the authoring committee and the institu-
tion.
xii ACKNOWLEDGMENTS
Contents
EXECUTIVE SUMMARY 1
5. Evaluation Methodologies 71
8. Recommendations 115
xiii
REFERENCES 128
APPENDIXES
INDEX 203
xiv CONTENTS
Executive Summary
who wish to explore the scholarship of throughout their careers; hiring prac-
teaching and learning; and (3) applying tices should provide a first opportunity
such formative evaluation techniques to to signal institutions’ teaching values
departmental programs, not only to and expectations of faculty.
individual faculty.2
Four fundamental premises guided Underlying these premises is the
the committee’s deliberations: committee’s recognition that science,
mathematics, and engineering instruc-
(1) Effective postsecondary teaching tors face a number of daunting chal-
in science, mathematics, and technology lenges: the need to apply principles of
should be available to all students, human learning from research in
regardless of their major. cognitive science to the assessment of
(2) The design of curricula and the learning outcomes, to teach and advise
evaluation of teaching and learning large numbers of students with diverse
should be collective responsibilities of interests and varying reasons for
faculty in individual departments or, enrolling, to prepare future teachers, to
where appropriate, through interdepart- provide faculty and students with
mental arrangements. engaging laboratory and field experi-
(3) Scholarly activities that focus on ences, and to supervise students who
improving teaching and learning should undertake original research. Simulta-
be recognized as bona fide endeavors neously addressing these challenges
that are equivalent to other scholarly requires knowledge of and enthusiasm
pursuits. Scholarship devoted to im- for the subject matter, familiarity with a
proving teaching effectiveness and range of appropriate pedagogies, skill in
learning should be accorded the same using appropriate tests, ease in profes-
administrative and collegial support that sional interactions with students within
is available for efforts to improve other and beyond the classroom; and active
research and service endeavors. scholarly assessment to enhance teach-
(4) Faculty who are expected to work ing and learning.
with undergraduates should be given Yet the committee found that most
support and mentoring in teaching faculty who teach undergraduates in the
STEM disciplines have received little
formal training in teaching techniques,
in assessing student learning, or in
2
evaluating teaching effectiveness.
Detailed definitions of formative and
summative evaluation can be found in Chapter 5. Formal programs aimed at improving
2 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
teaching are still rare. A firm commit- could be compared with the other
ment to open intradepartmental commu- independent evidence. The instructor’s
nication about teaching effectiveness is willingness to seek external support to
therefore critical to any convincing improve teaching and learning also is
evaluation of teaching based on these evidence of her or his commitment to
premises. And because considerable effective undergraduate teaching.
variation exists across institutions and Effective evaluation also emerges
disciplines, there is no single formula or from a combination of sources of evi-
pathway to effective evaluation of dence. Current students, those who had
teaching. taken a course in previous years, and
The research literature suggests that graduating seniors and alumni could
some combination of the following kinds provide evidence about the instructor’s
of formative and summative evidence role in their learning. Graduate teaching
about student learning can be helpful in assistants could discuss the instructor’s
evaluating and improving a faculty approaches to teaching, levels of interac-
member’s teaching: tions with students, and the mentoring
Departmental and other colleagues can that they receive in improving their own
provide informed input about teaching teaching skills. Departmental and other
effectiveness through direct observa- faculty colleagues, both from within and
tion, analysis of course content and outside the institution, could evaluate
materials, or information about the the currency of the materials the in-
instructor’s effectiveness in service and structor presents and his or her level of
interdisciplinary courses. Undergradu- participation and leadership in improv-
ates and graduate teaching assistants ing undergraduate education. The
could offer useful information based on faculty member being evaluated can
their experiences in the instructor’s provide critical information about his or
courses and laboratories, the her teaching challenges and successes
instructor’s supervision of research, and through self-reflection and other evi-
the quality of academic advising. Addi- dence of effective teaching, student
tionally, graduate students could com- learning, and professional growth.
ment on the supervision and mentoring Institutional data and records offer
they have received as they prepare for insights about changes in enrollments
teaching. The faculty member being in a faculty member’s courses over time,
evaluated could provide self-assessment the percentage of students who drop the
of his or her teaching strengths and instructor’s courses, and the number of
areas for improvement; this assessment students who go on to take additional
EXECUTIVE SUMMARY 3
4 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
peer reviews and teaching portfolios same time, institutions should recognize
used for promotion, tenure, and post- that disciplines approach teaching
tenure review. Such assessments differently and that such differences
should be designed to provide fair and should be reflected in evaluation proce-
objective information to aid faculty in dures.
the improvement of their teaching. Much of this report offers recommen-
Building consensus among faculty, dations to faculty about how they can
providing necessary resources, and use evaluation to improve their teach-
relying on the best available research on ing. Accordingly, the following set of
teaching, learning, and measurement recommendations is directed toward
are critical for this approach to evalua- policy makers, administrators, and
tion. leaders of organizations associated with
(1.4) Individual faculty—beginners higher education.
as well as more experienced teachers—
and their departments should be re- 2. Recommendations for
warded for consistent improvement of Presidents, Overseeing Boards,
learning by both major and nonmajor and Academic Officers
students. All teaching-related activi- (2.1) Quality teaching and effective
ties—such as grading, reporting of learning should be highly ranked
grades, curriculum development, institutional priorities. All faculty and
training of teaching assistants, and departmental evaluations and accredita-
related committee work—should be tion reviews should include rigorous
included in evaluation systems adopted assessment of teaching effectiveness.
for faculty rewards. University leaders should clearly assert
(1.5) Faculty should accept the high expectations for quality teaching to
obligation to improve their teaching newly hired and current faculty.
skills as part of their personal commit- (2.2) Campus-wide or disciplinary-
ment to professional excellence. De- focused centers for teaching and learn-
partments and institutions of higher ing should be tasked with providing
education should reinforce the impor- faculty with opportunities for ongoing
tance of such professional development professional development that include
for faculty through the establishment understanding how people learn, how to
and support of campus resources (e.g., improve current instruction though
centers for teaching and learning) and student feedback (formative evaluation),
through personnel policies that recog- and how educational research can be
nize and reward such efforts. At the translated into improved teaching
EXECUTIVE SUMMARY 5
6 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
EXECUTIVE SUMMARY 7
for improving teaching and learning. To primary criterion for program accredita-
the extent possible, these investments tion.
should not be made at the expense of (4.4) Professional societies should
sponsored research. offer opportunities to discuss under-
(4.2) Funding agencies and re- graduate education issues during annual
search sponsors should undertake a and regional meetings. These events
self-examination by convening expert might include sessions on teaching
panels to examine whether agency techniques and suggestions for over-
policies might inadvertently compro- coming disciplinary and institutional
mise a faculty member’s commitment to barriers to improved teaching.
quality undergraduate teaching. (4.5) Professional societies should
(4.3) Accreditation agencies and encourage publication of peer-reviewed
boards should revise policies to empha- articles in their general or specialized
size quality undergraduate learning as a journals on evolving educational issues
in STEM.
8 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
PART I
What Is Known:
Principles, Research Findings,
and Implementation Issues
1
Recent Perspectives
on Undergraduate
Teaching and Learning
11
IMPETUS FOR AND CHALLENGES rate leaders and the public alike are
TO CHANGE focusing on the need for a scientifically
and technologically literate citizenry
Calls for Accountability from and a skilled workforce (Capelli, 1997;
Outside of Academe Greenspan, 2000; International Technol-
ogy Education Association, 2000;
The reforms within K–12 education
Murnane and Levy, 1996; National
that have been enacted in almost every
Council of Teachers of Mathematics,
state and many districts include systems
2000; NRC, 1996a, 1999a, 2000d).
for measuring achievement and account-
Corporate leaders also have made it
ability. State legislatures, departments
increasingly clear that their workforce
of education, school boards, and the
needs more than basic knowledge in
general public expect those responsible
science, mathematics, and technology.
for educating students to be held specifi-
They expect those they hire to apply
cally accountable for the quality of the
that knowledge in new and unusual
outcomes of their work (Rice et al.,
contexts, as well as to communicate
2000).
effectively, work collaboratively, under-
The call for accountability is also
stand the perspectives of colleagues
being clearly heard at the
from different cultures, and continually
postsecondary level. State legislatures
update and expand their knowledge and
are demanding that public universities
skills (Capelli, 1997; Greenspan, 2000;
provide quantifiable evidence of the
Rust, 1998).
effectiveness of the academic programs
being supported with tax dollars. Other Calls for Change from Within
bodies, including national commissions, Academe
institutional governing boards, and
While public pressure for reforming
professional accrediting agencies, also
undergraduate teaching and learning
have begun to recommend that universi-
and holding educators accountable for
ties and colleges be held more account-
such improvements is real and growing,
able for student learning (see, e.g.,
recent surveys also suggest that in-
National Center for Public Policy and
creasing numbers of faculty are advo-
Higher Education, 2001; see also Chap-
cating strongly for quality teaching and
ter 3, this report).
are paying close attention to how
One aspect of the call for accountabil-
effective teaching is recognized, evalu-
ity in higher education is particularly
ated, and rewarded within departments
important for faculty in STEM. Corpo-
12 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
and at the institutional level (Ory, 2000). expectations for their performance are
In a recent survey of doctoral students, vague and sometimes conflicting. They
for example, 83 percent indicated that also indicated that feedback on their
teaching “is one of the most appealing performance often is insufficient,
aspects of faculty life, as well as its core unfocused, and unclear. Many ex-
undertaking” (Golde and Dore, 2001, p. pressed concern about the lack of a
21). “culture of collegiality” or a “teaching
In recent interviews with new faculty community” at their institutions (Rice et
members, Rice et al. (2000)2 reported al., 2000).
that interviewees overwhelmingly During the past decade, there also
expressed enjoyment of and commit- has been increasing concern among
ment to teaching and working with senior faculty and administrators about
students. However, early-career faculty improving undergraduate STEM educa-
expressed concerns about how their tion. These efforts have been spurred
work is evaluated. They perceive that by reports from a variety of national
organizations (e.g., Boyer, 1990; Boyer
Commission, 1998; NRC, 1996b, 1997a;
NSF, 1996; Project Kaleidoscope, 1991,
1994) calling for reform in these disci-
plines. Professional societies also are
2
This report by Rice et al. (2000) is a product devoting serious attention to enhancing
of the American Association for Higher
undergraduate teaching and learning in
Education’s (AAHE’s) ongoing Forum on Faculty
Roles and Rewards. The report provides the these disciplines (e.g., Council on
results of structured interviews that were Undergraduate Research <http://
undertaken with 350+ new faculty members and
graduate students aspiring to be faculty members www.cur.org>; Doyle, 2000; McNeal and
from colleges and universities around the D’Avanzo, 1997; NRC, 1999b, 2000b;
country. The aim of that study was to obtain
perspectives from those who are just beginning Howard Hughes Medical Institute
their academic careers and to offer guidance for <http://www.hhmi.org>; National
senior faculty, chairs, deans, and others in higher
education who will be responsible for shaping the Institute for Science Education <http://
professoriate of the future. Rice et al. offer ten www.wcer.wisc.edu/nise>; Project
“Principles of Good Practice: Supporting Early-
Career Faculty,” accompanied by an action
Kaleidoscope <http://www.pkal.org>;
inventory to prompt department chairs, senior Rothman and Narum, 1999; and
colleagues, and other academic leaders to
websites and publications of increasing
examine their individual and institutional
practices. These principles and specific action numbers of professional societies in the
items are also available in a separate publication natural sciences, mathematics, and
by Sorcinelli (2000), which is available at <http://
www.aahe.org/ffrr/principles_brochure.htm>. engineering).
RECENT PERSPECTIVES 13
14 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
Knowing What Students Know: The The goal of this project is to develop
Science and Design of Educational resources to help postsecondary science,
technology, engineering, and mathemat-
Assessment, NRC, 2001); and on other
ics (STEM) faculty and administrators
research findings that relate closely to
gain deeper understanding about ways
the responsibilities of undergraduate convincingly to evaluate and reward
faculty and could lead to direct improve- effective teaching by drawing on the
ments in undergraduate education. results of educational research. The
Overviews and summaries of research committee will prepare a National
on learning and the application of that Research Council report on the evalua-
tion of undergraduate STEM teaching,
scholarship to the assessment of learn-
with a focus on pedagogical and imple-
ing are provided at the end of this
mentation issues of particular interest to
chapter in Annex Boxes 1-1 and 1-2, the STEM community. The report will
respectively. emphasize ways in which research in
Many college faculty are not familiar human learning can guide the evaluation
with that literature, however, nor do and improvement of instruction, and will
they have the time, opportunity, or discuss how educational research find-
ings can contribute to this process.
incentives to learn from it. Moreover,
assessing whether students actually In responding to this charge, the
have learned what was expected re- committee embraced four fundamental
quires that faculty rethink course premises, all of which have implications
objectives and their approaches to for how teaching is honored and evalu-
teaching. Extending the assessment of ated by educational institutions:
learning outcomes beyond individual
courses to an entire departmental • Effective postsecondary teaching
curriculum requires that faculty collec- in STEM should be available to all
tively reach consensus about what students, regardless of their major.
students should learn and in which • The design of curricula and the
courses that knowledge and those skills evaluation of teaching and learning
should be developed. should be collective responsibilities of
faculty in individual departments or,
where appropriate, performed through
STATEMENT OF TASK AND other interdepartmental arrangements.
GUIDING PRINCIPLES • Scholarly activities that focus on
improving teaching and learning should
The committee conducted its work be recognized as bona fide endeavors
according to the following statement of that are equivalent to other scholarly
task from the NRC: pursuits. Scholarship devoted to im-
RECENT PERSPECTIVES 15
16 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
RECENT PERSPECTIVES 17
18 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
RECENT PERSPECTIVES 19
Research in the cognitive, learning, and brain sciences has provided many new
insights about how humans organize knowledge, how experience shapes understand-
ing, how individuals differ in learning strategies, and how people acquire expertise.
From this emerging body of research, scientists and others have been able to synthe-
size a number of underlying principles of human learning. That knowledge can be
synthesized into the following seven principles of learning:
20 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
5. Learners’ motivation to learn and sense of self affect what is learned, how much
is learned, and how much effort will be put into the learning process.
Both internal and external factors motivate people to learn and develop competence.
Regardless of the source, learners’ level of motivation strongly affects their willingness
to persist in the face of difficulty or challenge. Intrinsic motivation is enhanced when
students perceive learning tasks as interesting and personally meaningful, and pre-
sented at an appropriate level of difficulty. Tasks that are too difficult can frustrate;
those that are too easy can lead to boredom. Research also has revealed strong
continued on next page
RECENT PERSPECTIVES 21
6. The practices and activities in which people engage while learning shape what is
learned.
Research indicates that the way people learn a particular area of knowledge and
skills and the context in which they learn it become a fundamental part of what is
learned. When students learn some subject matter or concept in only a limited context,
they often miss seeing the applicability of that information to solving novel problems
encountered in other classes, in other disciplines, or in everyday life situations. By
encountering a given concept in multiple contexts, students develop a deeper under-
standing of the concept and how it can be used and applied to other contexts. Faculty
can help students apply subject matter to other contexts by engaging them in learning
experiences that draw directly upon real-world applications, or exercises that foster
problem-solving skills and strategies that are used in real-world situations. Problem-
based and case-based learning are two instructional approaches that create opportu-
nities for students to engage in practices similar to those of experts. Technology also
can be used to bring real-world contexts into the classroom.4
4
Specific techniques for structuring problem-based learning and employing technology in
college classrooms are discussed on the website of the National Institute for Science Education.
Suggestions for creative uses of technology are available <http://www.wcer.wisc.edu/nise/cl1/
ilt/default.asp>. Each site also provides further references. Additional resources on problem-
based learning are found in Allen and Duch (1998).
SOURCE: Excerpted and modified from NRC (2002b, Ch. 6). Original references
are cited in that chapter.
22 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
• Although assessments used in various contexts and for differing purposes often
look quite different, they share common principles. Assessment is always a process of
reasoning from evidence. Moreover, assessment is imprecise to some degree. Assess-
ment results are only estimates of what a person knows and can do. It is essential to
recognize that one type of assessment is not appropriate for measuring learning in all
students. Multiple measures provide a more robust picture of what an individual has
learned.
• Every assessment, regardless of its purpose, rests on three pillars: a model of
how students represent knowledge and develop competence in the subject domain,
tasks or situations that allow one to observe students’ performance, and an interpreta-
tion method for drawing inferences from the performance evidence thus obtained.
• Educational assessment does not exist in isolation. It must be aligned with
curriculum and instruction if it is to support learning.
• Research on learning and cognition indicates that assessment practices should
extend beyond an emphasis on skills and discrete bits of knowledge to encompass
more complex aspects of student achievement.
• Studies of learning by novices and experts in a subject area demonstrate that
experts typically organize factual and procedural knowledge into schemas that support
recognition of patterns and the rapid retrieval and application of knowledge. Experts
use metacognitive strategies to monitor their understanding when they solve problems
and perform corrections of their learning and understanding (see Annex Box 1-1,
principle 3, for additional information about metacognition). Assessments should
attempt to determine whether a student has developed good metacognitive skills. They
should focus on identifying specific strategies that students use for problem solving.
• Learning involves a transformation from naïve understanding into more complete
and accurate comprehension. Appropriate assessments can both facilitate this process
for individual students and assist faculty in revising their approaches to teaching. To
this end, assessments should focus on making students’ thinking visible to both them-
selves and their instructors so that faculty can select appropriate instructional strategies
to enhance future learning.
• One of the most important roles for assessment is the provision of timely and
informative feedback to students during instruction and learning so that their practice
of a skill and its subsequent acquisition will be effective and efficient.
• Much of human learning is acquired through discourse and interactions with
others. Knowledge is often associated with particular social and cultural contexts, and
it encompasses understanding about the meaning of specific practices, such as asking
and answering questions. Effective assessments need to determine how well students
engage in communicative practices that are appropriate to the discipline being
RECENT PERSPECTIVES 23
SOURCE: Excerpted and modified from NRC (2001, pp. 2–9). References to
support these statements are provided in that report.
24 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
25
In light of these predictions, what steps dents who now enroll in STEM courses
are institutions of higher education and as undergraduates.
supporting organizations taking to Many individual faculty and depart-
mobilize faculty and resources to ments are actively engaged in moving
enhance learning for undergraduate undergraduate education from a faculty-
students? centered teaching model to a student-
Graduate students, faculty, and centered learning model (Barr and
administrators from all types of Tagg, 1999). Moreover, numerous
postsecondary institutions in the United campuses in the United States and
States are increasingly interested in the abroad are establishing teaching and
revamping of teaching practices to learning centers.1 As these centers
enhance student learning in science, evolve, they are supporting new
technology, engineering, and mathemat- pedagogies and more efficient methods
ics (STEM) (see Rothman and Narum, of assessing teaching and learning, and
1999). In part, this increased interest are serving as focal points for efforts to
has stemmed from observations by advance the scholarship of teaching and
faculty that their approaches to teaching learning (Boyer, 1990; Glassick et al.,
may not result in the expected levels of 1997; Ferrini-Mundy, personal commu-
student learning (e.g., Hestenes, 1987; nication). Many of these centers are
Hestenes and Halloun, 1995; Mazur, increasingly tailoring their assistance to
1997; Wright et al., 1998). Some faculty faculty to reflect differences in ap-
and departments are confronting the proaches and emphases among disci-
pedagogical and infrastructural chal- plines. Experts in these discipline-
lenges of offering smaller classes (e.g., based centers are often disciplinary
the need for additional instructors to faculty with expertise in pedagogical
teach more sections), especially for content knowledge, assessment of
introductory courses. Others are using learning, and other issues specific to
innovative approaches to teaching based their disciplines (see also Huber and
on emerging research in the cognitive Morreale, 2002).
and brain sciences about how people
learn (e.g., National Research Council
[NRC], 2000c). Still others are experi-
menting with the effectiveness of 1
A list of websites of teaching and learning
different learning strategies to accom- centers of colleges and universities in Asia,
modate the broader spectrum of stu- Australia and New Zealand, Europe, and North
America is available at <http://www.ku.edu/
~cte/resources/websites.html>.
26 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
2
Although it appears obvious, any list
Examples are Microbiology Education,
published by the American Society of Microbiol- of characteristics of high-quality teach-
ogy; Journal of Chemical Education, published by ing of STEM that is centered on desired
the Division of Chemical Education of the
American Chemical Society; and Physics Today, student outcomes must begin with the
published by the American Institute of Physics. premise that faculty members must be
well steeped in their disciplines. They • They can help students learn and
must remain active in their areas of understand the general principles of
scholarship to ensure that the content of their discipline (e.g., the processes and
their courses is current, accurate, and limits of the scientific method).
balanced, especially when presenting • They are able to provide students
information that may be open to alterna- with an overview of the whole domain of
tive interpretation or disagreement by the discipline (e.g., Coppola et al., 1997).
experts in the field. They also should • They possess sufficient knowledge
allow all students to appreciate “. . . and understanding of their own and
interrelationships among the sciences related sub-disciplines to answer most
and the sciences’ relationship to the students’ questions and know how to
humanities, social sciences, and the help students find appropriate informa-
political, economic, and social concerns tion.
of society” (NRC, 1999a, p. 26). • They stay current through an
Knowledge of subject matter can be active research program or through
interpreted in other ways. For example, scholarly reading and other types of
several recent reports (e.g., Boyer professional engagement with peers.
Commission, 1998; NRC, 1999a; Na- • They are genuinely interested in
tional Science Foundation [NSF], 1996) what they are teaching.
have emphasized that the undergradu- • They understand that conveying
ate experience should add value in the infectious enthusiasm that accompa-
tangible ways to each student’s educa- nies original discovery, application of
tion. Faculty must teach subject matter theory, and design of new products and
in ways that encourage probing, ques- processes is as important to learning as
tioning, skepticism, and integration of helping students understand the subject
information and ideas. They should matter.
provide students with opportunities to
think more deeply about subject matter 2. Skill, Experience, and
than they did in grades K–12. They Creativity with a Range of
should enable students to move intellec- Appropriate Pedagogies
tually beyond the subject matter at and Technologies
hand.
Faculty who possess deep knowledge Deep understanding of subject matter
and understanding of subject matter is critical to excellent teaching, but not
demonstrate the following characteris- sufficient. Effective teachers also
tics: understand that, over the course of their
28 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
30 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
cal processes, faculty can help them students clearly are not prepared to
open their eyes to the ethical issues and undertake the challenges of a particular
political decisions that often affect course, faculty should be able to counsel
science and technology (e.g., Coppola them out of the course or suggest
and Smith, 1996). alternative, individualized approaches
Professionalism in a faculty member’s for learning the subject matter.
relationships and interactions with • They uphold and model for stu-
students also should be based on dents the best scholarly and ethical
criteria such as the following: standards (e.g., University of California
Faculty Code of Conduct).5
• Faculty meet with all classes and
assigned teaching laboratories, post and 5. Involvement with and
keep regular office hours, and hold Contributions to One’s Profession
exams as scheduled. in Enhancing Teaching
• They demonstrate respect for and Learning
students as individuals; this includes Effective teaching needs to be seen as
respecting the confidentiality of informa- a scholarly pursuit that takes place in
tion gleaned from advising or student collaboration with departmental col-
conferences. leagues, faculty in other departments in
• They encourage the free pursuit of the sciences and engineering, and more
learning and protect students’ academic broadly across disciplines (Boyer, 1990;
freedom. Glassick et al., 1997; Kennedy, 1997).
• They address sensitive subjects or Faculty can learn much by working with
issues in ways that help students deal colleagues both on and beyond the
with them maturely. campus, thereby learning to better
• They contribute to the ongoing integrate the materials they present in
intellectual development of individual their own courses with what is being
students and foster confidence in the taught in other courses (Hutchings,
students’ ability to learn and discover on 1996; NRC, 1999a).
their own.
• They advise students who are
experiencing problems with course
material and know how to work them in
venues besides the classroom to help 5
The University of California System’s Faculty
them achieve. On those occasions when Code of Conduct Manual is available at <http://
www.ucop.edu/acadadv/acadpers/apm/>.
more endemic to these disciplines. learn how the scientific process works, what
Some of the more general challenges scientists do and how and why they do it.
include improving the assessment of
They can provide research opportunities for
learning outcomes and preparing future
teachers. More discipline-specific practicing teachers; act as scientific part-
challenges include teaching a broad ners; provide connections to the rest of the
range and large numbers of students, scientific community; assist in writing grant
providing engaging laboratory and field
proposals for science-education projects;
experiences, and encouraging students
to undertake original research that provide hands-on, inquiry-based workshops
increasingly is highly sophisticated and for area teachers (e.g., NRC, 2000a); and
technical. provide teachers access to equipment,
The committee took particular note of ers to review educational material for its
Astin et al.’s (1996) Assessment Forum: accuracy and utility.
Nine Principles of Good Practice for
When scientists teach their undergradu-
Assessing Student Learning. Because
these authors articulate succinctly the ate classes and laboratories, potential
position the committee has taken in this science teachers are present. Scientists
report, their principles are presented should recognize that as an opportunity to
verbatim in Box 2-1. These principles
promote and act as a model of both good
also could be applied in evaluating
departmental programs. process and accurate content teaching and
so strive to improve their own teaching
(NRC, 1996c, p. 3).
32 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
34 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
This committee agrees with the to the world economy. Standards for
conclusions expressed by other NRC teacher education and professional
committees (NRC 1999a, 2000b) that development for teachers are an integral
science faculty in the nation’s universi- component of the National Science
ties should, as one of their primary Education Standards (NRC, 1996a);
professional responsibilities, model the much useful information can be found in
kinds of pedagogy that are needed to that document to help postsecondary
educate both practicing and prospective faculty understand their role in promot-
teachers. Those NRC reports provide a ing more effective teacher education.
series of recommendations for how Contributing authors in Siebert and
chief academic officers and faculty can Macintosh (2001) offer advice and
work together to promote more effec- numerous examples of how the prin-
tive education for teachers of mathemat- ciples contained in the National Science
ics and science. These recommenda- Education Standards can be applied to
tions include developing courses that higher education settings.
provide all students with a better under- An impending shortage of qualified
standing of the relationships among the K–12 teachers over the next decade
sciences, that integrate fundamental (National Center for Education Statis-
science and mathematics, and that help tics, 1999) will compound the shortage
students understand how these areas of that already exists for elementary and
knowledge relate to their daily lives and secondary school science and math-
36 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
38 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
In calculating academic rewards, it has been painfully difficult to evaluate the quality of
research as separated from its mass. Nevertheless, departments and deans find that for
passing judgment on peers, research productivity is a much more manageable criterion than
teaching effectiveness. Faculty gossip, student evaluations, and alumni testimonials have all
been notoriously weak reeds, and reliable self-evaluation is all but impossible…. At this
point promotion and tenure committees still find teaching effectiveness difficult to measure.
Publication is at least a perceptible tool; the relative ease of its use has reinforced the
reliance on it for tenure and promotion decisions. Evaluating good teaching will always be
difficult, but effective integration of research and teaching should be observable, as should
the development of interdisciplinary approaches to learning. Departments and deans must
be pressed to give significant rewards for evidence of integrated teaching and research and
for the imagination and effort required by interdisciplinary courses and programs. When
publication is evaluated, attention should be paid to the pedagogical quality of the work as
well as to its contribution to scholarship.
Boyer Commission on Educating Undergraduates in the Research University (1998, p. 41)
40
education has many parallels with the e.g., Annex Box 1-1, Chapter 1), criteria
research enterprise. The products of and methods for assessing undergradu-
sound teaching are effective student ate teaching performance in accordance
learning1 and academic achievement. with that emerging knowledge have not
The major challenge for colleges and yet seen widespread use. Instead, the
universities is to establish as an institu- measure of a teacher’s effort often is
tional priority and policy the need for reduced to the numbers of courses or
both individual and collective (i.e., laboratory sections he or she teaches,
departmental) responsibility and ac- the numbers of students taught, or
countability for improving student grade distributions. These are not
learning. As this report demonstrates, measures of outcomes and results. End-
criteria and methodologies for assessing of-course student evaluations are com-
teaching effectiveness and productivity mon, but even they usually lead to a
in ways that are comparable with the numeric ranking, which often confuses
measurement of productivity in scholar- evaluation of the teacher and the course.
ship are becoming increasingly available Because many factors, such as the size
(e.g., Gray et al., 1996; Licata and of the course, its grade distributions, or
Morreale, 1997, 2002; National Institute whether it is being taken as an elective
of Science Education, 2001b). Many of or distribution requirement can influ-
these criteria and methods are exam- ence responses on such evaluations (see
ined in Part II of this report. Chapter 4), rankings are rarely directly
While we now know a great deal more comparable among courses or instruc-
about practices that can contribute to tors.
effective teaching and learning (see, The committee maintains that the
goals and perception of excellence in
research and teaching at the under-
graduate level can and must become
1
There are numerous definitions of what more closely aligned. Five key areas in
constitutes effective student learning. For
purposes of this report, the committee has which steps can be taken to this end are
adopted the definition from the NRC report How (1) balancing the preparation provided
People Learn: Brain, Mind, Experience, and
School: Expanded Edition (National Research
for careers in research and teaching; (2)
Council [NRC], 2000c, p. 16): “To develop increasing support for effective teaching
competence in an area of inquiry, students must
on the part of professional organiza-
(a) have a deep foundation of factual knowledge,
(b) understand facts and ideas in the context of a tions; (3) developing and implementing
conceptual framework, and (c) organize knowl- improved means for evaluating under-
edge in ways that facilitate retrieval and applica-
tion.” graduate teaching and learning; (4)
42 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
44 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
expertise while referring only obliquely mentor the next generation of faculty in
to qualifications for teaching. During new pedagogies or in the use of tech-
interviews, candidates for positions niques for effectively assessing student
usually are required to present in learning. For many faculty, their most
colloquia or other venues details on successful instructional methods are
their current interests, achievements, usually self-taught—a reflection at least
and future plans for research, but may in part of the ways they themselves
not be asked to demonstrate either were taught—and consistent with
teaching prowess or knowledge of personal styles and areas of expertise.
critical teaching and learning issues in Such methods are not necessarily
STEM education. Orientation for new transferable to student assistants or less-
faculty, if it exists at all, is often com- senior colleagues. Moreover, teaching
pleted within a few days prior to the as modeled by faculty advisors has been
beginning of the academic year. During based primarily on the lecture, to the
orientation or earlier, new faculty may point that the unstated assumption of
learn of the existence of a teaching and graduate or postdoctoral students could
learning center on campus, which can very well be that this is the only “real”
provide access to resources that would form of teaching. While lectures may
be useful for development and refine- be an effective method when used by
ment of their teaching skills. Even certain faculty in certain settings, a mix
when such centers exist,2 however, of pedagogies is likely to be more
faculty may or may not be encouraged successful, particularly for the broader
to use their services. spectrum of students that now charac-
Indeed, many faculty in the STEM terizes the nation’s undergraduate
disciplines who teach undergraduates population (Cooper and Robinson, 1998;
are unfamiliar with the burgeoning McKeachie, 1999; McNeal and
research on education and human D’Avanzo, 1997; Shipman, 2001;
learning. This lack of knowledge and Springer et al., 1998; Wyckoff, 2001).
awareness leaves them ill equipped to Senior colleagues could serve as
sources of teaching support, advice, and
feedback for new faculty, but those new
faculty may be reluctant to initiate such
2
a relationship for several reasons. One
Teaching and learning centers on many
campuses are providing leadership in addressing is the tradition of academic freedom, in
these issues. A list of these centers around the which classrooms are viewed as private
world can be found at <http://www.ku.edu/~cte/
resources/websites.html>. domains where faculty members have
46 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
3
Additional information about this program is
6
available at <http://www.acs.org/portal/ Additional information about this convocation
Chemistry?PID=acsdisplay.html&DOC=education/ is available at <http://www.sigmaxi.org/forum/
curriculum/context.html>. 1999Forum/forum99.htm>.
4 7
See, for example, Undergraduate Education Additional information about this symposium
in Physics: Responding to Changing Expecta- is available at <http://www.sigmaxi.org/forum/
tions <http://www.aps.org/educ/conf97/ 1999Forum/forum99.htm>.
8
01.Chairs.homepage.html>. This new classification system is available at
5
Additional information is available at <http:// <http://www.carnegiefoundation.org/Classifica-
alidoro.catchword.com/vl=85083249/cl=13/ tion/index.htm>.
9
nw=1/rpsv/catchword/aibs/00063568/v50n3/ Additional information is available at <http:
s13/p277l>. //www.carnegiefoundation.org/>.
10
Additional information about this prize is
available at <http://www.case.org/awards>.
11
Additional information about this forum and
its related activities is available at <http:// 13
Additional information about this NSF
www.aahe.org/assessment/>.
12 initiative is available at <http://www.ehr.nsf.gov/
Additional information about the Project
ehr/DUE/programs/asa/>.
Kaleidoscope program, including specific case 14
Additional information is available at
studies and publications that are available in print
<http://www.pewtrusts.com/ideas/
and on the organization’s website, are available at
index.cfm?issue =22>.
<http://www.pkal.org>.
48 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
15
Additional information about the
organization’s increasing emphasis on examining
and disseminating new ideas about assessment is
available at <http://www.hhmi.org/grants/
undergraduate/assessment/>.
16
Additional information is available at <http://
www.abet.org/accreditation.html>.
17 18
Additional information about the ABET APA’s Principles for Quality Undergraduate
conferences is available at <http:// Psychology Programs is available at <http://
www.abet.org/annual_meeting_cover.html>. www.apa.org/ed/stmary.html>.
50 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
4
Evaluating Teaching in
Science, Technology,
Engineering, and Mathematics:
Principles and Research Findings
51
more accurate, particularly when based Input from Students and Peers
on the views of current and former • Evidence of learning from student
students, colleagues, and the instructor portfolios containing samples of their
or department being reviewed. The writing on essays, examinations, and
process of evaluating teaching has been presentations at student research
found to work best when all faculty conferences or regional or national
members in a given department (or, in meetings. Additional direct and indirect
smaller colleges, from across the classroom techniques that demonstrate
institution) play a strong role in develop- student learning are discussed in
ing policies and procedures. This is the Chapter 5.
case because evaluation criteria must be • Informed opinions of other mem-
clear, well known and understood, bers of the faculty member’s depart-
scheduled regularly, and acceptable to ment, particularly when those opinions
all who will be involved with rendering are based on direct observation of the
or receiving evaluation (Alverno College candidate’s teaching scholarship or
Faculty, 1994; Gardiner et al., 1997; practice. The ability to offer such input
Loacker, 2001; Wergin, 1994; Wergin comes from the reviewer’s observing a
and Swingen, 2000).2 series of the candidate’s classes, attend-
Evidence that can be most helpful in ing the candidate’s public lectures or
formatively evaluating an individual presentations at professional association
faculty member’s teaching efficacy and meetings, serving on curricular commit-
providing opportunities for further tees with the candidate, or team teach-
professional development includes the ing with the candidate. Opinions of
following points: faculty colleagues also can be based on
their observations of student perfor-
mance in courses that build upon those
2
Alverno College has sponsored a comprehen- taught by the faculty member being
sive research program on assessment of student
learning and means of tying that assessment to
evaluated.
ongoing improvement of both teaching by • Input by faculty from “user” depart-
individuals and departmental approaches to
ments for service courses and from
education. For additional information, see
Alverno College Faculty (1994). A more recent related disciplines for interdisciplinary
monograph edited by Loacker (2001) describes courses. Such information can be very
Alverno’s program, with a focus on how students
experience self-assessment and learn from it to helpful in determining whether students
improve their performance. Then from the are learning subject matter in ways that
perspective of various disciplines, individual
faculty explain how self-assessment works in will enable them to transfer that learn-
their courses.
52 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
4
Under its Course, Curriculum, and Labora-
5
tory Improvement program, the National Science The U.S. Department of Education’s Educa-
Foundation (NSF) now supports faculty mem- tional Resources Information Center system cites
bers who adopt and adapt successful models for more than 2,000 articles on research that focus
courses and pedagogy in their own teaching. on student evaluations. Additional information is
Additional information about this program is available at <http://ericae.net/scripts/ft/
available at <http://www.ehr.nsf.gov/ehr/due/ ftcongen.asp?wh1=STUDENT+ EVALUATION>.
programs/ccli/>.
54 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
member.6 To achieve the reliability for ratings in one type of course did not
summative evaluations advocated by necessarily receive similar ratings in
Gilmore et al., a newly hired tenure- other types of courses they taught.
track faculty member would need to be These differences may or may not be
evaluated each term for each course directly associated with variations in
taught during each of his or her teaching effectiveness in different
pretenure years. courses. For example, the same instruc-
On the other hand, the need for such tor may receive better evaluations for a
constant levels of student evaluation course that students elect to take than
could have the negative effect of stifling for a course that fulfills a general educa-
creativity and risk taking by the instruc- tion requirement.
tor in trying new teaching or assess- Research employing coefficient alpha
ment techniques. Indeed, on some analyses7 to establish the reliability
campuses, academic administrators are (relative agreement) of items within
waiving the requirement for counting factors or scale scores has revealed
student evaluations as part of faculty students’ ratings of faculty over short
members’ dossiers (although such periods of time (test–retest within a
evaluations may be collected from semester) to be stable. These results
students) when those faculty agree to suggest that student evaluations are
introduce alternative approaches to unlikely to be subject to day-to-day
teaching their courses and assessing changes in the moods of either students
student learning (Project Kaleidoscope, or teachers (Marsh, 1987).
personal communication).
How reliable are student evaluations Validity
when faculty members teach different Validity is the degree to which evi-
types of courses, such as large, lower dence and theory support interpreta-
division lecture classes and small tions of test scores. The process of
graduate research courses? According validation involves accumulating evi-
to the results of one study (Murray et
al., 1990), instructors who received high
7
Coefficient alpha analysis is a form of factor
analysis, used to verify the major dimensions
(factors) and the items within that dimension in
6
Gilmore et al. (1978) observe that if fewer an instrument. Coefficient alpha determines the
than 15 students per class provide the ratings, a extent to which the items within a factor or scale
greater number of courses need to be rated— are intercorrelated and thus measure a similar
preferably 10. characteristic.
56 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
dence to provide a sound scientific basis are generally critical. Two recent
for proposed score interpretations studies have shed light on this question
(AERA, APA, and NCME, 1999). by using students’ ratings of their own
The key questions related to validity learning as a proxy measure of examina-
of student evaluations are how well tion achievement scores. In both
results from student evaluations corre- studies, analyses of large datasets
late with other measures of teaching revealed a highly statistically significant
effectiveness and student learning, and relationship between a student’s self-
whether students learn more from rated learning and his or her rating of
effective than ineffective teachers. To teacher effectiveness in the course
explore the relationship between learn- (Cashin and Downey, 1999; Centra and
ing and student evaluations, Cohen Gaubatz, 2000b).
(1981) examined multisection courses Other validity studies have compared
that administered common final exami- students’ evaluations of their instructors
nations. Mean values of teaching with those prepared by trained observ-
effectiveness from student evaluations ers for the same instructors. In one
in each section were then correlated study, the trained observers noted that
with the class’s mean performance on teachers who had received high ratings
the final examination. A meta-analysis from students differed in several ways
of 41 such studies reporting on 68 from those who had received lower
separate multisection courses suggested ratings. Highly rated teachers were
that student evaluations are a valid more likely to repeat difficult ideas
indicator of teacher effectiveness several times and on different occa-
(Cohen, 1981). Correlations between sions, provide additional examples when
student grades and student ratings of necessary, speak clearly and expres-
instructors’ skills in course organization sively, and be sensitive to students’
and communication were higher than needs (Murray, 1983). In short, student
those between student grades and evaluations appear to be determined by
student ratings of faculty–student the instructor’s actual classroom behav-
interaction. ior rather than by other indicators, such
One limitation of Cohen’s study is that as a pleasing personality (see Ambady
multisection courses are typically lower and Rosenthal, 1993).
division courses. Therefore, the ques- Although all of the studies cited above
tion arises of whether similar correla- address short-term validity (end-of-
tions exist for upper level courses, course measures), critics have argued
where higher level learning outcomes that students may not appreciate de-
58 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
ematics and the natural sciences were sequence, but that students may per-
found to be more likely to receive lower ceive as having little to do with their
ratings than classes in other disciplines immediate academic interests or future
(Cashin, 1990; Feldman, 1978). The needs.
differences were not apparent for all Contrary to what one might other-
dimensions, however—the organization wise expect, studies have found that
of courses and the fairness of tests and instructors who received higher ratings
assignments were two areas in which did not assign less work or “water
students rated the disciplines similarly. down” their courses (Marsh, 1987;
Lower ratings for natural science and Marsh and Roche, 1993; Marsh and
mathematics classes in such dimensions Roche, 2000). Natural science courses
as faculty–student interaction, course not only were generally rated less
difficulty and pace, and presentation highly, but also were judged to be more
format (lecture versus discussion) difficult. In this particular case, stu-
suggested that these courses were less dents within those disciplines who gave
student-oriented, more difficult, faster- teachers high ratings also noted that
paced, and more likely to include those teachers assigned more work.
lecture presentations. What this ap- The student characteristics most
pears to indicate is that students did not frequently studied for their effects in
like these aspects of the courses and biasing evaluations of teaching include
may have learned less (Centra, 1993). grade point average, expected grade in
Student ratings can be influenced by the course, academic ability, and age.
many other variables that may interact According to most studies (e.g., Marsh
with or counteract the influence of and Roche, 2000; McKeachie, 1979,
discipline or course format. For ex- 1999), none of these characteristics
ample, studies have shown that students consistently affects student ratings.
tend to give slightly higher ratings to Despite this finding, some instructors
courses in their major field or to courses still firmly believe that students give
they chose to take, as opposed to those higher ratings to teachers from whom
they were required to take. The likely they expect to receive high grades.
reason is that students (and possibly Instructor characteristics that could
teachers as well) are generally less possibly influence ratings are gender,
interested in required courses. These race, and the students’ perception that
often include introductory or survey the faculty member is especially “enter-
courses that meet distribution require- taining” during instruction (Abrami et
ments in a college’s general education al., 1982). Several studies have analyzed
the effect of gender—of both the evalu- Graduating Seniors and Alumni
ating student and the teacher—on Evaluations of an instructor’s teach-
student evaluations. Most of these ing by graduating seniors and alumni
studies indicate there is no significant can be useful in providing information
difference in ratings given to male and about the effectiveness of both indi-
female instructors by students of the vidual teachers and the department’s
same or the opposite sex (Centra and overall curriculum. Current students
Gaubatz, 2000a; Feldman, 1993). In can comment on day-to-day aspects of
certain areas of the natural sciences and teaching effectiveness, such as the
engineering in which women faculty instructor’s ability to organize and
members are a distinct minority, female communicate ideas. Graduating seniors
teachers have been found to receive and alumni can make judgments from a
higher ratings than their male counter- broader, more mature perspective,
parts from both male and female stu- reflecting and reporting on the longer-
dents. Female teachers also were more term value and retention of what they
likely than male teachers to use discus- have learned from individual instructors
sion rather than lecturing as a primary and from departmental programs. They
method for teaching, which may help may be particularly effective contribu-
account for the higher ratings they tors to evaluations based on exit inter-
received (Centra and Gaubatz, 2000a). views (Light, 2001). There are, how-
The question of whether teachers ever, drawbacks to surveying seniors
who are highly entertaining or expres- and alumni, including difficulties in
sive receive higher ratings from stu- locating graduates and deciding which
dents has been examined in a series of students to survey (e.g., the percentage
“educational-seduction” studies (Abrami of students included in an evaluation
et al., 1982; Naftulin et al., 1973). In one process based on random surveys
study, researchers employed a profes- versus those recommended by the
sional actor to deliver a highly entertain- faculty member being evaluated), and
ing but inaccurate lecture. The actor the hazy memory alumni may have
received high ratings in this single about particular instructors (Centra,
lecture, particularly on his delivery of 1993).
content. A reasonable conclusion from
these studies is that by teaching more Teaching Assistants
enthusiastically, teachers will receive Teaching assistants are in a unique
higher ratings (Centra, 1993). position to provide information about
60 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
the teaching skills of the faculty mem- tions of teaching, few studies exist
bers with whom they work. They also concerning the efficacy of peer review,
can offer useful insight and perspective and those available tend to be limited in
on the collection of courses and cur- scope. Research has demonstrated that
ricula offered by their academic depart- extended direct observation of teaching
ment (Lambert and Tice, 1992; National by peers can be a highly effective means
Research Council [NRC], 1995b, 1997b, of evaluating the teaching of an indi-
2000b). Because teaching assistants vidual instructor (e.g., American Asso-
routinely observe classes and work with ciation for Higher Education [AAHE],
students throughout the term, they can 1995; Hutchings, 1996). However,
comment on course organization, the colleges and universities do not use
effectiveness of an instructor’s presenta- classroom observation widely in the
tions and interactions with students, the assessment of teaching.
fairness of examinations, and the like. A common but erroneous assumption
Teaching assistants also can assess how is that peer evaluations of teaching,
well the instructor guides, supervises, including evaluations by department
and contributes to the development and chairs, are best conducted through
enhancement of his or her own peda- classroom observation (Seldin, 1998).
gogical skills. As continuing graduate Even when peer evaluation does involve
students, however, teaching assistants extensive classroom observation,
may be vulnerable to pressures that problems can occur. For example, some
make it difficult to provide candid research has shown that when an
evaluations. Thus when they are asked instructor’s evaluation is based solely on
to evaluate their instructors, special classroom observation, the raters
precautions, such as ensuring confiden- exhibit low levels of concurrence in
tiality, must be taken. their ratings (Centra, 1975). This may
be because many faculty and administra-
Faculty Colleagues tors have had little experience in con-
Compared with the extensive re- ducting such reviews in ways that are
search on the utility8 of student evalua- fair and equitable to those being re-
viewed. Another reason may be that
such observation is not part of the
8
culture of teaching and learning within a
Utility denotes the extent to which using a
test to make or inform certain decisions is department. It may be possible to train
appropriate, economical, or otherwise feasible. faculty in observation analysis, provid-
The criterion of fairness is beginning to replace
utility in the scholarly literature on measurement. ing them with the skills, criteria, and
standards needed for consistent ratings ratings. The reliabilities of the evalua-
of a colleague’s classroom performance. tions (based on average intercorrela-
However, such efforts are time-consum- tions) were very high (above 0.90) for
ing and require more serious dedication each of the three performance areas. In
to the task than is usually given to fact, Root concluded that even a three-
teaching evaluations in higher educa- member committee working in similar
tion. fashion would be able to provide suffi-
Some studies have shown that faculty ciently reliable evaluations and in a very
believe they are better able to judge the short period of time—no more than an
research productivity of their colleagues hour or two. This study supports the
than their teaching effectiveness. use of colleague evaluations for
Kremer (1990) found that evaluations of summative decisions providing that the
research were more reliable than committee has previously discussed
evaluations of teaching or service. In evaluative criteria and expected stan-
that study, as is generally the case, dards of performance, and has a num-
faculty had access to more information ber of different sources of data on which
about their colleagues’ research than to base its evaluations.
about their teaching or service. Accord- This is a particularly critical point
ing to other studies, when faculty because at present, although tenure and
members have an extensive factual promotion committees at the college or
basis for their evaluations of teaching, university level always include faculty
there is higher reliability in their rat- representatives, such faculty usually do
ings. For example, Root (1987) studied not have the authority or the time
what happened when six elected faculty needed to make their own independent
members independently rated individual evaluation of a candidate’s performance
dossiers of other faculty. The dossiers in teaching, research, or service.
included course outlines, syllabi, teach- Instead they must rely almost entirely
ing materials, student evaluations, and on other sources, such as written or
documentation of curriculum develop- oral evaluations from colleagues in the
ment. The faculty members being candidate’s discipline or student evalua-
evaluated also submitted information tions.
about their scholarly and service activi- When conducted properly, review and
ties. Using cases that illustrated high evaluation by one’s colleagues can be an
and low ratings, the six-member com- effective means of improving teaching at
mittee reviewed and discussed criteria the college level, providing feedback for
for evaluation before making their ongoing professional development in
62 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
teaching, and enabling more informed different sections of the same course
personnel decisions (AAHE, 1993; (Bernstein and Quinlan, 1996; Edgerton
Chism, 1999; French-Lazovik, 1981; et al., 1991; Hutchings, 1995, 1996).
Hutchings, 1995, 1996; Keig and
Waggoner, 1994). AAHE recently Instructional Contributions
undertook an extensive, multiyear In addition to classroom observation,
initiative to examine ways of maximizing faculty colleagues can examine and
the effectiveness of peer review of comment on an instructor’s teaching-
teaching. A website describes the related activities. These kinds of evalua-
results and products of this initiative in tions might include examining syllabi,
detail.9 The ideas reviewed below distributed materials, or the content of
reflect the findings of the AAHE initia- tests and how well the tests align with
tive and other sources as cited. course goals. They might also address
the faculty member’s involvement with
Evaluation of Course Materials curriculum development, supervision of
Departments can obtain valuable student research, contributions to the
information about course offerings from professional development of colleagues
individual instructors by asking faculty and teaching assistants, publication of
to review and offer constructive criti- articles on teaching in disciplinary
cism of each other’s course materials journals, authorship of textbooks,
and approaches to teaching and learn- development of distance-learning or
ing. Faculty who teach comparable web-based materials, and related activi-
courses or different sections of the ties (Centra, 1993).
same course or who are particularly
knowledgeable about the subject matter Use of Students for
can conduct reviews of selected course Classroom Observation
materials. They can analyze those As noted above, peer observation can
materials with regard to such matters as be an effective evaluation technique if
the accuracy of information, approaches the observers are trained in the process.
to encouraging and assessing student Understandably, observation of col-
learning, and the consistency of expec- leagues remains a highly sensitive issue
tations among instructors who teach for some faculty members. In some
cases, the presence of the observer may
even affect the instructional dynamics of
9
Information about AAHE’s peer review of the course. For this reason, and also on
teaching initiative is available at <http://
www.aahe.org/teaching/Peer_Review.htm>. the grounds of fairness and balance, the
64 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
time, it was found that while teachers kind of evidence should be interpreted
tended to rate themselves higher than very cautiously since many factors other
their students did, they identified the than teaching effectiveness may account
same relative strengths and weaknesses for the findings. For example, recent
as did other evaluators (Centra, 1973, changes in an institution’s policy on
Feldman, 1989). Therefore, self-evalua- dropping courses may influence which
tions may be most useful in improving students decide to leave or remain in a
instruction, although corroborating course and when they elect to do so,
evidence from other sources may be independently of the instructor’s teach-
necessary to underscore needed ing effectiveness. If, however, records
changes. For summative purposes, show that a larger-than-normal fraction
however, most of the faculty queried in of the students in a professor’s course
one survey agreed with the findings of regularly drop out and repeat the class
research: self-evaluations lack validity at a later time, the attrition may be
and objectivity (Marsh, 1982). Although relevant to the quality of the instructor’s
quantifiable self-evaluations should thus teaching. Similarly, questions might be
probably not be used in summative raised about the quality of an
evaluations, teaching portfolios can be instructor’s teaching effectiveness
useful in improving instruction if they (especially in lower division courses) if a
are considered in conjunction with higher-than-normal fraction of students
independent evaluations from students, who have declared an interest in major-
colleagues, or teaching improvement ing in the subject area fails to enroll in
specialists. higher level courses within the depart-
ment (e.g., Seymour and Hewitt, 1997).
Institutional Data and Records In contrast, an unusual grade distribu-
tion may reflect some anomaly in a
Grade Distributions, Course Retention, particular class and should be consid-
and Subsequent Enrollment Figures ered in that light. For example, while
Historical records of grade distribu- the motives or competence of an in-
tions and enrollments within a depart- structor who consistently awards high
ment may provide supplemental infor- grades might be questioned, it is en-
mation about a faculty member’s teach- tirely possible that this individual has
ing when compared with data collected engaged his or her students in regular
from colleagues who have taught similar formative evaluations, which has helped
courses or are teaching different sec- them overcome academic problems and
tions of the same course. However, this learn more than might otherwise be
66 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
PART II
Evaluation Methodologies
Part I of this report describes recent need to correct their learning deficien-
research on ways to rethink and restruc- cies and misconceptions. When such
ture teaching and learning, coupled with evaluation is properly employed, stu-
new approaches to evaluation and dents learn that they can engage in self-
professional development for faculty. assessment and continuous improve-
Those findings have the potential to ment of performance throughout their
reshape undergraduate education in lives.
science, technology, engineering, and Accordingly, this chapter offers
mathematics (STEM) for a much larger practical guidance to postsecondary
number of undergraduates. However, faculty and administrators on ways to
developing strategies for implementing institute a system of both evaluation and
and sustaining such changes requires professional development that can
the commitment of all members of a contribute to significant gains in teach-
college or university community. ing effectiveness for faculty who teach
In a teaching and learning commu- undergraduates. The chapter describes
nity, the most effective evaluation is that how input from students (undergradu-
which encourages and rewards effective ates and graduate teaching assistants),
teaching practices on the basis of colleagues, and faculty self-evaluation
student learning outcomes (Doherty et can be used for evaluating individual
al., 2002; Shapiro and Levine, 1999). instructors. It also describes the advan-
Assessment of student learning at its tages and disadvantages of these vari-
best enables students to identify their ous approaches.
own strengths and weaknesses and to As stated in Chapter 1, ongoing
determine the kinds of information they formative assessment of student learn-
71
ing can have powerful benefits both in learning and the use of that information
improving learning and in helping to improve teaching are considered first.
faculty improve their teaching on the Additional strategies and methods for
basis of the feedback they receive from formative evaluation follow. The chap-
a variety of sources. The information ter concludes with a series of sugges-
gathered during such assessments also tions for improving summative evalua-
can serve as a basis for more formal, tion of faculty. The committee empha-
summative evaluations that have an sizes that the approaches described in
impact on important personnel deci- this chapter are but a sampling of the
sions. techniques that appear in the research
The technique of outcomes assess- literature on improving the evaluation of
ment as a means of measuring student teaching and student learning. They are
To many, the word “assessment” simply means the process by which we assign students
grades. Assessment is much more than this, however. Assessment is a mechanism for
providing instructors with data for improving their teaching methods and for guiding and
motivating students to be actively involved in their own learning. As such, assessment
provides important feedback to both instructors and students.
Assessment gives us essential information about what our students are learning and about
the extent to which we are meeting our teaching goals. But the true power of assessment
comes in also using it to give feedback to our students. Improving the quality of learning in
our courses involves not just determining to what extent students have mastered course
content at the end of the course; improving the quality of learning also involves determining
to what extent students are mastering content throughout the course.
72 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
included here on the basis of the and legislators that a program of study
committee’s analysis of the research produces competent graduates (Banta,
literature and the expertise of individual 2000).
committee members, and with the
expectation that each institution will Outcome Assessment Activities
adapt or modify these approaches Faculty members, both individually
according to its individual needs. and as colleagues examining their
department’s education programs, have
found the following activities helpful
IMPROVING TEACHING BY when undertaking outcome assessment:
EXAMINING STUDENT
• Developing expected student
LEARNING: OUTCOME
learning outcomes for an individual
ASSESSMENT
course of study, including laboratory
skills.
One approach to improving student
• Determining the point in a
learning is outcome assessment—the
student’s education (e.g., courses,
process of providing credible evidence
laboratories, and internships) at which
that an instructor’s objectives have been
he/she should develop the specified
obtained. Outcome assessment enables
knowledge and skills.
faculty to determine what students
• Incorporating the specified learn-
know and can do as a result of instruc-
ing outcomes in statements of objectives
tion in a course module, an entire
for the appropriate courses and experi-
course, or a sequence of courses. This
ences.
information can be used to indicate to
• Selecting or developing appropri-
students how successfully they have
ate assessment strategies to test student
mastered the course content they are
learning of the specified knowledge and
expected to assimilate. It can also be
skills.
used to provide faculty and academic
• Using the results from assessment
departments with guidance for improv-
to provide formative feedback to indi-
ing instruction, course content, and
vidual students and to improve curricu-
curricular structure. Moreover, faculty
lum and instruction.
and institutions can use secondary
• Adjusting expected learning
analysis of individual outcome assess-
outcomes if appropriate and assessing
ments to demonstrate to prospective
learning again. Such a process can lead
students, parents, college administra-
to continual improvement of curriculum
tors, employers, accreditation bodies,
and instruction.
E VA L U AT I O N M E T H O D O L O G I E S 73
74 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E VA L U AT I O N M E T H O D O L O G I E S 75
Formative Evaluation by 1
In contrast, Marsh and Roche (1993) report
Undergraduate Students that feedback gathered at the end of a course had
significantly greater long-term impact on the
Research has shown that the best way improvement of teaching than midcourse
evaluations.
to improve teaching is to provide indi-
76 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
offer both formative and summative can also be used to assess their effec-
feedback from their students about how tiveness as a teacher summatively.
various elements of their courses are However, many factors other than the
helping the students learn. This innova- instructor’s teaching competence can
tive website also allows students to affect examination results, including
complete the survey form on line and prior knowledge; students’ preconcep-
provides instructors with a statistical tions; and their ability, interest, and
analysis of the students’ responses.2 skills in the subject area (Centra, 1993).
The results of studies on formative Another factor is student effort. Even
evaluations of student learning indicate the most effective teachers can do only
that the techniques described below so much to motivate students. Although
require modest effort, are easy to carry most college teachers try to motivate
out, and consume very little class time. students to learn, in the end students
In addition, faculty can obtain regular must take responsibility for their own
feedback from their students through learning and academic achievement.
the use of course listservs, electronic For the past three years, the Carnegie
mail, or a website for student feedback Foundation for the Advancement of
connected to a course’s website. Teaching and Pew Forum on Under-
graduate Learning (2002) have pub-
Repeated Measurements of lished annually the National Survey of
Student Learning and Student Engagement: The College
Teaching Effectiveness Student Report. Each of these reports is
The typical end-of-course student compiled from responses to a question-
evaluation form is an indirect assess- naire whose respondents consist of
ment tool that can help an instructor thousands of first-year and senior
understand what worked to assist undergraduates at 4-year colleges and
learning in a course and what did not. universities.3 The students are asked
Instructors may feel that students’ about the extent to which they partici-
scores on final examinations in their pate in classroom and campus activities
courses provide a valid measure of shown by research studies to be impor-
student learning and that this measure tant to learning. Questions from the
2 3
Additional information and links to the The list of institutions that participated in this
survey forms are available at <http://www. project is available at <http://www.indiana.edu/
wcer.wisc.edu/salgains/instructor/>. ~nsse/>.
E VA L U AT I O N M E T H O D O L O G I E S 77
2001 survey instrument are provided in that lists the learning outcomes for a
Appendix B.4 This instrument and its course or series of courses. Students
parent, the College Student Experiences may be asked to indicate how much the
Questionnaire (Indiana University, course or the entire curriculum in-
2000), can provide important informa- creased their knowledge and skills in
tion about the quality of effort students the specified areas.
are committing to their work. For maximum usefulness, teachers
If a teaching evaluation form is may want to add their own course-
distributed only at the end of a course, it related items to student evaluation
cannot help the instructor make useful forms, as well as encourage written or
modifications for students who are oral communication from students,
currently enrolled. A better way to including computer-assisted feedback.
assess student learning and teaching Evaluations of laboratory, field, and
effectiveness is to test students at the extra clinical or discussion sections
beginning and then again at the end of a require special questions, as do evalua-
course and inspect the “gain scores.” tions of student advising (NISE, 2001a).
An instructor’s willingness and ability to
use gain scores to improve a course Direct Questioning of Students
may be considered favorably during a The easiest way to find out whether
summative evaluation of teaching. At students understand what is being said
the same time, gain scores are easily is to ask them directly. But unless
misinterpreted and manipulated and instructors have developed sufficient
may not be statistically reliable (both rapport and mutual respect among the
pre- and post-tests are characterized by students in their class, they should
unreliability that is compounded when avoid questions or situations that could
the two are used together). Therefore, make it awkward for students to re-
they should not be used exclusively to spond (“Who is lost?”) or are so generic
examine student learning for purposes as to lead to nonresponses (“Are there
of summative evaluation. any questions?”). Instead, instructors
Another indirect measure of student should pose questions that encourage
learning that some faculty have found more specific responses, (e.g., “How
particularly useful is a questionnaire many of you are understanding what we
are talking about?”). Various forms of
information technology, such as in-class
4
The survey instruments for both 2000 and response keypads, can facilitate asking
2001 are also available at <http://www.indiana. such questions, allowing students to
edu/~nsse/>.
78 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
answer without fearing that they will be A similar approach, developed by the
singled out or ridiculed by their peers if physics education community, is “just-in-
they indicate their lack of understand- time” teaching (Dougherty, 1999).
ing. Students are asked to respond to one or
Even better, instructors can ask two short questions posed by the
students to paraphrase briefly the key instructor the day before a subject is to
points or essence of a discussion or be taught. They submit their responses
lecture. At the end of a class session, via e-mail or to a website. These re-
students can be asked individually or in sponses give the instructor a good idea
pairs to write a brief summary of the of what the students do and do not
main ideas presented and submit it to understand about the concepts to be
the instructor (anonymously). If this considered. The instructor can then
method is used, students should clearly adjust the amount of time spent on
understand that the written summary is explaining the concepts, working
not a quiz and will not be graded. through problems, or providing ex-
amples that will help the students learn
Minute Papers and and understand the concepts.
Just-in-Time Teaching
At the end of a class, instructors can Student Teams
ask students to write for a minute or two Another documented approach
on one of the following kinds of ques- involves asking a team of students to
tions: “What is the most significant work throughout the term on continu-
thing you’ve learned today?” “What ous course evaluation (Baugher, 1992;
points are still not clear?” or “What Greene, 2000; Wright et al., 1998). The
question is uppermost in your mind at team members are encouraged to
the end of today’s class?” Responses administer questionnaires and interview
can help instructors evaluate how well their peers about how the instructor is
students are learning the material. or is not promoting learning.
Student responses to the second and For larger classes, a liaison commit-
third questions also can help instructors tee of two to four students can be
select and structure topics for the next established that meets periodically with
class meeting. Large numbers of such the instructor to discuss difficulties or
short papers can be read quickly, and a dissatisfactions. Membership on the
review of unclear concepts can take committee can be rotated from a list of
place at the next class meeting (Angelo volunteers as long as the entire class
and Cross, 1993; Schwartz, 1983). knows who the liaisons are at any given
E VA L U AT I O N M E T H O D O L O G I E S 79
time. Alternatively, students who are groups provide students with opportuni-
not enrolled in a course can be hired to ties to learn from one another, and a
attend the class and offer ongoing group may find it easier to seek assis-
feedback to the instructor (e.g., Greene, tance from the instructor. In turn,
2000). having group representatives rather
than individual students approach the
Students’ Course Notes instructor can reduce the amount of
With students’ permission, instruc- time required to answer repetitive
tors can ask to borrow a set of notes. questions, especially in larger classes.
This technique allows teachers to see
what students consider to be the main Informal Conversations
points presented and whether there is Instructors can seek feedback
misinformation or confusion about through informal conversations with
various topics. Alternatively, to ensure students during office hours, before or
student anonymity, students can be after class, or through e-mail. They can
asked to photocopy selected portions of ask students about what has been
their notes and submit them to the working well or what is problematic.
instructor without identifying informa- Instructors should not pose these
tion (Davis, 1993). questions to students in ways or at times
that might force them to answer quickly.
Chain Notes Questions should be directed to those
In small classes, it may be possible to students the teacher thinks would be
pass around a piece of paper midway most likely to respond candidly. When-
through a session and ask students to ever this kind of feedback is solicited,
jot down the main point of what is being instructors should keep in mind that
discussed at that moment. The instruc- such evidence is anecdotal and may not
tor then has a listing of what students be representative of the entire class.
consider to be the key concepts dis- However, informal responses from
cussed in that class period, which can individual students can serve as the
be used (Angelo and Cross, 1993). basis for index card questions to the
entire class (discussed next). Asking
Student Study Groups such questions based on informal
Students can be encouraged to form conversations with students can also
small study groups and to send repre- reinforce the message that the instruc-
sentatives to discuss any difficulties or tor is listening to students and takes
questions with the instructor. Study input from them seriously.
80 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E VA L U AT I O N M E T H O D O L O G I E S 81
82 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E VA L U AT I O N M E T H O D O L O G I E S 83
84 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
spective on the range of pedagogical students learn and how to improve their
approaches used by the colleague in his teaching. Referred to as the Master
or her teaching. Faculty Program, this initiative involves
faculty working together in pairs or in
Role of Colleagues in “Formal” triads. Faculty members observe each
Formative Evaluation other’s classes and interview each
Informal discussions and efforts to other’s students several times during
improve instruction among faculty the semester. Interviewers’ questions
members take place daily, but some emphasize student learning in the
departments and institutions employ course (for example, topics that may be
more systematic and formal efforts to difficult or reactions to specific class
assist in the improvement of teaching sessions). With these observations in
through formative evaluation. In addi- hand, the faculty participating in the
tion to the evaluation questionnaires program meet periodically to discuss
reprinted in Appendix C, the following candidly, and confidentially, how each
approaches to formative evaluation can participant has or has not fostered
be especially useful for the purposes of student learning. Chandler (1991) has
faculty professional development. documented the generally positive
results of this type of program involving
Faculty mentoring faculty. Increasingly, some 300 faculty at 21 different colleges
departments are assigning senior and universities.
faculty as mentors to untenured faculty.
Boice (1992) found that it was not Formative evaluation by faculty col-
necessary for successful mentors to be leagues from other institutions. Faculty
from the same department. Whether at higher education institutions across
from within or outside of the faculty the country and around the world can
member’s department, the ideal faculty provide formative evaluation to col-
mentor appears to play four major roles: leagues via the Internet. They can
friend, source of information, and career comment on the content of a faculty
and intellectual guide (NRC, 1997b; member’s websites for courses, old
Sands et al., 1991). examination questions, assignments,
At a variety of higher education and student responses to questions
institutions, Katz and Henry (1988) posed by the faculty member. This kind
developed a strategy of transdisciplinary of input from colleagues at other institu-
mentoring based on faculty working tions could be included as part of a
together to understand both how teaching portfolio or dossier for
E VA L U AT I O N M E T H O D O L O G I E S 85
summative evaluation, but also has teaching and bring to the scholarship of
great potential for ongoing formative teaching the same kinds of recognition
feedback. and reward afforded for other forms of
scholarly work (Hutchings, 2000).6
Projects of the American Association for Examples of the criteria being advanced
Higher Education. The American Asso- for evaluating a faculty member’s
ciation for Higher Education (AAHE) scholarship in teaching are presented in
has promoted collaboration in assessing Box 5-1, excerpted from Glassick et al.
and improving teaching through a (1997, p. 36). Centra (2001) has ex-
variety of projects. One such project, tended these criteria to allow for evalua-
conducted in the mid-1990s, involved 12 tion of the scholarship of teaching and
universities and stressed peer review as learning as practiced by academic
a means of formative evaluation. In this departments and institutions (see Box 5-
project, participants monitored their 2).
progress in improving student learning.
AAHE’s (1993) Making Teaching Com- Self-Evaluation
munity Property: A Menu for Peer Self-reports and self-reflections on an
Collaboration and Peer Review provides instructor’s teaching and promotion of
many other examples of peer review student learning can be important
efforts that contribute to formative sources of information for evaluating a
evaluation and improved professional teacher’s effectiveness (Hutchings,
development in teaching for faculty. 1998). These self-reports, which may be
More recently, AAHE, the Carnegie part of a required annual report or a
Foundation for the Advancement of teaching portfolio, are more useful and
Teaching, and the Carnegie Academy appropriate for formative or professional
for the Scholarship of Teaching and development purposes than for
Learning jointly developed a program summative personnel decisions. Faculty
for peer collaboration based on ideas who have not previously performed self-
and criteria advanced by Boyer (1990) evaluation may require assistance from
and Glassick and colleagues (1997). teaching and learning centers.
The goals of the program are to support As a summary of a professor’s major
the development of a scholarship of teaching accomplishments and
teaching and learning that will foster
significant, long-lasting learning for all
6
students. The program also seeks to Additional information about this program is
available at <http://www.carnegiefoundation.
enhance the practice and profession of org/CASTL/index.htm>.
86 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
Clear Goals: Does the scholar state the basic purposes of his or her work
clearly? Does the scholar define objectives that are realistic and achievable? Does
the scholar identify important questions in the field?
Adequate Preparation: Does the scholar show an understanding of existing
scholarship in the field? Does the scholar bring the necessary skills to his or her
work? Does the scholar bring together the resources necessary to move the project
forward?
Appropriate Methods: Does the scholar use methods appropriate to the
goals? Does the scholar apply effectively the methods selected? Does the scholar
modify procedures in response to changing circumstances?
Significant Results: Does the scholar achieve the goals? Does the scholar’s
work add consequentially to the field? Does the scholar’s work open additional areas
for further exploration?
Effective Presentation: Does the scholar use a suitable style and effective
organization to present his or her work? Does the scholar use appropriate forums for
communicating work to its intended audiences? Does the scholar present his or her
message with clarity and integrity?
Reflective Critique: Does the scholar critically evaluate his or her own work?
Does the scholar bring an appropriate breadth of evidence to his or her critique?
Does the scholar use evaluation to improve the quality of future work?
strengths (Shore et al., 1986), the how various materials were used in
teaching portfolio may include the teaching, innovations the instructor has
following kinds of evidence of teaching attempted and an evaluation of their
effectiveness: success, videotapes of teaching)
• Material or assessments from
• Development of new courses
others (student work and evaluations,
• Products of good teaching (for
input from colleagues or alumni)
example, student workbooks or logs,
• Descriptions of how the individual
student pre- and post-examination
has remained current in the field, such
results, graded student essays)
as using knowledge gained from attend-
• Material developed by the indi-
ing professional conferences (Edgerton
vidual (course and curriculum develop-
et al., 1991; Shore et al., 1986)
ment materials, syllabi, descriptions of
E VA L U AT I O N M E T H O D O L O G I E S 87
Focusing on Teaching
Making Teaching Practices and Learning Having Content and
Public Outcomes Pedagogical Knowledge
88 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E VA L U AT I O N M E T H O D O L O G I E S 89
• External support obtained for such semester has been completed, the
purposes as improving teaching or instructor prepares a similar brief
purchasing instrumentation for teaching description of the types of students who
laboratories actually enrolled, the instructional
methods that were used, and how the
Videotaping students’ achievement of major goals
Videotaping is a useful strategy that was measured. The evaluation should
enables instructors to see what they do address (1) goals the instructor believes
well and what needs to be improved. In were met and evidence of student
consultation with an expert from the learning and academic achievement, (2)
campus’s teaching and learning center, goals that were not realized, (3) the
instructors can determine whether they nature of and possible reasons for
exhibit such classroom behaviors as discrepancies between the instructor’s
dominating a discussion, allowing original intentions and actual outcomes,
students enough time to think through and (4) how the instructor might modify
questions, or encouraging all students the course in the future to achieve more
to participate in discussions. Faculty of the intended goals. These self-
who have been videotaped find the assessments can become part of a
experience extremely helpful, especially teaching portfolio that can later be used
if they discuss the analysis with some- for more summative types of evaluation.
one having expertise in classroom Another form of before-and-after
behavior. Videotaping is best used for assessment may help instructors who
formative evaluation. are interested in examining their teach-
ing behaviors and effectiveness rather
Before-and-After Self-Assessment than course outcomes. For this tech-
Faculty members can use before-and- nique, instructors use the end-of-course
after self-assessment to determine evaluation form, but complete the
whether course outcomes meet their questionnaire before their course
expectations. Before a course begins, begins (predicting how they think they
the instructor writes brief comments will do) and again at the end of the
about the types of students for whom semester (how they believe they did).
the course is intended. Given that They also may wish to fill out a ques-
audience, the instructor lists the most tionnaire at the end of the term based
important course and learning goals and on what they expect, on average, their
the teaching strategies she or he will students will say about their teaching.
design to achieve them. Once the In most cases, such self-evaluations are
90 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E VA L U AT I O N M E T H O D O L O G I E S 91
Global ratings of the course overall or dent evaluations for summative pur-
the teacher’s instructional effectiveness poses is provided in Appendix A.
also are common to most student
questionnaires. For courses in science Summative Evaluation by
and engineering, special questions Graduate Teaching Assistants
about the efficacy of laboratories, If a department wishes to involve
fieldwork, and research experiences teaching assistants in performing
also can be included as part of the summative evaluations of faculty or
standardized form or posed in a sepa- improving a department’s educational
rate questionnaire. For example, the offerings and approaches to teaching
University of Washington provides and learning, both the teaching assis-
separate evaluation forms for laborato- tants and faculty must feel confident that
ries, as well as for clinics and seminars the procedures for gathering informa-
(e.g., University of Washington Educa- tion will preserve the assistants’ ano-
tion Office of Educational Assessment7 ). nymity. Teaching assistants need to
Appendix B provides more specific know before participating how the
information about and several examples information will be used and who will
of student questionnaires for evaluating see the data.
undergraduate teaching. See also Davis When evaluations from teaching
(1988) for compilation of questions that assistants are to be used for personnel
can be used on an end-of-course ques- decisions, the department might con-
tionnaire. sider asking for written assessments.
It is important to note that question- Alternatively, a system might be estab-
naires usually do not permit students to lished whereby teaching assistants
assess such characteristics as an would be interviewed informally by a
instructor’s level of knowledge of member of the evaluation committee
subject matter. Students cannot and and their comments recorded and
should not evaluate instructors in this submitted collectively. In either case,
regard. Instead, faculty peers and teaching assistants should be asked to
colleagues should assess these charac- indicate the basis for their assessment.
teristics of an instructor’s teaching. Such information might include the
Additional detail about the use of stu- number of courses they have taught
with the instructor, descriptions of their
training and supervisory activities, the
7
nature and amount of their contact with
Additional information is available at <http://
www.washington.edu/oea/>. undergraduate students, whether they
92 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E VA L U AT I O N M E T H O D O L O G I E S 93
The materials to be considered by the What were the strong and weak points
committee could include a variety of of your teaching? What would you
teaching-related materials, all of which change or do differently the next time
would be supplied by the candidate: you teach the course? What did you
course syllabi and examinations, teach- find most interesting and most frustrat-
ing and learning aids, and evidence of ing about the course?
the impact of the candidate’s teaching The candidate’s department chair also
on students’ learning and intellectual could provide the committee with
growth. The faculty member also could student evaluations from courses taught
be asked to submit documentation for previously, names and addresses of
the following: currency of course student advisees, dissertation advisees,
content, participation in the design of enrollees in past and current courses,
courses, contributions to curriculum and the candidate’s cumulative teaching
and instruction, supervision of student portfolio if one has been prepared. The
research, advising duties, preparation of candidate should see the list of materi-
teaching assistants (if appropriate), and als submitted to the committee and be
individual and collaborative efforts to given the opportunity to supplement it.
improve teaching effectiveness. Through brief interviews, telephone
Candidates should also prepare and calls, letters, or brief survey question-
submit a self-assessment of their teach- naires issued to the candidate’s current
ing effectiveness. The self-assessment and former students from a variety of
could address questions such as the courses, the committee could compile a
following: What are the goals of your picture of students’ views of the teacher
teaching? Why were these goals se- that would supplement the written
lected? How did you know whether evaluation reports from past courses. In
students were gaining competence and addition, each committee member could
learning the material? How well did the observe and evaluate at least two of the
courses meet your learning goals for candidate’s classes.
your students, and how do you know? Studies of such ad hoc committees
What problems, if any, did you encoun- revealed that members met several
ter in attempting to meet these goals? times to discuss their individual find-
How did you conduct the course and ings, used a rating form, and prepared a
challenge and engage students? How report, which was then submitted to a
did your methods take into account the departmental tenure and promotion
levels and abilities of students? How committee (see Centra, 1993, pp. 129–
satisfied were you with the course? 131 for details). Given the highly
94 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E VA L U AT I O N M E T H O D O L O G I E S 95
96 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E VA L U AT I O N M E T H O D O L O G I E S 97
assumed that a faculty member would institution for improving their teaching.
continue to have access to all materials Members of such a panel might include
in his or her portfolio, unless letters a former recipient of the campus teach-
solicited or submitted in confidence ing award, a respected senior faculty
were protected under rules of the member who teaches introductory and
university. lower division courses, and a newly
tenured associate professor.
Feedback from Graduating
Seniors and Alumni Oversight Committee to Monitor
As part of the department’s regular Departmental Curriculum
academic program review, graduating and Instruction
seniors and alumni could be surveyed. The department chair could establish
Relevant survey information about an a permanent faculty committee to
individual instructor’s teaching effec- monitor the quality and effectiveness of
tiveness would be placed in his or her instruction by all members of the
teaching portfolio. Instructors should department. This committee would also
be made aware that such information oversee all evaluations of curriculum,
will be included in their portfolios and teaching, and student learning and,
be allowed to provide written comments where appropriate, nominate faculty for
or responses, where permissible. the campus’s or college’s teaching
awards.
Departmental Panel on Teaching
Effectiveness and Expectations Legal Considerations
In addition to an ad hoc department All stakeholders who are involved
committee to monitor candidates’ with the evaluation of teaching must act
progress in teaching, as discussed in accordance with institutional policies
above, the department as a whole could that have been designed to ensure
establish a faculty panel that would legally equitable and fair treatment of all
summarize the department’s policies involved parties. Such policies might
and procedures regarding expectations require, for example, that:
for teaching effectiveness, the methods
and criteria used to judge that effective- • The faculty be involved in the
ness, and the role of evaluation in design of an evaluation system, as well
academic personnel decisions. The as in evaluations of colleagues.
panel would remind faculty of the • The institution complies with all
resources available to them through the procedures specified in contracts or
98 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E VA L U AT I O N M E T H O D O L O G I E S 99
This report thus far has synthesized might be rendered by the department
the findings of research on evaluating or institution. This chapter presents
effective teaching, and has offered specific criteria that can be used when
specific recommendations to leaders in summative evaluations are undertaken.
the higher education community for These criteria are organized according
changing the climate and culture on to the five characteristics of effective
their campuses such that the evaluation teaching outlined in Chapter 2. It
of teaching will be valued, respected, should be emphasized that the criteria
and incorporated in the fabric of the suggested below are based on the
institution. The report also has empha- committee’s identification of best
sized that any system of teaching practices from an examination of the
evaluation should serve as a critical scholarly literature, but they are not
basis for improving student learning. exhaustive. Each evaluating depart-
The previous chapter provides a ment or institution is encouraged to
framework that departments and institu- select and, if necessary, modify those
tions can apply to evaluate the teaching criteria from the compendium presented
of individual faculty. It emphasizes the below that best suit its specific circum-
need for ongoing formative evaluation stances. As emphasized in Chapter 5,
that offers faculty members ample those who evaluate faculty teaching
opportunities, resources, and support should be careful to use multiple—and
systems for improving their teaching defensible—sources of evaluation,
prior to any summative evaluations that particularly for summative purposes.
100
E VA L U AT I O N O F I N D I V I D U A L F A C U L T Y 101
TABLE 6-1 Data Sources and Forms of Evaluation for Evaluating Knowledge and Enthusiasm
for Subject Matter
• Are organized and clearly commu- fullest potential and then employ the
nicate to students their expectations for professional knowledge and skill neces-
learning and academic achievement. sary to assist them in overcoming
• Focus on whether students are academic difficulties.
learning what is being taught and view
the learning process as a joint venture The following questions might be
between themselves and their students. posed for evaluation for this characteris-
• Give students adequate opportu- tic:
nity to build confidence by practicing • Does the instructor clearly commu-
skills. nicate the goals of the course to stu-
• Ask interesting and challenging dents?
questions. • Is the instructor aware of alterna-
• Encourage discussion and promote tive instructional methods or teaching
active learning strategies. strategies and able to select methods of
• Persistently monitor students’ instruction that are most effective in
progress toward achieving learning helping students learn (pedagogical
goals through discussions in class, out- content knowledge)?
of-class assignments, and other forms of • To what extent does the instructor
assessment. set explicit goals for student learning
• Have the ability to recognize those and persist in monitoring students’
students who are not achieving to their progress toward achieving those goals?
102 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
TABLE 6-2 Data Sources and Forms of Evaluation for Evaluating Skill in and Experience with
Appropriate Pedagogies and Technologies
Data sources and forms of evaluation • Know whether students are truly
for this characteristic are shown in learning what is being taught.
Table 6-2. • Determine accurately and fairly
students’ knowledge of the subject
matter and the extent to which learning
3. UNDERSTANDING OF AND has occurred throughout the term (not
SKILL IN USING APPROPRIATE just at the end of the course).
TESTING PRACTICES
The following questions might be
Summarizing the discussion of this posed for evaluation for this characteris-
characteristic in Chapter 2, effective tic:
teachers:
• Is the instructor aware of a range
of tools that can be used to assess
• Assess learning in ways that are
student learning?
consistent with the learning objectives
• Does the instructor select assess-
of a course and integrate stated course
ment techniques that are valid, reliable,
objectives with long-range curricular
and consistent with the goals and
goals.
learning outcomes of the course?
E VA L U AT I O N O F I N D I V I D U A L F A C U L T Y 103
TABLE 6-3 Data Sources and Forms of Evaluation for Evaluating Proficiency in Assessment
104 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E VA L U AT I O N O F I N D I V I D U A L F A C U L T Y 105
TABLE 6-4 Data Sources and Forms of Evaluation for Evaluating Professionalism with
Students Within and Beyond the Classroom
assistants? Does the instructor treat his needs to be seen as a scholarly pursuit
or her assistants with courtesy and as that takes place in collaboration with
professional colleagues? departmental colleagues, faculty in
other departments in the sciences and
Data sources and forms of evaluation engineering, and even more broadly
for this characteristic are shown in across disciplines. Such conversations
Table 6-4. enable faculty to better integrate the
course materials they present in their
courses with what is being taught in
5. INVOLVEMENT WITH AND other courses.
CONTRIBUTIONS TO ONE’S Summarizing the discussion of this
PROFESSION IN ENHANCING characteristic in Chapter 2, effective
TEACHING AND LEARNING teachers:
Much can be learned from teachers • Work with colleagues both on and
who work with colleagues both on and beyond campus, collaborating with
beyond the campus. Effective teaching departmental colleagues; faculty in
106 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
TABLE 6-5 Data Sources and Forms of Evaluation for Evaluating Professional Involvement and
Contributions
E VA L U AT I O N O F I N D I V I D U A L F A C U L T Y 107
Evaluation of Departmental
Undergraduate Programs
The discussion in this report thus far forces effective teaching practices and
has focused primarily on attributes of student learning. This position is
effective teaching by individual faculty consistent with the National Science
members. The central theme has been Foundation’s report Shaping the Future
that evidence of high-quality student (NSF, 1996, pp. 63–64), which recom-
learning should be the major criterion mends that college and university
for measuring a faculty member’s governing boards and academic admin-
teaching effectiveness. The report has istrators:
also emphasized the importance of
• Accept responsibility for the
using multiple indicators and different
learning of all students and make that
kinds of evaluators (e.g., students,
clear not only by what the institution
alumni, graduate assistants, colleagues),
says but also by putting in place
as well as increasing reliance on ongo-
mechanisms to discharge that respon-
ing formative evaluation, to provide a
sibility at the institutional and depart-
more holistic view of an individual’s
mental levels.
teaching effectiveness.
• Hold accountable and develop
The committee believes that similar
reward systems for departments and
expectations can and should apply to
programs, not just individuals, so that
academic departments and colleges.
the entire group feels responsible for
Departments should regularly evaluate
effective STEM (science, technology,
their current undergraduate programs
engineering, and mathematics)
and their commitment to fostering an
learning for all students.
environment that recognizes and rein-
108
E VA L U AT I O N O F D E P A R T M E N T A L U N D E R G R A D U AT E P R O G R A M S 109
future teachers (see McNeal and learning. Other issues on which depart-
D’Avanzo, 1997). mental members might focus include
There is growing consensus on the classroom teaching, academic advising
characteristics of effective undergradu- for students, and the roles of teaching
ate programs in STEM but too little laboratories and independent research
effort has been expended to date on opportunities in enhancing student
determining how measures of quality learning. Faculty and administrators
might be made more consonant and also need to reach consensus on the
consistent with national efforts to underlying assumptions, guidelines, and
improve undergraduate STEM educa- metrics they will use to improve under-
tion (e.g., Boyer Commission, 1998; graduate programs.
National Research Council [NRC], Many of the issues surrounding the
1995a, 1996a, 1999a; NSF, 1996, 1998; evaluation of teaching for individual
Rothman and Narum, 1999) or to align faculty also apply to the collective
such programs more closely with performance of academic departments.
national standards and benchmarks in The principles set forth in this report for
these disciplines for grades K–12 evaluating the teaching effectiveness of
(American Association for the Advance- individuals can easily be reshaped to
ment of Science [AAAS], 1993; Interna- apply to academic departments. This
tional Technology Education Associa- chapter lays a foundation for such
tion [ITEA], 2000; National Council of discussions.
Teachers of Mathematics [NCTM], Unlike the rest of the report, this
1989, 2000; NRC, 1996b). chapter offers no findings or recommen-
Members of academic departments, dations. Instead, it articulates a series
in conjunction with the principal aca- of questions that members of depart-
demic and executive officers on their ments might ask themselves and each
campuses, need to examine critically the other as they examine their unit’s role in
criteria they currently use to evaluate fostering the improvement of under-
the efficacy of their approaches to graduate education. These questions
undergraduate education. The first step are organized in accordance with the
in accomplishing this task is for each major responsibilities of departments in
department to adopt a mission state- the STEM disciplines.
ment on improving teaching and student
110 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E VA L U AT I O N O F D E P A R T M E N T A L U N D E R G R A D U AT E P R O G R A M S 111
112 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
• Does the department encourage assessing the role and nature of teach-
faculty members to integrate the cur- ing laboratories in the department’s
riculum of lower and upper division curriculum? For example, is there
courses? general agreement on whether labora-
tory exercises should parallel
Providing academic advising and coursework, provide students with
career planning: learning experiences not directly related
to work in classrooms, or some combi-
• Does the department view aca- nation of the two?
demic and career advising as central to • Does the department encourage
its mission? faculty to develop inquiry-based labora-
• Does the department encourage tory exercises that encourage students
faculty members to become more to develop their own hypotheses, design
effective academic and career advisors original experiments, and analyze data?
and provide the necessary resources • Have members of the department
and time for the purpose? discussed the criteria for assessing
• Does the department encourage students’ work in laboratories?
undergraduate students to undertake • Is the department familiar with the
real-world work and academic experi- use of virtual laboratories and the
ences through summer and academic- current status of research comparing
year internships? real and simulated approaches to
• Does the department bring people laboratory teaching and learning?
to campus for presentations to students
about career options and opportunities? Encouraging students to engage in
independent research:
E VA L U AT I O N O F D E P A R T M E N T A L U N D E R G R A D U AT E P R O G R A M S 113
114 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
Recommendations
115
116 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
R E C O M M E N D AT I O N S 117
118 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
should be accorded the same adminis- designed to provide fair and objec-
trative and collegial support that is tive information to aid faculty in the
available for other research and service improvement of their teaching.
endeavors. Faculty who wish to pursue Building consensus among faculty,
scholarly work by improving teaching or providing necessary resources, and
engaging in educational research relying on the best available re-
should be expected to conform to search on teaching, learning, and
standards of quality similar to those for measurement are critical for this
other types of scholarship (see Box 5-1 approach to evaluation.
in Chapter 5). They also should be
rewarded in ways that are comparable As discussed in Chapters 4 and 5,
to those associated with other forms of teaching portfolios, including a careful
scholarship during personnel decisions self-evaluation by the person being
on such matters as tenure, promotion, evaluated, can be an important tool for
and merit increases in salary. documenting a faculty member’s accom-
plishments in facilitating student learn-
(1.3) Valid summative assess- ing and academic achievement. Such
ments of teaching should not rely portfolios can be used for performing
only on student evaluations, but summative evaluation, but equally
should include peer reviews and important, for maintaining a record of
teaching portfolios used for promo- personal accomplishments and teaching
tion, tenure, and post-tenure re- issues that can serve as the basis for
view.1 Such assessments should be ongoing professional development.
Regardless of whether formalized
1
teaching portfolios are required for
Other organizations, such as the American
Association for Higher Education (AAHE), are evaluation of teaching, faculty should
currently engaged in efforts to explore issues collect a broad array of evidence of
associated with post-tenure review of faculty,
including the effectiveness of their teaching. teaching effectiveness that can be used
Therefore, the committee did not consider this for both formative and summative
issue in detail and offers no specific recommen-
dations about policies for post-tenure review of
evaluations. This evidence could
faculty. Additional information about the include, but not be limited to, the
program at AAHE and its recent publications on
following:
this issue (e.g., Licata and Morreale, 1997, 2002)
is available at <http://www.aahe.org/Bulletin/
aprilf1.htm>. Links to numerous other resources • Covering content at a level appro-
and policy statements on post-tenure review at
individual colleges and universities are available priate to course goals (particularly for a
at <http://www.google.com/search?hl=en&ie= course in a vertical sequence).
UTF-8&oe=UTF-8&q=post-tenure+review>.
R E C O M M E N D AT I O N S 119
120 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
R E C O M M E N D AT I O N S 121
122 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
R E C O M M E N D AT I O N S 123
124 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
R E C O M M E N D AT I O N S 125
126 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
4
The National Science Foundation recently
initiated a program that addresses this recom-
mendation. Its Centers for Learning and
Teaching program is designed to “…provide a
rich environment that melds research, teacher
professional development, and education
practice.” Additional information about this
initiative is available at <http://www.nsf.gov/cgi-
bin/getpub?nsf00148>.
R E C O M M E N D AT I O N S 127
References
Abd-El-Khalick, F., Bell, R.L., and Lederman, A menu for peer collaboration and peer review.
N.G. (1998). The nature of science and Washington, DC: Author.
instructional practice: Making the unnatural American Association for Higher Education.
natural. Science Education 82(4): 417–437. (1995). From idea to prototype: The peer review
Abrami, P.C., Leventhal, L., and Perry, R.P. of teaching—A project workbook. Washington,
(1982). Educational seduction. Review of DC: Author.
Educational Research 52, 446–464. American Association for Higher Education.
Allen, D.E., and Duch, B. (1998). Thinking (1996). Teaching, learning and technology
towards solutions: Problem-based learning (TLT) roundtable workbook. Washington, DC:
activities for general biology. Philadelphia, PA: Author.
Saunders College. American Educational Research Association,
Allen, D., Groh, S.E., and Allen, D.E. (Eds.). American Psychological Association, and
(2001). The power of problem-based learning: A National Council on Measurement in Educa-
practical “how-to” for teaching undergraduate tion. (1999). Standards for educational and
courses in any discipline. Sterling, VA: Stylus. psychological measurement. Washington, DC:
Alverno College Faculty. (1994). Student American Educational Research Association.
assessment-as-learning at Alverno College (3rd American Psychological Association. (2002).
Ed.). Milwaukee, WI: Alverno College. National guidelines and suggested learning
Ambady, N., and Rosenthal, R. (1993). Half a outcomes for the undergraduate psychology
minute: Predicting teacher evaluations from major (draft). Washington, DC: Author.
thin slices of nonverbal behavior and physical Anderson, E., (Ed.). (1993). Campus use of the
attractiveness. Journal of Personality and teaching portfolio: Twenty-five profiles.
Social Psychology 64(3), 431–441. Washington, DC: American Association for
American Association for the Advancement of Higher Education.
Science. (1989). Science for all Americans. Anderson, L.W., Krathwohl, D.R., and Bloom,
Washington, DC: Author. B.S. (Eds.). (2001). A taxonomy for learning,
American Association for the Advancement of teaching, and assessing: A revision of Bloom’s
Science. (1990). The liberal art of science. taxonomy of educational objectives, 2001.
Washington, DC: Author. Boston: Allyn and Bacon.
American Association for the Advancement of Angelo, T.A. (1995, November). Assessing (and
Science. (1993). Benchmarks for science defining) assessment. AAHE Bulletin 48(2).
literacy. New York: Oxford University Press. Angelo, T.A. (1999). The campus as a learning
American Association for Higher Education. community: Seven promising shifts and seven
(1993). Making teaching community property: powerful levers. In B.A. Pescosolido, and R.
Aminzade (Eds), The social worlds of higher
128
education: Handbook for teaching in a new Bernstein, D.J., and Quinlan, K.M. (Eds.). (1996,
century (pp. 110–116). Thousand Oaks, CA: Spring). The peer review of teaching [Special
Pine Forge Press. issue]. Innovative Higher Education 20(4).
Angelo, T.A., and Cross, K.P. (1993). Classroom Bleak, J., Neiman, H., Sternman, C., and Trower,
assessment techniques: A handbook for college C. (2000). Faculty recruitment study: Statistical
teachers (2nd ed.). San Francisco: Jossey- analysis report, August 2000. Cambridge, MA:
Bass. Harvard Graduate School of Education.
Armstrong, L. (2000). Distance learning: An Bloom, B.S. (Ed.). (1956). Taxonomy of educa-
academic leader’s perspective on a disruptive tional objectives: The classification of educa-
product. Change 32(6), 20–27. tional goals. Handbook I: Cognitive domain.
Astin, A.W., Banta, T.W., Cross, K.P., El-Khawas, New York: Longmans, Green.
E., Ewell, P.T., Hutchings, P., Marchese, T.J., Boice, R. (1992). The new faculty member. San
McClenney, M., Mentkowski, M., Miller, Francisco: Jossey-Bass.
M.A., Moran, E.T., and Wright, B.D. (1996). Borgman, C.L., Bates, M.J., Cloonan, M.V.,
Assessment forum: 9 principles of good practice Efthimiadis, E.N., Gilliland-Swetland, A.J.,
for assessing student learning. Washington, Kafai, Y.B., Leazer, G.H., and Maddox, A.B.
DC: American Association for Higher (1996). Social aspects of digital libraries. In
Education. See <http://www.aahe.org/ Proceedings of the University of California Los
principl.htm>. Angeles-National Science Foundation Social
Baker, P. (1999). Creating learning communities: Aspects of Digital Libraries Workshop, Febru-
The unfinished agenda. In B.A. Pescosolido, ary 15–17, 1996. See <http://
and R. Aminzade (Eds), The social worlds of is.geis.ucla.edu/research/dl/index.html>.
higher education: Handbook for teaching in a Boyer, E.L. (1990). Scholarship reconsidered:
new century (pp. 95–109). Thousand Oaks, Priorities of the professoriate. Princeton, NJ:
CA: Pine Forge Press. Carnegie Foundation for the Advancement of
Banta, T.W. (2000). That second look. Assessment Teaching.
Update 10(4), 3–14. Boyer Commission on Educating Undergradu-
Banta, T.W., Lund, J.P., Black, K.E., and ates in the Research University. (1998).
Oblander, F. W. (1996). Assessment in practice: Reinventing undergraduate education: A
Putting principles to work on college campuses. blueprint for America’s research universities.
San Francisco: Jossey-Bass. Menlo Park, CA: Carnegie Foundation for the
Barr, R.B., and Tagg, J. (1999). From teaching to Advancement of Teaching. See <http://
learning: A new paradigm for undergraduate notes.cc.sunysb.edu/Pres/boyer.nsf>.
education. In B.A. Pescosolido, and R. Brand, M. (2000). Changing faculty roles in
Aminzade (Eds.), The social worlds of higher research universities: Using the pathways
education: Handbook for teaching in a new strategy. Change 32(6), 42–45.
century (pp. 565–581). Thousand Oaks, CA: Braskamp, L., and Ory, J. (1994). Assessing
Pine Forge Press. faculty work: Enhancing individual and
Basinger, J. (2000, August 7). Carnegie issues institutional performance. San Francisco:
broad changes in system of classifying Jossey-Bass.
colleges. Chronicle of Higher Education. See Brinko, K.T. (1993). The practice of giving
<http://chronicle.com/daily/2000/08/ feedback to improve teaching: What is
2000080701n.htm>. effective? Journal of Higher Education 64(5),
Baugher, K. (1992). LEARN: The student quality 574–593.
team manual. Nashville, TN: LEARN. Brookfield, S.D. (1995). Becoming a critically
Bernstein, D.J. (1996, Spring). A departmental reflective teacher. San Francisco: Jossey-Bass.
system for balancing the development and Buck, G.A., Hehn, J.G., and Leslie-Pelecky, D.L.
evaluation of college teaching: A commentary (Eds.). (2000). The role of physics departments
on Cavanaugh. Innovative Higher Education in preparing k–12 teachers. College Park, MD:
20(4), 241–248. American Institute of Physics.
REFERENCES 129
Cambridge, B.L. (1996, Spring). The paradigm Kansas State University, Center for Faculty
shifts: Examining quality of teaching through Evaluation and Development.
assessment of student learning. Innovative Centra, J.A. (1993). Reflective faculty evaluation:
Higher Education 20(4), 287–298. Enhancing teaching and determining faculty
Cambridge, B.L. (Ed.). (1997). Assessing impact: effectiveness. San Francisco: Jossey-Bass.
Evidence and action. Washington, DC: Centra, J.A. (1994). The use of the teaching
American Association of Higher Education. portfolio and student evaluations for
Cambridge, B.L. (1999, December). The summative evaluation. Journal of Higher
scholarship of teaching and learning: Ques- Education 65(5), 555–570.
tions and answers from the field. AAHE Centra, J.A. (1998). The development of the student
Bulletin. See <http://www.aahe.org/Bulletin/ instructional report II, Higher Education
dec99f2.htm>. Assessment Program. Princeton, NJ: Educa-
Cambridge, B.L. (Ed.). (2001). Electronic tional Testing Service.
portfolios: Emerging practices in student, Centra, J.A. (2001). A model for assessing the
faculty, and institutional learning. Washington, scholarship of teaching. Presentation at the 9th
DC: American Association for Higher Annual American Association of Higher
Education. Education Conference on Faculty Roles and
Capelli, P. (Ed.). (1997). Change at work: Trends Rewards, February, Tampa, FL.
that are transforming the business of business. Centra, J.A., and Creech, F.R. (1976). The
Washington, DC: National Policy Association. relationship between student, teacher, and
Carnegie Foundation for the Advancement of course characteristics and student ratings of
Teaching and Pew Forum on Undergraduate teacher effectiveness. (Report No. PR-76-1).
Learning. (2002). National survey of student Princeton, NJ: Educational Testing Service.
engagement: The college student report. Menlo Centra, J.A., and Gaubatz, N.B. (2000a). Is there
Park, CA: Author. See <http:// gender bias in student evaluations of teach-
www.indiana.edu/~nsse/l>. ing? Journal of Higher Education 70(1), 17–33.
Cashin, W.E. (1990). Students do rate different Centra, J.A., and Gaubatz, N.B. (2000b). Student
academic fields differently. In M. Theall, and perceptions of learning and instructional
J. Franklin (Eds.), Student ratings of instruc- effectiveness in college courses, Higher Educa-
tion: Issues for improving practice. San tion Assessment Program. (Report No. 9).
Francisco: Jossey-Bass. Princeton, NJ: Educational Testing Service.
Cashin, W.E., and Downey, R.G. (1999). Using Chandler, S. (1991). Issues in evaluating multi-
global student rating items for summative institutional models. Paper presented at the
evaluation: Convergence with an overall American Educational Research Association
instructor evaluation criteria. Paper presented Symposium, Chicago.
at American Educational Research Associa- Chickering, A.W., and Gamson, Z.F. (1987).
tion, April, Washington, DC. Seven principles for good practice in under-
Centra, J.A. (1973). Item reliabilities, the factor graduate education. AAHE Bulletin 39, 3–7.
structure, comparison with alumni ratings. Chism, N.V.N. (1999). Peer review of teaching: A
(Student Instructional Report, No. 3). sourcebook. Boston: Anker.
Princeton, NJ: Educational Testing Service. Clark, J., and Redmond, M. (1982). Small group
Centra, J.A. (1974). The relationship between instructional diagnosis. (ERIC Ed. 217954).
student and alumni ratings of teachers. Seattle: University of Washington, Depart-
Educational and Psychological Measurement ment of Biology Education.
34(2), 321–326. Cochran, K.F. (1997). Pedagogical content
Centra, J.A. (1975). Colleagues as raters of knowledge: Teachers’ integration of subject
classroom instruction. Journal of Higher matter, pedagogy, students, and learning
Education 46, 327–337. environments. Research Matters–to the Science
Centra, J.A. (1979). Determining faculty effective- Teacher, #9702.
ness. San Francisco: Jossey-Bass. Cohen, P.A. (1981). Student ratings of instruction
Centra, J.A. (1987). Faculty evaluation: Past and student achievement: A meta-analysis of
practices, future directions. Manhattan, KS: multisection validity studies. Review of
Educational Research 51, 281–309.
130 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
Collis, B., and Moonen, J. (2001). Flexible Diamond, R.M., and Adams, B.E. (Eds.). (2000).
learning in a digital world: Experiences and The disciplines speak II: More statements on
expectations. London, UK: Kogan Page. rewarding the scholarly, professional, and
Cooper, J., and Robinson, P. (1998). Small-group creative work of faculty. Washington, DC:
instruction: An annotated bibliography of American Association for Higher Education.
science, mathematics, engineering and Diamond, R.M., and Gray, P.J. (1998). 1997
technology resources in higher education National Study of Teaching Assistants.
(Occasional Paper #6). Madison: University of Syracuse, NY: Syracuse University Center for
Wisconsin–Madison, National Institute for Instructional Development.
Science Education. Doherty, A., Riordan, T., and Roth, J. (Eds.).
Coppola, B.P., and Smith, D.H. (1996). A case for (2002). Student learning: A central focus for
ethics. Journal of Chemical Education 73, 33– institutions of higher education. (A report and
34. collection of institutional practices of the
Coppola, B.P., Ege, S.N., and Lawton, R.G. (1997). Student Learning Initiative.) Milwaukee, WI:
The University of Michigan undergraduate Alverno College Institute.
chemistry curriculum 2— Instructional Dougherty, A. (1999). Just-in-time, teaching using
strategies and assessment. Journal of Chemi- Web feedback to guide learning and teaching.
cal Education 74, 84–94. See <http://ww2.lafayette.edu/~doughera/
Coppola. B.P., and Jacobs, D. (2002). Is the talks/ets9912/>.
scholarship of teaching and learning new to Doyle, M.P. (Ed.). (2000). Academic excellence:
chemistry? In M.T. Huber, and S. Morreale The role of research in the physical sciences at
(Eds.), Disciplinary styles in the scholarship of undergraduate institutions. Tucson, AZ:
teaching and learning: Exploring common Research Corporation.
ground. Washington, DC: American Associa- Drucker, A.J., and Remmers, H.H. (1951). Do
tion for Higher Education and the Carnegie alumni and students differ in their attitudes
Foundation for the Advancement of Teaching. toward instructors? Journal of Educational
Cornell University. (1993). Teaching evaluation Psychology 42(3), 129–143.
handbook. Ithaca, NY: Cornell University, Ebert-May, D., Brewer, C.A., and Allred, S.
Office of Instructional Support. (1997). Innovation in large lectures: Teaching
Cross, K.P., and Steadman, M.H. (1996). Class- for active learning through inquiry. BioScience
room research: Implementing the scholarship of 47(9), 601–607.
teaching. San Francisco: Jossey-Bass. Edgerton, R., Hutchings, P., and Quinlan, K.
Daffinrud, S.M., and Herrera, O.L. (2000). (1991). The teaching portfolio: Capturing the
Evaluation methods and findings: The Field- scholarship in teaching. Washington, DC: The
Tested Learning Assessment Guide (FLAG) American Association for Higher Education.
final report. Madison, WI: University of Ehrmann, S.C. (2000, November/December). On
Wisconsin, LEAD Center. the necessity of grassroots evaluation of
Darling-Hammond, L. (Ed.). (1997). Doing what educational technology: Recommendations for
matters most: Investing in quality teaching. higher education. See <http://
New York: National Commission on Teaching horizon.unc.edu/TS/assessment/2000-
and America’s Future. 11.asp>.
Davis, B.G. (1988). Sourcebook for evaluating Ehrmann, S.C. (In press). Technology and
teaching. Berkeley: University of California at educational revolution: Ending the cycle of
Berkeley, Office of Educational Development. failure. Liberal Education. See draft <http://
Davis, B.G. (1993). Tools for teaching. San www.tltgroup.org/resources/
Francisco: Jossey-Bass. V_Cycle_of_Failure.html>.
Diamond, R.M., and Adams, B.E. (Eds.). (1995). Emerson, J.D., Mosteller, F., and Youtz, C.
The disciplines speak: Rewarding the scholarly, (2000). Students can help improve college
professional, and creative work of faculty. teaching: A review and an agenda for the
Washington, DC: American Association for statistics profession. In C. R. Rao, and G.
Higher Education. Szekely (Eds.), Statistics for the 21st century.
New York: Marcel Dekker.
REFERENCES 131
Emert, J. W., and Parish, C. R. (1996). Assessing Gardiner, L., Anderson, C., and Cambridge, B.
concept attainment in undergraduate core (Eds.). (1997). Learning through assessment: A
courses in mathematics. In T.W. Banta, J. P. resource guide for higher education. Washing-
Lund, K.E. Black, and F.W. Oblander (Eds.), ton, DC: American Association for Higher
Assessment in practice. San Francisco: Jossey- Education.
Bass. Gavin, R. (2000). The role of research at under-
Ewell, P.T. (1991). To capture the ineffable: New graduate institutions: Why is it necessary to
forms of assessment in higher education. defend it? In M.P. Doyle (Ed.), Academic
Review of Research in Education 17, 75–125. excellence: The role of research in the physical
Feldman, K.A. (1978). Course characteristics and sciences at undergraduate institutions (pp. 9–
college students’ ratings of their teachers and 17). Tucson, AZ: Research Corporation.
courses: What we know and what we don’t. Gilmore, G.M., Kane, M.T., and Naccarato, R.W.
Research in Higher Education 9, 199–242. (1978). The generalizability of student ratings
Feldman, K.A. (1984). Class size and college of instruction: Estimation of teacher and
students’ evaluations of teachers and courses: course components. Journal of Educational
A closer look. Research in Higher Education Measurement 15(1), 1–13.
21(1):45–116, September. Glassick, C.E., Huber, M.T., and Maeroff, G.I.
Feldman, K.A. (1989). Instructional effectiveness (1997). Scholarship assessed: Evaluation of the
of college teachers as judged by teachers professoriate. San Francisco: Jossey-Bass.
themselves, current and former students, Goheen, R.F. (1969). The human nature of a
colleagues, administrators and external university. Princeton, NJ: Princeton University
(neutral) observers. Research in Higher Press.
Education 30, 137–189. Golde, C.M., and Dore, T.M. (2001). At cross
Feldman, K.A. (1993). College students’ views of purposes: What the experiences of today’s
male and female college teachers: Part II– doctoral students reveal about doctoral
Evidence from students’ evaluations of their education. Philadelphia, PA: Pew Charitable
classroom teachers. Research in Higher Trusts.
Education 34(2), 151–211. Gray, J., Diamond, R.M., and Adam, B.E. (1996).
Fisch, L. (1998). AD REM: Spotters. The national A national study on the relative importance of
teaching and learning forum. Phoenix, AZ: research and undergraduate teaching at
Oryx Press. colleges and universities. Syracuse, NY:
Fosnot, C. (Ed.). (1996). Constructivism: Theory, Syracuse University Center for Instructional
perspectives, and practice. New York: Teachers Development.
College Press. Greene, E. (2000, June 23). Some colleges pay
Freedman, R.L.H. (1994). Open-ended question- students to go to class—to evaluate teaching.
ing. New York: Addison Wesley. The Chronicle of Higher Education, A18.
French-Lazovik, G. (1981). Documentary Greenspan, A. (2000). The economic importance
evidence in the evaluation of teaching. In J. of improving math-science education. Testi-
Millman (Ed.), Handbook of teacher evaluation mony to the U.S. House of Representatives
(pp. 73–89). Newbury Park, CA: Sage. Committee on Education and the Workforce,
Gabel, D.L. (Ed.) (1994). Handbook of research on Washington, DC. See <http://
science teaching and learning. New York: www.federalreserve.gov/boarddocs/testi-
Macmillan. mony/2000/20000921.htm>.
Gabelnick, F., MacGregor, J., Matthews, R., and Grouws, D.A. (1992). Handbook of research on
Smith, B. (1990). Learning communities: mathematics teaching and learning. New York:
Creating connections among students, faculty, Macmillan.
and disciplines. San Francisco: Jossey-Bass. Herron, J.D. (1996). The chemistry classroom:
Gaff, J.G., Pruitt-Logan, A.S., and Weibl, R.A. Formulas for successful teaching. Washington,
(2000). Building the faculty we need: Colleges DC: American Chemical Society.
and universities working together. Washington, Hestenes, D. (1987). Toward a modeling theory
DC: Association of American Colleges and of physics instruction. American Journal of
Universities. Physics 55, 440–454.
132 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
Hestenes, D., and Halloun, I. (1995). Interpreting Johnson, D.W., Johnson, R.T., and Smith, K.
the force concept inventory. Physics Teacher (1998). Active learning: Cooperation in the
33(8), 502–506. college classroom. Edina, MN: Interaction
Heterick, R., and Twigg, C. (1999). Lectures are Books.
not cheap! See <http://www.center.rpi.edu/ Joint Policy Board for Mathematics. (1994).
LForum/LM/Sept99.html>. Recognitions and rewards in the mathematical
Huber, M.T. (1999). Disciplinary styles in the sciences. Washington, DC: American Math-
scholarship of teaching: Reflections on the ematical Society.
Carnegie Academy for the Scholarship of Katz, J., and Henry, M. (1988). Turning professors
Teaching and Learning. Paper presented at into teachers: A new approach to faculty
the 7th International Improving Student development and student learning. New York:
Learning Symposium Improving Student Macmillan.
Learning Through the Disciplines, Menlo Keig, L., and Waggoner, M.D. (1994). Collabora-
Park, CA. tive peer review: The role of faculty in improv-
Huber, M.T. (2001, July/August). Balancing acts: ing college teaching. (ASHE-ERIC Higher
Designing careers around the scholarship of Education Report #2). Washington, DC:
teaching. Change, 21–29. George Washington University.
Huber, M.T., and Morreale, S. (Eds.). (2002). Kellogg Commission on the Future of State and
Disciplinary styles in the scholarship of teaching Land-Grant Universities. (1997). Returning to
and learning: Exploring common ground. our roots: The student experience. Washington,
Washington, DC: American Association for DC: National Association of State Universities
Higher Education and the Carnegie Founda- and Land-Grant Colleges.
tion for the Advancement of Teaching. Kennedy, D. (1997). Academic duty. Cambridge,
Hutchings, P. (Ed.). (1995). From idea to MA: Harvard University Press.
prototype: The peer review of teaching, a King, P.M., and Kitchener, K.S. (1994). Develop-
project workbook. Washington, DC: American ing reflective judgment: Understanding and
Association for Higher Education. promoting intellectual growth and critical
Hutchings, P. (Ed.). (1996). Making teaching thinking in adolescents and adults. San
community property: A menu for peer collabora- Francisco: Jossey-Bass.
tion and peer review. Washington, DC: Koon, J., and Murray, H.G. (1995). Using
American Association for Higher Education. multiple outcomes to validate student ratings
Hutchings, P. (Ed.). (1998). The course portfolio: of overall teacher effectiveness. Journal of
How faculty can examine their teaching to Higher Education 66(1), 61–81.
advance practice and improve student learning. Kremer, J. (1990). Constant validity of multiple
Washington, DC: American Association for measures in teaching, research, and service
Higher Education. and reliability of peer ratings. Journal of
Hutchings, P. (Ed.). (2000). Opening lines: Educational Psychology 82, 213–218.
Approaches to the scholarship of teaching and Lambert, L.M., and Tice, S.L. (Eds.). (1992).
learning. Menlo Park, CA: Carnegie Founda- Preparing graduate students to teach: A guide
tion for the Advancement of Teaching. to programs that improve undergraduate
Indiana University. (2000). The college student education and develop tomorrow’s faculty.
experiences questionnaire. See <http:// Washington, DC: American Association for
www.indiana.edu/~cseq/cseq_content.htm>. Higher Education.
International Technology Education Association. Landis, C.R., Ellis, A.B., Lisenky, G.C., Lorenz,
(2000). Standards for technological literacy: J.K., Meekder, K., and Wamser, C.C. (Eds.).
Content for the study of technology. Reston, VA: (2001). Chemistry concept tests: A pathway to
Author. See <http://www.iteawww.org/TAA/ interactive classrooms. Upper Saddle River, NJ:
STLstds.htm>. Prentice Hall.
Ireton, M.F.W., Manduco, C.A., and Mogk, D.W. Lederman, N.G., and O’Malley, M. (1990).
(1996). Shaping the future of undergraduate Students’ perceptions of tentativeness in
earth science education: Innovation and change science: Development, use, and sources of
using an earth system approach. Washington, change. Science Education 74, 225–239.
DC: American Geophysical Union.
REFERENCES 133
134 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
ing: A human constructivist view. San Diego, National Council of Teachers of Mathematics.
CA: Academic Press. (2000). Principles and standards for school
Mullin, R. (2001). The undergraduate revolution: mathematics. Reston, VA: Author.
Change the system or give incrementalism National Institute for Science Education. (2001a).
another 30 years? Change 33(5), 54–58. Learning through technology. Madison, WI:
Murnane, R.J., and Levy, F. (1996). Teaching the Author. See <http://www.wcer.wisc.edu/nise/
new basic skills: Principles for educating cl1/ilt/>.
children to thrive in a changing economy. New National Institute for Science Education. (2001b).
York: Free Press. Field-tested learning assessment guide.
Murray, H.G. (1983). Low inference classroom Madison, WI: Author. See <http://
teaching behaviors and student ratings of www.wcer.wisc.edu/nise/cl1/flag/>.
college teaching effectiveness. Journal of National Institute for Science Education. (2001c).
Educational Psychology 71, 856–865. Cooperative learning. See <http://
Murray, H.G., Rushton, P.J., and Paunonen, S.V. www.wcer.wisc.edu/nise/CL1/CL/>.
(1990). Teacher personality traits and student National Research Council. (1991). Moving
instructional ratings in six types of university beyond myths: Revitalizing undergraduate
courses. Journal of Educational Psychology 82, mathematics. Washington, DC: National
250–261. Academy Press. See <http://books.nap.edu/
Murray, H.G, Gillese, E., Lennon, M., Mercer, P., catalog/1782.html>.
and Robinson, M. (1996). Ethical principles in National Research Council. (1995a). Engineering
university teaching. Vancouver, BC: The education: Designing an adaptive system.
Society for Teaching and Learning in Higher Washington, DC: National Academy Press.
Education. See <http://www.umanitoba.ca/ See <http://books.nap.edu/catalog/
academic_support/uts/stlhe/Ethical.html>. 4907.html>.
Naftulin, D.H., Ware, J.E., and Donnelly, F.A. National Research Council. (1995b). Reshaping
(1973). The Doctor Fox lecture: A paradigm of the graduate education of scientists and
educational seduction. Journal of Medical engineers. Washington, DC: National Acad-
Education 48, 630–635. emy Press. See <http://books.nap.edu/
Narum, J. (1995). Structures for science: A catalog/4935.html>.
handbook on planning facilities for undergradu- National Research Council. (1996a). National
ate natural science communities (Vol. III). science education standards. Washington, DC:
Washington, DC: Project Kaleidoscope. National Academy Press. See <http://
National Academy of Sciences. (1997). Preparing books.nap.edu/catalog/4962.html>.
for the 21st century: The education imperative. National Research Council. (1996b). From
Washington, DC: National Academy Press. analysis to action. Washington, DC: National
See <http://www.nap.edu/catalog/ Academy Press. See <http://books.nap.edu/
9537.html>. catalog/9128.html>.
National Center for Education Statistics. (1999). National Research Council. (1996c). The role of
Teacher quality: A report on the preparation scientists in the professional development of
and qualification of public school teachers. science teachers. Washington, DC: National
Washington, DC: U.S. Department of Educa- Academy Press. See <http://books.nap.edu/
tion. catalog/2310.html>.
National Center for Public Policy and Higher National Research Council. (1997a). Science
Education. (2001). Measuring up 2000: The teaching reconsidered: A handbook. Washing-
state-by-state report card for higher education. ton, DC: National Academy Press. See <http:/
San Jose, CA: Author. See <http:// /books.nap.edu/catalog/5287.html>.
www.highereducation.org>. National Research Council. (1997b). Adviser,
National Council of Teachers of Mathematics. teacher, role model, friend: On being a mentor
(1989). Curriculum and evaluation standards to students in science and engineering. Wash-
for school mathematics. Reston, VA: Author. ington, DC: National Research Council. See
<http://books.nap.edu/catalog/5789.html>.
REFERENCES 135
National Research Council. (1998a). High stakes: guide for teaching and learning. Washington,
Testing for tracking, promotion, and gradua- DC: National Academy Press. See <http://
tion. Washington, DC: National Academy www.nap.edu/catalog/9596.html>.
Press. See <http://books.nap.edu/catalog/ National Research Council. (2000e). LC21: A
6336.html>. digital strategy for the Library of Congress.
National Research Council. (1998b). Developing a Washington, DC: National Academy Press.
digital national library for undergraduate See <http://books.nap.edu/catalog/
science, mathematics, engineering, and 9940.html>.
technology education: Report of a workshop. National Research Council. (2001). Knowing what
Washington, DC: National Academy Press. students know: The science and design of
See <http://books.nap.edu/catalog/ educational assessment. Washington, DC:
5952.html>. National Academy Press. See <http://
National Research Council. (1999a). Transform- www.nap.edu/catalog/10019.html>.
ing undergraduate education in science, National Research Council. (2002a). Learning
mathematics, engineering, and technology. and understanding: Improving advanced study
Washington, DC: National Academy Press. of science and mathematics in U.S. high
See <http://www.nap.edu/catalog/ schools. Washington, DC: National Academy
6453.html>. Press. See <http://www.nap.edu/catalog/
National Research Council. (1999b). Improving 10129.html.
student learning: A strategic plan for education National Research Council. (2002b). Scientific
research and its utilization. Washington, DC: research in education. Washington, DC:
National Academy Press. See <http:// National Academy Press. See <http://
books.nap.edu/catalog/6488.html>. www.nap.edu/catalog/10236.html>.
National Research Council. (1999c). Global National Science Board. (2000). Science and
perspectives for local action: Using TIMSS to engineering indicators—2000. Arlington, VA:
improve U.S. mathematics and science Author. See <http://www.nsf.gov/
education. Washington, DC: National Acad- search97cgi/vtopic>.
emy Press. See <http://books.nap.edu/ National Science Foundation. (1996). Shaping the
catalog/9723.html>. future: New expectations for undergraduate
National Research Council. (1999d). Myths and education in science, mathematics, engineering,
tradeoffs: The role of tests in undergraduate and technology. (NSF 96–139). Arlington, VA:
admissions. Washington, DC: National Author. See <http://www.nsf.gov/cgi-bin/
Academy Press. See <http://books.nap.edu/ getpub?nsf96139>.
catalog/9632.html>. National Science Foundation. (1998). Information
National Research Council. (2000a). Building a technology: Its impact on undergraduate
workforce for the information economy. education in science, mathematics, engineering,
Washington, DC: National Academy Press. and technology. (NSF 98–82). Arlington, VA:
See <http://books.nap.edu/catalog/ Author. See <http://www.nsf.gov/cgi-bin/
9830.html>. getpub?nsf9882>.
National Research Council. (2000b). Educating Neff, R.A., and Weimer, M. (Eds.). (1990).
teachers of science, mathematics, and technol- Teaching college: Collected readings for new
ogy: New practices for the new millennium. instructors. Madison, WI: Magna.
Washington, DC: National Academy Press. Novak, J. (1998). Learning, creating, and using
See <http://books.nap.edu/catalog/ knowledge: Concept maps as facilitative tools in
9832.html>. schools and corporations. Mahwah, NJ:
National Research Council. (2000c). How people Lawrence Erlbaum.
learn: Brain, mind, experience, and school: Nyquist, J.D., Abbott, R.D., Wulff, D.H., and
Expanded edition. Washington, DC: National Sprague, J. (Eds.). (1991). Preparing the
Academy Press. See <http://www.nap.edu/ professoriate of tomorrow to teach: Selected
catalog/9853.html>. readings in TA training. Dubuque, IA:
National Research Council. (2000d). Inquiry and Kendall/Hunt.
the national science education standards: A
136 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
Ory, J.C. (2000). Teaching evaluation: Past, Root, L.S. (1987). Faculty evaluation: Reliability of
present, and future. In K.E. Ryan (Ed.), peer assessments of research, teaching, and
Evaluating teaching in higher education: A service. Research in Higher Education 26, 71–
vision for the future: New directions in teaching 84.
and learning (pp. 13–18). San Francisco, CA: Rosenthal, R. (1976). Experimenter effects in
Jossey-Bass. behavioral research. New York: Appleton-
Osterlind, S.J. (1989). Constructing test items. Century-Croft.
Boston: Kluwer Academic. Rothman, F.G., and Narum, J.L. (1999). Then,
Overall, J.U., and Marsh, H.W. (1980). Students’ now, and in the next decade: A commentary on
evaluations of instruction: A longitudinal study strengthening undergraduate science, math-
of their stability. Journal of Educational ematics, engineering and technology education.
Psychology 72, 321–325. Washington, DC: Project Kaleidoscope. See
Palomba, C.A., and Banta, T.W. (1999). Assess- <http://www.pkal.org/news/
ment essentials, planning, implementing, and thennow100.html>.
improving assessment in higher education. San Rust, E. (1998). Business cares about math and
Francisco: Jossey-Bass. science achievement. In Business Coalition
Perry, R.P., and Smart, J.C. (Eds.). (1997). for Education Reform, The formula for success:
Effective teaching in higher education: A business leader’s guide to supporting math
Research and practice. New York: Agathon and science achievement (pp. 11–14). Washing-
Press. ton, DC: National Alliance for Business.
Pescosolido, B.A., and Aminzade, R. (Eds.). Sanderson, A., Phua, V.C., and Herda, D. (2000).
(1999). The social worlds of higher education: The American faculty poll. Chicago: National
Handbook for teaching in a new century. Opinion Research Center.
Thousand Oaks, CA: Pine Forge Press. Sands, R.G., Parson, L.A., and Duane, J. (1991).
Pike, G.R. (1995). The relationship between self- Faculty mentoring faculty in a public univer-
reports of college experiences and test scores. sity. Journal of Higher Education 62, 174–193.
Journal of Research in Higher Education 36(1), Schwartz, C. (1983). ABC’s of teaching with
1–21. excellence: A Berkeley compendium for teaching
Project Kaleidoscope. (1991). What works, with excellence. See <http://
building natural science communities: A plan teaching.berkeley.edu/compendium/>.
for strengthening undergraduate science and Scriven, M. (1981). Summative teacher evalua-
mathematics (Vol 1). Washington, DC: Author. tion. In J. Millman (Ed.), Handbook of teacher
Project Kaleidoscope. (1994). What works, evaluation. Beverly Hills, CA: Sage.
leadership: Challenges for the future (Vol. II). Scriven, M. (1993). Hard-won lessons in
Washington, DC: Author. program evaluation. New Directions for
Project Kaleidoscope. (1998). Shaping the future Program Evaluation 58, summer.
of undergraduate science, mathematics, Seldin, P. (1991). The teaching portfolio. Boston:
engineering and technology education: Proceed- Anker.
ings and recommendations from the PKAL day Seldin, P. (1998, March). How colleges evaluate
of dialogue. Washington, DC: Author. See teaching: 1988 vs. 1998. AAHE Bulletin 3–7.
<http://www.pkal.org/ Seymour, E. (In press). Tracking the processes
template2.cfm?2c_id=301>. of change in U.S. undergraduate education in
Reis, R.M (1997). Tomorrow’s professor: Prepar- science, mathematics, engineering, and
ing for academic careers in science and technology. Science Education.
engineering. Piscataway, NJ: IEEE Press. Seymour, E., and Hewitt, N.M. (1997). Talking
Remmers, H.H. (1934). Reliability and halo effect about leaving: Why undergraduates leave the
on high school and college students’ judg- sciences. Boulder, CO: Westview Press.
ments of their teachers. Journal of Applied Shapiro, N.S., and Levine, J.H. (1999). Creating
Psychology 18, 619–630. learning communities: A practical guide to
Rice, R.E., Sorcinelli, M.D., and Austin, A.E. winning support, organizing for change, and
(2000). Heeding new voices: Academic careers implementing programs. San Francisco, CA:
for a new generation. Washington, DC: Jossey-Bass.
American Association for Higher Education.
REFERENCES 137
Shipman, H.L. (2001). Hands-on science, 680 National Science Foundation. See <http://
hands at a time. Journal of College Science nsf.gov/cgi-bin/getpub?nsf00113>.
Teaching 30(5), 318–321. Svinicki, M., and Menges, R. (Eds.). (1996).
Shore, B.M., et al. (1986). The teaching dossier: A Honoring exemplary teaching. New directions
guide to its preparation and use. Montreal: for teaching and learning. (No. 65). San
Canadian Association of University Teachers. Francisco: Jossey-Bass.
Shulman, L. (1986). Those who understand: Uno, G.E. (1997). Handbook on teaching
Knowledge growth in teaching. Educational undergraduate science classes: A survival
Researcher 15, 4–14. manual. Norman, OK: University of Okla-
Shulman, L.S. (1993). Teaching as community homa Press.
property: Putting an end to pedagogical Walvoord, B.F., and Anderson, V.J. (1998).
solitude. Change 25(6), 6–7. Effective grading: A tool for learning and
Shulman, L. (1995). Faculty hiring: The peda- assessment. San Francisco: Jossey-Bass.
gogical colloquium—three models. AAHE Wergin, J. (1994). The collaborative department:
Bulletin 47(9), 6–9. How five campuses are inching toward cultures
Siebert, E.D., and McIntosh, W.J. (Eds.). (2001). of collective responsibility. Washington, DC:
College pathways to the science education American Association for Higher Education.
standards. Arlington, VA: National Science Wergin, J., and Swingen, J.N. (2000). Departmen-
Teachers Association. tal assessment: How some campuses are
Sorcinelli, M.D. (1999). The evaluation of effectively evaluating the collective work of
teaching: The 40-year debate about student, faculty. Washington, DC: American Associa-
colleague, and self-evaluations. In B.A. tion for Higher Education.
Pescosolido, and R. Aminzade (Eds.), The Wiggins, G. (1998). Educative assessment:
social worlds of higher education: Handbook for Designing assessments to inform and improve
teaching in a new century (pp. 195–205). student performance. San Francisco: Jossey-
Thousand Oaks, CA: Pine Forge Press. Bass.
Sorcinelli, M.D. (2000). Principles of good Wright, J.C., Millar, S.B., Kosciuk, S.A.,
practice: Supporting early career faculty. Penberthy, D.L., Williams, P.H., and Wampold,
Guidance for deans, department chairs, and B.E. (1998). A novel strategy for assessing the
other academic leaders. New Pathways II effects of curriculum reform on student
Project Forum on Faculty Roles and Rewards. competence. Journal of Chemical Education
Washington, DC: American Association for 75, 986–992.
Higher Education. See <http:// Wyckoff. S. (2001). Changing the culture of
www.aahe.org/ffrr/ undergraduate science teaching. Journal of
principles_brochure.htm>. College Science Teaching 30(5), 306–312.
Springer, L., Stanne, M.E., and Donovan, S.S. Zimpher, N. (1998). Ten changing demands on
(1998). Effects of small-group learning on college teachers in the future. Presented at
undergraduates in science, mathematics, Changing Demands on College Teachers: A
engineering, and technology: A meta-analysis Conference for Teaching Support Providers,
(Research Monograph No. 11). Madison: April 27, Columbus, Ohio. See <http://
University of Wisconsin-Madison, National www.acs.ohio-state.edu/education/ftad/
Institute for Science Education. Publications/ten-nancy.html>.
Suskie, L. (Ed.). (2000). Assessment to promote
deep learning. Washington, DC: American
Association for Higher Education. See <http:/
/www.aahe.org/catalog/
iteminfo.cfm?itemid=1&itemid=127&g=t>.
Suter, L., and Frechtling, J. (2000). Guiding
principles for mathematics and science
education research methods: Report of a
workshop. (NSF 00-113). Arlington, VA:
138 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
Appendix A
Selected Student
Evaluation Instruments
139
140 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
Another useful approach is for the learned well the knowledge and skills
instructor to evaluate student learning they need to move through a vertically
throughout the term. Instructors can structured departmental curriculum.
use the information obtained from these
regular assessments of student learning
to improve their teaching and make GUIDELINES FOR THE USE OF
midcourse corrections in the ap- STUDENT EVALUATIONS
proaches they are using. Faculty
members can thus conduct their own Having examined the research
classroom research, gathering mea- literature and practices in several
sures of student learning to improve different types of institutions of higher
their teaching (Brookfield, 1995; Na- education, the committee offers here
tional Institute for Science Education, guidelines for the use of student evalua-
2001b). An instructor’s use of such tions, particularly in making decisions
approaches, the range of test instru- about a faculty member’s professional
ments employed (e.g., short-answer and life. Centra (1993: especially 89–93)
essay questions, computer simulations, offers a detailed discussion of the issues
and laboratory-based problems, in involved; the suggestions offered below
addition to multiple-choice and similar are based in part on that analysis.
kinds of questions) and the ways in Make clear to faculty and students how
which the instructor responds to indica- results of student evaluations will be used.
tors of student learning can be useful Faculty members, administrators, and
measures of teaching effectiveness. students need to understand both how
Instructors also can benefit from the results will be used and who will
knowing whether students who have have access to them.
taken their courses have mastered Use student evaluation as only one
concepts and skills that will be needed piece of relevant information from several
for subsequent, higher level courses. sources. Because student evaluations
Thus, questions about specific concepts represent student views only, other
the students will have been expected to sources of information (colleagues, self-
learn can be included in pre/post- reports, evidence of student learning)
testing. Alternatively, as part of their must be considered. Student evalua-
evaluation of program effectiveness, tions are relatively easy to obtain, but
academic departments can develop that should not result in giving them
assessment instruments that can be undue weight. Note that when multiple
used to examine whether students have sources of evaluation data are used,
S E L E C T E D S T U D E N T E VA L U AT I O N I N S T R U M E N T S 141
consensus must be reached on how reliable. If the class has fewer than 10
each source will be weighted when students, it is best not to summarize the
making decisions about teaching effec- data. For sufficiently large sample
tiveness. sizes, means and standard deviations
Use several sets of evaluation results. are used most frequently to summarize
For personnel decisions, a pattern of data.
evaluation results derived from different Consider some course characteristics in
courses taught over more than one interpretations. While any single course
semester should be used. Using results variable may not have a great effect, a
from five or more classes is generally combination (e.g., small classes, course
best. Also, the results of student evalua- subject area) could affect a teacher’s
tions should be compared with a histori- mean rating.
cal record for that class or type of class, Use comparative data. Comparisons
if such data are available. among instructors within an institution
Have a sufficient number of students or, better yet, across a large number of
evaluate each course. Averaging re- similar institutions can help in interpret-
sponses from a sufficient number of ing results by minimizing the effects of
students minimizes the effects of a few any skewed distributions.
divergent opinions. Reliability estimates Do not overestimate small differences.
(see Chapter 4 for a definition of reliabil- Because student evaluations typically
ity as used in psychometrics) are excel- are quantified, there may be a tendency
lent for classes of 25 students or more. to assign them a precision they do not
In classes with fewer students, it is possess or warrant. A 10-percentile
critical to examine patterns of student difference between instructors gener-
responses across a number of classes. ally does not represent a practical
Reliability estimates for classes of 15 or distinction.
more are at an acceptable level. For For personnel decisions, emphasize
very large classes, a representative or global evaluations and estimates of
random sample of students totaling 25 learning. Overall ratings of instruction
or more can be selected to complete the or of a course tend to correlate highly
form. An effort should be made to with measured student achievement—
encourage at least 60 percent of enrolled more highly than ratings dealing with
students to participate in the evaluation, different teaching styles and presenta-
and at least 15–25 questionnaires are tion methods. Students’ estimates of
needed for results to be considered their own learning also can be useful
142 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
S E L E C T E D S T U D E N T E VA L U AT I O N I N S T R U M E N T S 143
is important to keep in mind that tradi- point of diminishing returns. If they are
tional student rating forms often do not overused, neither students nor instruc-
reflect an instructor’s effectiveness in tors will give them the level of attention
less traditional teaching or testing required for fair evaluation of teaching
environments. or continued professional development
Limit the use of rating forms. The use by the faculty member in question.
of student rating forms may reach a
144 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
Appendix B
Samples of Questionnaires
Used to Evaluate
Undergraduate Student Learning
145
Hampshire College
End-of-Semester Course Evaluation Forms 178–182
Instructor Objectives Report 183–184
146 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
3 During the current school More than 20 6 Which of the following have you done or do you
year, about how much Between 11 and 20 plan to do before you graduate from your institution?
reading and writing Between 5 and 10
have you done? Yes No Undecided
Between 1 and 4
None
a. Practicum, internship, field
experience, co-op experience,
a. Number of assigned textbooks, or clinical assignment
books, or book-length packs of
course readings b. Community service or
volunteer work
b. Number of books read on your own
(not assigned) for personal c. Work on a research project with a
enjoyment or academic enrichment faculty member outside of course
or program requirements
c. Number of written papers or reports
of 20 pages or more d. Foreign language coursework
d. Number of written papers or reports e. Study abroad
between 5 and 19 pages
f. Independent study or
e. Number of written papers or reports self-designed major
of fewer than 5 pages g. Culminating senior experience
(comprehensive exam, capstone
course, thesis, project, etc.)
4 Mark the box that best represents the extent to
which your examinations during the current school
year have challenged you to do your best work. 7 About how many hours do
you spend in a typical 7-day More than 30
week doing each of the 26 - 30
Very much
following? 21 - 25
16 - 20
7 # of hours 11 - 15
per week 6 - 10
6 1-5
0
5 a. Preparing for class
(studying, reading,
writing, rehearsing, and
4 other activities related to
your academic program)
3 b. Working for pay on
campus
2 c. Working for pay off
campus
d. Participating in co-
1
curricular activities
(organizations, campus
publications, student
Very little government, social
fraternity or sorority,
intercollegiate or
intramural sports, etc.)
5 Overall, how would you evaluate the quality of
academic advising you have received at your e. Relaxing and socializing
(watching TV, partying,
institution?
exercising, playing
Excellent computer and other
Good games, etc.)
Fair f. Providing care for
dependents living with
Poor
you (parents, children,
spouse, etc.)
148 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
8 To what extent has your institution experience 10 Mark the box that best represents the quality of
contributed to your knowledge, skills, and personal your relationships with people at your institution.
development in the following areas?
Relationships with:
Very Quite Very
much a bit Some little
a. b. c.
13 Write in your year of birth: 1 9 24 Which of the following best describes where
you are living now while attending college?
14 Your sex
Dormitory or other campus housing (not fraternity/sorority
Male Female house)
Residence (house, apartment, etc.) within walking
15 Are you of Hispanic, Latino, or Spanish origin?
distance of the institution
Yes No Residence (house, apartment, etc.) within driving
distance
16 What is your racial or ethnic identification? Fraternity or sorority house
(Mark all that apply)
25 Did either of your parents graduate from
American Indian or other Native American college?
Asian American or Pacific Islander Yes, both parents No
Black or African American Yes, father only Don't know
White Yes, mother only
Other: Specify 26 Which of these fields best describes your major(s)
or your expected major(s)? Mark only one major in
17 Are you an international student or foreign each column.
national?
Primary Second Major (not minor, concentration, etc)
Yes No Major (if applicable)
Agriculture
18 What is your current classification in college? Biological/life sciences (biology,
biochemistry, botany, zoology, etc.)
Freshman/first-year Sophomore
Business (accounting, business admin.,
Junior Senior
marketing, management, etc.)
Unclassified
Communications (speech, journalism,
television/radio, etc.)
19 Since high school, which of the following
types of schools have you attended other than the Computer and information sciences
one you are attending now? (Mark all that apply) Education
Engineering
Vocational-technical school
Ethnic, cultural studies, and area studies
Community or junior college
Foreign languages and literature (French,
4-year college other than this one Spanish, etc.)
None
Health-related fields (nursing, physical
Other: Specify therapy, health technology, etc.)
Humanities (English, literature,
20 Did you begin college at your current philosophy, religion, etc.)
institution or elsewhere? Liberal/general studies
Started here Started elsewhere Mathematics
Multi/interdisciplinary studies (international
21 Thinking about this current academic term, relations, ecology, environmental studies, etc.)
how would you characterize your enrollment?
Parks, recreation, leisure studies, sports
Full-time Less than full-time management
Physical sciences (physics, chemistry,
22 Are you a member of a social fraternity or astronomy, earth sciences, etc.)
sorority?
Public administration (city management,
Yes No law enforcement, etc.)
Social sciences (anthropology, economics,
23 Do you intend to teach at some
history, political science, psychology,
pre-kindergarten through high school grade level
sociology, etc.)
within a year or two of completing your degree
Visual and performing arts (art, music,
program?
theater, etc.)
Yes No Undecided Undecided
Other: Specify
150 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
152 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
154 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
156 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
158 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
160 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
162 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
164 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
166 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
168 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
170 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
172 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
174 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
176 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
178 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
180 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
182 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
184 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
Appendix C
This report has emphasized the The forms included in this appendix
importance of using multiple ap- serve as examples of peer evaluation
proaches in evaluating teaching effec- surveys. French-Lazovik’s (1981) form
tiveness. As discussed in Chapters 4 is designed to assist faculty in evaluating
and 5, feedback from faculty colleagues their colleagues on the basis of written
can be a highly useful source of informa- materials that are provided in a dossier.
tion for improving teaching and learn- The forms from Syracuse University
ing. However, research has indicated and The University of Texas outline
that faculty colleagues can be far more behaviors that colleagues can observe
effective in this role if they are trained directly when they visit their colleagues’
in how to conduct peer evaluations and classrooms.
if they work from an accepted set of
criteria.
185
186 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E X A M P L E S O F Q U E S T I O N S F O R C O N D U C T I N G P E E R E VA L U A T I O N S 187
188 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E X A M P L E S O F Q U E S T I O N S F O R C O N D U C T I N G P E E R E VA L U A T I O N S 189
190 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E X A M P L E S O F Q U E S T I O N S F O R C O N D U C T I N G P E E R E VA L U A T I O N S 191
192 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E X A M P L E S O F Q U E S T I O N S F O R C O N D U C T I N G P E E R E VA L U A T I O N S 193
194 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
E X A M P L E S O F Q U E S T I O N S F O R C O N D U C T I N G P E E R E VA L U A T I O N S 195
Appendix D
Biographical Sketches of
Committee Members
196
198 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
200 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
202 E VA L U AT I N G A N D I M P R O V I N G U N D E R G R A D U A T E T E A C H I N G
Index
203
204 INDEX
INDEX 205
206 INDEX
INDEX 207
208 INDEX
Harvard University Derek Bok Center for grade distributions, course retention, and
Teaching and Learning, 163-165 subsequent enrollment figures, 66-67
End-of-Semester Course Evaluation Form, quality and performance of undergraduate
164-165 research students, 67
Mid-Course Evaluation Form, 163 Instructional contributions
How People Learn: Brain, Mind, Experience, and data for evaluating teaching quality and
School, 14 effectiveness from, 63
Howard Hughes Medical Institution, 48 Instructor Objectives Report, 183-184
Integrated learning, 33
Intellectual development of individual students
contributions to ongoing, 31
I Interdepartmental cooperation
in improving undergraduate STEM education,
114
Implementation of evaluation methodologies, 96- International Technology Education Association
99 (ITEA), 110
departmental panels on teaching effectiveness ITEA. See International Technology Education
and expectations, 98 Association
feedback from graduating seniors and alumni,
98
formative discussions between the
department chair and individual faculty J
members, 97
helpful policies and procedures, 96-97
legal considerations, 98-99 Just-in-time teaching, 79
oversight committee to monitor departmental
curriculum and instruction, 98
regular meetings between new faculty
members and the department chair, 97 K
sharing faculty-generated teaching portfolios,
97-98
Independent research Kansas State University IDEA Center, 166-177
encouraging students to engage in, 113-114 Faculty Information Form for Student
Informal conversations, 80 Evaluations, 168-169
Input from students and peers, 52-53 Sample Results of Student Evaluations, 170-
evidence of learning from student portfolios, 177
52 Student Reactions to Instruction and Courses,
faculty from “user” departments for service 166-167
courses and from related disciplines for Knowing What Students Know: The Science and
interdisciplinary courses, 52-53 Design of Educational Assessment, 15
informed opinions of other members of the Knowledge of subject matter, 27-28, 101
faculty member’s department, 52 answering students’ questions and guiding
summary of professional attainments of information searches, 28
undergraduate students engaging in data sources and forms of evaluation for
research under the faculty member being evaluating, 102
evaluated, 53 helping students learn and understand the
undergraduate and graduate students, 53 general principles of their discipline, 28
undergraduate and graduate teaching providing students with an overview of the
assistants, 53 whole domain of the discipline, 28
Institutional data and records, 66-67 staying current through an active research
data for evaluating teaching quality and program or through scholarly reading, 28
effectiveness from, 66-67
evidence about student learning from, 3-4
INDEX 209
M
O
Making Teaching Community Property: A Menu
for Peer Collaboration and Peer Review, 86 Observation, 84-85
Master Faculty Program, 85 Outcomes assessment, 73-76
Mathematical Association of America, 46, 109 adjusting expected learning outcomes as
Mentoring appropriate, 73
of faculty by other faculty, 85 benefits of, 75-76
Mid-Course Evaluation Form, 163 determining when in a student’s education
Minute papers, 79 specific knowledge and skills should be
Miracosta Community College, 81n developed, 73
Multidimensional learning, 33 developing expected student learning
outcomes for an individual course of study,
73
incorporating specified learning outcomes in
N statements of objectives for courses, 73
scoring, 74-75
selecting appropriate assessment strategies to
National Center for Education Statistics, 35 test student learning of specified
National Center for Public Policy and Higher knowledge, 73
Education, 12 using to provide formative feedback to
National Council of Teachers of Mathematics individual students, 73
(NCTM), 46, 110 Outside evaluators, 81
National Council on Measurement in Education Oversight committee
(NCME), 55 to monitor departmental curriculum and
National Institute for Science Education, 30, 72, 76 instruction, 98
210 INDEX
P Preparation
adequacy of, 87
of future teachers, 32
Pedagogical content knowledge, 16n Preparing for Peer Evaluation, 95
Pedagogies and technologies Primary trait analysis
ability to recognize students not achieving to in scoring outcome assessments, 74-75
their fullest potential and assisting them in Principles of good practice for assessing student
their academic difficulties, 29 learning, 33-35
contextually appropriate, 29n educational values, 33
data sources and forms of evaluation for illuminating questions people really care
evaluating skill in and experience with, 103 about, 34
enabling teaching, 25 involving a larger set of conditions that
encouraging discussion and promoting active promote change, 34-35
learning strategies, 29 involving representatives from across the
organized and clear communication to educational community, 34
students of expectations for learning and meeting responsibilities to students and to the
academic achievement, 29 public, 35
persistently monitoring students’ progress ongoing, not episodic, 34
toward achieving learning goals, 29 paying attention to outcomes and equally to
skill, experience, and creativity with a range of the experiences leading to them, 33
appropriate, 28-30, 101-103 programs with clear, explicitly stated
viewing the learning process as a joint venture purposes, 33
with the students, 29 understanding learning as multidimensional,
Peer reviews of teaching integrated, and revealed in performance
including in valid summative assessments of over time, 33
teaching, 4-5, 119-120 Principles of learning, 20-22
providing both objective and subjective effect of learners’ motivation to learn and
assessment of a faculty member’s sense of self on what and how much is
commitment to quality teaching, 7, 125 learned and how much effort is put into
Pew Charitable Trust, 48 learning, 21-22
Pew Forum on Undergraduate Education, 147- effect of the practices and activities engaged
150 in while learning on what is learned, 22
Pew Forum on Undergraduate Learning, 77, 145 enhancement of learning through socially
Portfolio Clearinghouse, The, 65 supported interactions, 22
Portfolios. See Faculty teaching portfolios facilitation of learning through metacognitive
Predictions about undergraduate teaching, 25 strategies that identify, monitor, and
changes in evaluation and documentation of regulate cognitive practices, 21
teaching, 25 facilitation of learning with understanding
changing appearance of higher education when new and existing knowledge is
facilities, 25 structured around major concepts and
curriculum and program design becoming principles of the discipline, 20
inseparable from teaching and learning, 25 learners’ different strategies, approaches,
diversity seen as asset-based, 25 patterns of abilities, and learning styles
focus of teaching shifting away from content coming from their heredity and prior
transmission, 25 experiences, 21
nature and quality of assessment, 25 learners’ use of what they already know to
a new scholarship of teaching, 25 construct new understandings, 20
pedagogies students experienced prior to Professional interactions with students, 30-31,
college changing their expectations about 104-106
good teaching, 25 advising students experiencing problems with
teaching becoming more public than ever course material, 31
before, 25 contributing to the ongoing intellectual
technology enabling teaching, 25 development of individual students, 31
INDEX 211
data sources and forms of evaluation for Syracuse University’s Classroom Observation
evaluating professionalism with students Worksheet, 188-192
within and beyond the classroom, 106 University of Texas at Austin’s Checklist of
demonstrating respect for students as Teaching Skills, 193-195
individuals and respecting their privacy, 31 Questions people really care about
encouraging the free pursuit of learning and beginning with, 34
protecting students’ academic freedom, 31
meeting all classes and labs, posting and
keeping regular office hours, and holding
exams as scheduled, 31 R
upholding and modeling for students the best
in scholarly and ethical standards, 31
Professional organizations Recommendations for evaluating teaching
encouraging publication of peer-reviewed effectiveness, 4-8, 115-127
articles on evolving educational issues in accreditation agencies and boards should
STEM, 7, 127 revise policies to emphasize quality
increasing support for effective teaching, 46- undergraduate learning as a primary
49 criterion for program accreditation, 7, 127
offering opportunities to discuss campus-wide and disciplinary-focused centers
undergraduate education issues during for teaching and learning should be tasked
annual and regional meetings, 7, 127 with providing faculty with opportunities
Program design for ongoing professional development, 5-6,
becoming inseparable from teaching and 122-123
learning, 25 for deans, department chairs, and peer
Programs That Work, 47 evaluators, 6-7, 124-126
Project Kaleidoscope, 56 department heads should provide personnel
recommendations containing separate
ratings on teaching, research, and service,
7, 125
Q departments should contribute to campus-
wide awareness of the premium placed on
improved teaching, 6-7, 125
Quality and performance departments should periodically review a
of undergraduate research students, 67 departmental mission statement that
Quality of teaching and effective learning includes appropriate emphasis on teaching
ranking more highly in institutional priorities, and student learning, 6, 124
5, 122 departments should provide funds to faculty
Questionnaires used to evaluate undergraduate to enhance teaching skills and knowledge,
student learning, 19, 145-184 7, 125-126
Carnegie Mellon University Eberly Center for departments should support faculty moving to
Teaching Excellence, 153-162 greater emphasis on instruction or
The College Student Report 2001, 147-150 educational leadership, 7, 126
Hampshire College, 178-184 effective peer reviews of teaching should
Harvard University Derek Bok Center for provide both objective and subjective
Teaching and Learning, 163-165 assessment of a faculty member’s
Kansas State University IDEA Center, 166-177 commitment to quality teaching, 7, 125
Student Instructional Report II, 151-152 faculty should be encouraged to develop
Questions for conducting peer evaluations of curricula that transcend disciplinary
teaching, 19, 185-195 boundaries through a combination of
Suggested Form for Peer Review of incentives, 6, 124
Undergraduate Teaching Based on Dossier faculty should be supported in their obligation
Materials, 186-187 to improve their teaching skills through
212 INDEX
INDEX 213
214 INDEX
U W
Undergraduate teaching and learning, 11-24. See Worcester Polytechnic Institute, 64n
also Effective undergraduate teaching
impetus for and challenges to change, 12-15
INDEX 215