You are on page 1of 9

RESEARCH AND TEACHING

A New Tool for Measuring Student


Behavioral Engagement in Large
University Classes
By Erin S. Lane and Sara E. Harris

T
We developed a classroom he main goal of this work & Cocking, 2000). Students learn
observation protocol for was to develop a practical best when they are actively engaged
quantitatively measuring student and objective observation and can therefore deeply encode
engagement in large university protocol that could be used material (Baddeley, Lewis, Eldridge,
classes. The Behavioral in large university classrooms to & Thomson, 1984; Hake, 1998). In
Engagement Related to Instruction produce quantitative data identifying large-lecture settings, however, stu-
(BERI) protocol can be used student behavioral engagement level dents are more likely to experience
to provide timely feedback to at any given time. Engagement lev- anonymity and distraction leading
instructors as to how they can els could then be used to determine to decreased engagement in class
improve student engagement in the teaching practices that most ef- (Blatchford, Edmonds, & Marin,
their classrooms. We tested BERI fectively engage students and pro- 2003; Fenollar, Roman, & Cuesta,
on seven courses with different vide timely feedback to instructors. 2007). Similarly, students may find
instructors and pedagogy. BERI In achieving this goal, we wanted our it difficult to concentrate over an
achieved excellent interrater observation protocol to use a simple extended period of time in traditional
agreement (>95%) with a one-hour coding system that would be quick lectures when they are in passive
training session with new observers. and easy for observers to learn and learning mode (Young, Robinson,
It also showed consistent patterns that could be easily presented to the & Alberts, 2009). When students
of variation in engagement with course instructor as timely feedback stop paying attention and disengage
instructor actions and classroom immediately after class. Here we with course content, they are unable
activity. Most notably, it showed present the Behavioral Engagement to construct their own knowledge
that there was substantially higher Related to Instruction (BERI) pro- (Young et al., 2009). Therefore, it
engagement among the same tocol, an observation tool that has is important for instructors in large
group of students when interactive achieved good interrater agreement classes to examine what they might
teaching methods were used with little training, that can provide do to more deeply engage students
compared with more traditional timely and formative feedback to to promote learning. But how can
didactic methods. The same general the course instructor in the form of instructors measure the level of stu-
variations in student engagement a graphical representation of student dent engagement and effectiveness
with instructional methods were engagement over time for a class, of their interventions to improve it?
present in all parts of the room and and has results that can be easily re- Currently there are several tools to
for different instructors. lated to the pedagogy and curricular measure engagement in educational
content of the lecture. settings. A variety of self-report ques-
The theoretical basis for this work tionnaires have been used in research
stems from constructivism, which de- on student engagement (Chapman,
scribes learning as an active process 2003); however, the validity of the
through which individuals construct data yielded by these measures has
understanding of new information been questioned because students’
by linking it to previous experiences ability to accurately assess their own
and knowledge (Bransford, Brown, behaviors is highly variable (Assor &

Vol. 44, No. 6, 2015 83


RESEARCH AND TEACHING

Connell, 1992). Several studies have levels of individual students, because of different things: from time-on-task
used checklists and rating scales for each student is observed individually studies to studies that investigate the
instructors to measure engagement it would be difficult to relate teaching quality of effort, interest, and willing-
in their own courses (Chapman, practices to levels of engagement ness to participate in learning activi-
2003). These measures are flawed in the classroom. In addition, these ties (Kuh, 2009). Student engagement
by poor reliability and bias toward methods are time-consuming and not levels have also been indexed by cog-
a positive result of the instructor’s practical for use in large university nitive, affective, or behavioral criteria
effectiveness. classes. (Chapman, 2003). Cognitive criteria
Classroom observations have been There are a handful of published assess the extent to which students are
widely used in education over the past classroom observation instruments expending mental effort in the learn-
couple of decades to measure effec- that have been developed for post- ing tasks. Affective criteria assess the
tive teaching practices (Waxman & secondary education settings with level of students’ investment in, and
Padron, 2004). However, from Wax- large classes: the Reformed Teaching emotional reactions to, the learning
man and Padron’s (2004) review of Observation Protocol (RTOP; Piburn tasks. Behavioral criteria assess the
classroom observation studies, it is et al., 2000), VaNTH Observation extent to which students are actively
evident that most instruments focus System (VOS; Harris & Cox, 2003), participating in relevant learning tasks
on the instructor’s behaviors as the STROBE (O’Malley et al., 2003), and presented.
unit of measurement rather than on the Classroom Observation Protocol Here we present an observation
the student’s behaviors. Chapman for Undergraduate STEM (COPUS; protocol that measures university stu-
(2003) specifically reviewed a num- Smith, Jones, Gilbert, & Wieman, dent behavioral engagement, which
ber of third-party classroom observa- 2013). RTOP does not explicitly deal we define simply as on-task behavior
tion instruments to measure student with student engagement and is more in the classroom. The BERI protocol
engagement. The majority of these focused on the teachers’ implemen- was developed after we noticed dif-
protocols assess whether students tation of the lesson plan. VOS and ferences in student behavioral pat-
are engaged during a specific mo- STROBE measure student engage- terns in the classroom that seemed to
ment or over a given time interval, ment levels as a part of a larger series be related to different instructional
and various behaviors are considered of items to characterize the classroom methods (i.e., didactic parts of a les-
“engaged”—such as reading (aloud or climate and only capture coarse mea- son versus more interactive parts).
silently), paying attention, or looking surements of student engagement by This research examines the following
for materials (e.g., Greenwood & a global judgment of the proportion questions:
Delquadri, 1988). of the class as a whole that is on
However, all of the protocols task. In addition, RTOP, VOS, and 1. Can we quantify differences in
reviewed by Chapman (2003) were STROBE rely on complicated coding student behavioral engagement
designed for use in K–12 classes with systems, scorings, and questionnaires with a simple, valid, and reliable
30–35 students. In these methods, a to contextualize what is happening in observation protocol?
small representative sample of stu- the lecture at a given time. These pro- 2. How does student behavioral
dents are selected (typically five) and tocols therefore require lengthy user engagement vary with classroom
observations are rotated across stu- manuals and training sessions, and activities, instructional methods,
dents so that each student is observed there is a danger that the instructor and between instructors?
continuously for about one minute at will find feedback difficult to use be-
a time before moving on to the next cause of the complexity of the coding Methodology
student. Thus, in a typical observation system. COPUS includes document- Developing the protocol
session of 30 minutes, approximately ing what students are doing in class The first stage of developing BERI
5 minutes of observation data can be but does not explicitly measure levels was to conduct unstructured obser-
collected on each student, with six of engagement. vations of several large classes to
to seven sessions being required to Various definitions of student en- note patterns of student behavior
observe a full class of 30–35 students. gagement have appeared in the educa- that would define engagement. First,
Although these methods may be ef- tion literature. The term student en- all student in-class behaviors were
fective at identifying the engagement gagement is used to describe a variety recorded, described, and classified

84
Journal of College Science Teaching
in Table 1 (engaged behaviors) or BERI is able to identify on-task be- simultaneously observe. We wanted
Table 2 (disengaged behaviors). Al- haviors, it is unable to differentiate the observations to be as unobtrusive
though observation of one of these the depth at which the student is cog- as possible and decided that the ob-
behaviors does not guarantee that the nitively interacting with the material. server would sit in one place during
student is engaged/disengaged, we For example, frantic writing may be the entirety of the class. We also did
have limited the set to pronounced a sign of active student paraphrasing not want the students to notice or
behaviors for which there is wide- or passive note copying. be distracted by the observer, so we
spread agreement from observers as The next stage was to determine decided the observer would sit behind
to what it represents. Also, although the optimal number of students to the group of students being observed.

TABLE 1
Descriptions of student in-class behaviors that indicate they are engaged.

Engaged
Listening Student is listening to lecture. Eye contact is focused on the instructor or activity and the student
makes appropriate facial expressions, gestures, and posture shifts (i.e., smiling, nodding in
agreement, leaning forward).
Writing Student is taking notes on in-class material, the timing of which relates to the instructor’s
presentation or statements.
Reading Student is reading material related to class. Eye contact is focused on and following the material
presented in lecture or preprinted notes. When a question is posed in class, the student flips
through their notes or textbook.
Engaged computer use Student is following along with lecture on computer or taking class notes in a word processor or
on the presentation. Screen content matches lecture content.
Engaged student Student discussion relates to class material. Student verbal and nonverbal behavior indicates he or
interaction she is listening or explaining lecture content. Student is using hand gestures or pointing at notes
or screen.
Engaged interaction Student is asking or answering a question or participating in an in-class discussion.
with instructor

TABLE 2
Descriptions of student in-class behaviors that indicate they are disengaged.

Disengaged
Settling in/ packing up Student is unpacking, downloading class material, organizing notes, finding a seat, or packing up
and leaving classroom.
Unresponsive Student is not responsive to lecture. Eyes are closed or not focused on instructor or lecture mate-
rial. Student is slouched or sleeping, and student’s facial expressions are unresponsive to instruc-
tor’s cues.
Off-task Student is working on homework or studying for another course, playing with phone, listening to
music, or reading non-class-related material.
Disengaged computer Student is surfing web, playing game, chatting online, checking e-mail.
use
Disengaged student Student discussion does not relate to class material.
interaction
Distracted by another Student is observing other student(s) and is distracted by an off-task conversation or by another
student student’s computer or phone.

Vol. 44, No. 6, 2015 85


RESEARCH AND TEACHING

The observer still needed to have an


FIGURE 1
unobstructed view of the students
(could see their faces, computer Coversheet for the Behavioral Engagement Related to Instruction
screens, and notes) and was in close (BERI) protocol observations.
enough proximity to get the general
context of student interactions. After
several trials we chose 10 students
as the optional number, with the ob-
server sitting in the row directly be-
hind the students being observed and
at an angle so that the students were
still within the observer’s sight line. It
is worth noting that all of our observa-
tions were conducted in large-lecture
theatres with tiered seating, which
allowed the observer to sit higher
than the students being observed and
thereby get a better view. For alterna-
tive classroom settings, the number of
students or observers’ position may
have to be adjusted so that all students
in the observation group can be ad-
equately seen. Using the behaviors in
Table 1 and 2 as a guide, we recorded
how many of the 10 students were
engaged (“E”), disengaged (“D”), or
were of uncertain (“U”) engagement.
If the observer’s view of a student was
temporarily blocked or the student’s ly seen. At the beginning of the class, using humor or real-world examples)
behavior was unclear, he or she was the observer fills out a coversheet are recorded under each observation
entered into the uncertain category. It that contains general information point. Because classroom activities
took approximately 3 to 10 seconds and notes about the class (Figure 1). and instructor behaviors vary in dif-
to gauge the level of engagement of Once the class starts, observation ferent courses and with different in-
each student, with a 10-student cycle points are recorded directly onto the structors, this section is open-ended
taking approximately one minute copy of the instructor’s notes in the and determined by the observer. In-
to complete. We did not record the section corresponding to what is be- structor questions to the class and
specific behavior of each individual; ing covered. This ensures that the in- student questions to the instructor
rather, for each 10-student cycle we structor will later be able to relate en- are also documented with the fol-
recorded one observation point (e.g., gagement with what was happening lowing information: the section of
“8/10 students engaged”) that was in their class at any specific time. An the room in which the question/an-
time stamped at the start of the cycle. observation point is taken for every swer originated and how the interac-
page of notes, for any major change tion is followed up (e.g., entire class,
Using the protocol in activity or content, or at 2-min- subgroup of students, one student).
Prior to conducting a classroom ob- ute intervals depending on which In most cases in this study, the class-
servation, the observer receives a time interval is shorter. Changes in room activity was clearly outlined in
printed copy of the instructor’s notes the classroom activity (e.g., clicker the class notes so that the observer
or lesson plan. The observer random- question, in-class discussion, dem- needed only to record activities, be-
ly chooses a place to sit during the onstration) or instructor behaviors haviors, and questions that were not
class where 10 students can be clear- (e.g., moving around the classroom, preplanned.

86
Journal of College Science Teaching
Training and interrater reliability in three different educational settings Testing validity
The next stage was to test BERI to en- with five different instructors were To test the validity of BERI, we com-
sure that we could train observers and used to evaluate interrater reliability. pared observation data within single
get consistent results. Ten different The average interrater agreement class periods, within a single course
observers—instructors, graduate stu- was calculated to be 96.5%. Trained over multiple class periods, and
dent teaching assistants, and science observers only put students in the across multiple courses.
education researchers—were trained uncertain category 3.3% of the time, First, we wanted to determine if
to use BERI. Training consisted of a and this typically happened when 10 students could provide an accurate
brief description of the methodology the observer’s view of the student representation of engagement for the
and purpose of the protocol followed was obstructed. Observers in training whole class. To do this, we conducted
by a supervised observation session placed students into the uncertain cat- an observation where we had three
during a routine 50-minute under- egory 7.3% of the time. On the basis trained observers take simultaneous
graduate class. The trained observer of this high percentage agreement observations of three groups of 10
and trainee sat side by side and agreed between observers, it appears that one students from different sections of the
to observe 10 students that they both 50-minute practice class is adequate classroom.
felt confident that they could ade- training for new observers to produce Second, to determine the depen-
quately see. To ensure that these ob- reliable results. This confirms that dence of engagement on location in
servations were completed at exactly the behaviors in Tables 1 and 2 are the classroom, we observed 25 classes
the same time, the trained observer in- obvious when observed. Debriefing in a single course over the semester
dicated with unobtrusive hand signals after an observation cycle during the and compared the average student
when to start the observation cycle. training sessions revealed sources engagement for classes where the
Both observers independently cycled of discrepancy between observers; observer sat in the front of the class (9
through each of the 10 students in se- in many of the cases, an observed classes) versus the back (16 classes).
quence and for each student recorded student was displaying a subtle The expectation was that students in
“E” for engaged, “D” for disengaged, behavior. For example, a student the front of the class would be more
and “U” for uncertain. At the end of may have appeared to be listening engaged than students in the back of
each cycle and after class, the trainer to the instructor, whereas in fact the the class.
and trainee compared results and dis- student was sending a text message Third, we tested BERI on seven
cussed any differences. The observer on his or her cell phone so discretely classes of varying sizes (50–300
and the trainee then attended a second one of the observers did not notice. students), in different subjects within
class together where they still sat to- Another small source of discrepancy science, on different course levels (first
gether, agreed on the students they occurred when an observer was not year to fourth year), and on 14 different
were going to observe, and used hand familiar with the course material and instructors with varying instructional
gestures to cue the start of an observa- was not able to determine if students methods.
tion cycle; however, this time they did were writing or reading notes for the
not share or discuss their results. The class in progress or doing work for Linking engagement patterns to
data from these sessions were then a different course. This was more pedagogy
used to calculate the interrater reli- common shortly before exam time One of the main goals in developing
ability of observers, calculated as the and was usually recognized after a BERI was to see if instructional
percentage of identically marked Es, couple of observation cycles when methods could be associated with
Ds and Us for individual students for the observer realized that the student student engagement. To test this, we
each pair of observers for all sessions was not changing his or her behavior did the following:
observers attended together. in response to what was happening
After only one practice session, in class. Although we believe that 1. Observed the same group of
new observers were able to produce anyone can be trained to use the class- students and the same instructor
reliable results compared with their room observation protocol, familiar- on two different days. On the first
trained partner. Data from 2,154 ity with the material being covered in day the instructor used a well-
judgments of individual student en- class does help the observer in cases prepared traditional, didactic
gagement from six pairs of observers like this. lecture and on the second day

Vol. 44, No. 6, 2015 87


RESEARCH AND TEACHING

delivered an interactive lecture with 8.4 ± 0.12 (front) and 7.5 ± 0.14 classes of varying sizes, course levels,
in which students completed a (back) students out of 10 engaged, and subject matters within science.
preclass reading and applied their respectively (p < .001). This differ- However, in classes where the in-
knowledge by answering in-class ence may be because students who are structor wrote notes on a blackboard
questions using personal response more likely to be engaged choose to and students did not receive a copy
systems (clickers). sit in the front or because the proxim- of the notes before or after class,
2. We conducted 27 observations of ity of the instructor causes students to engagement levels were found to be
a course where two instructors be more engaged (Benedict & Hoag, consistently high. This was because
with different levels of 2004). students were taking notes for the
experience using student-focused BERI was tested and able to entirety of the class period and were
pedagogy taught the same group differentiate engagement levels in classified as being engaged as they fit
of students on different days
using very similar teaching
methods. Both instructors FIGURE 2
followed a similar format for
Average student engagement (number of students engaged)
each class where they presented grouped by instructional activity for each instructor in a first-year
learning goals for the lecture; Oceanography course. Error bars represent standard error. The
mixed didactic lecture with in- number of observation sets (n) used to calculate mean and standard
class questions, discussions, error is indicated above each bar. For example, the bar representing
and real-world examples; tested “Instructor 2-Clicker” is the average of 82 instances in which 10
student knowledge using clicker students were observed during a clicker question. These numbers
questions that counted for grades; also roughly show the distribution of activities during class time, with
and ended with a summary of “Lecture” most common, followed by “Clicker.”
each lecture.

Results and discussion


Validity
Our findings from simultaneous ob-
servations of students in different
sections of the class by three trained
observers suggest that although the
overall level of engagement varies
between different groups of students,
the general trends over time in student
engagement remain consistent. This
result implies students are engaged or
disengaged by the same in-class ac-
tivities regardless of where they are
sitting in the classroom. Therefore
data from a single observer reporting
on 10 students can serve as a proxy
for trends in engagement for the full
class and provide useful feedback to a
course instructor.
On the basis of an independent
t-test, we also determined that the
average level of student engagement
is significantly higher in the front of
the class versus the back of the class,

88
Journal of College Science Teaching
the day the instructor used a well-
FIGURE 3
prepared traditional didactic lecture,
Sample of engagement data over a 50-minute class period, showing engagement for the 10 students av-
classroom activities that are more/less engaging. Data like these are eraged to 5.7 ± 0.09 over the class
provided to instructors shortly after observation. period. In contrast, when the same
instructor gave an interactive lecture,
engagement increased to 8.6 ± 0.14
for the same group of students.
We also found that the level of en-
gagement related to various instruc-
tional methods is similar for different
instructors. Figure 2 shows data on
student engagement from a large first-
year class where teaching is split be-
tween two different instructors with
different levels of experience with
interactive teaching techniques. Both
instructors used similar instructional
methods, and average engagement
for each instructor was grouped on
the basis of the classroom activity.
Based on average numbers of stu-
dents engaged out of 10, we found
that the most engaging activities are
clicker questions and clicker question
follow-up (“Post Clicker” in Figure
into the writing category of Table 1. served. Students were not explicitly 2); the least engaging activities are
It is worth noting that although we told that observers were recording instructor lecture, summaries, and
have identified a list of overt student their behaviors, and the observers learning goals. These data also show
behaviors that indicate whether the always sat behind the observed that the most common activity during
student is engaged or disengaged, students and dressed casually so as the observation semester was lecture,
we cannot confirm how deeply the to blend in with the student popula- followed by clicker questions and
student is cognitively processing tion. Several observers were asked follow-up (Figure 2). This type of in-
the material to be learned. Students by students to participate in student formation can help instructors reflect
who engage in rote note taking may activities, indicating that the students on their current practices and iterate
not actually be cognitively engaged next to them did not realize they were toward more engaging methods.
in the class material. We think that not part of the regular student popu- Some differences between in-
BERI is best used in large lecture the- lation. Cameras could potentially structors for the same instructional
ater classes where students are given replace observers; however, human practice are revealing. For example,
the notes in advance or at the end of judgments about engagement, based although there is little difference for
class and is best used to determine if on the footage, would still be needed. many practices, there is a notable
the students are on-task or off-task in We have not explored the challenges difference in the average level of en-
the classroom. cameras would present. gagement for the clicker question fol-
One concern with classroom low-up (“Post Clicker” in Figure 2).
observations is that students may Linking engagement patterns to These observations were conducted
realize that they are being observed pedagogy during the first term in which Instruc-
and therefore alter their behavior. All Having an instructor change the tor 2 used clickers in class, whereas
indications are that the students were method of instruction had a large Instructor 1 had used clickers for 3
unaware that they were being ob- effect on student engagement. On years. As with any pedagogy, it takes

Vol. 44, No. 6, 2015 89


RESEARCH AND TEACHING

practice to improve one’s skill both large university classes that provides Experimental Psychology: General,
in creating clicker questions that will quantitative data identifying student 13, 518–540.
generate discussion and in respond- behavioral engagement. It can be Benedict, E. M., & Hoag, J. (2004).
ing in real time to clicker results in used with high reliability with only Seating location in large lectures:
ways that encourage engagement. one hour of training. It provides a Are seating preferences or location
These observation data can serve to quantitative measure of how the related to course performance?
illuminate for Instructor 2 a specific level of engagement depends on an Journal of Economic Education, 35,
instructional skill on which to focus instructor and the teaching practices 215–231.
and improve. used by that instructor. Although our Blatchford, P., Edmonds, S., &
main purpose here is to introduce the Marin, C. (2003). Class size, pupil
Providing formative feedback instrument and examples of its use, attentiveness and peer relations.
to instructors BERI could supplement a wide va- British Journal of Educational
Figure 3 shows a typical student- riety of work from testing new peda- Psychology, 73, 15–36.
engagement-versus-time plot for gogies to informing professional Bransford, J., Brown, A. L., &
a 50-minute lecture. This demon- development for faculty members Cocking, R. R. (2000). How people
strates how BERI allows an instruc- to relating engagement to student learn: Brain, mind, experience,
tor to easily see how student engage- performance. We believe this instru- and school (expanded edition).
ment rose and fell in relation to what ment will be a valuable tool for im- Washington, DC: National
was happening in class, including proving instructor’s teaching prac- Academies Press.
the instructor’s own behaviors. For tices to achieve greater student en- Chapman, E. (2003). Alternative
example, using Figure 3 and obser- gagement and, ultimately, improved approaches to assessing student
vation data recorded in a copy of the learning. ■ engagement rates. Practical
instructor’s class notes, the instruc- Assessment, Research & Evaluation,
tor can see that when he or she asked Acknowledgments 8(13). Retrieved from http://
a clicker question in class, there This work was supported at the pareonline.net
were high levels of student engage- University of British Columbia (UBC) Fenollar, P., Roman, S., & Cuesta,
ment. Engagement dropped when through the Carl Wieman Science P. J. (2007). University students’
the instructor discussed learning Education Initiative. Appreciation goes academic performance: An
goals for the lecture at the 24-min- to the many Science Teaching and integrative conceptual framework
ute mark; however, when the in- Learning Fellows who tested and used and empirical analysis. British
structor subsequently discussed the the protocol. Approval to observe class- Journal of Educational Psychology,
relevance of the class material and rooms and instruction at the UBC and 77, 873–891.
its application to the real world, en- publish results of that work is provided Greenwood, C. R., & Delquadri, J.
gagement increased. When the in- to the Carl Wieman Science Education (1988). Code for Instructional
structor lectured for more than 10 Initiative by the UBC under the policy on Structure and Student Academic
consecutive minutes toward the end institutional research. Response (CISSAR). In M. Hersen
of class using equation-heavy slides, & A. S. Bellack (Eds.), Dictionary
the engagement dropped down to References of behavioural assessment (pp. 120–
40%, but when a clicker question Assor, A., & Connell, J. P. (1992). The 122). Elmsford, NY: Pergamon.
was used in the last minute of class, validity of students’ self-reports as Hake, R. R. (1998). Interactive-
engagement rose significantly. Pro- measures of performance-affecting engagement versus traditional
viding instructors with formative self-appraisals. In D. H. Schunk & methods: A six-thousand-student
feedback such as this helps them see J. Meece (Eds.), Student perceptions survey of mechanics test data
how they can improve engagement in the classroom (pp. 25–46). for introductory physics courses.
in their teaching. Hillsdale, NJ: Erlbaum. American Journal of Physics, 66,
Baddeley, A. D., Lewis, V., Eldridge, 64–74.
Conclusion M., & Thomson, N. (1984). Harris, A. H., & Cox, M. F. (2003).
BERI is the first systematic class- Attention and retrieval from Developing an observation system
room observation instrument for long-term memory. Journal of to capture instructional differences

90
Journal of College Science Teaching
in engineering classrooms. Journal Observation Protocol (RTOP) approaches for understanding
of Engineering Education, 92, (ACEPT Technical Report No. IN- cultural and linguistic diversity
329–336. 003). Tempe, AZ: Arizona State (pp. 72–96). Cambridge, England:
Kuh, G. D. (2009). The National University. Cambridge University Press.
Survey of Student Engagement: Smith, M. K., Jones, F. H. M., Gilbert, Young, M. S., Robinson, S., & Alberts,
Conceptual and empirical S. L., & Wieman, C. E. (2013). The P. (2009). Students pay attention!
foundations. New Directions for Classroom Observation Protocol for Combating the vigilance decrement
Institutional Research, 141, 5–21. Undergraduate STEM (COPUS): to improve learning during
O’Malley, K. J., Moran, B. J., Haidet, A new instrument to characterize lectures. Active Learning in Higher
P., Schneider, V., Morgan, R. university STEM classroom Education, 10, 41–55.
O., Kelly, P. A., . . . Richards, B. practices. CBE—Life Sciences
(2003). Validation of an observation Education, 12, 618–627. Erin S. Lane is a former science teaching
instrument for measuring student Waxman, H. C., & Padron, Y. N. and learning fellow and Sara E. Harris
engagement in health professions (2004). The uses of the Classroom (sharris@eos.ubc.ca) is a senior instruc-
settings. Evaluations and the Health Observation Schedule to improve tor, both in the Earth, Ocean and Atmo-
Professions, 26, 86–103. classroom instruction. In H. C. spheric Sciences Department and the
Piburn, M., Sawada, D., Falconer, K., Waxman, R. G. Tharp, & R. Soleste Carl Wieman Science Education Initiative
Turley, J. Benford, R., & Bloom, Hilberg (Eds.), Observational at the University of British Columbia in
I. (2000). Reformed Teaching research in U.S. classrooms: New Vancouver, British Columbia, Canada.

Vol. 44, No. 6, 2015 91

You might also like