Professional Documents
Culture Documents
Linda M. Stearns, Jim Morgan, Mary Margaret Capraro, & Robert M. Capraro,
Teaching is a complex activity that requires making ongoing multiple decisions and
sporadic, responsive actions all while performing preplanned prescribed tasks. Evaluation
instrument. This instrument was designed to assess the enactment of the essential
implementing the instrument’s use as well as some design methods are discussed.
The Use of a Teacher Observation Instrument of PBLs in STEM Classrooms
Mathematics (STEM) education has resulted in the prolific growth of programs and
professional developments to address this need. However, little research investigating the
While many agree that STEM education is import because it affects the field of
education, but also the United States’ competiveness in the global market across various
fields, including economics (Committee on Prospering in the Global Economy of the 21st
2010; Pfeiffer, Overstreet, & Park, 2010). A heighted awareness of STEM education in
the general public has occurred due to media reports of what is called the STEM pipeline
problem, which is a decrease in the number of students pursuing careers in STEM fields
(Sanders, 2009).
The No Child Left Behind Act (2001) has addressed the growing concern over the
STEM pipeline problem by advocating for greater attention to science and mathematics
education. STEM education in K-12 benefits students for STEM careers but can also
[NGA], 2008),. In fact, STEM education should be present in all schools and post-
secondary institutions should be involved and play an important role in helping schools
build infrastructure and improve articulation between K-12 school and universities
(NGA).
In a recent study, Bhattacharjee (2009) found that a lack of incentive for pursuing
STEM careers was the cause for decreasing numbers seeking STEM degrees,
contradicting the previously held notion that it was due to inadequate K-12 STEM
preparation. Even though the number of high-school STEM education credits students
earned increased steadily from 1990 to 2005 just taking more courses was not the
solution. Given a statistically significant increase in the number of STEM courses high
school students completed, there was a decline in the number of students graduating
college with STEM related degrees (Laird, Alt, & Wu, 2009).
Because increasing the number of STEM courses taken in high school has been
education should be the focus of national attention. Marshall (2010) advocated for
improvement in the quality of the STEM courses. An effective STEM curriculum should
nurture students’ problem solving and inventive thinking. The STEM curriculum should
focus learning on creative exploration, projects, problem solving, and innovation, not fact
to make connections between the content taught in various classes (Pfeiffer et al., 2010).
STEM education, a better understanding of STEM will lead more students to STEM
careers.
Observation of Teachers
In order to improve the quality of STEM education classes designed to encourage
support. “There is considerable evidence from different studies suggesting that how
teachers behave in the classroom, the instructional approaches they employ, significantly
affect the degree to which students learn” (Van Tassel-Baska, Quek & Feng, 2008, p. 85).
regardless of the students’ abilities (Sanders and Rivers, 1996). Many similar studies in
al.). Effective teaching must have some form of evaluation. Classroom observation
instruments can be an effective tool for measuring performance based teaching strategies.
Effective observations require a certain amount of training and often can still have
high degree of face validity (Volpe, DiPerna, Hintze & Shapiro, 2005). This is not to say
that no validty threats are present. For example, the following threats have been identified
within behavioral observations: (a) poorly defined behavior categories, (b) low
behaviors, (e) inappropriate code selection, and (f) observer bias. These threats can be
Using an external observer to describe and evaluate teaching practices can give a
should be well trained to identify factors that are fundamentally important to a school’s
academic success. In addition, the observers should understand how the school’s goals
and initiatives, past and current professional development, and the content covered in
course are aligned. Evaluators may use an observation tool that can be designed in as
many ways as there are teaching methods (Dinkelman, 2003; Felder & Silverman, 1988).
designed observational instrument (Simon & Boyer, 1969; O’Malley et al., 2003). The
observation tool could yield a descriptive account of targeted performances. This can be
achieved with a conceptual rubric that contains a numeric range of descriptors for each
counting system, or coding system (Taylor-Powell & Steele, 1996). This observational
tool could serve to monitor the progress toward increasing a desirable trait or diminishing
through the observation tool can be used for teacher reflection and to customize
development effective change may not occur (Van Tassel-Baska et al., 2007). A study of
teacher evaluation practices found that few teachers had substantial change in their
with the current reform. Teachers cannot just be shown a new idea or practice but need
experimentation and a culture of learning to fully implement the new practice or idea
(Franke, Carpenter, Levi, & Fennema, 2001). Evaluation of PD is critical to establishing
improved actions by teachers in consequential ways that benefit students. There are
several ways to appraise the effectiveness of PD; one way is through observation of
teachers in their classrooms (Guskey, 2002). VanTassel-Baska et al. (2008) argued that
professional development was most effective over the course of three years when
interspersed with classroom observations that tracked the targeted instructional behaviors.
Without the classroom observation check points, the teachers could be evaluated;
however, the observation should only be focused on the behaviors focused on in the PD
(VanTassel-Baska).
looked at several classroom teachers and their ongoing learning after a professional
development program. They found that the teachers who actively listened to their
children’s mathematical thinking through their own observations and observations from
others within their classroom were able to modify and improve their instructional
decisions to benefit their students. Without the observation, the teachers did not have a
clear understanding of their students’ thought processes and how their lesson appeared to
the students.
observation, but the observer needs to provide feedback to the educator so they may
evaluate and adjust their teaching to benefit the students (Patrick, 2009). Another way for
PD to be evaluated is through lesson study, which permits teachers to refine their lessons
to create a quality lesson. It can be used with a collaborative group of teachers working
together to write lessons, observe, provide feedback, and refine one another’s lessons in
the STEM field, in order to boost the STEM curriculum at their school (Liddicoat, 2008).
To help teachers apply the knowledge learned from professional development Krause,
Culbertson, Oehrtman, and Carlson (2008) suggested the use of Professional Learning
Communities (PLCs) in which groups teachers got together to support, observe, and
engage in academic thought to further their knowledge about the STEM. Their research
showed that the PLCs worked to create a positive community of collaboration with
common goals to advance the STEM pipeline. Hamos et al. (2009) also advocated for
STEM teachers to work together in PLCs rather than in isolation to improve the
been created by a team of professors and graduate students from The Aggie STEM
Center at Texas A&M University. This instrument was specifically created to evaluate
specific observable objectives of teachers when presenting PBLs in their classrooms. The
teachers who are evaluated with this instrument have gone through extensive training on
PBL with the Aggie STEM Center. The teachers have been prepared in each of the
measured objectives. Both the team of observers and the teachers have been trained on
the use of the instrument. The instrument results are provided and discussed with the
teachers. Whole group statistics are discussed with teachers, administrators, and school
board members. Teachers may provide their own justifications for items not evident or
include: (a) PBL structure, (b) PBL Facilitation, (c) Student Participation, (d) Resources,
(e) Assessment, and (f) Classroom Learning Environment. The number of items under
each objective varies. Each item can be evaluated on a scale from 1 t 5. One, meaning no
evidence and 5, meaning strong evidence. The observer is encouraged to write the
justification for the score assigned to each item. Occasionally, the item will not apply to
what is observed at that time. This may happen if the observer is only present for part of
the PBL. The observer can then indicate that the behavior was not applicable or not
References
Anita, S., & Gil, B. E. (Eds.). (1969). Mirros for behaviors, an anthology of classroom
observation instruments. Retrieved from ERIC database. (ED031613)
Bhattacharjee, Y. (2009). Study finds science pipeline strong, but losing top students.
Science, 326 (5953), 654.
Committee on Prospering in the Global Economy of the 21st Century: An Agenda for
American Science and Technology, National Academy of Sciences, National
Academy of Engineering, Institute of Medicine. (2007). Rising above the
gathering storm: Energizing and employing America for a brighter economic
future. (2007). Washington, DC: National Academy Press.
Dinkelman, T. (2003). Self-study in teacher education. Journal of Teacher Education,
54(1), 6-18.
Felder, R. M.,, & Silverman, L. K. (1988). Learning and teaching styles in engineering
education. Engr. Education, 78(7), 674-681.
Franke, M. L., Carpenter, T. P., Levi, L., & Fennema, E. (2001). Capturing teachers’
generative change: A follow-up study of professional development in
mathematics. American Educational Research Journal, 38, 653-689.
Guskey, T. R. (2002). Does it make a difference?: Evaluating professional development.
Educational Leadership, 46-51.
Hlebowitsh, P. S. (2005). Designing the school curriculum. Boston: Pearson Education.
Kimball, S. M. (2002). Analysis of feedback, enabling conditions, and fairness
perceptions. Madison: University of Wisconsin-Madison, Wisconsin Center for
Education Research, Consortium for Policy Research in Education.
Krause, S., Culbertson, R., Oehrtman, M., & Carlson, M. (2008). High school teacher
change, strategies, and actions in a professional development project connecting
mathematics, science, and technology. Proceedings of Frontiers in Education
Conference, 1-3, 265-270.
Laird, J., Alt, M., & Wu, J. (2009). STEM coursetaking: Among high school graduates,
1990-2005. MPR Research Brief, [online].
Liddicoat, S. (2008) NASA enriched collaborative K-12 STEM teacher professional
development institutes within the California State university system.
Proceedings of Frontiers in Education Conference, 1-3, 1488-1493.
Marshall, S. P. (2010). Re-imagining specialized STEM academies: Igniting and
nurturing decidedly different minds, by design. Roeper Review, 32, 48-60.
Merrell, K. W. (1999). Behavioral, social, and emotional assessment of children and
adolescents. Mahwah, NJ:Erlbaum.
National Governors' Association. (2008). Promoting STEM education: A communications
toolkit. Retrieved March 3, 2010, from http://www.nga.org
O’Malley K. J., Moran, B. J., Haidet, P., Seidel, C. L., Schneidr, V., Morgan, R. O.,
Kelly, P. A., & Richards, B. (2003). Validation of an observation instrument for
measuring student engagement in health professions settings. Evaluation &
Health Professions, 26(1), 86-103.
Pfeiffer, S. I., Overstreet, M., & Park, A. (2010). The state of science and mathematics
education in state-supported residential academies: A nationwide survey.
Roeper Review, 32, 25-31.
Rosenshine, B. (1970). Evaluation of classroom instruction. Review of Education
Research, 40(2), 279-300.
classroom application. Modern Language Journal, 93, 280-287.
Sanders, M. (2009). STEM, STEM education, STEMmania. The Technology Teacher, 68
(4), 20-26.
Sanders, W. L., & Rivers, J. C. (1996). Cumulative and residual effects of teachers on
future students’ academic achievement. Knoxville: University of
Tennessee, Value-Added Research and Assessment Center.
Simon, A., & Boyer, E. G. (1969). Mirrors for behavior, An anthology of classroom
observation instruments. ERIC document 031613.
Taylor-Powell, E., & Steele, S. (1996). Colleting evaluation data: Direct observation.
Program Development and Evaluation,
Patrick, P. (2009). Professional development that fosters Van Tassel-Baska, J., Quek, C.,
& Feng, A. X. (2007). The development and use of a structured teacher
observation scale to assess differentiated best practice, 29(2), 84-92.
VanTassel-Baska, J., Feng, A. X., Brown, E., Bracke, B., Stambaugh, T., French,
H., . . .Bai, W. (2008). A study of differentiated instructional change over 3 years.
The Gifted Child Quarterly, 52, 297-312.
Volpe, R. J., DiPerna, J. C., & Hintze, J. M. (2005). Observing students in classroom
settings: A review of seven coding schemes. School Psychology Review, 34(4),
454-474.
Appendix
PBL Title
____________________________________________________________
PBL Description
________________________________________________________
_____________________________________________________________
To what extent was the following present? Please mark the box that best displays your response
on a scale of 5 to 1. 5= to a great extent, 1 = no evidence.
I. PBL Structure
1. The PBL has a well defined outcome.
2. The PBL contains rigorous subject area content.
3. The PBL lends itself to multiple, creative and unique tasks in which students can demonstrate
a continuum of knowledge and understanding.
4. The PBL covers subject/grade level TEKS.
5. The tasks or the overall PBL will likely to lead to higher order thinking.
6. The PBL is not a stand-alone lesson.
7. The PBL is interdisciplinary.
8. The students worked in organized small groups.
II. PBL Facilitation
9. The teacher clearly stated goals and tasks.
10. The teacher facilitated the students to remain on-task.
11. The teacher asked effective open-ended questions.
12. The teacher worked with members of all small groups.
13. The teacher achieved objectives they identified.
IV. Resources
17. The appropriate resources were used.
18. The resources were readily available and in working order.
19. The students were proficient in using the resources (i.e. calculators, test books, computers).
20. The materials were familiar to the students.
V. Assessment
21. The assessment(s) was/were continuous and varied.
22. The evidence of holistic assessments existed (e.g. rubrics for participation/engagement, early
stages of the PBL, or group work).
23. The students could explain the expectations.
24. The students understood what they needed to do and how it was evaluated on the rubric.
___________________________________________________________________________
___________________________
Observer Date