You are on page 1of 22

Res Sci Educ (2009) 39:17–38

DOI 10.1007/s11165-007-9072-7

Factors Affecting the Implementation of Argument


in the Elementary Science Classroom. A Longitudinal
Case Study

Anita M. Martin & Brian Hand

Published online: 15 December 2007


# Springer Science + Business Media B.V. 2007

Abstract This longitudinal case study describes the factors that affect an experienced
teacher’s attempt to shift her pedagogical practices in order to implement embedded
elements of argument into her science classroom. Research data was accumulated over
2 years through video recordings of science classes. The Reformed Teacher Observation
Protocol (RTOP) is an instrument designed to quantify changes in classroom environments
as related to reform as defined by the National Research Council (National science
education standards. Washington, DC: National Academy Press, 1996b) and the National
Research Council (Fulfilling the promise: Biology education in the nation’s schools,
Washington, DC: National Academy Press, 1990) and was used to analyze videotaped
science lessons. Analysis of the data shows that there was a significant shift in the areas of
teacher questioning, and student voice. Several levels of subsequent analysis were
completed related to teacher questioning and student voice. The data suggests a relationship
between these areas and the implementation of scientific argument. Results indicate that the
teacher moved from a traditional, teacher-centered, didactic teaching style to instructional
practices that allowed the focus and direction of the lesson to be affected by student voice.
This was accomplished by a change in teacher questioning that included a shift from factual
recall to more divergent questioning patterns allowing for increased student voice. As
student voice increased, students began to investigate ideas, make statements or claims and
to support these claims with strong evidence. Finally, students were observed refuting
claims in the form of rebuttals. This study informs professional development related to
experienced teachers in that it highlights pedagogical issues involved in implementing
embedded elements of argument in the elementary classroom.

Keywords Embedded elements of argument . Science Writing Heuristic (SWH) .


Student voice . Teacher questioning

A. M. Martin (*) : B. Hand


Science Education Department, Teaching and Learning, University of Iowa,
N228 LC, Iowa City, IA 52242, USA
e-mail: Anita-martin@uiowa.edu
B. Hand
e-mail: Brian-hand@uiowa.edu
18 Res Sci Educ (2009) 39:17–38

Introduction

Clearly the arrival of the National Science Education Standards (NRC 1996a, b) and the
recommendations by the National Research Council (NRC 1990) has impacted science
teaching. These standards have shifted the central focus of science instruction to include a
teaching methodology that requires students to be more active participants in their learning
(Weiss et al. 2003; Zady et al. 2003) and thus more involved in scientific thinking and
process skills (Duschl et al. 1999; NRC 1990). To achieve this focus an inquiry-based
approach to teaching science has been recommended by the National Science Teacher’s
Association and NRC as a way to improve students’ abilities to think like scientists
(Handelsman et al. 2004; Kuhn and Dean 2005). A shift to inquiry-based approaches is
thought to more closely mirror the actions undertaken by scientists in their own community.
This requires a shift away from more traditional transmission of content teaching (Abell et
al. 2000), toward teaching strategies that require students to develop skills of argument such
as making claims, using evidence, and requiring peers to evaluate claims based on the
strength of evidence (Naylor et al. 2007; Osborne et al. 2001).

Theoretical Background

Reform Efforts in Science Education

With the adoption of the National Science Education Standards (NRC 1996a, b), new
demands are being placed on the science teacher. No longer are traditional methodologies,
such as the simple transmission of facts, valued and thus environments where “experts
discover, teachers tell, and students remember facts, theories, or procedures” (Lapadat
2000, p.1), have been labeled inefficient in promoting student learning. Instead, science
teaching is now viewed as a place where science classrooms are centered on engaging
students in investigating scientific ideas as active individuals who bring past experiences
and prior knowledge to the learning task (NRC 1990). According to the American
Association for the Advancement of Science (1993) scientific inquiry can be defined as an
attempt to develop explanations about the natural world by using evidence and logic. While
inquiry teaching is being recommended, a major dilemma that has arisen is the lack of clear
definition for classroom teachers (Beck et al. 2000; Fu and Shelton 2002; Newman et al.
2004). What constitutes inquiry-based instruction for teachers is clouded by previous
initiatives such as “discovery learning” and “hands-on science.” Newman et al. (2004)
reported that the teaching of inquiry to pre-service teachers created several challenges
including the students’ lack of exposure to learning science through inquiry, and the breadth
of meaning for the term “inquiry.” They found that it was difficult to provide sufficient
opportunities for both modeling inquiry science and teaching inquiry as a pedagogical
strategy. Issues are similar for elementary teachers, with researchers reporting a lack of
understanding of inquiry and insufficient skills or experiences to effectively teach science
through inquiry (Crawford 2000; Lederman and Niess 2000). This is consistent with
research by Bright and Yore (2002) who reported that educators have not overcome the
barriers to effective science instruction. Research continues to focus on the changes
necessary to teach science in ways that are consistent with reform efforts that call for the
development of inquiry classrooms.
Components that are characteristic of inquiry-based classrooms include such aspects as:
active investigations, dialogical interactions, and collaborative work among students. The
Res Sci Educ (2009) 39:17–38 19

National Science Education Standards reported that effective investigations are ones that
allow students to actively search for answers to their questions (Lew 2001; NRC 1996a, b).
Research on dialogical interactions acknowledges the importance of classroom talk as a
means for extending conceptual understanding (Abell et al. 2000; Carlsen 1997; Driver et
al. 2000; Lemke 1990). When students are provided opportunities to work collaboratively
to solve problems and discuss alternative views, advancements in understanding are
reported (Duschl 2003; Polman 2004). Inquiry classrooms are environments where students
observe objects and events, ask questions, construct explanations, test those explanations
against current scientific knowledge, and report their ideas to others using claims and
evidence. They use critical thinking by considering alternative explanations in an attempt to
increase their understanding of scientific knowledge through the use of reasoning and
thinking skills (NRC 2000).
Consequently, students should have opportunities to explore and think about how their
previous ideas fit with the new ideas generated as a consequence of these inquiry activities.
Researchers have suggested that the construction of new meaning involves the interplay
between students’ previous knowledge and these new ideas as they question new
information, share their thinking in social contexts, and make public their newly
synthesized views with the class (Driver et al. 1994; Goodnough 2006; Van Zee 2000).
As recommended in the National Science Education Standards, inquiry classrooms should
be places where students make connections between their current understanding of science
and the scientific knowledge found through the use of a variety of sources. Such resources
include textbooks, websites, as well as peer interactions. Polman (2004) argues for the need
for inquiry classrooms to become places where the “dialogical activity structure” of the
classroom is changed from a teacher-centered environment to one more characteristic of
student-to-student interactions.
Significant, then, is dialogical interactions among students. Mercer et al. (2004) suggests
that there are two distinct contexts in which language occurs in classroom environments:
teacher-led situations, and peer group interactions. In addition, Lemke (1990) suggests that
it is within these peer group interactions that students become fluent speakers of science
and that it is through this use of language that students begin to make sense of their newly
constructed ideas. However, studies consistently report that student-to-student talk is rarely
observed in science classrooms (Newton et al. 1999; Wallace and Narayan 2002; Weiss et
al. 2003; Wragg 1993). For example, Wragg (1993) found that in analyzing student
questioning, 38% of the questions were related to procedural issues, not subject matter
understanding. These classroom findings demonstrate a lack of opportunity for students to
engage in meaningful dialogue in school settings across content areas and with diverse
student populations. More recently, in science classes, Newton et al. (1999) found that
dialogic interactions that included elements of argument, that of discussions based on
claims and evidence, is uncommon, even though research in science education recommends
inquiry classrooms that use the language of scientists because it is thought to enhance
scientific understanding by creating frameworks that encourage the development of
reasoning. Wallace and Narayan (2002) suggest a core component of the science classroom
is one where students are, “learning to use language, think and act in ways that enable one
to be identified as a member of the scientific literate community and participate in activities
of that community” (p. 4) that is, engaged in dialogical processes scientists use. Norris and
Phillips (2003) support the idea that language is not simply part of science, but an “essential
constituitive element.” Simon, Erduran, and Osborne (2006) report on the critical nature of
dialogical interactions as a function of scientific argument, suggesting that the construction
of argument is a core discursive activity of science. This is further supported by Ernest
20 Res Sci Educ (2009) 39:17–38

(1998) who contends that, for students, involvement in oral argument is a critical
component in the development of epistemological knowledge of the discipline. He suggests
that individuals use such discourse opportunities to both construct knowledge claims and to
participate in the dialectical process of criticism and warranting of others’ knowledge
claims (p. 240).

Argumentation as a Central Tenet of Inquiry

Researchers such as Driver et al. (2000) have argued for the need to shift science away from
just a set of accumulated facts toward science inquiry where the building of argument is
seen as critical. Science instruction therefore is focused on efforts to allow students to
construct explanations about their world, and to make them public by sharing them in small
groups and whole class situations. These authors and others (Duschl and Ellenbogen 2002;
Lew 2001; Luykx and Lee 2007) view the classroom as a place for students to be
enculturated into the ways of the scientific community.
Bereiter et al. (1997) assert that “any group could potentially function as a scientific
community and this includes elementary classrooms” (p. 333). By participating in such a
community, students must make claims, provide evidence for such claims, and be prepared
for challenges from other students. These processes are seen as critical in the promotion of
argumentation and its unique discourse patterns (Andrews et al. 1993; Kuhn 1992). Ritchie
and Tobin (2001) support this idea of social discourse within communities of practice and
suggest that teacher’s roles must include promoting dialogical discourse within the
classroom. In citing McGinn and Roth (1999) they argue that “through increasing
competence with science discourses and their participation in classroom discussions,
student discourse increasingly comes to resemble the discourses used by working
scientists” (p. 19).
Importantly, then, argument is seen as a tool that is utilized by the scientific community
in order to establish canonical knowledge through inquiry processes, and that without such
discourse scientists would be unable to judge the strength of claims and evidence in order to
produce new knowledge that is accurate (Kuhn 1993). Therefore, Driver et al. (1994) stress
the importance of dialogue in building scientific arguments because “scientific under-
standings are constructed when individuals engage socially in talk and activities about
shared problems or tasks. Making meaning is thus a dialogic process involving persons-in-
conversation” (p. 7). Other researchers such as Duschl and Ellenbogen (2002, p. 2) have
described argumentation as “a genre of discourse and an epistemological framework central
to doing science.” Science education, then, is beginning to demand a closer link between
science as it is taught and science as it is practiced (Hogan and Maglienti 2001; Kuhn 1970;
NSES 1996; Quinn 1997). Students need to be engaged in making claims, building evidence
and reporting findings as part of the argumentation process of science (Millar and Osborne
1998). Using these tools of argument, students are able to become actively involved in the
making of scientific knowledge (Siegel 1995) by reflective thinking that involves the
comparison of the new information to their prior knowledge and then assessing the validity of
the differences between them (Wallace and Narayan 2002). Adopting such a learning
environment provides students with opportunities to understand the true nature of the
scientific community and to become active members by modeling the practices of scientists
in their own communities within the elementary science classroom (Lew 2001; Quinn 1997).
Researchers looking at the implementation of argument within school environments,
such as Driver et al. (2000) have suggested that a critical component in promoting argument
is the need for more dialogical interaction. Support for this finding is echoed in Mercer et
Res Sci Educ (2009) 39:17–38 21

al. (1999) study on dialogic interaction where they found infrequent classroom talk. Yip
(2001) also supports this notion by claiming that many teachers feel pressured into teaching
a set curriculum in a prescribed amount of time, thereby thwarting their attempts at debate
that may interfere with the goal of the daily lesson plan. However, Solomon (1998)
suggests that teachers lack the skills necessary to promote classroom debate and others are
not convinced of its value.
As researchers attempt to consider the components of scientific argument for better
understanding, Toulmin’s (1958) model of argument has become foundational. In this
model, argument consists of a framework of claims, warrants and backings, and rebuttals.
Argument based on this model is supported by NSES (1996) and NRC (1990) and more
recently in studies such as Eduran et al. (2004). Importantly, scientific argument is seen as
dialogical in nature where participants are required to make claims based on sound evidence
and to present those claims to peers who either accept or refute them. Thus dialogical
argument best occurs where multiple views are discussed and considered, so as to arrive at
classroom consensus (Erduran et al. 2004).

Teacher’s Role and Dialogic Interactions

Conditions that promote dialogical interaction require a shift in the role of the teacher
(Kelly and Chen 1999; Schwarz et al. 2003). Traditionally, within the science classroom,
the teacher’s role has been viewed as teacher-centered with an emphasis on transmission of
the scientific idea. In this version of teaching, students are simply viewed as empty slates
that take up new information. With reform efforts firmly centered on the active role of
students, there is a growing dissatisfaction among educators with traditional teaching modes
and conventional discourse interactions. Discourse patterns in traditional, teacher-centered
classrooms, such as the initiate–reply–evaluate (IRE), have focused on ensuring that
students have received and can replicate knowledge given to them by the teacher as well as
allowing the teacher to maintain control of the classroom environment (Macbeth 2003;
Mehan 1979). While this pattern served the teacher well in the lecture-dominated classroom
where whole group instruction is the norm and is “premised on known answers and teacher-
driven activity,” it is inconsistent with an inquiry-learning philosophy (Polman and Pea
2000). In fact, the choice of this type of teacher questioning pattern has been shown to shut
down classroom conversation (Carlsen 1997). In a major review concerning components of
high-quality instruction, researchers found that across the country teachers generally used
low-level “fill-in-the-blank” type questions that were asked in “rapid-fire fashion” in order
to evaluate students’ understanding instead of using questioning as a tool to further
conceptual understanding (Weiss et al. 2003). While the content that teachers taught was
generally not within their control, teachers reported that decisions about instructional
strategies were their own and were based on factors such as their beliefs about the subject
matter, pedagogy, and the students within their classrooms (Weiss and Pasley 2004).
Treagust (2007) in his synthesis of the research on instructional methods and strategies in
science instruction, reports that the amount of classroom discourse is directly affected by
teacher questioning and that higher level questioning has been shown to improve the
amount and the quality of talk that occurs in the science classroom. Therefore, the role of
the teacher is critical in creating an environment where dialogical activities that encourage
student voice are practised (Lapadat 2002). Student voice, then, in this context is defined as
the opportunity for students to engage in dialogical interactions with the teacher and as well
as in social contexts with peers. Driver et al. (2000) claim that it is the role of the teachers
to make the tools of science available to students, allowing them to engage in socially
22 Res Sci Educ (2009) 39:17–38

constructed discourse. As teachers create opportunities for students to use these tools in the
classroom, increases in scientific reasoning skills have been observed (Hogan and Maglienti
2001; Kitchener and Fischer 1990; Pera 1994).
The Science Writing Heuristic approach has been proven to be successful in promoting
student voice and scientific reasoning (Hand et al. 2005). The SWH approach allows
students the opportunity to set their own investigative questions, propose methods to
address these questions and to carry out the investigations. The components of the SWH
approach include the elements of reflective thinking and writing, and it is through these
activities that the classroom becomes an environment where the development of scientific
reasoning can be found. The SWH approach in its simplest form asks students to be
reflective about their approach to learning science. The following questions frame scientific
investigations when using the SWH approach in the classroom:

What are my questions?


How can I test these?
How did I set up my investigation?
What can I now claim?
What evidence do I have to back up my claim?
How do my ideas compare to others?
How have my ideas changed from this investigation?

The ability to think and write about science concepts can be achieved through reflective
activities using such frameworks as the Science Writing Heuristic (SWH) approach which
provides a necessary structure for the teacher, while allowing the flexibility to meet the
individual needs of the students (Hand et al. 2004). Built into the SWH approach are
teacher questioning strategies that seek to activate prior knowledge by eliciting student
voice as well as structures that allow students to have a stronger voice in the classroom.
Therefore, in summary, the SWH approach consists of at least four major skills areas:
Teacher’s role, Teacher Questioning, Student Voice, and Science Argument.
It has been found that in teacher-centered classrooms a high percentage of voice is
teacher voice and student voice is heard for the sole purpose of reciting memorized pieces
of information (Weiss et al. 2003). As teachers implement the SWH approach, student voice
changes to incorporate their past experiences that frame their present level of understanding
(Hand et al. 2005). It is this voice that becomes stronger as teachers challenge students to
debate their findings in small groups where negotiations with peers occur.
As student voice increases, teachers tend to give up control of discourse and begin to
move from a teacher-centered style of teaching to that of a student-centered learning
environment (Cobb and Bauersfeld 1995). Progressive views in science education would
support roles that focused classroom interactions around student voice (Furman and Barton
2006). Students become active participants by framing their own questions for study either
individually, in small groups or by consensus with the class as a whole (Hand et al. 2004).
By allowing this type of student decision-making, the classroom becomes a place where
richer understandings about science ideas are able to occur (NRC 1990). Teachers then
become facilitators of student-generated inquiry discussions that promote embedded
elements of argument to occur (Seymour and Lehrer 2006). Students become more
comfortable making claims and supporting them with stronger evidence and are more able
to critically analyze the claims and evidence of their peers (Driver et al. 2000). This ability
to reason scientifically, using the tools of argument, is the primary focus of science
education reform (NRC 1990; NSES 1996).
Res Sci Educ (2009) 39:17–38 23

The purpose of this study is to examine the barriers faced by an experienced teacher
when implementing pedagogical changes in her science teaching in order that elements of
argument become characteristic of her elementary science classroom. In particular the
researchers were interested in addressing two questions:
1. Are there particular elements of the teacher’s practice that needed to be addressed in
promoting change to the SWH approach?
2. Are the elements of practice independent of each other, that is, is teacher questioning a
necessary criterion for promoting student voice?
While research on preservice teacher’s belief and attitudes about inquiry and science
process skills is abundant, the number of studies which examine the issues that experienced
teachers face when promoting argument at the elementary level is limited, thus the researchers
believe that this study is necessary to extend present understanding in science education.

Methods

Research Design

A qualitative study was designed to analyze video-taped science lessons of an experienced


fifth grade teacher as she attempted to implement the SHW approach in her science
classroom as part of a professional development project. A single case study was designed
as a follow-up study, after the professional development time period. Thirteen science
lessons were tape-recorded over a 2 year period of time. The video tapes sampled the
teacher’s science instruction while the teacher was involved in professional development in
science with a liaison of the researchers. This support person was a retired High School
science teacher involved in professional development initiatives with several teachers that
were implementing the SWH approach.

Context

The school in this study is located in a small rural Midwestern town with a population of
670. The ethnicity of the student population is 97% Caucasian with 27% of the student
body having free and reduced lunch status. The school has a total population of 161
students.

Participants

The teacher involved in this study is an elementary teacher with 16 years experience and
with a strong background in reading/ language arts and science. She has been involved in
professional development activities in the area of science for the past 4 years. This teacher
was selected using a purposeful sampling technique because she was attempting to
incorporate the SWH approach in her classroom at a high level of implementation.
The professional development liaison is a retired science teacher who taught middle
school general science as well as high school environmental science. He was involved in a
series of professional development initiatives with the second author over the last 5 years
including previous research projects related to the implementation of the Science Writing
Heuristic (SWH) approach. Approximately, two times per month he observed the teacher’s
science lessons. He met with the teacher in order to provide feedback related to pedagogical
24 Res Sci Educ (2009) 39:17–38

Table 1 Comparison of RTOP and SWH categories

RTOP SWH

Student 1. Instructional strategies respected students’ Connections: There is an emphasis on


voice prior knowledge/preconceptions. determining student knowledge and building
teacher plans based on this knowledge.
5. Focus and direction of lesson determined Connections: Teacher builds or activates
by ideas from students. students’ prior knowledge with some evidence
of using it to make instructional decisions.
16. Students communicated their ideas to Focus on learning: Student sharing with
others. argumentation/connections in either small
group, group to group or whole group.
Connections: Language activities flow
naturally throughout the SWH.
Science argument: Teacher promotes linkages
to big ideas and begins to promote debate on
these ideas.
18. High proportion of student talk and a Focus on learning: Student sharing with
significant amount was student to student. argumentation/connections in either small
group, group to group or whole group.
Dialogical interaction: Communication
effectively varies from teacher to student
and from student to student according to the
situation.
19. Students’ questions and comments Connections: Teacher effectively builds or
determined focus and direction of activates student prior knowledge with
classroom discourse. evidence of using this to make instructional
decisions.
Dialogical interaction: Teacher is not compelled
to give right answer shifting focus to the big
idea Teacher uses all levels of questioning,
and adjusts levels to individual students.
Teacher 24. Teacher acted as resource person, Focus on learning: Teacher effectively plans
role supporting and enhancing student for teacher and student instruction as needed
investigations. and appropriate.
25. The metaphor “teacher as listener” was Dialogical interaction: Teacher used questions
very characteristic of this classroom. to explore student thinking. Teacher’s
response to student answers is probing,
connects, and extends, questions.
Science 13. Students were actively engaged in thought Connections: Science activities promote big
argument provoking activities that involved critical ideas clearly and extend students learning
assessment of procedures. Connections can be seen from beginning to
end and are articulated by students.
14. Students were reflective about their Science argumentation: Teacher demands
learning. connections between question, claims,
evidence and reflection.
15. Intellectual rigor, constructive criticism, Focus on learning: Student sharing with
and the challenging of ideas was valued. argumentation/connection in small groups,
group to group and whole group with few
prompts.
Science argumentation: Teacher promotes
linkage to big ideas and promotes debate on
these ideas.
Res Sci Educ (2009) 39:17–38 25

Table 1 (continued)

RTOP SWH

21. Active participation was encouraged and Science argument: Teacher requires students
valued. to link claims and evidence. Teacher
scaffolds questions, claims, evidence and
reflection. Promotes linkages to big ideas,
and promotes debate of these ideas.
22. Students were encouraged to generate Science argumentation: Teacher scaffolds
conjectures, alternative solution strategies, questions, claims, evidence and reflection.
and ways of interpreting evidence. Promotes reflection to big ideas and
promotes debate of these ideas.
Questioning 17. Teacher questioning triggered divergent Dialogical interaction: Students are asked to
modes of thinking. explain and challenge each others’ responses
rather than the teacher passing judgment.
Teacher asks many layered questions
(i.e. Bloom’s Taxonomy). Teacher is not
compelled to give “right” answer shifting
focus to the big idea.

practices needed for implementing the SWH approach. The primary focus of their
discussions, then, centered on skills related to shifts in instructional practices not on lesson
planning. Discussion threads were based on the issues that the classroom teacher was facing
and what the changes necessary to solve these issues. Communication was available
between the teacher and the professional development liaison on a continual basis
throughout the project through email and telephone conversations. This professional
development process was grounded in a supportive role for the teacher’s concerns as she
implemented pedagogical changes necessary to use the SWH approach.

Data Collection

Different types of data sources were collected. The primary data source was 13 video-
tapings of science lessons taught by the teacher over a 2-year period of time prior to this
study. These videotapes were analyzed for this study. Additional data sources include final
interviews with the professional development liaison and the teacher.

Measurement Instruments

The Reformed Teacher Observation Protocol (RTOP) was developed by the Evaluation
Facilitation Group of the Arizona Collaborative for Excellence in the Preparation of
Teachers (Sawada et al. 2000) and designed to measure “reformed” teaching through
observation and to analyze characteristics of a classroom on a quantitative scale regarding
efforts of reform. For these reasons, the RTOP was selected to analyze this teacher’s
implementation of the SWH which has been shown to be consistent with the reform efforts
associated with the creation of inquiry-based classrooms.
Previous studies using the RTOP instrument have found a high correlation with these
scores and science and math achievement (Lawson et al. 2002; MacIsaac and Falconer
2002). Estimated reliability for the RTOP has been previously reported as r2 =0.954
(Sawada et al. 2000).
26 Res Sci Educ (2009) 39:17–38

The authors undertook an alignment task between the previously mentioned SWH
components and the RTOP’s 25 item descriptors using an interpretational analysis method.
Importantly, the results from this alignment task indicated that 13 RTOP descriptors focused
on related SWH major skills areas. The other skill areas were more related to lesson design
and specific content which were not addressed during the professional development project
and are not explicit aims of this study.
As can be seen in Table 1, four collapsed RTOP categories were created that correlated
with the SWH. Because of the modifications of the RTOP, concern over the reliability of
the new instrument required additional calculations. Internal reliability estimates were
calculated for these collapsed categories using RTOP scores from video-taped lessons
(student voice r2 =0.982; elements of argument r2 =0.977 and teacher’s role r2 =0.985).
Each of these categories contained several sub descriptors; the teacher questioning category
only had one sub descriptor and so a reliability estimate was not possible. Three
independent raters scored 3 randomly selected video tapes in order to confirm inter-rater
reliability and a Pearson’s Coefficient of 0.822 was calculated. The RTOP data meets
Levine’s Test for Equality of Variances. Table 1 shows the comparison of RTOP subscale
indicators and corresponding SWH categories.

Data Analysis

Videotape analysis

The analysis of the videotapes evolved into a complex process that was completed at
multiple levels. Each videotape was initially analyzed using the RTOP by reviewing each
tape and scoring it on a zero to four scale, with zero representing behaviors that did not
occur and four representing behaviors that were very descriptive of the classroom.
Descriptive statistics were then calculated from hard count data on teacher questioning,
dialogical interactions, and embedded elements of argument. Each of these areas was
analyzed through a finer lens through a continual review of the videotapes. The following
outline lists the multiple level analyses utilized in this study:
1. Teacher questioning
a. Initial analysis of total number of questions
b. Analysis of number of factual recall and yes/no type questions
c. Analysis of questions eliciting student voice
2. Dialogical Interactions
a. Analysis of percentage of class time devoted to teacher voice verses student voice
3. Elements of Argument
a. Student voice analysis to distinguish argument and non argument
b. Analysis to determine observed use of terminology of claims and evidence

Teacher Questioning

Aside from the scoring analysis of the RTOP, descriptive statistics were used to analyze
changes in teacher questioning patterns. All of the teacher’s questions were tallied and
categorized by question type. The framework used for this classification is Bloom’s
Taxonomy of Educational Objectives: Cognitive Domain., where questions can be
categorized by their cognitive performance level (Bloom and Krathwohl 1956). This
Res Sci Educ (2009) 39:17–38 27

framework allowed the researchers to place teacher questions into categories based on the
cognitive demand placed on the student by the question. In this model, factual recall and
yes/no type questions are considered to have the lowest level of cognitive demand, placing
them into the knowledge and comprehension levels of the taxonomy, while questions
demanding a higher level of thought are questions that require the student to use the
processes of application, analysis, synthesis, and evaluation. When students bring prior
knowledge and past experiences to bear on the question or evaluate a peer’s claim, they are
using a higher cognitive level than when responding to simple recall questions. Teacher
questions that elicit student voice can be categorized into this higher cognitive domain as
can questions that require elements of argument such as claims and evidence.
Employing this framework, the researchers divided the teacher’s questions into yes/no
and factual recall questions and then questions that elicit student voice. Questioning that
elicited student voice was divided further into that of argument and non-argument
categories. Each of these areas of questioning was again analyzed by a constant reviewing
of the video-tapes for specificity of purpose. Data reflecting total percentages of the
teacher’s questioning that elicited student voice were compared to the number of questions
related to factual recall and yes/no questions. Subsequent data analysis was completed by
reviewing video-tapes and determining the percentages of “why” questions that encourage
students to provide further evidence. Additional data analysis was completed by comparing
the number of questions that elicit student voice across time.

Dialogical Interactions

Next, an analysis of the percentage of class time devoted to student vs. teacher voice was
completed. This was completed by reviewing all 13 tapes (628 total minutes of class time)
and recording the number of minutes that the teacher talked and then recording the number
of minutes that students talked either to one another, to the teacher, or to the class as a
whole.

Elements of Argument

Further analysis was conducted looking at the percentage of student voice that could be
classified as student voice related to elements of scientific argument. This analysis was
achieved by recording situations (# of minutes) that were descriptive of the terms associated
with science argument, specifically that of claims and evidence. A comparison was made
between student voice that was not related to science argument to the amount of time that
was associated with direct use of these terms.

Transcriptional Analysis

Qualitative data was collected through the transcriptional analysis of dialogical interactions
between the teacher and students. Classroom discourse was reviewed and transcribed to
determine the presence as well as the amount of student voice that occurred during each
video-taped science lesson. This data was recorded in minutes of class time for teacher
voice and student voice. Additional transcription using the SWH framework that
emphasizes question, claim, evidence and reflection as components of argument, allowed
the researchers to further determine the presence of argument within student voice.
Teacher–student dialogical interactions were coded as either: teacher voice, student voice-
28 Res Sci Educ (2009) 39:17–38

non argument, or student voice-argument. Transcribed sections of classroom discourse were


completed in order to provide the reader with examples of typical dialogues across the
2 years of this study.

Interviews

Semi-structured interviews were conducted with the professional development liaison and
the classroom teacher. The researchers probed the emerging issues stemming from the
RTOP data as well as the transcriptional analysis and then adapted the interview questions
to focus on these issues.
This triangulation of data brings validity to the study’s 3 claims:
The role of teacher questioning is critical in implementing argument in the elementary
science classroom.
As a shift in questioning patterns occurs, student voice is elicited and encouraged.
As student voice increases, embedded elements of argument are practiced.
These claims are substantiated in the following results section.

Results

Three claims were generated from the data. As described in the methods section above, the
data presented represent a combination of numerical analysis from the RTOP, quantitative
and qualitative analysis of video-taped science lessons, and transcribed interview data from
the teacher and the professional development liaison.
Thirteen 5th grade science lessons were video-taped. Approximately two-thirds of the
way through the study, there appeared to be a major shift in the teacher’s pedagogical
practices which resulted in a delineation into two major categories: a first phase and a
second phase. The first phase consists of video tape recordings of five science classes while
the second phase consists of data from eight video tapings during the last 6 months of the
professional development project. The total number of minutes in each of these phases was
approximately equal, with the first phase totaling 306 min and the second phase totaling
322 min. The decision to consider two phases was based on two criteria. First, observation
of the first five videotapes indicated that the teacher was attempting to adapt her
questioning strategies, but did not focus on students’ prior knowledge. The researchers
observed a marked change in the amount of student voice that began to appear in videotape
five, but became more descriptive in the remaining 8 videotapes. The terms “first phase”
and “second phase” will be used to discuss our findings. Importantly, these two phases can
be seen as a transition from a teacher-centered classroom to a stronger focus on student
voice and embedded elements of argument. An additional data source that assisted the
delineation of two phases was the analysis of the final interview with the liaison. In this
interview, the professional development person was asked to reflect on specific aspects of
the teacher’s transition to a more student-centered classroom. Specifically, he was asked:
Researcher: What changes did you see in the teacher that opened the door for such a
change in the amount of student voice observed in the video-tapes?
Professional Development Liaison: “It was when she finally gave up control of the
research questions and let the students come up with them on their own. They came up
with the same question that she wanted them to study, but the difference was that it
Res Sci Educ (2009) 39:17–38 29

was their own question. This takes more time, and teacher’s don’t think that they have
the time, but, it is valuable time spent.”
A further review of the video-tapes indicated that the shift in control began with the
teacher allowing the students to determine the research question in video-tapes 5 and 6.
Video-tape 5 indicated that the teacher was leading the students heavily, but in video-tape 6
she allowed them to select the research question without pressure. Thus we adopted the
delineation of the two phases.
At approximately 1.5 years into the professional development by the teacher, changes
were observed in several areas as supported by the RTOP score shifts across the two phases.
The data presented in Table 2, represent the average score for the five lessons viewed in the
first phase, while the data in the second phase represent averages of the eight lessons from
the second phase. These averages represent data points on a likert scale ranging from 0 to 4,
with 0 signifying behaviors that never occurred and 4 representing behaviors that were very
descriptive of the classroom. The table also includes the RTOP descriptors.
As can be seen from Table 2, there is a pedagogical shift from the first phase to the
second phase. The teacher’s questioning patterns change, the role of student voice shifts
and aspects of science argument begin to appear during the second phase. Table 3 is a two
sample t-test comparison of the RTOP means across the two phases. The analysis of
variance described in the claims sections discusses this shift.

Table 2 Mean RTOP scores during the first phase and second phase on a 0–4 Likert scale

Collapsed RTOP descriptors First Second


categories phase phase

Questioning
#17 Teacher questioning triggered divergent modes of thinking. 0.4 2.8
Student voice
#1 Instructional strategies respected students’ prior knowledge/ 0.4 2.8
preconceptions.
5 Focus and direction of lesson determined by ideas from students. 0.0 3.0
16 Students communicated their ideas to others. 0.0 3.0
18 High proportion of student talk and a significant amount was student to 0.2 2.4
student.
19 Students’ questions and comments determined focus and direction of 0.0 3.3
classroom discourse.
Science argument
#13 Students were actively engaged in thought-provoking activities that 0.2 2.7
involved critical assessment of procedures.
14 Students were reflective about their learning. 0.0 3.5
15 Intellectual rigor, constructive criticism, and the challenging of ideas 0.4 3.4
were valued.
21 Active participation was encouraged and valued. 0.8 3.1
22 Students were encouraged to generate conjectures, alternative solution 0.8 3.3
strategies, and ways of interpreting evidence.
Teacher’s role
24 Teacher acted as resource person, supporting and enhancing student 0.4 3.1
investigations.
25 The metaphor “teacher as listener” was very characteristic of this 0.2 3.0
classroom.

0 = never occurred 4 = very descriptive of classroom


30 Res Sci Educ (2009) 39:17–38

Table 3 A two sample t-test comparison of the RTOP means across the two phases

RTOP collapsed categories First phase Second phase t stat p value

M SD M SD

Questioning 0.40 0.55 2.75 0.89 5.282 <0.001


Student voice 0.12 0.33 2.88 0.72 17.878 <0.001
Science argument 0.44 0.65 3.20 0.91 13.171 <0.001
Teacher’s role 0.30 0.68 3.06 0.57 11.168 <0.001

Claim 1: The Role of Teacher Questioning is Critical in Implementing Argument in the


Elementary Science Classroom

Results from Tables 2 and 3, show that the use of questioning by the teacher differed in the
second phase. In the first phase the RTOP average was 0.4 out of 4.0, while the second
phase RTOP average was 2.8 out of 4.0. The two sample t test also supports the claim that
the teacher shifted her questioning pattern from a limited number of divergent questions in
the first phase to a more consistent use in the second phase.
During the first phase, typical questioning was of an initiate–respond–evaluate (IRE)
pattern where the teacher was in charge and evaluated each response. An excerpt from one
of the first phase lessons on saturated fats illustrates this pattern. The following discourse
documents the kind of information sought by the teacher, and indicates a use of prodding to
elicit student response. When there is no response to a question, the teacher asks that facts
be read from a book to the class. The following example was typical of the discourse during
the first phase.
T: We found 4 kinds of fats. What are they? Name them.Students name fats.
T: Very good.
T: What’s a trans fat? No response.
T: Cory read this.Student reads something.
T: Okay, he just read what we learned. Are trans fats good fats or bad fats?S: Bad fats.
T: Right. Very good. If it is a hardened vegetable oil, what is it?
T: Think of something else. What must they add?
T: Think. I know you know it.
T: Think. What is it?
T: Say it. You know it.
T: Say it. I know you know it. If I tell you, you’re gonna say, “I knew that”.
However, during the second phase this discourse pattern changed. The teacher was
observed using different types of questions with a foci that was centered on eliciting a
stronger student voice. During this phase, the researchers began to observe a consistent
attempt by the teacher to use discourse that included both the terminology of claims and
evidence and an emphasis on students’ use and application of this terminology, as
demonstrated by the RTOP scores for student voice and science argument. Additional
Res Sci Educ (2009) 39:17–38 31

support for this claim can be seen in the following excerpt on weathering and erosion below
which demonstrates that the previous IRE pattern has begun to change:
T: Can we see examples of what frost action does around here anywhere?
S: Yes. The sidewalks.
T: What about the sidewalks?
S: When the water freezes in the winter it cracks the sidewalks.
T: Do you agree or disagree, class?
Class: Agree.
T: We’re making some pretty bold claims. Can we back them up?
A second level of analysis of the teacher’s questioning pattern was completed to analyze
the shift in questioning adopted by the teacher. This analysis focused on the beginning of
each science class where the lesson was teacher centered and devoted to question and
answer. Three sample video-tapes from each phase were analyzed, with a combined total of
54 min of class time (taken from the beginning of each of the three classes videotaped) for
each phase. All questions that were asked were recorded. As can be seen in Table 4 there
was a clear difference in emphasis by the teacher across these two phases.
An initial analysis revealed that a total of 389 questions were asked in 54 min of analyzed
tape in the first phase. During the second phase 177 total questions were asked, resulting in 54%
fewer questions during the same amount of time. The Chi-squared analysis, X2 (1, N=566)=
117.54, p<0.001, demonstrates a statistically significant difference in the number of yes/no
and factual recall questions compared to the number of questions, which are later categorized
as questions that elicit student voice. During the first phase 77% of all the teacher questions
could be categorized as either a yes/no or factual recall type question. In the second phase
only 29% of the teacher’s questions were of this nature. Thus it can be noted that the teacher
has begun to shift the initial focus of the lesson from a teacher directed activity to a much
more open discursive practice. Because of this disparity, it was important to further analyze
the kinds of questions that replaced the factual recall questions indicative of the first phase.

Claim 2 As a Shift in Questioning Patterns Occur, Student Voice is Elicited and Encouraged

As the questioning pattern was further examined, the data show a parallel shift between the
teacher’s questioning pattern and an attempt to elicit student voice. In support of this claim,

Table 4 Totals of all questions asked by the teacher in 54 min of class time taken from the beginning of each
of three sample classes from each phase

First phase Second phase

Yes/No 188 27
Factual Recall 110 24
Total number of questions of these typesa 298 51
Total number of questions in this phaseb 389 177

X2 (1, N=566)=117.54, p<0.001


a
Total Yes/No and Factual Recall Questions in Phase 1 and 2
b
Total number of questions includes questions not categorized as either yes/no or factual recall questions
32 Res Sci Educ (2009) 39:17–38

in the first phase less than one percent of the teacher’s questions were “why?” questions (3
out of 389), while in the second phase fifteen percent of the questions were “why?”
questions (26 out of 177). We believe that such a questioning pattern inherently draws out
student opinion and provides opportunities for individual reasoning to be observed. This
claim is supported by research on questioning previously discussed in the literature review
and the SWH approach which includes a tool for teachers that links them to Bloom’s
Taxonomy. Video analysis suggests that in the second phase, student voice was elicited to
assess students’ prior knowledge and played a role in the direction and focus of the class.
Changes in RTOP scores for student voice (# 1, 5, 16, 18, 19) from the first phase to the
second phase support this notion (from 0.4 to 2.8; 0.0 to 3.0; 0.0 to 3.0;.2 to 2.4; 0.0 to 3.3
respectively).
During the first phase, the researchers observed that the teacher’s central focus was on
the lesson for that day and that attempts made by the students to engage in related areas of
interest were not recognized. Student responses did not affect the direction or foci of the
lesson as can be documented by the following teacher responses: “We’ll get to that later.”
“No, let’s focus on this.” “Does your answer relate to this? No. We’ll get to that later.” In
the first phase, the teacher used those kinds of statements in order to redirect students 12
times when they attempted to share prior knowledge or ask related questions. However
during the same amount of class time in the second phase the students were redirected only
once. During this second phase similar attempts were met with a different response by the
teacher as can be illustrated by the excerpts below:

“You have lots of questions about glaciers.”


Write down your questions. What can we do to find out the answers to your
questions?”
“What if we simulated how something so heavy could flow?”
“Would that answer your questions?”
Teacher: Do you agree that how the plates move is related to volcanoes?
Students: “Yes” “No”
Teacher: Hmmm. We’re divided on that aren’t we? We’d better work that out,
shouldn’t we? Let’s go back to our resources: books, web, and come back together.

These examples demonstrate a change in the teacher’s focus in the classroom. Not only
does the teacher acknowledge student voice, but these excerpts illustrate a persistence in
seeking out student questions and student opinions. We make the case that a pedagogical
shift that allows student voice to affect instructional direction creates a classroom
environment that enhances ownership for learning and creates active participation by the
students. This is supported by the shift in RTOP scores for subsets 21, 22 across the two
phases (from 0.8 to 3.1; 0.8 to 3.3).
Additional analysis of the video tapes revealed that the teacher employed new question
types in the second phase of the project. The following data in Table 4, is a further analysis
of the shift in question type across the two phases. Table 5 lists higher order questions that
suggest a purposeful attempt by the teacher to reveal students’ presently held conceptions
(or misconceptions) and to seek their opinion.
As can be seen in Table 5, there were no observed cases of questions that elicit student
response in the first phase. An analysis shows that in the second phase 65 out of 177 were
Res Sci Educ (2009) 39:17–38 33

Table 5 Analysis of number of questions that elicit student voice

Question type First phase Second phase

Do you agree or disagree with the claim? 0 31


What do you mean by that? 0 18
How are they alike or different? 0 16
Student voice questions/total questions asked 0/389 65/177

asked for the purpose of eliciting student voice (37% of the total questions asked). This
finding can be supported by the quantitative data from the RTOP scores for items 18 and 19
where the data shifts from scores of 0.2 to 2.4 and 0.0 to 3.3. Table 5 also illustrates the
introduction of the terminology of claims and evidence in the classroom. The researchers
then analyzed whether an increase in student voice in general, would impact student voice
directly related to elements of argument.
Additional support for this claim can be seen in the following excerpt from the final
interview with the teacher.
Researcher: You’re different than you were when you started. How are you different?
Is it your management of kids? Or your knowledge? Or student interactions? Or your
questioning?
Teacher: Well, that’s gotten a lot easier. And I like that. They run the conversations
now. And I think they do get a lot more out of that. They put it in their own words.
They actually will question each other. And kids, I don’t know what it is about them,
but they can explain it to each other better than I ever could. They understand it. It’s
better than I ever imagined.

Claim 3: As Student Voice Increases, Embedded Elements of Argument are Practiced

During the second phase as student voice increased, the teacher began using the terms
“claims and evidence.” Students began to make claims in small groups and to the class as a
whole. Following this shift, the students were observed using evidence to back up these
claims. When students made claims, the teacher asked the other students if they agreed or
disagreed with the claim. As this shift occurred, the amount of time devoted to teacher
voice decreased while student voice increased. This change allowed more time for
discourse among and between students. Additional analysis of the tapes was completed to
compare the percentage of class time allocated for teacher voice and student voice as well
as the percentage of student voice related to argument. Results are displayed in Table 6.

Table 6 Percentage of class time for teacher and student voice

First phase Second phase

Teacher voice 62% 46%


Student voice
Non argument 38% 12%
Argument 0% 35%

Second Phase ≠ 100% due to 7% of class time devoted to reflective writing that did not exist in the first
phase.
34 Res Sci Educ (2009) 39:17–38

During the first phase, sixty-two percent of all classroom discourse observed, was
teacher voice. Student voice represented 38% of total class time and is considered non
argument student voice. Student voice that is categorized as argument includes the use of
terminology such as claims and evidence, warrants and backings or rebuttals. During the
first phase student voice centered on answering the teacher’s recall questions. Students were
not observed making individual claims and providing evidence. Learning situations that
embed elements of argument include the practice of presenting findings to small groups or
the whole class and defending those claims with evidence through warrants and backings.
As the teacher transitioned to a student-centered classroom and teacher voice diminished,
more opportunities for student voice were observed. Also, during the second phase, the
researchers found that the majority of student voice included aspects of scientific
argumentation, (0% = first phase; 35% = second phase). As noted in Table 6, there were
no observations of reflective thinking and writing in the first phase, but 7% of class time in
the second phase was devoted to these reflective practices, decreasing the total class time
devoted to voice for the second phase to 93%.
During the first phase, only one lesson included situations that required students to make
claims, however, during the second phase six out of eight lessons included the terminology
of claims and evidence. As elements of argument were practiced, however, the student
peers became reviewers of the claims and began to evaluate the claims based on strength or
weakness of argument. In the second phase, rebuttals were observed more frequently. At
first the rebuttals were weak as can be seen in the following transcribed episode:
Cory: I disagree with that Tara.
Teacher: Let’s hear it.
Cory: Okay, Tara, name one thing, just one thing that is weathering that builds up the
Earth’s crust?
Tara: Volcanoes.
Corry: No, a volcano is not weathering.
Teacher: Can you back up that claim?
Cory: I know a volcano is not weathering. Oh, how did I say that before.
As embedded elements were practiced, the students became better able to refute and
defend claims. The following excerpt from the second phase depicts such an example.
Mara: Conifers don’t grow in the Tundra.
Teacher: Why don’t you see a lot of trees in the tundra?
Mara: They can’t survive because of the frozen ground.
Teacher: What do you mean by that?
Mara: Their roots need to go deep into the ground, but they can’t because it’s frozen.
Teacher: Did she back up her claim, class?
We observed that the pattern of argument paralleled the development of student voice.
Comparing RTOP scores related to elements of argument (13, 14, 15, 21, 22,) and student
voice (16, 18, 19) during the first phase and the second phase indicates a clear difference.
As student voice and elements of argument were implemented in the classroom, the
Res Sci Educ (2009) 39:17–38 35

environment changed. With more student to student interaction, the teacher’s role became
one of listener and resource person. This is illustrated by the RTOP subsets 24 and 25. The
average RTOP score for these subsets in the first phase was 0.4 and 0.2 respectively, while
the second phase results are 3.1 and 3.0, indicating a dramatic shift in the teacher’s role.

Discussion

Results from this study highlight the difficulties that an experienced teacher faced in
shifting her practice from a traditional approach to a more student-centered orientation. This
experienced teacher was a successful classroom practitioner who struggled with the change
process not because she was unwilling, but due to the shift in pedagogical practices that
were required when using the SWH approach.
The major findings of the study have resulted in three major claims related to
implementation of dialogue and argumentation within this grade 5 classroom. These were:
there is a critical role for the teacher in promoting argument; shifting the questioning pattern
produces more active student voice; and as student voice increases, elements of science
argumentation are practiced.
An important outcome of this research is that the shift in pedagogical orientation is not
easily achieved and requires time to occur. Experienced teachers have developed successful
pedagogical strategies and these have become entrenched as automatic ways of operating.
Even with constant interaction with a professional development person, there was an
18 month period before significant shifts in the teacher’s practices were observed. The time
taken in shifting the teacher’s practices has implications for professional development.
Experienced teachers are reluctant to give up their pedagogical strategies (Bright and Yore
2002; Yip 2001). These ways of teaching are a part of the teacher’s repertoire of skills that
have proven successful over time. As teachers are asked to shift these practices, it is
important to study the necessary support that experienced teachers need to accomplish this
change (Jeanpierre et al. 2005). Specific types of support that are necessary for a shift in
pedagogy that would allow teachers to implement inquiry-based classrooms and approaches
such as the SWH should be considered for future study.
Significant, then, is that this shift was facilitated by the introduction of the argument
terms of claims and evidence that allowed the teacher to shift her pedagogical orientation.
By using these terms in the form of questions, a natural shift in the dialogue occurred as the
teacher moved from a monological direction to increased dialogical interaction between
herself and the students. Scores from the RTOP and the dialogical analysis clearly show this
shift. Such results would suggest and highlight the need for potential implementation of
inquiry approaches such as the SWH to focus on teacher questioning. However, the
researchers would point out that other studies have shown that there are other factors that
also need to be addressed in terms of teacher change when implementing inquiry-based
approaches.
The researchers have shown that during the second phase of the study, students took
ownership for their learning which this study suggests is evidenced by an increase in
student voice and subsequently an increase in terminology such as claims and evidence.
This study also demonstrates that as students become familiar with these terms, their ability
to evaluate their peers claims increases and rebuttals and the refuting of claims can be
observed.
Importantly, then, this study suggests that as student voice increases and the teacher
begins to require students to provide support for their findings, the use of the terminology
36 Res Sci Educ (2009) 39:17–38

of argument such as claims and evidence is a result of embedding this language into science
instruction. Our study indicates that as students use these science argument terms in the
context of “talking through” the results of their investigations, their use of the terms
increases. Future studies that look at the issue of explicit instruction of argument terms or
the use of embedded elements of argument in the elementary science classroom could
advance understanding in this area.
Finally, as professional development initiatives are determined, consideration of how to
increase the amount of student–student dialogical interaction within the elementary science
classroom should be pushed to the forefront. These findings are consistent with recent
literature involving science argument, that with an increase in student voice, students
develop the skills necessary to design investigations, make claims and provide evidence for
their findings (Mercer et al. 2004; Naylor et al. 2007; Seymour and Lehrer 2006).

References

Abell, S. K., Anderson, G., & Chezem, J. (2000). Science as argument and explanation: Exploring concepts
of sound in third grade. In J. Minstrell, & E. H. van Zee (Eds.) Inquiring into inquiry learning and
teaching in science (pp. 65–79). Washington, DC: American Association for the Advancement of
Science.
American Association for the Advancement of Science (1993). Benchmarks for science literacy. New York:
Oxford University Press.
Andrews, R., Costello, P., & Clarke, S. (1993). Improving the quality of argument, Final Report pp. 5–16.
Hull, UK: Esmee Fairbairn Charitable Trust/University of Hull.
Beck, J., Czerniak, C., & Lumpe, A. (2000). An exploratory study of teacher’s beliefs regarding the
implementation of constructivism in their classrooms. Journal of Science Teacher Education, 11, 323–
343.
Bereiter, C., Scardamalia, M., Cassells, C., & Hewitt, J. (1997). Postmodernism, knowledge building and
elementary science. Elementary School Journal, 97, 329–340.
Bloom, B. S., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of
educational goals, by a committee of college and university examiners. Handbook I: Cognitive domain.
New York: Longmans, Green.
Bright, P., & Yore, L. (2002). Elementary preservice teacher’s beliefs about the nature of science and their
influence on classroom practice. Paper presented at the Annual Meeting of the National Association for
Research in Science Teaching, New Orleans, LA. (ED46182)
Carlsen, W. S. (1997). Never ask a question if you don’t know the answer: Tension in teaching between modeling
scientific argument and maintaining law and order. Journal of Classroom Interaction, 32(2), 14–23.
Cobb, P., & Bauersfeld, H. (1995). The emergence of mathematical meaning: Interaction in classroom
cultures.Studies in mathematical thinking and learning series. Hillsdale, NJ: Lawrence Erlbaum
Associates, Inc.
Crawford, B. A. (2000). Embracing the essence of inquiry: New roles for science teachers. Journal of
Research in Science Teaching, 37, 916–937.
Driver, R., Asoko, H., Leach, J., Mortimer, E., & Scott, P. (1994). Constructing scientific knowledge in the
classroom. Educational Researcher, 23(7), 5–12.
Driver, R., Newton, P., & Osborne, J. (2000). Establishing the norms of scientific argumentation in
classrooms. Science Education, 84, 287–312.
Duschl, R. A. (2003). Assessment of inquiry. In J. M. Atkin, & J. E. Coffey (Eds.) Everyday assessment in
the science classroom (pp. 41–59). Arlington, VA: NSTA Press.
Duschl, R.A., & Ellenbogen, K. (2002). Argumentation processes in science learning. Paper presented at the
conference on Philosophical, Psychological, and Linguistic Foundations for Language and Science
Literacy Research, University of Victoria, BC, Canada.
Duschl, R., Ellenbogen, K., & Erduran, S. (1999). Understanding dialogic argumentation. Paper presented at
the annual meeting of American Educational Research Association, Montreal, QC, Canada.
Eduran, S., Simon, S., & Osborne, J. (2004). TAPping into argumentation: Developments in the application
of Toulmin et al.’s argument pattern for studying science discourse. Science Education, 88, 915–933.
Res Sci Educ (2009) 39:17–38 37

Ernest, P. (1998). Social constructivism as a philosophy of mathematics. New York: State University of New
York Press.
Fu, D., & Shelton, N. R. (2002). Teaching collaboration between a university professor and a classroom
teacher. Teaching Education, 13(1), 91–102.
Furman, M., & Barton, A. (2006). Capturing urban student voices in the creation of a science mini-
documentary. Journal of Research in Science Teaching, 43, 667–694.
Goodnough, K. (2006). Enhancing pedagogical content knowledge through self-study: An exploration of
problem-based learning. Teaching in Higher Education, 11, 301–318.
Hand, B., Norton-Meier, L., Gunel, M., & Akkus, R. (2005). K-6 Science Writing Heuristic Project: An MSP
project funded by the Iowa Department of Education. Presentation to The Iowa Department of
Education, Des Moines, IA.
Hand, B., Wallace, C. W., & Yang, E. (2004). Using a science writing heuristic to enhance learning outcomes
from laboratory activities in seventh-grade science: Quantitative and qualitative aspects. International
Journal of Science Education, 26(2), 131–149.
Handelsman, J., Ebert-May, D., Beichner, R., Bruns, P., Chang, A., & DeHaan, R. (2004). Scientific
teaching. Science, 304, 521–522.
Hogan, K., & Maglienti, M. (2001). Comparing the epistemological underpinnings of students’ and
scientists’ reasoning about conclusions. Journal of Research in Science Teaching, 38, 663–687.
Jeanpierre, B., Oberhauser, K., & Freeman, C. (2005). Characteristics of professional development that effect
change in secondary science teachers’ classroom practices. Journal of Research in Science Teaching, 42,
668–690.
Kelly, G. J., & Chen, C. (1999). The sound of music: Constructing science as sociocultural practices through
oral and written discourse. Journal of Research in Science Teaching, 36, 883–915.
Kitchener, K. S., & Fischer, K. W. (1990). A skill approach to the development of reflective thinking. In D.
Kuhn (Ed.) Developmental perspectives on teaching and learning thinking skills: Vol. 21 (pp. 48–62).
New York: Karger.
Kuhn, T. E. (1962). The structure of scientific revolutions. Chicago: University of Chicago Press.
Kuhn, T. E. (1970). The structure of scientific revolutions. Chicago: University of Chicago Press.
Kuhn, D. (1992). Thinking as argument. Harvard Educational Review, 62, 155–178.
Kuhn, D. (1993). Science as argument. Science Education, 77, 319–337.
Kuhn, D., & Dean Jr., D. (2005). Is developing scientific thinking all about learning to control variables?
Psychological Science, 16, 866–870.
Lawson, A., Benford, R., Bloom, I., Carlson, M., Falconer, K., Hestenes, D., et al. (2002). Evaluating college
science and mathematics instruction: A reform effort that improves teaching skills. Journal of College
Science Teaching, 31, 388–93.
Lapadat, J. (2000). Construction of science knowledge: Scaffolding conceptual change through discourse.
Journal of Classroom Interactions, 35(2), 1–14.
Lapadat, J. (2002). Relationships between instructional language and primary students’ learning. Journal of
Educational Psychology, 94, 278–290.
Lederman, N. G., & Niess, M. L. (2000). Problem solving and solving problems: Inquiry about inquiry.
School Science and Mathematics, 100, 113–116.
Lemke, J. (1990). Talking science: Language, learning, and values. Norwood, NJ: Ablex.
Lew, L. (2001). Development of constructivist’s behaviors among four new science teachers prepared at the
University of Iowa. Unpublished doctoral dissertation, University of Iowa, Iowa City.
Luykx, A., & Lee, O. (2007). Measuring instructional congruence in elementary science classrooms:
Pedagogical and methodological components of a theoretical framework. Journal of Research in Science
Teaching, 44, 424–447.
Macbeth, D. (2003). Hugh Mehan’s Learning Lessons reconsidered: On the differences between the
naturalistic and critical analysis of classroom discourse. American Educational Research Journal, 40,
239–280.
MacIsaac, D., & Falconer, K. (2002). Reforming physics education via RTOP. The Physics Teacher, 40, 479–
485.
McGinn, M., & Roth, W. -M. (1999). Preparing students for competent scientific practice: Implications of
recent research in science and technology studies. Educational Researcher, 28(3), 14–24.
Mehan, H. (1979). Learning lessons: Social organization in the classroom. Cambridge, MA: Harvard
University Press.
Mercer, N., Dawes, L., Wegerif, R., & Sams, C. (2004). Reasoning as a scientist: Ways of helping children to
use language to learn science. British Educational Research Journal, 30, 359–377.
Mercer, N., Wegerif, R., & Dawes, L. (1999). Children’s talk and the development of reasoning in the
classroom. British Educational Research Journal, 25, 95–111.
38 Res Sci Educ (2009) 39:17–38

Millar, R., & Osborne, J. (1998). Beyond 2000: Science education for the future. London: King’s College
London.
National Research Council (1990). Fulfilling the promise: Biology education in the nation’s schools.
Washington, DC: National Academy Press.
National Research Council (1996a). The role of scientists in the professional development of science
teachers. Washington, DC: National Academy Press.
National Research Council (1996b). National science education standards. Washington, DC: National
Academy Press.
National Research Council (2000). Inquiry and the national science education standards. Washington, DC:
National Academy Press.
Naylor, S., Keogh, B., & Downing, B. (2007). Argumentation and primary science. Research in Science
Education, 37, 17–39.
Newman, W., Abell, S., Hubbard, P., McDonald, J., Otaala, J., & Martini, M. (2004). Dilemmas of teaching
inquiry in elementary science methods. Journal of Science Teacher Education, 15, 257–279.
Newton, P., Driver, R., & Osborne, J. (1999). The place of argumentation in the pedagogy of school science.
International Journal of Science Education, 21, 553–576.
Norris, S. P., & Phillips, L. M. (2003). How literacy in its fundamental sense is central to scientific literacy.
Science Education, 87, 224–240.
Osborne, J. F., Erduran, S., Simon, S., & Monk, M. (2001). Enhancing the quality of argument in school
science. School Science Review, 82(301), 63–70.
Pera, M. (1994). The discourses of science. Chicago: University of Chicago Press.
Polman, J. (2004). Dialogic activity structures for project-based learning environments. Cognition and
Instruction, 22, 431–466.
Polman, J., & Pea, R. (2000). Transformative communication as a cultural tool for guiding inquiry science.
Science Education, 85, 223–238.
Quinn, V. (1997). Critical thinking in young minds. London: David Fulton.
Ritchie, S., & Tobin, K. (2001). Actions and discourses for transformative understanding in a middle school
science class. International Journal of Science Education, 23, 283–299.
Sawada, D., Piburn, M., Falconer, K., Turley, J., Benford, R., & Bloom, I. (2000). Reformed teaching
observation protocol (RTOP) (Tech. Rep No. IN00-1). Tempe, AZ: Arizona State University, Arizona
Collaborative for Excellence in the Preparation of Teachers.
Schwarz, B., Neuman, Y., Gil, J., & Ilya, M. (2003). Construction of collective and individual knowledge in
argumentation activity. Journal of the Learning Sciences, 12, 219–256.
Seymour, J., & Lehrer, R. (2006). Tracing the evolution of pedagogical content knowledge as the
development of interanimated discourses. Journal of the Learning Sciences, 15, 549–582.
Siegel, H. (1995). Why should educators care about argumentation? Informal Logic, 17(2), 159–176.
Simon, S., Erduran, S., & Osborne, J. (2006). Learning to teach argumentation: Research and development in
the science classroom. International Journal of Science Education, 28, 235–260.
Solomon, J. (1998). About argument and discussion. School Science Review, 80(291), 57–62.
Toulmin, S. (1958). The uses of argument. Cambridge: Cambridge University Press.
Treagust, D. (2007). General instructional methods and strategies. In S. Abell, & N. Lederman (Eds.)
Handbook on research in science education (pp. 373–391). Mahwah, NJ: Lawrence Erlbaum Associates.
Van Zee, E. (2000). Using questioning to guide student’s thinking. Journal of the Learning Sciences, 6, 227–
269.
Wallace, C., & Narayan, R. (2002). Acquiring the social language of science: Building science language
identities through inquiry-based investigations. Paper presented at the Conference on Philosophical,
Psychological, and Linguistic Foundations for Language and Science Literacy Research, University of
Victoria, B.C., Canada.
Weiss, I., & Pasley, J. (2004). What is high quality instruction? Educational Leadership, 61(5), 24–28.
Weiss, I., Pasley, J., Smith, P., Banilower, E., & Heck, D. (2003). Looking inside the classroom: A study of K-
12 mathematics and science education in the United States. Chapel Hill, NC: Horizon Research.
Wragg, E. (1993). Primary teaching skills. London: Routledge.
Yip, D. (2001). Promoting the development of a conceptual change model of science instruction in
prospective secondary biology teachers. International Journal of Science Education, 23, 755–770.
Zady, M., Portes, P., & Ochs, V. (2003). Examining classroom interactions related to differences in students’
science achievement. Science Education, 87, 40–63.

You might also like