You are on page 1of 24

INGRID S. WEILAND, RICK A. HUDSON and JULIE M.

AMADOR

PRESERVICE FORMATIVE ASSESSMENT INTERVIEWS:


THE DEVELOPMENT OF COMPETENT QUESTIONING
Received: 13 June 2012; Accepted: 18 January 2013

ABSTRACT. Recent research and reform documents in mathematics and science education
have highlighted the importance of preservice teacher education that focuses on
understanding students’ reasoning and modifying instruction accordingly. Utilizing the
constructs of core practices and the professional noticing as lenses with which to examine
the development of preservice teachers’ questioning practice, we conducted a case study
of 1 pair of preservice teachers as they performed weekly formative assessment interviews
to elicit student thinking during 1 semester. Our study was driven by the following
research questions: (1) How do preservice teachers develop their questioning practice and
ability to notice students’ thinking about mathematical and science concepts? (2) How can
these questioning practices be further developed? Results suggest that with weekly
practice and reflection, preservice teachers can develop their questioning practice within
the context of face-to-face interaction with students and that the ways they question
students can change when given opportunities to interact with them and analyze their
thinking. By having participants attend to what they were professionally noticing about
students’ thinking, we contend that they learned to adapt their questioning techniques to
ask students more competent questions. Participants also exhibited 2 areas of questioning
practice ripe for improvement—asking leading questions and missed opportunities to
probe students’ thinking. While providing clinical field experiences to preservice teachers
is not novel, we suggest the importance of foregrounding practices that elicit student
mathematical and scientific thinking through an iterative process of enactment and
reflection to develop questioning practice and the ability to notice.

KEY WORDS: core practices, formative assessment, preservice teacher education,


professional noticing, questioning

Recent research and reform documents in mathematics and science


education have highlighted the importance of training preservice teachers
to develop an understanding of students’ reasoning and to modify their
instruction accordingly (NCTM, 2000; NSTA, 2000); however, coming to
understand how students think requires that teachers pose problems and
competently probe in ways that draw out and build on students’
reasoning. Teachers can then make sense of the students’ language and
actions by continually interpreting student behavior and thinking from

Electronic supplementary material The online version of this article (doi:10.1007/s10763-013-


9402-3) contains supplementary material, which is available to authorized users.

International Journal of Science and Mathematics Education (2014) 12: 329Y352


# National Science Council, Taiwan 2013
330 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

students’ perspectives (Steffe & Thompson, 2000). With this understand-


ing, the teacher can develop hypotheses about students’ cognition and
design tasks to provoke creative activity (Hackenberg, 2005).
The act of interviewing students has been used in the mathematics and
science education communities to examine the thinking of students
individually or collectively in small groups. Piagetian “clinical inter-
views” have been used to study students’ concepts (Posner & Gertzog,
1982). “Diagnostic interviews” have been called “interviews about
instances” (Bell, Osborne, & Tasker, 1985). Steffe & Thompson (2000)
have used multiple, sustained teaching interactions to examine students’
longitudinal development and to test their teacher/researcher hypotheses
about how concepts develop. They refer to this longitudinal approach as a
teaching experiment, which includes a number of teaching episodes,
typically with pairs of children. They stated, “we have to accept the
students’ mathematical reality as being distinct from ours…students’
mathematics is indicated by what they say and do as they engage in
mathematical activity, and a basic goal of the researchers in a teaching
experiment is to construct models of students’ mathematics” (p. 268).
This study documents how preservice teachers develop their question-
ing practice and ability to professionally notice as a result of their
participation in an experimental field experience which focuses on
eliciting students’ mathematical and scientific thinking. The field
experience is designed to provide preservice teachers with close
interactions with elementary students and to allow them to develop
personal models of student reasoning. The preservice teachers engage in
formative assessment interviews—adapted from the teaching experiment
methodology (Steffe & Thompson, 2000)—to help them understand
children’s mathematical and scientific reasoning through questioning. In
this article, we address the following questions:
1. From participating in this field experience, how do preservice teachers
develop their questioning practice and ability to notice students’
thinking about mathematical and science concepts?
2. How can the preservice teachers’ questioning practices be further
developed?

THEORETICAL FRAMEWORK AND RELEVANT LITERATURE

Many educational researchers have noted that effective teacher prepara-


tion programs provide preservice teacher clinical experiences in order to
DEVELOPMENT OF COMPETENT QUESTIONING 331

engage in specific research-based practices. Preservice teachers are then


given opportunities to improve through repeated use and reflection
(Darling-Hammond, 2010; Grossman, Hammerman, & McDonald, 2009;
Lampert, 2010). These research-based practices, which have been called
“core practices” (Grossman et al., 2009), “high-leverage practices” (Ball,
Sleep, Boerst, & Bass, 2009), and “generative practices” (Franke &
Kazemi, 2001), are considered key elements of teaching that can
potentially result in highly effective instruction. Common to these three
terms are the following criteria: (a) improve learning and achievement of
students, (b) are done frequently, (c) can be applied across content and
contexts, and (d) support the integrity of the teaching profession.
Lampert (2010) notes the various uses of the term “practice” and how
differing definitions can affect approaches to teacher education. In this
paper, we aspire to the definition provided by Grossman and colleagues,
which stipulates that practices are developed through social interaction (in
this case, with students) whereby preservice teachers engage in
“pedagogies of enactment.” Pedagogies of enactment allow preservice
teachers to bridge the theory they learn in methods courses to interactions
with students in the classroom. Preservice teachers must engage in
teaching practices with students and with subject matter in order to
develop understandings of the complexity of teaching as well a
professional identity. Furthermore, teacher educators must focus on the
theoretical and practical aspects of any given practice.
Questioning meets the aforementioned criteria of core practices and is
driven by both theory and action. The theoretical underpinnings of
questioning practice lie within cognitively guided instruction (Carpenter,
Fennema, & Franke, 1996). Cognitively guided instruction approaches
teaching with the notion that both teachers and students arrive in the
classroom with prior conceptions of phenomena and that teachers must
uncover this prior knowledge, interpret the knowledge with regard to
misconceptions or missing information, and modify instruction to build
on current understandings. Students bring “informal and intuitive”
knowledge to the classroom, and it is the responsibility of the teacher to
attend to these understandings during instruction. At the core of this
framework is focusing on what students know rather than what they do
not know. In action, questioning is a core practice that can enhance
teachers’ understandings of what students know by uncovering students’
prior conceptions of a mathematical or science topic. Questioning allows
teachers to elicit students’ thinking (Carpenter et al., 1996), interpret this
thinking, and modify instruction accordingly. In this study, we examined
how preservice teachers develop the core practice of questioning to elicit
332 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

students’ thinking about mathematical and science concepts. Our


participants engaged in weekly formative assessment interviews, which
provided them opportunities to interact with students and engage in
pedagogies of enactment to focus on particular students’ thinking.
Utilizing questioning as a core practice that allows preservice teachers
to focus on students’ thinking, we examined the development of
preservice teachers’ ability to question students during weekly formative
assessment interviews that occurred over the course of one semester.

Teacher Questioning
Inquiry-based instruction requires teachers to be able to question
students in systematic ways that build on students’ knowledge and
focus students’ attention on important scientific and mathematical
content. The opportunities children have to learn are directly
impacted by the questions they are asked. NCTM (2007) advocates
that teachers pose “questions and tasks that elicit, engage, and
challenge each student’s thinking” and ask “students to clarify and
justify their ideas orally and in writing” (p. 45). Such questioning
practices provide a stark contrast to the Initiate–Respond–Evaluate
discourse patterns that have often characterized classroom discourse
in traditional teaching settings (Mehan, 1979).
International comparisons of teachers’ classroom questioning practices
have shown that the questioning practices of American teachers, in
particular, differ greatly from their peers in other countries. Kawanaka &
Stigler (1999) found that American teachers are more likely to ask low-level
questions (e.g. questions that can be answered with a response of yes/no)
than teachers in Germany or Japan. They are also less likely than their peers
to ask students to explain or describe their thinking, although the percentage
of these higher-order questions asked in all three countries remained small.
Several research studies have investigated the modes and charac-
teristics of teachers’ questions in classroom and small-group settings.
These studies have shown that teachers’ questions can serve an
important role to guide students’ thinking (van Zee & Minstrell,
1997) and to promote student justification and generalization
(Martino & Maher, 1999). Sahin & Kulm (2008) characterized
teachers’ questions into three types: factual, probing, and guiding.
These questions differ, respectively, by requiring students to provide
a predetermined response, by prompting students to explain or justify
their thinking, and by scaffolding students’ thinking toward a
particular concept. In observing two mathematics teachers question-
DEVELOPMENT OF COMPETENT QUESTIONING 333

ing practices, Sahin & Kulm (2008) found that the teachers, who
were using reform-based textbooks, tended to ask a preponderance of
factual questions. Franke, Webb, Chan, Ing, Freund, & Battey (2009)
reported on teacher questioning practices after participating in
professional development sessions focused on algebraic reasoning.
These teachers consistently asked questions to elicit student thinking
(e.g. How did you get that?), but the ways they responded to
students’ explanations varied widely, ranging from general questions
to responses specific to the students’ explanations. As Sahin & Kulm
(2008) and Franke et al. (2009) have documented, even when
teachers have access to reform-based curriculum materials and
professional development aimed at enhancing teacher knowledge,
teachers still struggle to question students productively. Chin (2006)
examined in-service teachers’ use of questioning in the science
classroom and proposed a framework for understanding “question-
ing-based discourse.” She found that teachers responded to students
in four ways: (a) affirm the answer and continue direct instruction,
(b) affirm the answer and then ask probing questions, (c) correct the
student, or (d) evaluate the response or provide neutral comments by
asking follow-up questions. While these findings are interesting,
Chin’s research did not focus on interviews specifically nor did her
participants include preservice teachers, therefore highlighting a need
for continued study.
Although the importance of teachers’ questions has been docu-
mented widely in educational research, very little research has
focused on how teachers’ questioning practice develops. Moyer &
Milewicz (2002) investigated preservice teachers’ use of interviews
through an examination of their abilities to ask students questions
about their mathematical thinking. They examined questioning
abilities during one interview session and found that preservice
teachers’ questions fell into three categories: checklisting (listing
questions off of the interview protocol), instructing rather than
assessing, and probing or follow-up questions. Although this study
provides insight about the types of questioning that preservice
teachers employ, it does not address how preservice teachers’
questioning changes over time. Using two data points throughout a
practicum experience, Ralph (1999) argued that preservice teachers
improved the clarity of their questions, increased wait time for
student responses, and posed questions to a wider variety of
students. However, one of the limitations of Ralph’s study is an
overreliance on self-assessment of questioning abilities. These studies
334 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

suggest that there is still much to understand about how novice


teachers learn to develop their questioning practices.

Professional Noticing
A strong relationship exists between teachers’ monitoring of students’
thinking and the ability to pose timely questions to deepen student
understanding (Martino & Maher, 1999). To do so, teachers must first come
to professionally notice and understand the methods of reasoning students
are using in the act of problem solving and scientific inquiry (Jacobs, Lamb,
Philipp, & Schappelle, 2011). The construct of professional noticing is a
“way to understand how teachers make sense of complex classrooms”
(Jacobs et al., 2011, p. 98) and encompasses where teachers focus attention
and how they interpret and reflect on their discoveries about student
thinking. Van Es & Sherin (2002, 2008) describe professional noticing in
mathematics as what teachers attend to during instruction; for the purposes of
this paper, we apply the construct of professional noticing to the content
areas of both mathematics and science as we examine the core practice of
questioning because of its applicability across disciplines (Sherin, Russ, &
Colestock, 2011). (Further use of the term “noticing” in this paper refers to
this construct, unless otherwise specified).
Van Es & Sherin (2002, 2008) claim that the act of noticing involves three
related aspects: (a) identifying what is noteworthy about classroom
interactions, (b) connecting classroom events to broader principles of
teaching and learning, and (c) using what one knows about a situation’s
context to reason about the situation. In van Es and Sherin’s view, teacher
education has not traditionally focused on helping teachers learn to interpret
classroom interactions in the midst of instruction. Recent work by Jacobs et
al. (2011) has found that deciding how to respond while teaching requires
expertise in attending to children’s strategies and interpreting their thinking
and that learning how to engage in professional noticing can be learned
through support.
Researchers have begun to investigate what supports are necessary to help
teachers focus their professional noticing on aspects of teaching that are
inherent to reform initiatives, such as the classroom discourse required to
build conceptual understanding. Such interventions are intended to move
teachers past noticing superficial aspects of reform teaching like classroom
management or use of manipulatives. For example, Scherrer & Stein (2012)
and van Es & Sherin (2008) found that teacher participants improved in their
ability to recognize and interpret children’s mathematical thinking when they
shifted the focus of what they notice.
DEVELOPMENT OF COMPETENT QUESTIONING 335

Van Es (2011) argues that professional noticing during discourse


encourages the interpretation of student thinking and formulation of
evidence that serves as a catalyst for future teaching decisions.
Like Jacobs, Lamb, & Philipp (2010), our interest lies in what teachers’
notice about children’s thinking, which includes three interrelated skills:
(a) attending to children’s strategies, (b) interpreting children’s mathe-
matical understandings, and (c) deciding how to respond to students on
the basis on their understandings. Their results align with Jacobs et al.
(2011) and suggest that the ability to notice is not routine for novice
teachers, but is more prevalent among emerging teacher leaders. Research
by Star, Lynch, & Perova (2011) found that teacher preparation programs
can positively impact preservice teachers’ abilities to attend to thinking
strategies and interpret understandings. Such findings suggest that the
ability to professionally notice can be learned even early on, and teaching
experience coupled with continuous professional development that
focuses on children’s thinking has the potential to support further
development.

METHOD

Context
At the university where the study took place, all preservice teachers
majoring in elementary education enroll concurrently in a mathematics
methods course, a science methods course, and a field experience focused
on both mathematics and science instruction during one semester. As a
part of a larger research project, preservice teachers in the experimental
section of the field experience engage in an iterative weekly process
throughout the semester which consists of conducting formative assess-
ment interviews, building sharable models of student thinking, planning
and teaching whole-class lessons, and collectively reflecting via a
modified version of lesson study (Lewis, 2000). This paper focuses
solely on the formative assessment interview component of the
experimental field experience approach.
During the first 2 weeks of the field experience, preservice teachers
become oriented with the experimental approach and the school setting.
Next, they spend 6 weeks focusing on mathematics and 5 weeks focusing
on science. Each week, a pair of preservice teachers conducts a formative
assessment interview (FAI) with two elementary students from their host
classroom. The FAIs, modeled after Steffe & Thompson’s (2000)
336 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

teaching experiments, are intended to help preservice teachers gain an


understanding of how individual students think and reason about specific
content. The purpose is also formative in that the findings influence
instructional decisions regarding planning and teaching of the whole-class
lessons taught the following week. As a part of this process, the
preservice teachers develop weekly interview protocols based on a
mathematics or science topic.

Research Design and Data Collection


To investigate the preservice teachers’ questioning, we conducted a case
study, following the same pair of preservice teachers over the course of a
semester. We considered the pair as a single case as they worked together
closely to develop and conduct FAIs. Case studies are best used when
“how” questions are being asked about a contemporary set of events over
which the researcher has little or no control (Yin, 2009). We therefore
employed a case study methodology to understand how two preservice
teachers questioned elementary students within the context of FAIs and
how their questioning techniques evolved over time.
The FAIs with elementary students were video-recorded for 10 weeks
throughout the semester; during this time, the two preservice teachers
rotated between the two roles as lead and witnessing interviewer. The first
six interviews focused on mathematics content, while the final four
interviews addressed science content. Each video was transcribed
verbatim, and student and preservice teacher actions were included to
construct a rich record of each interview. To triangulate data and support
the construct validity of our results, we analyzed the FAI protocols
written by the preservice teachers and the videos and transcripts of
formative assessment interviews. We provide thick description of the case
study using these data to support the validity of our findings. Preservice
teachers completed a Formative Assessment Interview Reflection and
Planning form (see Electronic Supplementary Material (ESM), Appendix
A) immediately following each interview as part of the class. We
analyzed these to understand the preservice teachers’ noticing of student
thinking. Finally, we engaged in peer debriefing of our findings with
colleagues to ensure that our work resonates with other mathematics and
science education researchers (Creswell, 2002).

Participants
Our case study includes two participants, Kelly and Caleb, who were both
elementary education majors in their junior year at a large, American
DEVELOPMENT OF COMPETENT QUESTIONING 337

university. At the time of the study, Kelly and Caleb were working
with four peers in a first-grade classroom as part of their field
experience course. We selected this pair to study because their field
instructor indicated that their professional dispositions would likely be
receptive to our examination of their questioning practices. In
addition, based on the results of a survey designed to measure
teachers’ beliefs Hudson, Kloosterman, & Galindo (2012), Kelly
indicated she felt more confident to teach mathematics than science.
Caleb, on the other hand, felt confident to teach both mathematics
and science.

Data Analysis
The purpose of this study was to examine the change in questioning
techniques used by preservice teachers over time and in depth. We
documented the types of questions posed by preservice teachers over
the course of the semester in which they participated in the
experimental field experience, striving to understand how they were
questioning students during FAIs and seeking to explicate changes in
their questioning over time.
To analyze the data, we created a framework based on recent
studies focused on questioning. Reliability of case study methodology
can be addressed by following a case study protocol (Yin, 2009); we
therefore utilized research procedures similar to those described by
Moyer & Milewicz (2002). We began with the codes created by
Moyer & Milewicz (2002) because they provided a way to categorize
and label preservice teachers’ questions during interviews with
students. To identify questioning types, we first considered two
categories presented by Moyer & Milewicz (2002): instructing rather
than assessing and probing or follow-up questions. We did not
include their third category, “checklisting,” which signifies instances
when questions are posed as a list from the protocol, because we
considered such questions as instances of posing problems and
therefore coded it as such. We used this framework to arrive at the
following three major types of questions: instructing rather than
assessing, probing or follow-up questions, and problem-posing
questions. We then subcategorized these three question types as
described below.
To begin, an initial set of interview data was coded by each of the three
authors independently. Only questions posed in the interrogative form
were coded; thus, imperative statements that merely implied the asking of
338 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

a question were not included in the analysis. To promote inter-rater


reliability and consistent coding, the authors met to reconcile coding
differences and to refine the questioning coding framework. Based on
these initial discussions, “instructing rather than assessing questions”
were subcategorized into leading questions or teaching/telling.
“Probing or follow-up questions” were subcategorized into incorrect
response, nonspecific, and competent questions. “Problem-posing
questions” were subcategorized into framing questions, protocol
questions, and new questions. The prevalence of various prompts in
the initial analysis also led to one additional code: repeated questions.
Further descriptions and examples of these codes are provided in
Table 1 and a summary of codes is depicted in Fig. 1.
Following the initial analyses, all subsequent interview transcripts
were coded by two of the authors and differences in codes were
mutually discussed and reconciled until agreement was reached on
how the question should be coded. After coding all questions, the
number of each type of question for each FAI was counted and
percentages were calculated for the question types.
To investigate the relationship between Kelly and Caleb’s
questioning practices and their abilities to notice, we analyzed the
FAI Reflection and Planning forms. We created a rubric containing
five dimensions (see ESM Appendix B) to analyze the interpretation
of student thinking and inferences about student understanding.
Following Jacobs et al. (2010), the rubric was structured around
three categories of professional noticing: robust (three points),
limited (two points), and lacking (one point). Therefore, combined
scores for each form ranged from a high score of 15 to a low score
of 3. Once the rubric was refined, each author independently coded
the forms with 90.2 % scoring agreement and the scores were
reconciled so that there was a single score for each dimension. After
all the codes were reconciled, we found a summative score and
quantitatively analyzed the data to examine changes in Kelly and
Caleb’s noticing patterns over the course of the semester.

FINDINGS

Kelly and Caleb engaged in a variety of questioning techniques


throughout the semester. From the nine types of questions outlined in
Table 1, a total of 539 questions were coded over ten FAIs. Each question
type is elaborated below.
TABLE 1
Codes and examples from data

Frequency
Code Description (adapted from Moyer & Milewicz, 2002) Example (%)

Problem-posing: Task/question is directly read or slightly modified Kelly: So, Mr. Caleb …has eleven students 105 (19.5)
protocol from the task/question that appeared in the FAI in his class, okay? He has both boys and girls
protocol in his class. How many different combinations
of boys and girls could he have in his class?
Problem-posing: Task/question that does not appear on the FAI Protocol states: “If I started with 10 cubes, 17 (3.2)
framing protocol, but that the PST uses to introduce how many are under the cup?
or frame the task/question that appears Kelly: I’m going to take some and I’m going
on the protocol. to put them under a cup, okay. So pay
attention here. Why don’t you guys count how
many are not in the cup right now?
Problem-posing: new Task/question that does not appear on the FAI Protocol states: Have the student move magnets 21 (3.9)
protocol, but that the preservice teacher poses over the iron filings. Ask students what they
to students notice and see if they can explain what
is happening.
Caleb asked had asked the students to do the above
task. He posed a new problem when stating:
Okay, so now you use magnets and you can try
DEVELOPMENT OF COMPETENT QUESTIONING

to make this move, see what you find.


Problem-posing: repeat Task/question that has already been asked, but Caleb: So what do you notice about the colors? 80 (14.8)
the teacher repeats in order to refocus the See right there you’re touching white. What do
students’ attention or react to student questions you notice about that? [problem-posing:
about the initial problem posed protocol] Student: I don’t know. It’s very sticky.
Caleb: Do you notice anything about the colors?
[problem-posing: repeat]
339
TABLE 1
340

(continued)

Frequency
Code Description (adapted from Moyer & Milewicz, 2002) Example (%)

Instructing: teaching Question moves away from assessing Caleb: Well, if you look at this, what it does, 27 (5.0)
and telling student thinking to teaching the student well, it is supposed to kind of balance it out
by telling them a fact or concept .…Okay? So, see if I push really, really hard,
this isn’t lined up.
Instructing: leading Question intended to focus students’ attention Kelly: So, that, that [problem] uses this 28 (5.2)
question to a particular aspect or provides prompting symbol, (points to addition sign) right? If we
hints or clues were to write that out.
Follow-up: nonspecific Question in reaction to student response, but Caleb: Why do you think so? 105 (19.5)
lacks specificity or does not react to student
words or actions
Follow-up: competent Question in reaction to student response, but in Caleb: Do you notice anything about the 70 (13.0)
which the preservice teacher attempted to build colors? [problem-posing repeat]
on the students’ thinking to cause the student Student: Yeah, the different colors
to justify thinking, explore reasoning, or create Caleb: So when you move the magnets, do you
a cognitive conflict to reorganize thinking notice anything about how the different colors
affect each other? [follow-up: competent]
Follow-up: incorrect Question in reaction to student response that is Caleb: (Long pause) So, since you are 12 (2.2)
solely presented to tell the student that they are blowing, do you think you always have to
incorrect. In these instances, the PST does not touch an object to make it move? [problem-
INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

following up questions that they perceive the posing: protocol] Student: Yeah. Caleb: You
student has responded to correctly. do? [follow-up: incorrect]
Total questions 539
DEVELOPMENT OF COMPETENT QUESTIONING 341

Other
9%
IRTA NonSpecific
10% 19%

Competent
13% Protocol
20%

Clarifying
14%
Repeat
15%

Fig. 1. Summary of overall questioning codes. IRTA=instructing-rather-than-assessing


(teaching and telling or leading questions)

Problem-Posing Questions
Kelly and Caleb asked problem-posing: framing questions 17 times
(3.2 % of questions asked). During an FAI focusing on the concept of
balance using a triple beam balance, Kelly asked the following problem-
posing: framing question:

Kelly: What happens is if these two [sides of the balance] have the same amount in them,
then this stick is going to match this in the middle. So, see if I push really, really hard, this
isn’t lined up. Okay? Does this remind you of something? Maybe something you have seen?
Student: Yeah.
Kelly: Yeah, do you know what I am talking about?
Student: Yeah.
Kelly: Yeah, a see saw, or a teeter totter or something. Yeah. Have you ever played on
one of those before?

Kelly familiarized the student with the concept of balance by referring to a


teeter totter. She made the assumption that the student had prior knowledge
of balance through experiences on a teeter totter and utilized this assumed
prior knowledge to frame the following protocol question, “What do you
notice happens when I put one of these blocks on one side?”
In addition to framing questions, Kelly and Caleb asked questions from
the protocols they had prepared prior to the FAI. Problem-posing:
protocol (19.5 %) and problem-posing: repeat (14.8 %) questions were
342 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

those that were taken directly from the written protocol or changed only
slightly so that the intent of the question remained the same.
New problems were asked 21 times (3.9 %) and consisted of
questions that deviated from the protocol in terms of conceptual
content rather than minor wording. For example, during FAI 2, Caleb
sought to find out what his students thought about composing and
decomposing the number “eight.” In the FAI protocol, Caleb planned
to ask the following questions:
 If I have 4 chips and 4 cubes, how many chips and cubes do I have
all together?
 Do I have to use 4 chips and 4 cubes to get 8 or is there maybe
another way to get 8 total objects?
 Are there any other combinations of chips and cubes I can use so that
there are still 8 total items on the table?

During the FAI, the students presented different combinations of


chips and cubes so there were a total of eight items. For example,
one student showed two chips and six cubes to demonstrate 2+608.
The student used a strategy of replacing a cube for each chip. As the
student did this, Caleb posed a new question, “Can you take away all
of them?” This question was not originally in the protocol, but was
asked as a new task based on the student–teacher interaction. In this
case, the student was able to represent the situation and exclaimed
that it represented, “Eight plus zero.”
Often after asking questions, Kelly and Caleb would clarify the
response, asking the student to repeat his or her answer or repeating
the response in the form of a question (i.e. Student: Eleven. Kelly:
Eleven?). Approximately 13.0 % of all questions Kelly and Caleb
asked were focused on clarifying student responses.

Instructing-Rather-Than-Assessing Questions
We also identified instances when Kelly and Caleb instructed rather
than assessed when asking questions. At times, they told the student
the answer within their question (5.0 %) or incorporated leading
questions (5.2 %) that suggested to the students a solution path. For
example, in FAI 5, Kelly guided the student in solving to the point
that it was difficult to determine the student’s thinking. The student
was given the following problem: “Bill had five donuts, George gave
him some more and now Bill’s total is eighteen. How many did
George give him?” The following ensued:
DEVELOPMENT OF COMPETENT QUESTIONING 343

Student: I don’t know. I think it is eight.


Kelly: Keep working, you can use the cubes if you need to, Erica.
Student: That is nineteen.
Kelly: Nineteen. How did you get that? He gave him nineteen to get to eighteen?
Student: Eighteen, nineteen.
Kelly: Okay, here, think about this. So, he had five, right? You can use the counters
[counts out five counters for him]. He had five and he got some, we don’t know how
many though, and he ended with eighteen. How many do you think he gave him?

Kelly told the student to keep working, implying that the answer was incorrect.
Kelly led the student to use cubes to solve the problem, which hindered her
ability to uncover how the student was solving the problem. Rather than
eliciting the student’s thinking, Kelly continued to guide the student to use
counters. In fact, she even counted out the five counters. This extensive
guidance hindered the potential for the student to represent and solve the
problem. This incident was therefore coded as instructing: teaching and telling.

Follow-Up Questions
The most notable trends in the data were evident in the follow-up
questioning that took place after a problem had been posed and the
student had responded. As described above, follow-up questions were
separated into three main categories: follow-up: incorrect, follow-up:
nonspecific, and follow-up: competent. Questions that indicated to the
student an incorrect answer were deemed follow-up incorrect (2.2 % of all
questions). Of particular interest to our first research question is that
19.5 % of all questions asked were coded as follow-up: nonspecific and
13.0 % were coded as follow-up: competent. Examples of these findings
are described in detail below.

Development of Competent Questioning


Follow-up: nonspecific questions were those that were general in nature,
meaning they focused on exposing student thinking and reasoning, but
did not necessarily cause the student to reorganize or conceptualize
mathematical or scientific processes or draw directly on the actions and
utterances of the student. For example, Kelly interviewed a student about
her understanding of balance. When the student was able to add weight to
one side of a triple beam balance so that both sides were equal, she
followed up with a nonspecific probe: “So, why do you think that
happened?” Rather than pose a general question, Kelly could have asked
a more detailed question that incorporated the student’s action such as,
“What do you think made the two sides balanced?” or, “What did adding
344 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

TABLE 2
Percentage of follow-up: nonspecific questions

FAI 1 2 3 4 5 6 7 8 9 10
% NS 15.7 26.8 25.6 43.2 23.9 8.1 12.2 18.0 11.5 14.9
% NS: percentage of questions coded as “follow up: nonspecific” by FAI

more weight to one side do?” Table 2 shows the decrease in percentage of
the use of these types of questions over the course of the semester.
During FAI 1, Kelly and Caleb primarily asked questions from the protocol
(23.5 %) or asked clarification (17.6 %) or follow-up: nonspecific questions
(15.7 %). However, over the course of the semester, Kelly and Caleb’s use of
follow-up: nonspecific questions decreased and their use of competent follow-
up questions increased. Questions coded as competent were directly related to
how the student responded and competent questions were indicated by those
causing some type of disequilibrium within the student’s thinking or those
attempting to draw students’ attention to conceptual meaning. These sorts of
questions asked the students to consider how the new information would make
sense given their existing schema. This question type is exemplified by Kelly
when asking the student about balancing a ruler.
Kelly: …Let me see you balance it with just your finger. (Student balanced the ruler
on her finger)
Student: Easy.
Kelly: How do you know to put your finger there?
Student: It’s the very middle.
Kelly: It’s the very middle. Well, why does the very middle work, Amy? (Paused)
Why does it work on the very middle?

In this excerpt, Kelly asked the student questions that would cause her to
speculate beyond her initial conjectures and reflect more deeply. Kelly’s
questions focused on the actions and responses of the student and
encouraged the student to understand the action within the concept of

TABLE 3
Percentage of follow-up: competent questions

FAI 1 2 3 4 5 6 7 8 9 10
%C 3.9 8.9 5.1 4.5 6.2 10.8 26.5 14.0 22.1 17.0
% C: percentage of questions coded as “follow up: competent” by FAI
DEVELOPMENT OF COMPETENT QUESTIONING 345

balance. Table 3 shows the increasing percentage of competent follow-up


questions Kelly and Caleb asked throughout the ten FAIs.
Kelly and Caleb both developed more competent questioning practice
throughout the course of the semester. The percentage of competent
follow-up questions during FAI 1 was 3.9 % of questions, the minimum
for the entire semester. However, the percentage of competent questions
was as high as 26.5 % and ended with 16.7 %.

Noticing
Kelly and Caleb’s ability to professionally notice their students’ thinking was
measured by their responses on the FAI Reflection and Planning forms. In
their abilities to detail the students’ strategies, interpret the students’
understanding, and decide how to respond based on the students’
understanding, Kelly gradually improved while Caleb fluctuated throughout
the semester. Recall that the FAI Reflection and Planning forms were coded
as robust (three points), limited (two points), and lacking (one point) on five
dimensions, resulting in a maximum score of 15 (see Fig. 2).
Caleb steadily improved from FAI 1 to 4, but exhibited a dramatic dip
by FAI 6, the last of the mathematics FAI Reflection and Planning forms.
The sixth reflection lacked the detail and rigor he provided in previous
forms. Caleb improved again during the science FAIs, yet never exceeded
his scores on the third and fourth mathematics FAI Reflection and
Planning forms. Kelly, on the other hand, showed a steady improvement
over time up to FAI Reflection and Planning form 8.

14

12
Noticing Score

10

4
Caleb
2
Kelly
0
1 2 3 4 5 6 7 8 9 10
Interview Number
Fig. 2. Kelly and Caleb’s noticing scores based on FAI Reflection and Planning form
analysis
346 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

Areas for Further Development


Data analysis revealed growth in Kelly and Caleb’s questioning
practice (i.e. competent follow-up questions); however, both preser-
vice teachers exhibited areas that could be further developed to
enhance their abilities to elicit student thinking. Analysis of FAI
video and transcripts to explore research question 2 suggested many
opportunities Kelly and Caleb missed to ask further questions to
explore student thinking more deeply. Leading questions resulted in
the preservice teacher not gaining an accurate understanding of the
students’ thinking by foregrounding their own solution strategies
because in these cases, the student tended to agree with the
preservice teacher rather than provide their own rationale. For
example, in FAI 3, Kelly asked the student how many different
combinations of boys and girls one could have in a class with 11
total students. The following conversation ensued.
Student: Nine girls and one boy.
Kelly: If there’s eleven kids in the class...
Student: I’m kidding, two boys.
Kelly: Okay, so write out what you had. So you had nine girls, right?
Student: And two boys.
Kelly: And two boys? And together that makes?

In this example, Kelly provided the student prompting and failed to


adequately gauge the student’s reasoning. She began by reminding the
student that there were 11 total students. While this may have been
appropriate to clarify for the student, Kelly focused on the incorrect
response rather than on how the student solved the problem. Furthermore,
Kelly failed to uncover the student’s thinking as she continued to support
him by telling him to write down what he is thinking. This may have
guided him to solve the problem in a certain way instead of using other
strategies such as manipulatives or mental mathematics. Finally, Kelly
said to the student, “And together that makes,” providing him with a clue
to sum the two numbers. Perhaps the student would not have added the
two numbers had Kelly not lead him with that prompt. Had Kelly focused
more on the student’s reasoning, she may have discovered how the
student was thinking about the decomposition of the number 11.
Therefore, Kelly and Caleb needed to learn that allowing students to
attempt their own methods and assessing their reasons for this method can
often uncover misconceptions that the student may hold.
DEVELOPMENT OF COMPETENT QUESTIONING 347

In addition to asking leading questions, there were several instances


where Kelly and Caleb could have asked follow-up questions, but missed
the opportunity to do so. In FAI 7, Kelly explored the concept of balance
with the student. She had the student place dried beans and wooden
blocks on a scale and observed what happened as they placed various
objects on each side of the scale. Once the scale was balanced with a
block on one side and beans on the other, Kelly asked the student to place
two more blocks on the scale and balance it again. The following
conversation ensued:
Kelly: Can you find a way to balance this, using one block on this side and one block
on this side and some beans in there?
(Student places a block on one side. She begins placing another block on the same side.)
Student: Here?
Kelly: Well, you already got the one here, so what if you put another one over here?
(Kelly puts block on the other side). What do you think is going to happen?

In this episode, Kelly missed an important opportunity to allow the student to


explore the concept of balance on her own. The student may have shared
information related to her understanding of balance if Kelly had allowed the student
to place the block on the same side of the scale and asked, “What happened when
you put the block on that side?” Instead, Kelly instructed the student where to
correctly place the block and asked her what she thought was going to happen after
the student had already placed the block on the correct side.
Both Kelly and Caleb demonstrated questioning practice that suggested
areas for improvement. They asked questions that led students to solve
problems in a certain way rather than allowing students to solve the
problem in their own way regardless of the accuracy of the response. This
resulted in their inability to capture the full thought processes of the
students. In addition, Kelly and Caleb missed a number of opportunities
to ask follow-up questions to probe the students’ thinking further, which
may have elicited a deeper level of the students’ thinking.

DISCUSSION

Educators have suggested that that the ability to competently question is a


key component of teaching (Bell et al., 1985; Harlen, 2001). The types of
questions teachers ask can influence the opportunities that students have
to learn about content (Hackenberg, 2005) and what, in turn, teachers can
learn about students. The core practice of questioning contributes to
revealing student thinking, allowing teachers to modify their instruction
348 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

based on student needs. Although some researchers suggest that


preservice teachers are not developmentally ready to engage in advanced
practices (Kagen, 1992), similar to Levin, Hammer, & Coffey (2009), we
believe that preservice teachers can develop core practices that foreground
student thinking. The many aspects of professional noticing—attending to
strategies, interpreting understandings, and deciding how to respond—
play an important role in student-centered teaching (Jacobs et al., 2010;
Scherrer & Stein, 2012; van Es & Sherin, 2002, 2008). We believe that
the ability to competently question students is a core practice that must be
developed through continued practice and support and that development
of this practice is influenced by one’s ability to notice.
As stated by Darling-Hammond (2010), the results of this study suggest
that with weekly practice and reflection, preservice teachers can develop
their questioning practice within the context of face-to-face interaction with
students and that the ways they question students can change when given
opportunities to interact with them and analyze their thinking. The
importance of placing student thinking at the center of instruction is not
novel (Carpenter et al., 1996); however, we believe that preservice teacher
education could place greater emphasis on preparing teachers to uncover
student thinking. We believe that a focus on questioning practice and the
ability to notice is one way to do so. In this study, Kelly and Caleb improved
their ability to competently pose follow-up questions based on students’
responses to delve deeper into the students’ understanding. Although all
teachers engage in the act of noticing during instruction (Jacobs et al., 2010;
Sherin, Jacobs & Philipp, 2011a; Sherin, et al., 2011b), what they notice and
choose to attend to may differ. Through weekly field experiences and
reflection, Kelly and Caleb were not only able to practice their abilities to
question through weekly FAIs but also received feedback from their field
instructor (via grading of FAI protocols) and from the students themselves
through their responses. We postulate that the reflection forms provided
Kelly and Caleb are an important scaffold to think critically about their
professional noticing of student thinking, and in Kelly’s case, we saw
continual growth. We contrast this to the experiences of many preservice
teachers who may never have the opportunity to dissect the mental
constructions of children.
We posit that the experimental field experience approach provided rich
opportunities for Kelly and Caleb to develop the core practice of
questioning and their abilities to notice. Our analyses suggest that this
development occurred simultaneously, particularly with regard to Kelly.
Although one might suggest that any engagement in consistent practice
would contribute to improved performance, another explanation for the
DEVELOPMENT OF COMPETENT QUESTIONING 349

change we observed is the development of Kelly and Caleb’s ability


to notice. In this study, Kelly and Caleb demonstrated progress in
their ability to competently question students to reveal components
of their mathematical and scientific reasoning. By reflecting on what
they professionally noticed about students’ thinking, we contend that
Kelly and Caleb learned to adapt their questioning practice to offer
more competent questions in their interactions with students. We
postulate that they were attending to their students’ thinking (as
demonstrated by increasingly foregrounding student thinking within
their follow-up questions), interpreting their students’ responses (as
demonstrated by responding with meaningful follow-up questions and
reflect on student responses in the FAI Reflection and Planning form), and
deciding how to respond (as demonstrated by their increased propensity to
ask relevant follow-up questions not included in the FAI protocol and to
make suggestions for further lessons in their FAI Reflection and Planning
forms). By building their knowledge of each student’s thinking from week to
week, they were formulating an understanding of the individual students’
thinking as well building broader models of how students engage in specific
content.
Our study builds on the prior work of Moyer & Milewicz (2002)
by considering how preservice teachers’ questioning practices change
over time. Like others (Franke et al., 2009; Kawanaka & Stigler,
1999; Ralph, 1999; Sahin & Kulm, 2008), we have documented the
struggles that teachers face in asking students provocative questions,
but we also have shown that given appropriate scaffolds, preservice
teachers can make changes in their questioning throughout the course of a
semester. While providing clinical field experiences to preservice teachers is
not novel, we suggest the importance of foregrounding student mathematical
and scientific thinking through an iterative process of enactment and
reflection to develop questioning practice and the ability to notice. In this
paper, we note areas for further development that could potentially improve
preservice teachers’ questioning practice. These areas could be specifically
targeted in a field experience course, such as the one described in this study.

ACKNOWLEDGMENTS

We would like to thank Anderson Norton III, Meredith Park Rogers,


Enrique Galindo, and Valarie Akerson. Support for this project was
funded by the National Science Foundation (project no. 0732143).
350 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

REFERENCES

Ball, D., Sleep, L., Boerst, T. & Bass, H. (2009). Combining the development of practice
and the practice of development in teacher education. The Elementary School Journal,
109(5), 458–474.
Bell, B., Osborne, R. & Tasker, R. (1985). Finding out what children think. In R. Osborne
& P. Freyberg (Eds.), Learning in science: The implications of children’s ideas (pp.
151–165). Portsmouth: Heinemann.
Carpenter, T., Fennema, E. & Franke, M. (1996). Cognitively guided instruction: A
knowledge base for reform in primary mathematics education. The Elementary School
Journal, 97(1), 3–20.
Chin, C. (2006). Classroom interaction in science: Teacher questioning and feedback to
students’ responses. International Journal of Science Education, 28, 1315–1346.
Creswell, J. (2002). Research design: Qualitative, quantitative, and mixed methods
approaches. Los Angeles, CA: Sage.
Darling-Hammond, L. (2010). Teacher education and the American future. Journal of
Teacher Education, 61(1–2), 35–47.
Franke, M. L. & Kazemi, E. (2001). Teaching as learning within a community of practice:
Characterizing generative growth. In T. Wood, B. Nelson & J. Warfield (Eds.), Beyond
classical pedagogy in elementary mathematics: The nature of facilitative teaching (pp.
47–74). Mahwah, NJ: Lawrence Erlbaum.
Franke, M. L., Webb, N. M., Chan, A. G., Ing, M., Freund, D. & Battey, D. (2009).
Teacher questioning to elicit students’ mathematical thinking in elementary school
classrooms. Journal of Teacher Education, 60, 380–392.
Grossman, P., Hammerness, K. & McDonald, M. (2009). Redefining teacher, re-
imagining teacher education. Teachers and Teaching: Theory and Practice, 15(2),
273–289.
Hackenberg, A. (2005). A model of mathematical learning and caring relations. For the
Learning of Mathematics, 25(1), 44–47.
Harlen, W. (Ed.). (2001). Primary science …taking the plunge: How to teach primary
science more effectively for ages 5 to 12 (2nd ed.). Portsmouth, NH: Heinemann.
Hudson, R. A., Kloosterman, P. & Galindo, E. (2012). Assessing preservice teachers’
beliefs about the teaching and learning of mathematics and science. School Science and
Mathematics, 112, 433–442.
Jacobs, V. R., Lamb, L. L. C. & Philipp, R. A. (2010). Professional noticing of children’s
mathematical thinking. Journal for Research in Mathematics Education, 41, 169–202.
Jacobs, V., Lamb, L., Philipp, R. & Schappelle, B. (2011). Deciding how to respond on
the basis of children’s understandings. In M. G. Sherin, V. Jacobs & R. Philipp (Eds.),
Mathematics teacher noticing (pp. 97–116). New York: Routledge.
Kagen, D. (1992). Professional growth among preservice and beginning teachers. Review
of Educational Research, 62(2), 129–169.
Kawanaka, T. & Stigler, J. W. (1999). Teachers’ use of questions in eighth-grade
mathematics classrooms in Germany, Japan, and the United States. Mathematical
Thinking and Learning, 1, 255–278.
Lampert, M. (2010). Learning teaching in, from and for practice: What do we mean?
Journal of Teacher Education, 6(1–2), 21–34.
Levin, D., Hammer, D. & Coffey, J. (2009). Novice teachers’ attention to student
thinking. Journal of Teacher Education, 60(2), 142–154.
DEVELOPMENT OF COMPETENT QUESTIONING 351

Lewis, C. (2000). Lesson study: The core of Japanese professional development. Invited
presentation to the Special Interest Group on Research in Mathematics Education at the
Annual Meeting of the American Educational Research Association, New Orleans, LA,
April.
Martino, A. M. & Maher, C. A. (1999). Teacher questioning to promote justification and
generalization in mathematics: What research practice has taught us. The Journal of
Mathematical Behavior, 18, 53–78.
Mehan, H. (1979). Learning lessons: Social organization in a classroom. Cambridge,
MA: Harvard University Press.
Moyer, P. & Milewicz, E. (2002). Learning to question: Categories of questioning used by
preservice teachers during diagnostic mathematics interviews. Journal of Mathematics
Teacher Education, 5, 293–315.
NCTM. (2000). Principles and standards for school mathematics. Reston, VA: National
Council of Teachers of Mathematics.
NCTM. (2007). Mathematics teaching today. Reston, VA: National Council of Teachers
of Mathematics.
National Science Teachers Association. (2000). NSTA position statement: Science teacher
preparation. Retrieved 10 August 2009 from http://www.nsta.org/about/positions/
preparation.aspx.
Posner, G. & Gertzog, W. (1982). The clinical interview and the measurement of
conceptual change. Science Education, 66(2), 195–209.
Ralph, E. G. (1999). Developing novice teachers’ oral-questioning skills. McGill Journal
of Education, 34, 29–47.
Sahin, A. & Kulm, G. (2008). Sixth grade mathematics teachers’ intentions and use of
probing, guiding, and factual questions. Journal of Mathematics Teacher Education, 11,
221–241.
Scherrer, J. & Stein, M. K. (2012). Effects of a coding intervention on what teachers learn
to notice during whole-group discussion. Journal of Mathematics Teacher Education.
doi:10.1007/s10857-012-9207-2.
Sherin, M., Jacobs, V. & Philipp, R. (Eds.). (2011a). Mathematics teacher noticing:
Seeing through the teacher’s eyes. New York: Routledge.
Sherin, M., Russ, R. & Colestock, A. (2011b). Accessing mathematics teacher’s in-the-
moment noticing. In M. G. Sherin, V. Jacobs & R. Philipp (Eds.), Mathematics teacher
noticing (pp. 79–94). New York: Routledge.
Star, J., Lynch, K. & Perova, N. (2011). Using video to improve preservice mathematics
teachers’ abilities to attend to classroom features. In M. G. Sherin, V. Jacobs & R.
Philipp (Eds.), Mathematics teacher noticing (pp. 117–133). New York: Routledge.
Steffe, L. & Thompson, P. (2000). Teaching experiment methodology: Underlying
principles and essential elements. In A. E. Kelley & R. A. Lesh (Eds.), Handbook of
research design in mathematics and science education (pp. 267–306). Mahwah, NJ:
Lawrence Erlbaum.
van Es, E. (2011). A framework for learning to notice student thinking. In M. G. Sherin,
V. Jacobs & R. Philipp (Eds.), Mathematics teacher noticing (pp. 134–151). New York:
Routledge.
van Es, E. A. & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’
interpretations of classroom interactions. Journal of Technology and Teacher
Education, 10, 571–596.
352 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR

van Es, E. A. & Sherin, M. G. (2008). Mathematics teachers’ “learning to notice” in the
context of a video club. Journal of Technology and Teacher Education, 24(2), 244–276.
van Zee, E. & Minstrell, J. (1997). Using questioning to guide student thinking. The
Journal of the Learning Sciences, 6, 227–269.
Yin, R. (2009). Case study research: Designs and methods (4th ed.). Thousand Oaks, CA:
Sage.

Ingrid S. Weiland
College of Education and Human Development
University of Louisville
1905 S. 1st Street, #273, Louisville, KY, 40292, USA
E-mail: ingrid.weiland@louisville.edu

Rick A. Hudson
University of Southern Indiana
Evansville, IN, USA
E-mail: rhudson@usi.edu

Julie M. Amador
University of Idaho
Coeur d’Alene, ID, USA
E-mail: jamador@uidaho.edu

You might also like