Professional Documents
Culture Documents
AMADOR
ABSTRACT. Recent research and reform documents in mathematics and science education
have highlighted the importance of preservice teacher education that focuses on
understanding students’ reasoning and modifying instruction accordingly. Utilizing the
constructs of core practices and the professional noticing as lenses with which to examine
the development of preservice teachers’ questioning practice, we conducted a case study
of 1 pair of preservice teachers as they performed weekly formative assessment interviews
to elicit student thinking during 1 semester. Our study was driven by the following
research questions: (1) How do preservice teachers develop their questioning practice and
ability to notice students’ thinking about mathematical and science concepts? (2) How can
these questioning practices be further developed? Results suggest that with weekly
practice and reflection, preservice teachers can develop their questioning practice within
the context of face-to-face interaction with students and that the ways they question
students can change when given opportunities to interact with them and analyze their
thinking. By having participants attend to what they were professionally noticing about
students’ thinking, we contend that they learned to adapt their questioning techniques to
ask students more competent questions. Participants also exhibited 2 areas of questioning
practice ripe for improvement—asking leading questions and missed opportunities to
probe students’ thinking. While providing clinical field experiences to preservice teachers
is not novel, we suggest the importance of foregrounding practices that elicit student
mathematical and scientific thinking through an iterative process of enactment and
reflection to develop questioning practice and the ability to notice.
Teacher Questioning
Inquiry-based instruction requires teachers to be able to question
students in systematic ways that build on students’ knowledge and
focus students’ attention on important scientific and mathematical
content. The opportunities children have to learn are directly
impacted by the questions they are asked. NCTM (2007) advocates
that teachers pose “questions and tasks that elicit, engage, and
challenge each student’s thinking” and ask “students to clarify and
justify their ideas orally and in writing” (p. 45). Such questioning
practices provide a stark contrast to the Initiate–Respond–Evaluate
discourse patterns that have often characterized classroom discourse
in traditional teaching settings (Mehan, 1979).
International comparisons of teachers’ classroom questioning practices
have shown that the questioning practices of American teachers, in
particular, differ greatly from their peers in other countries. Kawanaka &
Stigler (1999) found that American teachers are more likely to ask low-level
questions (e.g. questions that can be answered with a response of yes/no)
than teachers in Germany or Japan. They are also less likely than their peers
to ask students to explain or describe their thinking, although the percentage
of these higher-order questions asked in all three countries remained small.
Several research studies have investigated the modes and charac-
teristics of teachers’ questions in classroom and small-group settings.
These studies have shown that teachers’ questions can serve an
important role to guide students’ thinking (van Zee & Minstrell,
1997) and to promote student justification and generalization
(Martino & Maher, 1999). Sahin & Kulm (2008) characterized
teachers’ questions into three types: factual, probing, and guiding.
These questions differ, respectively, by requiring students to provide
a predetermined response, by prompting students to explain or justify
their thinking, and by scaffolding students’ thinking toward a
particular concept. In observing two mathematics teachers question-
DEVELOPMENT OF COMPETENT QUESTIONING 333
ing practices, Sahin & Kulm (2008) found that the teachers, who
were using reform-based textbooks, tended to ask a preponderance of
factual questions. Franke, Webb, Chan, Ing, Freund, & Battey (2009)
reported on teacher questioning practices after participating in
professional development sessions focused on algebraic reasoning.
These teachers consistently asked questions to elicit student thinking
(e.g. How did you get that?), but the ways they responded to
students’ explanations varied widely, ranging from general questions
to responses specific to the students’ explanations. As Sahin & Kulm
(2008) and Franke et al. (2009) have documented, even when
teachers have access to reform-based curriculum materials and
professional development aimed at enhancing teacher knowledge,
teachers still struggle to question students productively. Chin (2006)
examined in-service teachers’ use of questioning in the science
classroom and proposed a framework for understanding “question-
ing-based discourse.” She found that teachers responded to students
in four ways: (a) affirm the answer and continue direct instruction,
(b) affirm the answer and then ask probing questions, (c) correct the
student, or (d) evaluate the response or provide neutral comments by
asking follow-up questions. While these findings are interesting,
Chin’s research did not focus on interviews specifically nor did her
participants include preservice teachers, therefore highlighting a need
for continued study.
Although the importance of teachers’ questions has been docu-
mented widely in educational research, very little research has
focused on how teachers’ questioning practice develops. Moyer &
Milewicz (2002) investigated preservice teachers’ use of interviews
through an examination of their abilities to ask students questions
about their mathematical thinking. They examined questioning
abilities during one interview session and found that preservice
teachers’ questions fell into three categories: checklisting (listing
questions off of the interview protocol), instructing rather than
assessing, and probing or follow-up questions. Although this study
provides insight about the types of questioning that preservice
teachers employ, it does not address how preservice teachers’
questioning changes over time. Using two data points throughout a
practicum experience, Ralph (1999) argued that preservice teachers
improved the clarity of their questions, increased wait time for
student responses, and posed questions to a wider variety of
students. However, one of the limitations of Ralph’s study is an
overreliance on self-assessment of questioning abilities. These studies
334 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR
Professional Noticing
A strong relationship exists between teachers’ monitoring of students’
thinking and the ability to pose timely questions to deepen student
understanding (Martino & Maher, 1999). To do so, teachers must first come
to professionally notice and understand the methods of reasoning students
are using in the act of problem solving and scientific inquiry (Jacobs, Lamb,
Philipp, & Schappelle, 2011). The construct of professional noticing is a
“way to understand how teachers make sense of complex classrooms”
(Jacobs et al., 2011, p. 98) and encompasses where teachers focus attention
and how they interpret and reflect on their discoveries about student
thinking. Van Es & Sherin (2002, 2008) describe professional noticing in
mathematics as what teachers attend to during instruction; for the purposes of
this paper, we apply the construct of professional noticing to the content
areas of both mathematics and science as we examine the core practice of
questioning because of its applicability across disciplines (Sherin, Russ, &
Colestock, 2011). (Further use of the term “noticing” in this paper refers to
this construct, unless otherwise specified).
Van Es & Sherin (2002, 2008) claim that the act of noticing involves three
related aspects: (a) identifying what is noteworthy about classroom
interactions, (b) connecting classroom events to broader principles of
teaching and learning, and (c) using what one knows about a situation’s
context to reason about the situation. In van Es and Sherin’s view, teacher
education has not traditionally focused on helping teachers learn to interpret
classroom interactions in the midst of instruction. Recent work by Jacobs et
al. (2011) has found that deciding how to respond while teaching requires
expertise in attending to children’s strategies and interpreting their thinking
and that learning how to engage in professional noticing can be learned
through support.
Researchers have begun to investigate what supports are necessary to help
teachers focus their professional noticing on aspects of teaching that are
inherent to reform initiatives, such as the classroom discourse required to
build conceptual understanding. Such interventions are intended to move
teachers past noticing superficial aspects of reform teaching like classroom
management or use of manipulatives. For example, Scherrer & Stein (2012)
and van Es & Sherin (2008) found that teacher participants improved in their
ability to recognize and interpret children’s mathematical thinking when they
shifted the focus of what they notice.
DEVELOPMENT OF COMPETENT QUESTIONING 335
METHOD
Context
At the university where the study took place, all preservice teachers
majoring in elementary education enroll concurrently in a mathematics
methods course, a science methods course, and a field experience focused
on both mathematics and science instruction during one semester. As a
part of a larger research project, preservice teachers in the experimental
section of the field experience engage in an iterative weekly process
throughout the semester which consists of conducting formative assess-
ment interviews, building sharable models of student thinking, planning
and teaching whole-class lessons, and collectively reflecting via a
modified version of lesson study (Lewis, 2000). This paper focuses
solely on the formative assessment interview component of the
experimental field experience approach.
During the first 2 weeks of the field experience, preservice teachers
become oriented with the experimental approach and the school setting.
Next, they spend 6 weeks focusing on mathematics and 5 weeks focusing
on science. Each week, a pair of preservice teachers conducts a formative
assessment interview (FAI) with two elementary students from their host
classroom. The FAIs, modeled after Steffe & Thompson’s (2000)
336 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR
Participants
Our case study includes two participants, Kelly and Caleb, who were both
elementary education majors in their junior year at a large, American
DEVELOPMENT OF COMPETENT QUESTIONING 337
university. At the time of the study, Kelly and Caleb were working
with four peers in a first-grade classroom as part of their field
experience course. We selected this pair to study because their field
instructor indicated that their professional dispositions would likely be
receptive to our examination of their questioning practices. In
addition, based on the results of a survey designed to measure
teachers’ beliefs Hudson, Kloosterman, & Galindo (2012), Kelly
indicated she felt more confident to teach mathematics than science.
Caleb, on the other hand, felt confident to teach both mathematics
and science.
Data Analysis
The purpose of this study was to examine the change in questioning
techniques used by preservice teachers over time and in depth. We
documented the types of questions posed by preservice teachers over
the course of the semester in which they participated in the
experimental field experience, striving to understand how they were
questioning students during FAIs and seeking to explicate changes in
their questioning over time.
To analyze the data, we created a framework based on recent
studies focused on questioning. Reliability of case study methodology
can be addressed by following a case study protocol (Yin, 2009); we
therefore utilized research procedures similar to those described by
Moyer & Milewicz (2002). We began with the codes created by
Moyer & Milewicz (2002) because they provided a way to categorize
and label preservice teachers’ questions during interviews with
students. To identify questioning types, we first considered two
categories presented by Moyer & Milewicz (2002): instructing rather
than assessing and probing or follow-up questions. We did not
include their third category, “checklisting,” which signifies instances
when questions are posed as a list from the protocol, because we
considered such questions as instances of posing problems and
therefore coded it as such. We used this framework to arrive at the
following three major types of questions: instructing rather than
assessing, probing or follow-up questions, and problem-posing
questions. We then subcategorized these three question types as
described below.
To begin, an initial set of interview data was coded by each of the three
authors independently. Only questions posed in the interrogative form
were coded; thus, imperative statements that merely implied the asking of
338 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR
FINDINGS
Frequency
Code Description (adapted from Moyer & Milewicz, 2002) Example (%)
Problem-posing: Task/question is directly read or slightly modified Kelly: So, Mr. Caleb …has eleven students 105 (19.5)
protocol from the task/question that appeared in the FAI in his class, okay? He has both boys and girls
protocol in his class. How many different combinations
of boys and girls could he have in his class?
Problem-posing: Task/question that does not appear on the FAI Protocol states: “If I started with 10 cubes, 17 (3.2)
framing protocol, but that the PST uses to introduce how many are under the cup?
or frame the task/question that appears Kelly: I’m going to take some and I’m going
on the protocol. to put them under a cup, okay. So pay
attention here. Why don’t you guys count how
many are not in the cup right now?
Problem-posing: new Task/question that does not appear on the FAI Protocol states: Have the student move magnets 21 (3.9)
protocol, but that the preservice teacher poses over the iron filings. Ask students what they
to students notice and see if they can explain what
is happening.
Caleb asked had asked the students to do the above
task. He posed a new problem when stating:
Okay, so now you use magnets and you can try
DEVELOPMENT OF COMPETENT QUESTIONING
(continued)
Frequency
Code Description (adapted from Moyer & Milewicz, 2002) Example (%)
Instructing: teaching Question moves away from assessing Caleb: Well, if you look at this, what it does, 27 (5.0)
and telling student thinking to teaching the student well, it is supposed to kind of balance it out
by telling them a fact or concept .…Okay? So, see if I push really, really hard,
this isn’t lined up.
Instructing: leading Question intended to focus students’ attention Kelly: So, that, that [problem] uses this 28 (5.2)
question to a particular aspect or provides prompting symbol, (points to addition sign) right? If we
hints or clues were to write that out.
Follow-up: nonspecific Question in reaction to student response, but Caleb: Why do you think so? 105 (19.5)
lacks specificity or does not react to student
words or actions
Follow-up: competent Question in reaction to student response, but in Caleb: Do you notice anything about the 70 (13.0)
which the preservice teacher attempted to build colors? [problem-posing repeat]
on the students’ thinking to cause the student Student: Yeah, the different colors
to justify thinking, explore reasoning, or create Caleb: So when you move the magnets, do you
a cognitive conflict to reorganize thinking notice anything about how the different colors
affect each other? [follow-up: competent]
Follow-up: incorrect Question in reaction to student response that is Caleb: (Long pause) So, since you are 12 (2.2)
solely presented to tell the student that they are blowing, do you think you always have to
incorrect. In these instances, the PST does not touch an object to make it move? [problem-
INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR
following up questions that they perceive the posing: protocol] Student: Yeah. Caleb: You
student has responded to correctly. do? [follow-up: incorrect]
Total questions 539
DEVELOPMENT OF COMPETENT QUESTIONING 341
Other
9%
IRTA NonSpecific
10% 19%
Competent
13% Protocol
20%
Clarifying
14%
Repeat
15%
Problem-Posing Questions
Kelly and Caleb asked problem-posing: framing questions 17 times
(3.2 % of questions asked). During an FAI focusing on the concept of
balance using a triple beam balance, Kelly asked the following problem-
posing: framing question:
Kelly: What happens is if these two [sides of the balance] have the same amount in them,
then this stick is going to match this in the middle. So, see if I push really, really hard, this
isn’t lined up. Okay? Does this remind you of something? Maybe something you have seen?
Student: Yeah.
Kelly: Yeah, do you know what I am talking about?
Student: Yeah.
Kelly: Yeah, a see saw, or a teeter totter or something. Yeah. Have you ever played on
one of those before?
those that were taken directly from the written protocol or changed only
slightly so that the intent of the question remained the same.
New problems were asked 21 times (3.9 %) and consisted of
questions that deviated from the protocol in terms of conceptual
content rather than minor wording. For example, during FAI 2, Caleb
sought to find out what his students thought about composing and
decomposing the number “eight.” In the FAI protocol, Caleb planned
to ask the following questions:
If I have 4 chips and 4 cubes, how many chips and cubes do I have
all together?
Do I have to use 4 chips and 4 cubes to get 8 or is there maybe
another way to get 8 total objects?
Are there any other combinations of chips and cubes I can use so that
there are still 8 total items on the table?
Instructing-Rather-Than-Assessing Questions
We also identified instances when Kelly and Caleb instructed rather
than assessed when asking questions. At times, they told the student
the answer within their question (5.0 %) or incorporated leading
questions (5.2 %) that suggested to the students a solution path. For
example, in FAI 5, Kelly guided the student in solving to the point
that it was difficult to determine the student’s thinking. The student
was given the following problem: “Bill had five donuts, George gave
him some more and now Bill’s total is eighteen. How many did
George give him?” The following ensued:
DEVELOPMENT OF COMPETENT QUESTIONING 343
Kelly told the student to keep working, implying that the answer was incorrect.
Kelly led the student to use cubes to solve the problem, which hindered her
ability to uncover how the student was solving the problem. Rather than
eliciting the student’s thinking, Kelly continued to guide the student to use
counters. In fact, she even counted out the five counters. This extensive
guidance hindered the potential for the student to represent and solve the
problem. This incident was therefore coded as instructing: teaching and telling.
Follow-Up Questions
The most notable trends in the data were evident in the follow-up
questioning that took place after a problem had been posed and the
student had responded. As described above, follow-up questions were
separated into three main categories: follow-up: incorrect, follow-up:
nonspecific, and follow-up: competent. Questions that indicated to the
student an incorrect answer were deemed follow-up incorrect (2.2 % of all
questions). Of particular interest to our first research question is that
19.5 % of all questions asked were coded as follow-up: nonspecific and
13.0 % were coded as follow-up: competent. Examples of these findings
are described in detail below.
TABLE 2
Percentage of follow-up: nonspecific questions
FAI 1 2 3 4 5 6 7 8 9 10
% NS 15.7 26.8 25.6 43.2 23.9 8.1 12.2 18.0 11.5 14.9
% NS: percentage of questions coded as “follow up: nonspecific” by FAI
more weight to one side do?” Table 2 shows the decrease in percentage of
the use of these types of questions over the course of the semester.
During FAI 1, Kelly and Caleb primarily asked questions from the protocol
(23.5 %) or asked clarification (17.6 %) or follow-up: nonspecific questions
(15.7 %). However, over the course of the semester, Kelly and Caleb’s use of
follow-up: nonspecific questions decreased and their use of competent follow-
up questions increased. Questions coded as competent were directly related to
how the student responded and competent questions were indicated by those
causing some type of disequilibrium within the student’s thinking or those
attempting to draw students’ attention to conceptual meaning. These sorts of
questions asked the students to consider how the new information would make
sense given their existing schema. This question type is exemplified by Kelly
when asking the student about balancing a ruler.
Kelly: …Let me see you balance it with just your finger. (Student balanced the ruler
on her finger)
Student: Easy.
Kelly: How do you know to put your finger there?
Student: It’s the very middle.
Kelly: It’s the very middle. Well, why does the very middle work, Amy? (Paused)
Why does it work on the very middle?
In this excerpt, Kelly asked the student questions that would cause her to
speculate beyond her initial conjectures and reflect more deeply. Kelly’s
questions focused on the actions and responses of the student and
encouraged the student to understand the action within the concept of
TABLE 3
Percentage of follow-up: competent questions
FAI 1 2 3 4 5 6 7 8 9 10
%C 3.9 8.9 5.1 4.5 6.2 10.8 26.5 14.0 22.1 17.0
% C: percentage of questions coded as “follow up: competent” by FAI
DEVELOPMENT OF COMPETENT QUESTIONING 345
Noticing
Kelly and Caleb’s ability to professionally notice their students’ thinking was
measured by their responses on the FAI Reflection and Planning forms. In
their abilities to detail the students’ strategies, interpret the students’
understanding, and decide how to respond based on the students’
understanding, Kelly gradually improved while Caleb fluctuated throughout
the semester. Recall that the FAI Reflection and Planning forms were coded
as robust (three points), limited (two points), and lacking (one point) on five
dimensions, resulting in a maximum score of 15 (see Fig. 2).
Caleb steadily improved from FAI 1 to 4, but exhibited a dramatic dip
by FAI 6, the last of the mathematics FAI Reflection and Planning forms.
The sixth reflection lacked the detail and rigor he provided in previous
forms. Caleb improved again during the science FAIs, yet never exceeded
his scores on the third and fourth mathematics FAI Reflection and
Planning forms. Kelly, on the other hand, showed a steady improvement
over time up to FAI Reflection and Planning form 8.
14
12
Noticing Score
10
4
Caleb
2
Kelly
0
1 2 3 4 5 6 7 8 9 10
Interview Number
Fig. 2. Kelly and Caleb’s noticing scores based on FAI Reflection and Planning form
analysis
346 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR
DISCUSSION
ACKNOWLEDGMENTS
REFERENCES
Ball, D., Sleep, L., Boerst, T. & Bass, H. (2009). Combining the development of practice
and the practice of development in teacher education. The Elementary School Journal,
109(5), 458–474.
Bell, B., Osborne, R. & Tasker, R. (1985). Finding out what children think. In R. Osborne
& P. Freyberg (Eds.), Learning in science: The implications of children’s ideas (pp.
151–165). Portsmouth: Heinemann.
Carpenter, T., Fennema, E. & Franke, M. (1996). Cognitively guided instruction: A
knowledge base for reform in primary mathematics education. The Elementary School
Journal, 97(1), 3–20.
Chin, C. (2006). Classroom interaction in science: Teacher questioning and feedback to
students’ responses. International Journal of Science Education, 28, 1315–1346.
Creswell, J. (2002). Research design: Qualitative, quantitative, and mixed methods
approaches. Los Angeles, CA: Sage.
Darling-Hammond, L. (2010). Teacher education and the American future. Journal of
Teacher Education, 61(1–2), 35–47.
Franke, M. L. & Kazemi, E. (2001). Teaching as learning within a community of practice:
Characterizing generative growth. In T. Wood, B. Nelson & J. Warfield (Eds.), Beyond
classical pedagogy in elementary mathematics: The nature of facilitative teaching (pp.
47–74). Mahwah, NJ: Lawrence Erlbaum.
Franke, M. L., Webb, N. M., Chan, A. G., Ing, M., Freund, D. & Battey, D. (2009).
Teacher questioning to elicit students’ mathematical thinking in elementary school
classrooms. Journal of Teacher Education, 60, 380–392.
Grossman, P., Hammerness, K. & McDonald, M. (2009). Redefining teacher, re-
imagining teacher education. Teachers and Teaching: Theory and Practice, 15(2),
273–289.
Hackenberg, A. (2005). A model of mathematical learning and caring relations. For the
Learning of Mathematics, 25(1), 44–47.
Harlen, W. (Ed.). (2001). Primary science …taking the plunge: How to teach primary
science more effectively for ages 5 to 12 (2nd ed.). Portsmouth, NH: Heinemann.
Hudson, R. A., Kloosterman, P. & Galindo, E. (2012). Assessing preservice teachers’
beliefs about the teaching and learning of mathematics and science. School Science and
Mathematics, 112, 433–442.
Jacobs, V. R., Lamb, L. L. C. & Philipp, R. A. (2010). Professional noticing of children’s
mathematical thinking. Journal for Research in Mathematics Education, 41, 169–202.
Jacobs, V., Lamb, L., Philipp, R. & Schappelle, B. (2011). Deciding how to respond on
the basis of children’s understandings. In M. G. Sherin, V. Jacobs & R. Philipp (Eds.),
Mathematics teacher noticing (pp. 97–116). New York: Routledge.
Kagen, D. (1992). Professional growth among preservice and beginning teachers. Review
of Educational Research, 62(2), 129–169.
Kawanaka, T. & Stigler, J. W. (1999). Teachers’ use of questions in eighth-grade
mathematics classrooms in Germany, Japan, and the United States. Mathematical
Thinking and Learning, 1, 255–278.
Lampert, M. (2010). Learning teaching in, from and for practice: What do we mean?
Journal of Teacher Education, 6(1–2), 21–34.
Levin, D., Hammer, D. & Coffey, J. (2009). Novice teachers’ attention to student
thinking. Journal of Teacher Education, 60(2), 142–154.
DEVELOPMENT OF COMPETENT QUESTIONING 351
Lewis, C. (2000). Lesson study: The core of Japanese professional development. Invited
presentation to the Special Interest Group on Research in Mathematics Education at the
Annual Meeting of the American Educational Research Association, New Orleans, LA,
April.
Martino, A. M. & Maher, C. A. (1999). Teacher questioning to promote justification and
generalization in mathematics: What research practice has taught us. The Journal of
Mathematical Behavior, 18, 53–78.
Mehan, H. (1979). Learning lessons: Social organization in a classroom. Cambridge,
MA: Harvard University Press.
Moyer, P. & Milewicz, E. (2002). Learning to question: Categories of questioning used by
preservice teachers during diagnostic mathematics interviews. Journal of Mathematics
Teacher Education, 5, 293–315.
NCTM. (2000). Principles and standards for school mathematics. Reston, VA: National
Council of Teachers of Mathematics.
NCTM. (2007). Mathematics teaching today. Reston, VA: National Council of Teachers
of Mathematics.
National Science Teachers Association. (2000). NSTA position statement: Science teacher
preparation. Retrieved 10 August 2009 from http://www.nsta.org/about/positions/
preparation.aspx.
Posner, G. & Gertzog, W. (1982). The clinical interview and the measurement of
conceptual change. Science Education, 66(2), 195–209.
Ralph, E. G. (1999). Developing novice teachers’ oral-questioning skills. McGill Journal
of Education, 34, 29–47.
Sahin, A. & Kulm, G. (2008). Sixth grade mathematics teachers’ intentions and use of
probing, guiding, and factual questions. Journal of Mathematics Teacher Education, 11,
221–241.
Scherrer, J. & Stein, M. K. (2012). Effects of a coding intervention on what teachers learn
to notice during whole-group discussion. Journal of Mathematics Teacher Education.
doi:10.1007/s10857-012-9207-2.
Sherin, M., Jacobs, V. & Philipp, R. (Eds.). (2011a). Mathematics teacher noticing:
Seeing through the teacher’s eyes. New York: Routledge.
Sherin, M., Russ, R. & Colestock, A. (2011b). Accessing mathematics teacher’s in-the-
moment noticing. In M. G. Sherin, V. Jacobs & R. Philipp (Eds.), Mathematics teacher
noticing (pp. 79–94). New York: Routledge.
Star, J., Lynch, K. & Perova, N. (2011). Using video to improve preservice mathematics
teachers’ abilities to attend to classroom features. In M. G. Sherin, V. Jacobs & R.
Philipp (Eds.), Mathematics teacher noticing (pp. 117–133). New York: Routledge.
Steffe, L. & Thompson, P. (2000). Teaching experiment methodology: Underlying
principles and essential elements. In A. E. Kelley & R. A. Lesh (Eds.), Handbook of
research design in mathematics and science education (pp. 267–306). Mahwah, NJ:
Lawrence Erlbaum.
van Es, E. (2011). A framework for learning to notice student thinking. In M. G. Sherin,
V. Jacobs & R. Philipp (Eds.), Mathematics teacher noticing (pp. 134–151). New York:
Routledge.
van Es, E. A. & Sherin, M. G. (2002). Learning to notice: Scaffolding new teachers’
interpretations of classroom interactions. Journal of Technology and Teacher
Education, 10, 571–596.
352 INGRID WEILAND, RICK A. HUDSON AND JULIE AMADOR
van Es, E. A. & Sherin, M. G. (2008). Mathematics teachers’ “learning to notice” in the
context of a video club. Journal of Technology and Teacher Education, 24(2), 244–276.
van Zee, E. & Minstrell, J. (1997). Using questioning to guide student thinking. The
Journal of the Learning Sciences, 6, 227–269.
Yin, R. (2009). Case study research: Designs and methods (4th ed.). Thousand Oaks, CA:
Sage.
Ingrid S. Weiland
College of Education and Human Development
University of Louisville
1905 S. 1st Street, #273, Louisville, KY, 40292, USA
E-mail: ingrid.weiland@louisville.edu
Rick A. Hudson
University of Southern Indiana
Evansville, IN, USA
E-mail: rhudson@usi.edu
Julie M. Amador
University of Idaho
Coeur d’Alene, ID, USA
E-mail: jamador@uidaho.edu