Professional Documents
Culture Documents
research-article2014
JOAXXX10.1177/1932202X14538032Journal of Advanced AcademicsFirmender et al.
Article
Journal of Advanced Academics
2014, Vol. 25(3) 214–236
Examining the Relationship © The Author(s) 2014
Reprints and permissions:
Between Teachers’ sagepub.com/journalsPermissions.nav
DOI: 10.1177/1932202X14538032
Instructional Practices and joa.sagepub.com
Students’ Mathematics
Achievement
Abstract
The purpose of this study was to determine whether relationships existed between
teachers’ implementation of two specific discourse-related instructional practices
and students’ mathematics achievement in geometry and measurement as part of
a research study on the effectiveness of an advanced mathematics curriculum for
kindergarten and Grades 1 and 2. The mathematics units incorporated the following
instructional practices: engaging students in verbal communication in mathematics
and encouraging the use of appropriate mathematical vocabulary. Hierarchical linear
modeling was used to determine the relationships between teachers’ use of the
instructional practices and the students’ mathematics achievement. Results indicated
that significant, positive relationships existed; the teachers’ implementation scores
for the verbal communication and encouraging mathematical language instructional
practices were predictors of student mathematics achievement as measured by
students’ percentage gain scores on the Open-Response Assessments. Implications
of these findings for mathematics instruction are discussed.
Keywords
mathematics achievement, mathematics discourse, mathematical vocabulary, mathematics
instruction, primary grades
Corresponding Author:
Janine M. Firmender, Saint Joseph’s University, 5600 City Avenue, Philadelphia, PA 19131, USA.
Email: janine.firmender@sju.edu
Firmender et al. 215
For students, this mathematical community is the classroom where they can engage
in both verbal and written forms of mathematical communication. In fact, NCTM
(2000) identifies mathematical communication as one of the five mathematical pro-
cesses that teachers should develop in students, and the Common Core Standards for
Mathematical Practice (NGA & CCSSO, 2010) state that students should be provided
opportunities to “construct viable arguments and critique the reasoning of others” (p.
6), as well as, “communicate precisely to others” (p. 7). Furthermore, the National
Council of Teachers of English (NCTE; Whitin & Whitin, 2000) view verbal and writ-
ten communication in mathematics as “tools for collaboration, discovery, and reflec-
tion” (p. 2). When students engage in mathematical communication, they can share
their own ideas and analyze others’ ideas to further their understanding of mathemati-
cal concepts (NCTM, 2000).
Engaging students in mathematical communication requires the teacher to take on
a specific role to facilitate discussions and foster the development of students’ com-
munication skills. A teacher who implements such instructional practices listens to
students’ ideas and challenges students’ thinking by asking them to justify their ideas.
In addition, the teacher pursues certain ideas in more depth, provides additional infor-
mation as necessary, and monitors students’ participation in discussions (NCTM,
1991; Walshaw & Anthony, 2008). For this to occur, students must view the classroom
as a mathematical learning community or a community of practice in which they are
“expected to propose and defend mathematical ideas and conjectures and to respond
thoughtfully to the mathematical arguments of their peers” (Goos, 2004, p. 259).
Furthermore, the student’s role in this mathematical learning community is to ask
questions, pose problems, present and use varied strategies to justify solutions, con-
sider examples and counterexamples, and examine mathematical evidence through
listening and responding to others (NCTM, 1991). Engaging students in verbal com-
munication or discourse about mathematics and encouraging students’ use of appropri-
ate mathematical language are therefore two specific instructional practices related to
the engagement of students as mathematicians during instruction and development of
a community of learners.
In a classroom established as a community of practice, teachers engage students in
discussions of mathematical ideas (Goos, 2004) or verbal communication. Teachers
must teach students how to engage in these types of discussions, however (NCTM,
2000); and teachers’ use of this practice develops over time (Hufferd-Ackles, Fuson,
& Sherin, 2004; Walshaw & Anthony, 2008). One way to facilitate students’ verbal
communication in the mathematics classroom is the teacher’s use of talk moves
(Chapin, O’Connor, & Anderson, 2009). These talk moves, developed by Chapin and
colleagues (2009) and integrated previously in a mathematics curriculum for mathe-
matically talented students (Gavin, Casa, Adelson, Carroll, & Sheffield, 2009; Gavin
et al., 2007), have an identified purpose and help teachers facilitate students’ participa-
tion in mathematical discussions (see Table 1).
Although engaging students in verbal communication is a recommended instruc-
tional practice in mathematics education (NCTM, 1991; NGA & CCSSO, 2010), its
use is varied. For example, approximately 36% of teachers (n = 99) indicated that they
218 Journal of Advanced Academics 25(3)
Note. Talk moves as implemented in Project M2 (Gavin, Casa, Chapin, & Sheffield, 2010, 2011a, 2011b,
2012a, 2012b, 2013) and adapted from Chapin, O’Connor, and Anderson (2009).
used critical discourse in their instruction very frequently, 24% frequently, 27% some-
times, and 13% seldom (McKinney, Chappell, Berry, & Hickman, 2009). Conversely,
only 19% of teachers reported using “social interactions” in mathematics very fre-
quently compared with 53% of teachers reporting that they seldom engage students in
social interactions in mathematics (McKinney et al., 2009).
The body of research related to mathematics discourse is broad as the systematic
reviews on the topic by Ryve (2011) and Walshaw and Anthony (2008) indicate. These
reviews on mathematics discourse research focused on the conceptualization of dis-
course (Ryve, 2011) and how teachers promote discourse in the classroom (Walshaw
& Anthony, 2008). Additional research, focused on how engaging students in verbal
communication influences their mathematics achievement and concept development,
has demonstrated that this instructional practice can be beneficial (Cross, 2009; Dixon,
Egendoerfer, & Clements, 2009; Inagaki, Hatano, & Morita, 1998; Kazemi & Stipek,
2001). However, this research is often conducted with students beyond the primary
grade level; only one of these studies (Dixon et al., 2009) examined its use with stu-
dents in one of the primary grades (Grade 2).
Instructional practices for developing and encouraging the use of appropriate math-
ematical vocabulary include both direct and indirect methods (Marzano, 2004).
Teachers can influence students’ mathematics achievement when they provide repeated
exposure to vocabulary through modeling of the appropriate usage of mathematical
language. Incidents of “teacher use” of words relating to number during instruction
were significantly related to preschool-age students’ mathematics achievement
(Ehrlich, 2007) and growth in mathematics knowledge (Klibanoff, Levine,
Huttenlocher, Vasilyeva, & Hedges, 2006). By expecting students to engage in and
correctly use mathematical vocabulary in both verbal and written communication,
teachers can foster students’ development and appropriate use of mathematical vocab-
ulary (Casa et al., 2013; Gavin et al., 2009; Gavin, Casa, Adelson, & Firmender, 2013;
Gavin, Casa, Firmender, & Carroll, 2013; Thompson & Rubenstein, 2000). Another
strategy for focusing on mathematical vocabulary development is the use of a “word
wall.” A word wall may be used to display mathematical vocabulary related to a cur-
rent unit of study (Rubenstein & Thompson, 2002), or can include both mathematical
vocabulary words and corresponding picture cards that represent the words’ meanings
(Casa et al., 2013; Gavin, Casa, Adelson, & Firmender, 2013; Gavin, Casa, Firmender,
& Carroll, 2013).
Students may struggle with developing and using the language of mathematics for
multiple reasons. For instance, some vocabulary words have different meanings in
mathematical and non-mathematical contexts (e.g., volume as a measure of how much
space a three-dimensional shape takes up, a measure of loudness, or an identification
number for a periodical), vocabulary words that are homonyms in mathematical and
non-mathematical contexts (e.g., sum and some), and vocabulary words that are used
in more than one mathematical context (e.g., second to indicate a measure of time or
an ordinal number; Rubenstein & Thompson, 2002; Schleppegrell, 2007).
The instructional practices of engaging students in verbal communication and
encouraging students’ use of appropriate mathematical vocabulary share the character-
istic of being related to language. However, each of these instructional practices also
has distinguishing characteristics. When engaging students in verbal communication
or discourse, the teacher must present students with a task worthy of discussion (Dixon
et al., 2009; NCTM, 2000) and facilitate student discussion around making sense of
the task, strategies for solving or working on the task, providing justification for strate-
gies and solutions, and evaluating other students’ reasoning (NCTM, 2000). Although
encouraging students’ use of appropriate mathematical vocabulary may be done dur-
ing discussions, a distinction arises when teachers allow students’ use of informal
language when discussing mathematical ideas instead of emphasizing mathematically
correct vocabulary. In addition, the use of appropriate mathematical vocabulary is not
limited to discussions. For example, teachers may use a word wall to display vocabu-
lary words and corresponding iconic representations, students may represent mathe-
matical vocabulary using pictures or symbols, and students may use appropriate
vocabulary in their written mathematical communication.
Given the potential influence of these two instructional practices on students’ math-
ematical achievement and the distinctions between the two practices, the current study
220 Journal of Advanced Academics 25(3)
the CCSS (NGA & CCSSO, 2010) until Grade 2. In addition, only 36% of U.S. sev-
enth graders and 48% of U.S. eighth graders were able to answer questions on this
concept correctly during the 1995 Third International Mathematics and Science Study
(TIMSS; Beaton et al., 1996).
The instructional practices embedded within the Project M2 units included engag-
ing students in thinking and acting like mathematicians, creating a community of
learners through mathematical communication, fostering verbal communication and
written communication, using a talk frame graphic organizer to connect verbal and
written communication, encouraging the use of appropriate mathematical vocabulary,
and differentiating instruction (Casa et al., 2013; Gavin, Casa, Adelson, & Firmender,
2013; Gavin, Casa, Firmender, & Carroll, 2013). Although the Project M2 units encom-
passed all of the above instructional practices, the current study focuses on the teach-
ers’ implementation of two of these: verbal mathematical communication and
encouraging the use of appropriate mathematical vocabulary. As recommended by
Ryve (2011), descriptions of these two instructional practices are provided.
The verbal communication (discourse) instructional practice is based on the com-
munication process standard (NCTM, 2000), and the characteristics of this practice as
it relates to Project M2 and the current study are as follows:
The communication is a community activity, so students discuss their ideas with both the
teacher and their peers; students communicate to make sense of the problems and possible
solutions; both the reasoning process (ideas on how to solve the problem) and the product
(the answer) are valued, with the reasoning process overall being more highly regarded;
students regularly justify their ideas; misunderstanding and misconceptions—both
offered by students and introduced by teachers—are used as opportunities to guide the
discussion; and several ideas are considered and revised as students get closer to accepted
mathematical truths. (Gavin, Casa, Chapin, & Sheffield, 2011a, p. 13)
Sample
The sample for the current study included the 36 teachers and 601 students who previ-
ously participated in the Project M2 curriculum implementation research study as part
of the field test intervention groups. Twelve teachers and 193 students from Grade 2,
222 Journal of Advanced Academics 25(3)
12 teachers and 191 students from Grade 1, and 12 teachers and 217 students from
kindergarten participated in the Project M2 field test intervention during the 2008-
2009, 2009-2010, and 2010-2011 academic years, respectively. Missing data, how-
ever, is an issue with all educational research due to student mobility. In the cases
where pre- and postassessment data were not available for a student, the data were
eliminated listwise. As student mobility/absence was the reason for the missing data,
it was assumed that missing data were random. In addition, the level of implementa-
tion for the instructional practices for these two Grade 2 classes had been documented
using a previous version of the Project M2 Teacher Observation Scale that did not align
exactly with the final version and which necessitated exclusion from the current study.
Given that this was a paperwork issue, these data were assumed to be missing at ran-
dom and eliminated listwise. Therefore, the final sample for the current study included
12 teachers and 210 students from kindergarten, 12 teachers and 186 students from
Grade 1, and 10 of the teachers and 164 students from Grade 2 who participated in the
Project M2 field test intervention during the 2008-2009, 2009-2010, and 2010-2011
academic years, respectively. This represents all of the kindergarten and Grade 1
teachers and students, but only 10 of the 12 Grade 2 teachers’ classes (Grade 2) that
participated in the intervention of Project M2. Demographic details for the students
and teachers in the original sample and the sample after missing data were eliminated
listwise are presented in Tables 2 and 3.
Data Collection
The quantitative data analyzed in this study were collected as part of the Project M2
research study during the field test of the curriculum units. A description of the devel-
opment and use of the instruments to collect these data are as follows.
Grade 2, n (%)
include open-ended questions, the Open-Response Assessment for each grade level
was constructed by the researchers (Casa, Copley, & Gavin, 2010; Osiecki, Casa, &
Gavin, 2009; Spinelli, 2008). These assessments were used as part of the Project M2
research study to assess the mathematics achievement in the areas of geometry and
measurement for kindergarten, Grade 1, and Grade 2 students.
The development of the Open-Response Assessments for each grade level began
with an extensive analysis of the geometry and measurement content for the appropri-
ate grade level. Open-response style items for each of the assessments were con-
structed for a pilot test. The pilot versions of the Open-Response Assessments were
sent to reviewers in the fields of mathematics, mathematics education, and early child-
hood education who rated each item on three characteristics: (a) identification of the
appropriate mathematical content area, geometry, or measurement; (b) relevance to
the mathematical content area; and (c) difficulty of the item for the specified grade
level. The scores for each item were analyzed to determine which items would be
retained and/or revised for the final versions of the Open-Response Assessments. The
reviewers were also asked to comment on the wording and appropriateness of the
questions, developmentally, for the specified grade level. Items were revised based on
this review and content analysis and a pilot test was developed. The pilot tests of the
Open-Response Assessments were conducted with two groups of students, a group
who had experienced the pilot version of the Project M2 kindergarten and Grades 1 and
2 geometry and measurement units and a group who had experienced the regular
224 Journal of Advanced Academics 25(3)
mathematics curriculum. The reliability coefficients for each grade level’s Open-
Response Assessment are α = .81 for kindergarten, α = .79 for Grade 1, and α = .82 for
Grade 2.
The kindergarten Open-Response Assessment contained items that were adminis-
tered in small group and individual formats; Grades 1 and 2 Open-Response
Assessments were administered in a whole-class setting. In all cases, the classroom
teacher was present during the administration of the assessment, but did not have
access prior to or after the assessment. Two trained members of the Project M2 research
team scored all students’ Open-Response Assessments using the previously estab-
lished rubrics. Any discrepancy in item scores resulted in the item being scored by a
third Project M2 research team member.
Project M2 Teacher Observation Scale. This scale was developed as one of several mea-
sures used to monitor the fidelity of implementation of the Project M2 curriculum and
embedded instructional practices and to assist with professional development during
the field test. These items were teacher behaviors that would be evident if the teacher
was implementing the specific instructional practices (Gavin & Casa, 2008). A trained
Project M2 professional development staff member completed this treatment fidelity
scale after each weekly classroom observation. The extent of the teachers’ implementa-
tion of the verbal communication instructional practice, including the talk moves, was
recorded for the nine items (see Appendix A). For example, an item to measure the
implementation of verbal communication was “[The talk move,] agree/disagree and
why was used to have students apply their understanding to someone else’s thoughts
and defend their position” (Gavin & Casa, 2008, p. 2). The verbal communication item
Firmender et al. 225
scores were coded as 1 for “yes,” 0.5 for “somewhat,” and 0 for “no.” The mathematical
language instructional practice was measured with three items (see Appendix B), an
example of which is “The teacher or students referred to the word wall” (Gavin & Casa,
2008, p. 3). The use of the teachers’ implementation of this instructional practice was
observed, and the items were coded as 1 for “yes” and 0 for “no.” See Appendices A and
B for the complete set of observation items related to these two instructional practices.
Data Analysis
The students in the intervention group were nested within classrooms and experienced
the Project M2 mathematics content and instructional practices as implemented by
their teacher. It is therefore likely that students within classrooms experienced some
level of statistical dependence. For this reason, the intra-class correlation (ICC) or the
proportion of variance that is between classes on the dependent variable (Raudenbush
& Bryk, 2002) was examined, and hierarchical linear modeling (HLM) was used in the
data analysis to account for the non-independence (McCoach & Adelson, 2010;
Raudenbush & Bryk, 2002) of the students on the dependent variable. ICCs of between
.10 and .20 are common in educational research (McCoach, 2010). In the HLM analy-
ses, the students from all grade levels (n = 560) were included at Level 1 and the
classes/teachers represented the clusters (n = 34), or Level 2 units. With the sample
including over 30 Level 2 clusters, the estimates of the parameter coefficients and
variance components should not be biased (Maas & Hox, 2005).
The goal of the HLM analyses was to determine the relationship between the teach-
ers’ implementation of the specific Project M2 instructional strategies and students’
mathematics achievement. To do this, we ran a series of multilevel models with HLM
7 (Raudenbush, Bryk, & Congdon, 2010) using restricted maximum likelihood
(REML) estimation due to the small sample size (Raudenbush & Bryk, 2002). In addi-
tion, three variables were calculated, the students’ percentage gain scores pre- to post-
test on the Open-Response Assessment and the teachers’ levels of implementation
scores for each of the instructional practices being investigated, verbal communication
and mathematical language. The students’ percentage gain scores pre- to posttest on
the Open-Response Assessment were calculated using the kindergarten and Grades 1
and 2 student pre- and posttest scores on the Open-Response Assessment. Due to the
possible unreliability of gain scores, the variances in the students’ scores on the Open-
Response Assessment at pre- and posttest were examined and determined to be
unequal. This means that the variance at posttest (0.037) was larger than variance at
pretest (0.011; Fulcher & Willse, 2007), which would be expected of reliable gain
scores. Descriptive statistics for variables entered in the HLM are provided in Table 4.
The verbal communication and mathematical language instructional practice scores
for each teacher were based on the number of observations completed. The average
number of observations across grade levels and teachers was 10. The verbal commu-
nication instructional practice score for each teacher was calculated by using the mean
of the verbal communication items for each teacher’s observation that was completed
for at least seven of the nine items in the verbal communication section of the scale.
226 Journal of Advanced Academics 25(3)
average change in the students’ scores on the Open-Response Assessments was 0.43 or 43%.
The mathematical language instructional practice score was calculated for each teacher
first by calculating the mean of the mathematical language section for each teacher’s
observation that was completed for at least two of the three items in the mathematical
language section of the scale. The descriptive statistics for these variables are provided
in Table 4. In addition, the correlation between the teacher implementation scores for
verbal communication and mathematical language was calculated (r = .500, p = .003),
indicating a moderate correlation between the two variables. The verbal communica-
tion and mathematical language instructional practice scores for the teachers were
entered into two separate HLM analyses as a Level 2 predictor of the students’ per-
centage gain score on the Open-Response Assessment.
After calculating the variables, we ran a completely unconditional model that con-
tained only the dependent variable, student percentage gain score on the Open-
Response Assessment at Level 1. Results from the unconditional model were used to
determine the ICC, which was .30. This indicates that 30% of the variance in the per-
centage gain scores on the Open-Response Assessment is between classes. To enhance
the precision with which estimates are made in the model (Raudenbush, 1997), the
ITBS standard score at pretest was entered as a grand mean centered covariate into the
random coefficients model at Level 1. This covariate explained an additional 10% of
the between-class variance. We then estimated a full two-level model that included
two dummy variables to represent the three grade levels of the teachers. The coding
was such that Kindergarten was the referent grade level. Also included at Level 2 in
the full, contextual model was the focal variable, teachers’ level of implementation
score for verbal communication or encouraging mathematical language for the instruc-
tional practice, as a grand mean centered, Level 2 predictor.
Firmender et al. 227
Results
The separate models that were constructed to determine if a relationship exists between
the students’ percentage gain score on the Open-Response Assessment and the teach-
ers’ implementation scores for the instructional practices of verbal communication and
mathematical language were based on the same unconditional and random coefficients
models because the verbal communication and mathematical language implementa-
tion scores were entered into the model as Level 2 predictors.
The unconditional model with the students’ (K-Grade 2) percentage gain score on
the Open-Response Assessments as the dependent variable and no predictors at either
level in the model was analyzed first. The student’s percentage gain score on the Open-
Response Assessment was a function of the intercept (γ00), the student-level residual
(r), and the teacher-level variance (τ00). In the unconditional model with no other pre-
dictors, a student’s predicted percentage gain from pre- to posttest on the Open-
Response Assessment would be 43%; this is statistically significantly different from
zero (p < .001).
The students’ standard scores on the Kindergarten ITBS Mathematics, Grade 1
ITBS Mathematics, and Grade 2 ITBS Math Concepts subtests at pretest were entered
at Level 1 in the random coefficients model as a predictor of the percentage gain score
on the Open-Response Assessment and to account for differences between students on
initial mathematics achievement. The ITBS Mathematics standard score at pretest was
entered as a grand mean centered, Level 1 covariate. Analysis of the results of the
random coefficients model indicated that the ITBS standard score at pretest was a
statistically significant predictor of the percentage gain score on the Open-Response
Assessment (p < .001); however, the slope for the ITBS standard score at pretest (τ11)
was not statistically significant (p = 0.083). It was therefore determined that the ITBS
pretest slope would not be allowed to randomly vary and the random coefficients
model was run again. The results of the random coefficients model are displayed in
Table 5. The addition of the ITBS standard score at the time of pretest as a covariate in
the random coefficients model explains an additional 10% of the between-class vari-
ance over the unconditional model.
Verbal Communication
To determine if there is a relationship between the teachers’ implementation of the
verbal communication instructional practice and the percentage gain score on the
Open-Response Assessment, the teachers’ verbal communication implementation
score variable (M = 0.71, SD = 0.16) from the Project M2 Observation Scale was
entered as a grand mean centered, Level 2 predictor in a contextual model based on the
final random coefficients model. The two dummy variables representing the grade
level of the teachers were also entered into the model as a Level 2 predictor of the
percentage gain score on the Open-Response Assessment.
Analysis of the results of the contextual model indicated that the teachers’ verbal
communication implementation score and the two grade-level variables were not sig-
nificant predictors of the slope for the ITBS standard score at pretest covariate. The
228 Journal of Advanced Academics 25(3)
Table 5. Summary of REML Parameter Estimates for the HLM Models for Percentage Gain
Scores on the Open-Response Assessment.
Contextual models
Fixed effects
Model for CNGPER_OR (β0)
Intercept (γ00) 0.43*** (0.02) 0.43*** (0.02) 0.53*** (0.02) 0.54*** (0.02)
GRADE1 (γ01) −0.09** (0.03) −0.10** (0.03)
GRADE2 (γ02) −0.24*** (0.04) −0.24*** (0.04)
VERBAL (γ03) 0.27*** (0.06) NA
MATHLANG (γ02) NA 0.19** (0.06)
Model for ITBS pretest slope (β1)
Intercept (γ10) 0.004*** (0.001) 0.005*** (0.001) 0.004*** (0.001)
Note. The chi-square test for homogeneity of variances indicated that the variances for the classes are homogeneous for
both the verbal communication model, χ(.05, 33) = 7.00, p > .500, and the mathematical language model, χ(.05, 33) =
7.00, p > .500. REML = restricted maximum likelihood; HLM = hierarchical linear modeling; ITBS = Iowa Tests of Basic
Skills.
**p < .001. ***p < .001.
fixed effects of verbal communication implementation score, and the two grade-level
variables for the effect of the slope of the ITBS pretest were therefore eliminated from
the model based on these results. This revised contextual model was then run, and the
parameters from the final contextual model are shown in Table 5. The final contextual
model including the verbal communication implementation score as a Level 2 predic-
tor is as follows:
The final contextual model that includes the teachers’ verbal communication imple-
mentation score and the teachers’ grade level as predictors of the students’ percentage
gain score on the Open-Response Assessment accounts for an additional 0.3% of the
between-class variance over the random coefficients model, in the students’ percent-
age gain score on the Open-Response Assessment. Given that γ00 = 0.53, a kindergar-
ten student with a mean ITBS standard score at pretest and whose teacher at the mean
Firmender et al. 229
on the verbal communication scale would be predicted to gain 53% pre- to posttest on
the Open-Response Assessment. γ02 is the parameter of interest in this analysis and
represents the predicted change in the percentage gain score on the Open-Response
Assessment as a teacher’s verbal communication implementation score increases,
after controlling for grade level and the ITBS standard score at pretest. The verbal
communication implementation score is a significant predictor of the student percent-
age gain score on the Open-Response Assessment; a kindergarten student at the aver-
age of the ITBS standard score at pretest and whose teacher was rated to have always
implemented the verbal communication practice (scored a 1 on each item) on the
Project M2 Observation Scale would be predicted to gain 80% pre- to posttest on the
Open-Response Assessment.
Mathematical Language
To determine if there is a relationship between the teachers’ implementation of the
mathematical language instructional practice and the percentage gain score on the
Open-Response Assessment, the teachers’ mathematical language implementation
score (M = 0.60, SD = 0.20) from the Project M2 Observation Scale was entered as a
grand mean centered, Level 2 predictor in a contextual model based on the appropriate
random coefficients model. The grade-level variables were also entered into the model
as Level 2 predictors of the percentage gain score on the Open-Response Assessment.
Analysis of the results of the contextual model indicated that the teachers’ mathe-
matical language implementation score and the grade-level variables were not signifi-
cant predictors of the slope of the ITBS standard scores at pretest. Therefore, fixed
effects of the grade-level variables and the teachers’ mathematical language imple-
mentation score on the effect of the slope of the ITBS standard scores at pretest were
eliminated from the model. The revised contextual model was then run and the param-
eters from the final contextual model are shown in Table 5. The final contextual model
that included the teachers’ implementation score for the mathematical language
instructional practice is as follows:
The final contextual model that includes the teachers’ mathematical language
implementation score and the teachers’ grade level as predictors of the students’ per-
centage gain score on the Open-Response Assessment accounts for an additional 0.3%
of the between-class variance, over the random coefficients model, in the students’
percentage gain score on the Open-Response Assessment. The γ00 parameter is inter-
preted as the predicted percentage gain score on the Open-Response Assessment for a
kindergarten student with a mean ITBS standard score at pretest and whose teacher is
at the mean on the mathematical language scale. The predicted percentage gain from
pre- to posttest on the Open-Response Assessment would be 54%. The parameter of
interest in this analysis is γ03 and represents the predicted change in the percentage
230 Journal of Advanced Academics 25(3)
Discussion
At a time when this country’s educational climate is focused on accountability and
reform (NCTM, 2000; NGA & CCSSO, 2010), investigations into what may influence
student performance and achievement are important. Investigating specific reform
practices separately contrasts with other research on instructional practices of the
reform movement in mathematics, which often has been conducted using a framework
that examines the mathematical reform practices as a whole (Gimbert et al., 2007;
Huffman et al., 2003; Spillane & Zeuli, 1999). The current study was conducted in an
attempt to determine the relationship between specific instructional practices that
teachers implemented as part of a curriculum research study and students’ achieve-
ment in geometry and measurement as measured by the kindergarten and Grades 1 and
2 Open-Response Assessments.
The body of research on verbal communication or discourse in mathematics class-
rooms attempts to address a number of issues related to this instructional practice.
Some of these issues include how it is conceptualized (Ryve, 2011), how teachers
develop the use of the practice (Hufferd-Ackles et al., 2004; Walshaw & Anthony,
2008), how teachers can engage students in verbal communication (Chapin et al.,
2009; Walshaw & Anthony, 2008), how often teachers use the instructional practice
(McKinney et al., 2009), and does the teachers’ use of the instructional practice
Firmender et al. 231
Appendix A
Project M2 Observation Scale—Verbal Communication Items
1. The teacher focused on all major mathematical ideas appropriate for the
observed part, and student ideas provided the foundation for the discussions.
2. Revoicing was used by the teacher to clarify and make sense of student ideas.
Students were asked if their ideas were interpreted accurately.
3. Repeat/rephrase was used for students to validate ideas, get students to express
themselves clearly, and/or call attention to ideas. Students were asked if their
ideas were interpreted accurately.
Firmender et al. 233
Appendix B
Project M2 Observation Scale—Mathematical Language Items
1. The word wall was used interactively.
2. The teacher or students referred to the word wall.
3. Teacher encouraged students to use correct mathematical language.
Authors’ Note
The opinions, conclusions, and recommendations expressed in this article are those of the
authors and do not necessarily reflect the position or policies of the National Science Foundation.
Funding
The authors disclosed receipt of the following financial support for the research, authorship,
and/or publication of this article: Part of the work reported herein used archived data collected
during a project funded by the National Science Foundation (Grant DRL-0733189).
References
Beaton, A. E., Mullis, I. V. S., Martin, M. O., Gonzalez, E. J., Kelly, D. L., & Smith, T. A.
(1996). Mathematics achievement in the middle school years. Boston, MA: TIMSS Study
Center.
Burton, L., & Morgan, C. (2000). Mathematicians writing. Journal for Research in Mathematics
Education, 31, 429-453.
Casa, T. M., Copley, J., & Gavin, M. K. (2010). Project M2 kindergarten geometry unit assess-
ment. Storrs, CT: Neag Center for Gifted Education and Talent Development.
Casa, T. M., Firmender, J. M., Gavin, M. K., & Carroll, S. R. (2013). The influence of challeng-
ing geometry and measurement units on the achievement of kindergarteners. Manuscript
in preparation.
Chapin, S. H., O’Connor, C., & Anderson, N. C. (2009). Classroom discussions: Using math
talk to help students learn, K-6 (2nd ed.). Sausalito, CA: Math Solutions.
234 Journal of Advanced Academics 25(3)
Huffman, D., Thomas, K., & Lawrenz, F. (2003). Relationship between professional devel-
opment, teachers’ instructional practices, and the achievement of students in science and
mathematics. School Science and Mathematics, 103, 378-387.
Inagaki, K., Hatano, G., & Morita, E. (1998). Construction of mathematical knowledge through
whole class discussion. Learning and Instruction, 8, 503-526.
Iowa Tests of Basic Skills. (2003). The Iowa tests: Guide to research and development. Itasca,
IL: Riverside.
Kazemi, E., & Stipek, D. (2001). Promoting conceptual thinking in four upper-elementary
mathematics classrooms. The Elementary School Journal, 102, 59-80.
Klibanoff, R. S., Levine, S. C., Huttenlocher, J., Vasilyeva, M., & Hedges, L. V. (2006).
Preschool children’s mathematical knowledge: The effect of teacher “math talk.”
Developmental Psychology, 42, 59-69. doi:10.1037/0012-1649.42.1.59
Maas, C. J. M., & Hox, J. J. (2005). Sufficient sample sizes for multilevel modeling. Methodology,
1, 86-92. doi:10.1027/1614-1881.1.3.86
Marzano, R. J. (2004). Building background knowledge for academic achievement. Alexandria,
VA: Association for Supervision and Curriculum Development.
McCoach, D. B. (2010). Hierarchical Linear Modeling. In G. R. Hancock & R. O. Mueller
(Eds.) Quantitative methods in the social and behavioral sciences: A guide for researchers
and Reviewers. (pp. 123-140). New York, NY: Routledge.
McCoach, D. B., & Adelson, J. L. (2010). Dealing with dependence (Part I): Understanding the
effects of clustered data. Gifted Child Quarterly, 54, 152-155.
McKinney, S. E., Chappell, S., Berry, R. Q., & Hickman, B. T. (2009). An examination of
the instructional practices of mathematics teachers in urban schools. Preventing School
Failure, 53, 278-284.
National Association for the Education of Young Children & National Association of Early
Childhood Specialists in State Departments of Education. (2003). Early childhood curricu-
lum, assessment, and program evaluation. Washington, DC: National Association for the
Education of Young Children.
National Council of Teachers of Mathematics. (1991). Professional standards for teaching
mathematics. Reston, VA: Author.
National Council of Teachers of Mathematics. (2000). Principles and standards for school
mathematics. Reston, VA: Author.
National Governors Association Center for Best Practices & Council of Chief State School
Officers. (2010). Common core state standards for mathematics. Washington, DC: Author.
Retrieved from http://www.corestandards.org/assets/CCSSI_Math%20Standards.pdf
Osiecki, C., Casa, T. M., & Gavin, M. K. (2009). Project M2 Grade 1 Open-Response
Assessment. Storrs, CT: Neag Center for Gifted Education and Talent Development.
Raudenbush, S. W. (1997). Statistical analysis and optimal design for cluster randomized trials.
Psychological Methods, 2, 173-185.
Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data
analysis methods (2nd ed.). Thousand Oaks, CA: SAGE.
Raudenbush, S. W., Bryk, A. S., & Congdon, R. (2010). HLM 7: Hierarchical linear and non-
linear modeling. Lincolnwood, IL: Scientific Software International.
Rubenstein, R. N., & Thompson, D. R. (2002). Understanding and supporting children’s math-
ematical vocabulary development. Teaching Children Mathematics, 9, 107-112.
Ryve, A. (2011). Discourse research in mathematics education: A critical evaluation of 108
journal articles. Journal for Research in Mathematics Education, 42, 167-198.
Schleppegrell, M. J. (2007). The linguistic challenges of mathematics teaching and learning: A
research review. Reading & Writing Quarterly, 23, 139-159.
236 Journal of Advanced Academics 25(3)
Spillane, J. P., & Zeuli, J. S. (1999). Reform and teaching: Exploring patterns of practice in
the context of national and state mathematics reforms. Educational Evaluation and Policy
Analysis, 21, 1-27.
Spinelli, A. M. (2008). Project M2 Grade 2 Open-Response Assessment. Storrs, CT: Neag
Center for Gifted Education and Talent Development.
Thompson, D. R., & Rubenstein, R. N. (2000). Learning mathematics vocabulary: Potential
pitfalls and instructional strategies. Mathematics Teacher, 93, 568-574.
van Oers, B. (1996). Learning mathematics as a meaningful activity. In L. P. Steffe, P. Nesher,
P. Cobb, G. A. Goldin, & B. Greer (Eds.), Theories of mathematical learning (pp. 91-113).
Mahwah, NJ: Lawrence Erlbaum.
Wakefield, D. V. (2000). Math as a second language. The Educational Forum, 64, 272-279.
Walshaw, M., & Anthony, G. (2008). The teacher’s role in classroom discourse: A review
of recent research into mathematics classrooms. Review of Educational Research, 78,
516-551.
Whitin, P., & Whitin, D. J. (2000). Math is language too: Talking and writing in the mathemat-
ics classroom. Urbana, IL: National Council of Teachers of English.