Professional Documents
Culture Documents
Developing and Validating A Reliable TPACK Instrument For Secondary Mathematics Preservice Teachers
Developing and Validating A Reliable TPACK Instrument For Secondary Mathematics Preservice Teachers
net/publication/260364520
CITATIONS READS
58 4,458
4 authors:
Some of the authors of this publication are also working on these related projects:
Alabama's Practitioner Leaders for Underserved Schools in Mathematics (A-PLUS) View project
Item efficiency: an item response theory parameter with applications for improving the reliability of mathematics
assessment View project
All content following this page was uploaded by Jeremy Zelkowski on 20 January 2016.
Jeremy Zelkowski
Jim Gleason
University of Alabama
Dana C. Cox
Miami University
Stephen Bismarck
University of South Carolina—Upstate
Abstract
Within the realm of teaching middle and high school mathematics, the abil-
ity to teach mathematics effectively using various forms of technology is now
more important than ever, as expressed by the recent development of the
Common Core State Standards for Mathematical Practice. This article pres-
ents the development process and the results from 15 institutions and more
than 300 surveys completed by secondary mathematics preservice teach-
ers. The results suggest that technological, pedagogical, and content knowl-
edge; technology knowledge; content knowledge; and pedagogical knowledge
constructs are valid and reliable, whereas pedagogical content knowledge,
technological content knowledge, and technological pedagogical knowledge
domains remain difficult for preservice teachers to separate and self-report.
(Keywords: TPACK, survey, secondary, mathematics education, teacher edu-
cation, preservice)
W
ith the recent development and adoption of the Common Core
State Standards for Mathematics (CCSSM) (NGA & CCSSO, 2010)
come new standards for mathematical practice that re-emphasize
the importance of technological tools in mathematics classrooms. This
renewed commitment to helping students model their world and strategi-
cally select tools for mathematical pursuits essentially requires teachers of
mathematics to be knowledgeable and effective at teaching mathematics
with technology and assessing the learning of mathematics by students
who use technology when doing mathematics (Stein, Smith, Henningsen,
& Silver 2010). The CCSSM are not the first set of standards that attend to
the importance of integrating technology into the teaching and learning of
mathematics. The National Council of Teachers of Mathematics (NCTM)
Theoretical Background
2008; Strawhecker, 2005; Van Voorst, 2004), though more so at the primary
grade levels than secondary grade levels.
Because of this work, teacher education programs have placed much
greater attention on the development of these teacher knowledge do-
mains (Ball, Lubienski, & Mewborn, 2001; Dede & Soybas, 2011; Fryk-
holm & Glasson, 2005; Kahan, Cooper, and Bethea, 2003; Wilson,
Floden, Ferrini-Mundy, 2001). With the inclusion of TK as an additional
teacher knowledge domain and the advanced technology developments
for teaching mathematics, teacher education programs for teachers of
mathematics, now must examine their program effectiveness at develop-
ing seven domains of knowledge in PSTs. These seven domains include,
TK, PK, CK, PCK, technological pedagogical knowledge (TPK), techno-
logical content knowledge (TCK), and TPACK (see Figure 1).
For a more general treatment and description of each of these domains,
we refer the reader to Schmidt, Baran, Thompson, Mishra, Koehler, and Shin
(2009) and the literature above. Here, we elaborate on TK, TCK, and TPACK
in the context of secondary mathematics. To begin crafting these working
definitions, we acknowledge the considerable amount of accomplished work
in the field (see “TPACK in secondary mathematics” below).
TK in secondary mathematics. TK in secondary mathematics refers to
knowledge of technologies that are relevant to secondary mathematics
Method
Our goal was to develop an instrument that answers the need to monitor
the development of TPACK in preservice secondary mathematics teachers
throughout preparation programs. Adapting the TPACK survey for elemen-
tary PSTs (Schmidt et al., 2009), this project focused on developing and
validating a reliable content-specific survey for the population of interest:
preservice secondary mathematics teachers.
Instrument Development
We began by examining the reliable and nonreliable items from the Schmidt
et al. (2009) survey by deleting items that did not specifically pertain to the
teaching of secondary mathematics. Specifically, we deleted the items relat-
ing to social studies, literacy, and science, as the instrument was established
for generalists at the elementary level. Next, we wrote an additional 22 items
to fill gaps in the seven knowledge domain constructs (TK, PK, CK, PCK,
TPK, TCK, TPACK) by focusing on specific content areas in mathematics.
We wrote these items to adhere to item development guidelines (Fink, 2003;
Fink & Kosecoff, 1998; Fowler, 2008; Pattern, 2001). Six researchers, includ-
ing two external to the project, with expertise in secondary mathematics
education reviewed all of the items to be included in the survey for content
validity (Lawshe, 1975), even though the Schmidt survey development team
had previously vetted many of the survey items. We then revised seven items
based on the feedback from the expert panel. Three items received minor
editing, such as moving a word to a different location in the item or using a
word with similar meaning that was more specific to mathematics. Six items
received major editing, such as completely restructuring the item wording.
After this revision, the final survey instrument contained 62 items consisting
of 8 TK items, 8 CK items, 8 PK items, 7 PCK items, 7 TPK items, 12 TCK
items, and 12 TPACK items. Each item used a 5-point Likert scale response
(SD = “strongly disagree,” D = “disagree,” N = “neither agree nor disagree,” A
= “agree,” SA = “strongly agree”).
For the purposes of this study, we designed a cover sheet to collect addi-
tional demographic data, including participants’ age, level in college, practi-
cum experience, gender, and ethnicity. We also designed an additional cover
sheet to gather information about the context in which data was collected,
including the name of the course for which the survey was administered,
date of survey, whether students completed a technology course, if technol-
ogy is used in mathematics content courses, and the grade-band of teacher
certification for secondary mathematics at each respective institution.
Participants
The research team sought to collect data from a variety of secondary math-
ematics teacher preparation programs across the United States in an effort
to maximize the diversity in our national sample. For this project, diversity
included size of institution, type of institution, size of secondary mathemat-
ics education program, demographics of student population, experience of
faculty teaching program courses, and geographic location. We contacted
24 secondary mathematics education faculty around the country to inquire
whether they would be willing to administer the survey to their PSTs. Half
(12) agreed to organize administering the survey in addition to the three
institutions of the co-authors. The sample was a mix of convenience and
strategically chosen institutions to produce a diverse sample. To maximize
authentic responses and completion rates, paper surveys were administered
in classes rather than online out-of-class surveys (Adams & Gale, 1982;
Lefever, Dal, & Matthiasdottir, 2007; Norris & Conn, 2005). Faculty were
encouraged to arrange administering the survey at the start of class but were
permitted to administer the survey at their convenience. Faculty had gradu-
ate students or teaching assistants administer the survey in accordance with
IRB requirements. Due to these efforts, our survey completion return rate
was over 90% and as we will mention later, the surveys returned seemed to
contain predominantly legitimate responses.
We collected 315 surveys completed by PSTs prior to student teaching
from 15 institutions across the United States (see Table 1) with class sizes
ranging from 4 to 26 students during the 2010–11 academic year. Survey
Variable % MW NE SE W RI RT
Age
< 19 0.3 1.5 0.0 0.0 0.0 0.7 0.0
19–22 68.3 89.7 66.7 62.8 40.6 67.1 68.2
23–26 19.0 5.9 15.8 22.6 43.8 19.9 20.3
27–30 2.9 0.0 3.5 3.6 6.3 2.1 4.1
> 30 9.5 2.9 12.3 11.7 9.4 11.0 8.8
Class
Sophomore 8.6 36.8 3.5 0.0 0.0 6.8 12.2
Junior 17.8 16.2 26.3 16.8 0.0 17.1 16.2
Senior 50.8 41.2 40.4 61.3 50.0 43.8 58.8
Graduate 20.6 2.9 26.3 21.2 50.0 30.8 12.2
Gender
Male 32.4 39.7 24.6 32.1 43.8 32.2 25.0
Female 67.3 60.3 75.4 67.9 53.1 67.8 75.0
Practicum
Yes 58.4 46.0 44.0 70.0 59.0 69.9 47.3
No 41.6 54.0 56.0 30.0 41.0 30.1 52.7
Ethnicity
African Am 4.1 36.8 1.8 5.8 3.1 6.2 2.0
Am Indian 0.3 0.0 0.0 0.7 0.0 0.0 0.7
Asian 1.3 0.0 3.5 0.7 3.1 2.1 0.7
Hisp/Latino 3.5 1.5 1.8 0.7 25.0 0.7 6.1
Pac Islander 1.0 0.0 1.8 0.7 3.1 0.7 0.7
White 85.7 94.1 84.2 88.3 56.3 87.0 83.1
Other 2.2 1.5 5.3 0.7 3.1 0.7 6.1
Notes:. Regional classifications (Midwest, Northeast, Southeast, West) based on higher education consortia’s of states (e.g. West-
ern Interstate Commission on Higher Education). Institutional classification based on Carnegie classification and labeled as either
RI=Research Intensive, RT=Some research and/or Teaching only. Some percentages may not sum to 100 due to non-responses
and/or rounding.
found an additional eight cell-entry errors, and corrected them. The project
PI then randomly selected 10% of the surveys and found no entry errors.
This verification method of checking, correcting, and rechecking authenti-
cates that the 19,530 cells in the spreadsheet had virtually a zero error rate
for entry before the analyses.
The final sample reported in Table 1 reflects 49.6% of responses from sec-
ondary mathematics PSTs at eight research-intensive, institutions with the
remaining amount coming from seven institutions with some research activ-
ity and/or teaching only. The national sample consisted of responses from
institutions in the Midwest (23.1%), Northeast (19.4%), Southeast (46.6%),
and West (10.9%) regions of the United States. Our sample oversampled
approximately 10% higher in the Southeast and 10% lower in the West when
examining the 2010 U.S. census populations. The Midwest and Northeast are
within 2% of census populations to the region.
Data Analysis
To determine the structure of the TPACK instrument, we randomly split the
data into two equal groups based on the overall TPACK summation score of
Likert scale items so that the distribution between the two groups was the
same. We used an exploratory factor analysis (EFA) with the first group of
147 student surveys to determine which items should be retained for the fi-
nal instrument. We used a confirmatory factor analysis on the second group
of 147 student surveys to verify the structure obtained through the explor-
atory factor analysis. This method gives more strength and validity to the
research method and results (Fink, 2003; Litwin, 1995; Thompson, 2004). To
determine the internal reliability of each of the subscales, we used a graded
response model (Samejima, 1996) with the full sample of 294 surveys to
analyze the instrument in addition to the traditional Cronbach alpha for
reliability measures.
Results
Table 2. Factor Loadings for EFA with Varimax Rotation of Seven Factors
Item Factor 1 Factor 2 Factor 3 Factor 4 Factor 5 Factor 6 Factor 7
TK01 -0.175 0.738 -0.226
TK02 -0.274 0.837
TK03 0.860 -0.187 0.141
TK04 0.845 -0.121
TK05 0.695
TK06 0.739 -0.134 0.169 0.103
TK07 0.228 0.513 -0.102 0.157
TK08 0.120 -0.421 0.203 -0.113
CK09 -0.196 -0.102 0.732
CK10 0.103 0.899 -0.276
CK11 0.778 -0.265
CK12 0.503 0.170
CK13 -0.159 0.564 0.199
CK14 0.128 0.491 0.345
CK15 -0.183 -0.127 0.257 0.640
CK16 -0.154 0.119 0.451 0.203
PK17 0.627 0.130 -0.101
PK18 -0.120 0.828
PK19 -0.168 0.147 0.907 -0.179
PK20 -0.149 0.766
PK21 0.252 0.613 -0.165 -0.125
PK22 0.443 0.262
PK23 0.436 -0.144 0.191
PK24 0.630 0.132 0.110
PCK25 0.386 0.146 0.233
PCK26 0.168 0.659
PCK27 -0.180 0.110 0.751
PCK28 0.342 0.235 0.209 -0.167
PCK29 0.193 0.129 0.189 0.151 0.109
PCK30 -0.151 0.180 0.591
PCK31 -0.161 -0.173 0.130 0.501 0.144 0.291 0.152
TCK32 0.318 -0.100 0.219 -0.162 0.441
TCK33 -0.145 0.536 -0.194
TCK34 0.494 -0.107 0.375 -0.123 -0.323
TCK35 0.608 0.124 -0.259 0.102 -0.390
TCK36 0.872 -0.116 -0.128
TCK37 -0.111 0.754
TCK38 0.349 0.148 -0.171 -0.213
TPK39 0.733 -0.111 -0.121 -0.206
TPK40 0.713 -0.127 0.146 0.150 -0.128 -0.172
Table 2 continued
Item Factor 1 Factor 2 Factor 3 Factor 4 Factor 5 Factor 6 Factor 7
TPK41 0.488 0.131 0.192 -0.195
TPK42 0.485 0.198 0.106 0.109 -0.303
TPK43 0.583 0.156 -0.236
TPK44 -0.134 0.256
TPK45 0.207 0.542 0.113
TPK46 0.143 0.188 -0.112 0.120 0.322
TPK47 0.457 0.146 0.129 0.159 -0.172
TPK48 0.124 -0.187
TPK49 0.254 -0.118 0.262 -0.105
TPK50 0.147 0.159 -0.112 -0.254
TPACK51 0.759 -0.115 0.149 0.122
TPACK52 0.987 -0.150 -0.197 -0.156
TPACK53 1.009 -0.141 -0.120 -0.147 -0.101
TPACK54 0.495 0.255 -0.119
TPACK55 0.717 0.149 0.219
TPACK56 0.505 0.204 0.354
TPACK57 0.321 0.157 0.184 -0.193 0.534 0.335
TPACK58 0.210 -0.131 0.748 0.269
TPACK59 0.758 0.249
TPACK60 0.623 -0.185 0.119 0.219 0.259
TPACK61 0.119 0.913 0.491
TPACK62 -0.121 0.802 0.390
Notes. Factor loadings ≥ 0.500 are in boldface. Test of the hypothesis is that seven factors are sufficient. Chi-square statistic is
2375.54 on 1,478 degrees of freedom, p<0.001.
Table 3. Factor Loadings for EFA with Varimax Rotation of Four Factors.
Item Factor 1 Factor 2 Factor 3 Factor 4
TK01 -0.163 0.706
TK02 -0.248 0.848 0.102
TK03 0.843 -0.174
TK04 0.828 -0.114
TK05 0.115 0.669
TK06 0.713
TK07 0.212 0.462
TK08 0.120 -0.405 0.161
CK09 -0.197 0.755
CK10 0.725
CK11 -0.113 0.655
CK12 0.536 0.108
CK13 0.636
CK14 0.118 0.662
CK15 -0.113 -0.118 0.794
CK16 -0.166 0.632
PK17 0.667
PK18 0.801
PK19 -0.118 0.158 0.822
PK20 -0.123 0.758
PK21 0.195 -0.176 0.656
PK22 0.172 0.166 0.479
PK23 0.479
PK24 0.104 0.571
TPACK51 0.864 -0.103 -0.120
TPACK52 0.857 -0.151
TPACK53 0.923 -0.143
TPACK54 0.516 0.233 -0.154
TPACK55 0.798
TPACK56 0.239 0.490
TPACK57 0.570 0.154
TPACK58 0.386 0.236
TPACK59 0.829
TPACK60 0.750 0.133 -0.147
TPACK61 0.507 0.216 -0.141
TPACK62 0.281 0.100 0.305
Notes. Factor loadings ≥ 0.500 are in boldface. Test of the hypothesis is that four factors are sufficient. Chi-square statistic is
861.66 on 492 degrees of freedom, p<0.001.
small loadings onto the desired factor and/or loadings onto undesired fac-
tors, we established a final instrument with 22 items (6 TK, 5 CK, 5 PK, 6
TPACK). The iterative process included determining items that loaded onto
multiple factors. For example, we deleted TPACK62 due to cross-loading
almost equally on two constructs. Items with less than a 0.5 factor loading
were considered for deletion. Remaining items with a factor loading over
0.5 were examined for high correlations with other items. The remaining
items deleted were due to high correlations with other times. For example,
TPACK54 had a high correlation (>0.8) with TPACK51. We determined that
TPACK51 was a much stronger and more meaningful item than TPACK54.
Further, some of the indices indicated that the fit would be greatly improved
by removing CK10 and allowing TPACK59 and TPACK60 to interact. This
is natural, as the geometry and algebra items for TPACK should have some
correlation for secondary mathematics PSTs because much of the content
is interconnected (NCTM, 2000). The CFA for four factors is presented in
Table 4 (p. 188).
Further, as the analysis of the instrument includes using a graded re-
sponse model to estimate the standard error curve, each of the subscales
must be confirmed to be unidimensional. To verify the unidimensionality,
we analyzed each subscale (TK, CK, PK, TPACK) using all 294 students.
Internal reliability. The Technology Knowledge (TK) subscale had a
Cronbach’s alpha of 0.8899 with the graded response model (see Figure 2,
p. 189) giving a standard error of measurement below 0.35 for most of the
respondents. For the TK subscale, 74% of the respondents were within one
standard deviation of the mean, and 96% were within two standard devia-
tions of the mean.
The Content Knowledge (CK) subscale had a Cronbach’s alpha of 0.8554 with
the graded response model (see Figure 3, p. 189) giving a standard error of mea-
surement below 0.53 for participants within one standard deviation of the mean.
For the CK subscale, 73% of the respondents were within one standard deviation
of the mean, and 99% were within two standard deviations of the mean.
The Pedagogical Knowledge (PK) subscale had a Cronbach’s alpha of
0.8768 with a graded response model (see Figure 4, p. 190), giving a stan-
dard error of measurement below 0.5 for all of the participants. For the PK
sub-scale, 72% of the respondents were within one standard deviation of the
mean and 93% were within two standard deviations of the mean.
The TPACK subscale had a Cronbach’s alpha of 0.8966 with a graded re-
sponse model (see Figure 5, p. 190) giving a standard error of measurement
below 0.5 for almost all of the participants. For the TPACK subscale, 71% of
the respondents were within one standard deviation of the mean, and 94%
were within two standard deviations of the mean.
As Cronbach’s alpha for each of the four subscales is above a 0.85 for each
construct, the internal reliability is very good to excellent in measure for
differences at the group level, and therefore the subscales meet the desired
Table 4. Factor Loadings for CFA with Varimax Rotation of Four Factors after Iterative Process for Item Removal
Item TPACK TK CK PK
TK01 0.663
TK02 -0.179 0.801
TK03 0.853 -0.161
TK04 0.831 -0.132
TK05 0.133 0.665
TK06 0.667
CK09 -0.228 0.774
CK10 0.826
CK11 0.754
CK12 0.579 0.102
CK13 0.593
CK14 0.108 0.555
PK17 0.142 0.625
PK18 0.821
PK19 -0.109 0.133 0.827
PK20 -0.121 0.730
PK21 0.208 0.608
TPACK51 0.801 -0.122
TPACK52 0.880 -0.103
TPACK53 0.929
TPACK55 0.710 0.111
TPACK59 0.750
TPACK60 0.678 0.139 0.147 -0.152
Notes. Factor loadings ≥ 0.500 are in boldface. Test of the hypothesis is that four factors are sufficient. Chi-square statistic is
272.38 on 167 degrees of freedom, p<0.001.
Table 5. Confirmatory Factor Analysis Statistical Fit Indices for Final Model and Subscales
Description Chi-square df p NNFI CFI RMSEA
Four-factor model 374.40 202 0.00000 0.940 0.947 0.052
TK subscale 37.67 9 0.00002 0.947 0.968 0.104
CK subscale 18.82 5 0.00207 0.945 0.972 0.097
PK subscale 8.60 5 0.12607 0.990 0.995 0.049
TPACK subscale 44.55 8 0.00000 0.932 0.964 0.125
Notes. Four-factor model statistics and each subscale unidimensionality statistics.
Figure 6. Confirmatory factor analysis model diagram for final four-factor instrument.
TK. On the other hand, if PSMTs have strong TK, it does not imply that they
see they have strong pedagogy beliefs. The key finding of all this sense-making
is that the fact that CK is correlated well with PK and TK may simply imply
that higher CK is likely when PSMTs have higher TK and PK, instead of that
a PSMT with lower TK or PK will have lower content knowledge. It could be
possible that PSMTs with more education methods courses may have stronger
CK, as K–12 content is focused on more in those courses, as opposed to math-
ematics department upper-division content courses (e.g., analysis, abstract
algebra). PSMTs who have completed upper-division mathematics courses
may perceive themselves as having lower CK because their strong self-efficacy
in mathematics may be challenged in theoretical-based courses.
With these results, our goal of developing a valid and reliable instrument
to monitor and assess preservice mathematics teachers’ TPACK develop-
ment within teacher education programs was lucrative for a four-factor in-
strument. Although no instrument is perfect, our instrument provides a ro-
bust survey aimed at one specific population that traditionally is very small
at respective institutions of higher education, making it nearly impossible to
largely quantify TPACK components through research at one or even a few
institutions. This adds to the TPACK literature with a large-scale quantita-
tive study of a historically low population, institution to institution, in the
United States. Our instrument builds on the work of Mishra and Koehler
(2005a, 2005b, 2006) through the adapting of the Schmidt et al. work (2009).
We extend the TPACK literature domain with this instrument as compared
to previous study findings for different populations.
The main goal of this research project was to develop a valid and reliable
instrument for the population of secondary mathematics PSTs to help math-
ematics teacher educators (MTEs) monitor and assess their own programs’
ability to develop TPACK in their PSTs. The data from this research resulted
in survey items to reliably monitor and assess the constructs of TK, PK, CK,
and TPACK so historically low populations of students at institutions can
use a valid and reliable instrument to examine their program effectiveness.
This instrument can be used to better inform a single program course and
can provide overall information for an entire program. If the instrument
is applied as a pre–post survey for a program, the information does not
provide detailed information regarding course specific material. A limitation
arises with programs where secondary mathematics PSMTs’ content knowl-
edge is targeted by a mathematics department and their pedagogical knowl-
edge is targeted by an education department, and there is a lack of commu-
nication between both departments. This is not uncommon in secondary
mathematics teacher preparation programs across the United States.
This study was not specifically about what students know, but rather an
objective assessment of what they think they know. PSMTs’ personal beliefs
change positively and negatively over time during preservice prepara-
tion. This instrument gives researchers and educators the ability to reliably
measure these beliefs regarding TPACK and the contributing factors of PK,
TK, and CK during preservice preparation programs. PSMTs can easily be
overconfident or lack confidence. Our experiences provide us with knowl-
edge that PSMTs can be exposed to new ways of thinking during coursework
and improve or diminish their sense of self-efficacy. Learning can go in both
directions, so our study provides an opportunity for programs and educators
to understand their learning environment to improve TPACK development
in PSMTs.
Limitations
A limitation worth noting is the potential of variation in the administration
of surveys. Specific instructions were given to each site, and we rely on the
professionals following those guidelines, and we must rely on an assumption
that the surveys were administered in a similar fashion at each institution.
More significant, we were unable to isolate PCK, TCK, and TPK knowledge
domains specifically in the population of secondary mathematics PSTs. We
acknowledge that this is limiting in terms of the use of our survey, yet we
see unearthing such a limitation as a key finding in expanding the TPACK
literature domain.
Acknowledgments
The authors wish to thank individuals who helped us by administering or arranging the admin-
istration of our survey to the secondary mathematics preservice teachers at their respective insti-
tutions. Thank you, Ginny Bohme, Daniel Brahier, Anna Marie Conner, Kelly Edenfield, Carla
Gerberry, Shannon Guerrero, Gary Martin, Ginger Rhodes, Wendy Sanchez, Tommy Smith, Toni
Smith, Jami Stone, Anthony Thompson, and Jan Yow. We also wish to express our gratitude to the
preservice secondary mathematics teachers who engaged and diligently completed surveys at each
of the respective institutions of higher education. Finally, we wish to thank Barbara and Robert
Reys for their visionary mission to link early career mathematics teacher educators through the
STaR program to advance the future of the mathematics education field. Without their hard work
and the mentoring professionals of the STaR program, the data collection for this project may
have taken years and the professional relationships might never have been established.
Author Notes
Jeremy Zelkowski, PhD, is an assistant professor of secondary mathematics education in the
Department of Curriculum and Instructions in the College of Education at the University of
Alabama, Tuscaloosa. His research interests focus on TPACK development in preservice teachers,
appropriate use of technology in teaching mathematics, mathematical college readiness, and pro-
fessional development learning communities. Please address correspondence regarding this article
to Dr. Jeremy Zelkowski, Department of Curriculum and Instruction, University of Alabama, 902
University Blvd, Tuscaloosa, AL 35487-0232. Email: jzelkowski@bamaed.ua.edu
Jim Gleason, PhD, is an associate professor in the Department of Mathematics in the College of Arts
& Sciences at the University of Alabama, Tuscaloosa. His research interests focus on teacher content
knowledge, assessment, and psychometrics pertaining to mathematics teaching and learning.
Dana C. Cox, PhD, is an assistant professor in the Department of Mathematics in the College of
Arts & Sciences at Miami University, Oxford, Ohio. Her research interests focus on curriculum
design, TPACK development in preservice teachers, and studying the emergent professional vision
of teaching with technology held by preservice teachers.
References
Abbitt, J. T. (2011). Measuring technological pedagogical content knowledge in preservice
teacher education: A review of current methods and instruments. Journal of Research on
Technology in Education, 43(4), 281–300.
Adams, L. L. M., & Gale, D. (1982). Solving the quandary between questionnaire length and
response rate in education research. Research in Higher Education, 17(3), 231–240.
Angeli, C., & Valanides, N. (2009). Epistemological and methodological issues for the
conceptualization, development, and assessment of ICT-TPCK: Advances in technological
pedagogical content knowledge (TPCK). Computers & Education, 52(1), 154–168
Archambault, L. M., & Barnett, J. H. (2010). Revisiting technological pedagogical content
knowledge: Exploring the TPACK framework. Computers & Education, 55(4), 1656–1662.
Archambault, L., & Crippen, K. (2009). Examining TPACK among K–12 online distance
educators in the United States. Contemporary Issues in Technology and Teacher Education
(CITE Journal), 9(1), 71–88.
Association of Mathematics Teacher Educators (AMTE). (2006). Preparing teachers to use
technology to enhance the learning of mathematics. Retrieved August 1, 2008, from http://
www.amte.net/resources
Association of Mathematics Teacher Educators (AMTE). (2009). Mathematics TPACK
framework. Retrieved October 13, 2009, from http://www.amte.net/resources
Ball, D. L. (1990) Prospective elementary and secondary teachers’ understanding of division.
Journal for Research in Mathematics Education, 21(2), 132–144.
Ball, D. L., Hill, H. C., & Bass, H. (2002). Developing measures of mathematics knowledge for
teaching. Ann Arbor, MI: Study of Instructional Improvement.
Ball, D. L., Lubienski, S., & Mewborn, D. (2001). Research on teaching mathematics: The
unsolved problem of teachers’ mathematical knowledge. In V. Richardson (Ed.), Handbook
of research on teaching (4th ed.). New York: Macmillan.
Barkatsas, A., Kasimatis, K., & Gialamas, V. (2009). Learning secondary mathematics
with technology: Exploring the complex interrelationship between students’ attitudes,
engagement, gender and achievement. Computers & Education, 52(3), 562–570.
Beaudin, M., & Bowers, D. (1997). Logistics for facilitating CAS instruction. In J. Berry,
J. Monaghan, M. Kronfellner, & B. Kutzler (Eds.), The state of computer algebra in
mathematics education (pp. 126–135). Lancashire, England: Chartwell-York.
Bos, B. (2007). The effect of the Texas Instrument interactive instructional environment on the
mathematical achievement of eleventh grade low achieving students. Journal of Educational
Computing Research, 37(4), 351–368.
Bos, B. (2011). Professional development for elementary teachers using TPACK. Contemporary
Issues in Technology and Teacher Education, 11(2), 167–183.
Bowers, J., & Stephens, B. (2011). Using technology to explore mathematical relationships:
A framework for orienting mathematics courses for prospective teachers. Journal of
Mathematics Teacher Education, 14(4), 285–304. doi:10.1007/s10857-011-9168-x
Brown, C. A., & Borko, H. (1992). Becoming a mathematics teacher. In D. A. Grouws (Ed.),
Handbook of research on mathematics teaching and learning. New York: Macmillan.
Cronbach, L. J., & Shavelson, R. J. (2004). My current thoughts on coefficient alpha and
successor procedures. Educational and Psychological Measurement, 64(3), 391–418.
Dede, Y., & Soybas, D. (2011). Preservice mathematics teachers’ experiences about function
and equation concepts. EURASIA Journal of Mathematics, Science & Technology Education,
7(2), 89–102.
Demana, F., & Waits, B. K. (1990). Implementing the standards: The role of technology in
teaching mathematics. Mathematics Teacher, 83(1), 27–31.
Demana, F., & Waits, B. K. (1998). The role of graphing calculators in mathematics reform.
Retrieved from ERIC database. (ED458108).
DeVellis, R. F. (2011). Scaled development: Theory and applications. Thousand Oaks, CA: Sage
Publications.
Doerr, H. M., & Zangor, R. (2000). Creating meaning for and with the graphing calculator.
Educational Studies in Mathematics, 41(2), 143–163.
Drier, H. S. (2001). Teaching and learning mathematics with interactive spreadsheets. School
Science and Mathematics 101(4), 170–179.
Duda, J. (2011). Mathematical creative activity and the graphic calculator. International
Journal for Technology in Mathematics Education, 18(1), 3–14.
Dugdale, S., & Kibbey, D. (1990). Beyond the evident content goals—Part I: Tapping the depth
and flow of the educational undercurrent. Journal of Mathematical Behavior, 9, 201–228.
Dunham, P. H., & Dick, T. P. (1994). Connecting research to teaching: Research on graphing
calculators. Mathematics Teacher, 87(6), 440–445
Dunham, P., & Hennessey, S. (2008). Equity and the use of educational technology in
mathematics. In M. K. Heid & G. W. Blume (Eds.), Research on technology and the teaching
and learning of mathematics: Vol. 1. Research syntheses (pp. 345–418). Charlotte, NC:
Information Age.
Fink, A. (2003). How to manage, analyze, and interpret survey data (2nd edition). Thousand
Oaks, CA: Sage Publications.
Fink, A., & Kosecoff, J. (1998). How to conduct surveys: A step-by-step guide. Thousand Oaks,
CA: Sage Publications.
Fowler, F. J. (2008). Survey research methods (4th edition). Thousand Oaks, CA: Sage
Publications.
Frykholm, J., & Glasson, G. (2005). Connecting science and mathematics instruction:
Pedagogical context knowledge for teachers. School Science and Mathematics, 105(3),
127–140.
Green, S. B., Lissitz, R. W., & Mulaik, S. A. (1977). Limitations of coefficient alpha as an index
of test unidimensionality. Educational and Psychological Measurement, 37(4), 827–838.
Guckin, A., & Morrison, D. (1991). Math*Logo: A project to develop proportional reasoning
in college freshmen. School Science and Mathematics, 91(2), 77–81.
Harris, J. B., & Hofer, M. J. (2011). Technological pedagogical content knowledge (TPACK)
in action: A descriptive study of secondary teachers’ curriculum-based, technology-related
instructional planning. Journal of Research on Technology in Education, 43(3), 211–229.
Harris, J. B., Grandgenett, N., & Hofer, M. (2010, March). Testing a TPACK-based technology
integration assessment rubric. Paper presented at the annual meeting of the Society for
Information Technology and Teacher Education (SITE), San Diego, CA.
Hill, H. C. (2011). The nature and effects of middle school mathematics teacher learning
experiences. Teachers College Record, 113(1), 205–234.
Hofer, M. & Grandgenett, N. (2012). TPACK development in teacher education: A
longitudinal study of preservice teachers in a secondary M.A.Ed. program. Journal of
Research on Technology in Education 45(1), 83–106.
Hofer, M., Grandgenett, N., Harris, J., & Swan, K. (2011). Testing a TPACK-based
technology integration observation instrument. In C. D. Maddux, D. Gibson, B. Dodge,
M. Koehler, P. Mishra, & C. Owens (Eds.). Research highlights in technology and teacher
education 2011 (pp. 39–46). Chesapeake, VA: Society for Information Technology &
Teacher Education.
Hollebrands, K. F. (2003). High school students’ understandings of geometric transformations
in the context of a technological environment. Journal of Mathematical Behavior, 22(1),
55–72.
Kahan, J. A., Cooper, D. A., & Bethea, K. A. (2003). The role of mathematics teachers’
content knowledge in their teaching: A framework for research applied to a study
of student teachers. Journal of Mathematics Teacher Education, 6(3), 223–252. doi:
10.1023/A:1025175812582.
Kaput, J. (1995). Creating cybernetic and psychological ramps for the concrete to the abstract:
Examples from multiplicative structures. In D. Perkins, J. Schwartz, M West, & M. Wiske
(Eds.), Software goes to school: Teaching for understanding with new technologies (pp.
130–154). New York: Oxford University Press.
Kastberg, S., & Leatham, K. (2005). Research on graphing calculators at the secondary level:
Implications for mathematics teacher education. Contemporary Issues in Technology and
Teacher Education [Online serial], 5(1).
Keating, T., & Evans, E. (2001). Three computers in the back of the classroom: Preservice
teachers’ conceptions of technology integration. In R. Carlsen, N. Davis, J. Price, R. Weber, &
D. Willis (Eds.), Society for Information Technology and Teacher Education Annual, 2001 (pp.
1671–1676). Norfolk, VA: Association for the Advancement of Computing in Education.
Kelly, B., Carnine, D., Gersten, R., & Grossen, B. (1986). The effectiveness of videodisc
instruction in teaching fractions to learning-disabled and remedial high school students.
Journal of Special Education Technology, 7(2), 5–17.
Kendal, M. & Stacey, K. (2001). The impact of teacher privileging on learning differentiation
with technology. International Journal of Computers for Mathematical Learning, 6(2),
143–165.
Koehler, M. J., & Mishra, P. (2005a). Teachers learning technology by design. Journal of
Computing In Teacher Education, 21(3), 94–102.
Koehler, M. J., & Mishra, P. (2005b). What happens when teachers design educational
technology? The development of technological pedagogical content knowledge. Journal of
Educational Computing Research, 32(2), 131–152.
Koh, J. L., Chai, C. S., & Tsai, C. C. (2010). Examining the technological pedagogical content
knowledge of Singapore pre-service teachers with a large-scale survey. Journal of Computer
Assisted Learning, 26(6), 563–573. doi:10.1111/j.1365-2729.2010.00372.x
Kutzler, B. (2000). The algebraic calculator as a pedagogical tool for teaching mathematics.
International Journal of Computer Algebra in Mathematics Education, 7(1), 5–23.
Lampert, M., & Ball, D. L. (1998). Teaching, multimedia, and mathematics: Investigations of real
practice. New York: Teacher’s College Press.
Lawless, K. A., & Pellegrino, J. W. (2007). Professional development in integrating technology
into teaching and learning: Knowns, unknowns, and ways to pursue better questions and
answers. Review of Educational Research, 77(4), 575–614.
Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel Psychology, 28(4),
563–575.
Leatham, K. R. (2007). Pre-service secondary mathematics teachers’ beliefs about the nature
of technology in the classroom. Canadian Journal of Science, Mathematics, and Technology
Education. 7(2/3), 183–207.
Lefever, S., Dal, M., & Matthiasdottir, A. (2007). Online data collection in academic research:
Advantages and limitations. British Journal of Educational Technology, 38(4), 574–582.
Li, Q. (2003). Would we teach without technology? A professor's experience of teaching
mathematics education incorporating the internet. Educational Research, 45(1), 61–77.
Li, Q., & Ma, X. (2010). A meta-analysis of the effects of computer technology on school
students’ mathematics learning. Educational Psychology Review, 22(3), 215–243.
Litwin, M. S. (1995). How to measure survey reliability and validity. Thousand Oaks, CA: Sage
Publications.
López, O. S. (2010). The digital learning classroom: Improving English language learners’
academic success in mathematics and reading using interactive whiteboard technology.
Computers & Education, 54(4), 901–915. doi:10.1016/j.compedu.2009.09.019
Lux, N., Bangert, A., & Whittier, D. (2011). The development of an instrument to assess
preservice teachers’ technological pedagogical content knowledge. Journal of Educational
Computing Research, 45(4), 415–431.
Ma, L. P. (1999). Knowing and teaching elementary mathematics. Mahwah, NJ: Lawrence
Erlbaum.
McKinney, S., & Frazier, W. (2008). Embracing the principles and standards for school
mathematics: An inquiry into the pedagogical and instructional practices of mathematics
teachers in high-poverty middle schools. Clearing House: A Journal of Educational
Strategies, Issues, and Ideas. 81(5), 201–210.
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A
framework for integrating technology in teachers’ knowledge. Teachers College Record,
108(6), 1017–1054.
Mitchell, B., Bailey, J. L., & Monroe, E. (2007). Integrating technology and a standards-based
pedagogy in a geometry classroom: A mature teacher deals with the reality of multiple
demands and paradigm shifts. Computers in the Schools, 24(1–2), 75–91.
National Council of Teachers of Mathematics. (2000). Principles and standards for school
mathematics. Reston, VA: Author.
National Governors Association (NGA) Center for Best Practices & Council of Chief State
School Officers (CCSSO). (2010). The Common Core State Standards initiative: Preparing
America’s students for college and career. Washington, DC: Author.
Niess, M. L. (2001). A model for integrating technology in preservice science and mathematics
content-specific teacher preparation. School Science & Mathematics, 101(2), 102–109.
Niess, M. L. (2005). Preparing teachers to teach science and mathematics with technology:
Developing a technology pedagogical content knowledge. Teaching and Teacher Education,
21(5), 509–523.
Niess, M. L. (2011). Investigating TPACK: Knowledge growth in teaching with technology.
Journal for Educational Computing Research. 44(3), 299–317.
Niess, M. (2013). Central component descriptors for levels of technological pedagogical
content knowledge. Journal of Educational Computing Research, 48(2), 173–198.
Niess, M. L., Ronau, R. N., Shafer, K. G., Driskell, S. O., Harper S. R., Johnston, C., Browning,
C., Özgün-Koca, S. A., & Kersaint, G. (2009). Mathematics teacher TPACK standards and
development model. Contemporary Issues in Technology and Teacher Education [Online
serial], 9(1), 4–24.
Norris, J., & Conn, C. (2005). Investigating strategies for increasing student response rates to
online-delivered course evaluations. Quarterly Review of Distance Education, 6(1), 13–29.
Norton, S., McRobbie, C. J., & Cooper, T. J. (2000). Exploring secondary math teachers’
reasons for not using computers in their teaching: Five case studies. Journal of Research on
Computing in Education. 33(1), 87–109.
Özgün-Koca, S. A., Meagher, M., & Edwards, M. T. (2010). Preservice teachers’ emerging
TPACK in a technology-rich methods class. Mathematics Educator, 19(2), 10–20.
Page, M. S. (2002). Technology-enriched classrooms: Effects on students of low socioeconomic
status. Journal of Research on Technology in Education, 34(4), 389–409.
Pattern, M. L. (2001). Questionnaire research: A practical guide (2nd edition). Los Angeles:
Pyrczak Publishing.
Pierce, R., & Ball, L. (2009). Perceptions that may affect teachers’ intention to use technology
in secondary mathematics classes. Educational Studies in Mathematics, 71(3), 299–317.
Quinn, R. J. (1998). Technology: PSTs’ beliefs and the influence of a mathematics methods
course. Clearing House. 71(6), 375–377.
Sahin, I. (2011). Development of survey of technological pedagogical and content knowledge
(TPACK). Turkish Online Journal of Educational Technology—TOJET, 10(1), 97–105.
Salinger, G. L. (1994). Educational reform movements and technology education. Technology
Teacher, 53(5), 6–8
Samejima, F. (1996). Graded response model. In W. J. van der Linden & R. K. Hambleton
(Eds.), Handbook of modern item response theory. New York: Springer-Verlag.
Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J., & Shin, T. S. (2009).
Technological pedagogical content knowledge (TPACK): The development and validation
of an assessment instrument for preservice teachers. Journal of Research on Technology in
Education, 42(2), 123–149.
Schmidt, K., Kohler, A., & Moldenhauer, W. (2009). Introducing a computer algebra system
in mathematics education—Empirical evidence from Germany. International Journal for
Technology in Mathematics Education, 16(1), 11–26
Schmitt, N. (1996). Uses and abuses of coefficient alpha. Psychological Assessment, 8(4),
350–353.
Shapley, K., Sheehan, D., Maloney, C., & Caranikas-Walker, F. (2011). Effects of technology
immersion on middle school students’ learning opportunities and achievement. Journal of
Educational Research, 104(5), 299–315. doi:10.1080/00220671003767615
Shoaf, M. (2000). A capstone course for pre-service secondary mathematics teachers.
International Journal of Mathematical Education in Science & Technology, 31(1), 151–160.
Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational
Researcher, 15(2), 4–14.
Silverman, J., & Thompson, P. W. (2008). Toward a framework for the development of
mathematical knowledge for teaching. Journal of Mathematics Teacher Education, 11(6),
499–511. DOI: 10.1007/s10857-008-9089-5.
Stallings, L. L. (1995). Teachers’ stories learning to use computing technologies in mathematics
teaching. Unpublished doctoral dissertation, University of Georgia, Athens.
Stein, M. K., Smith, M. S., Henningsen, M. A., & Silver, E. A. (2010) Implementing standards-
based mathematics instruction: A casebook for professional development. New York: Teachers
College Press.
Strawhecker, J. (2005). Preparing elementary teachers to teach mathematics: How field
experiences impact pedagogical content knowledge. Issues in the Undergraduate
Mathematics Preparation of School Teachers, 4(Curriculum), 12 pp. Retrieved from ERIC
database. (EJ835513).
Tharp, M. L., Fitzsimmons, J. A., & Ayers, R. L. B. (1997). Negotiating a technological shift:
Teacher perception of the implementation of graphing calculators. Journal of Computers in
Mathematics and Science Teaching, 16(4), 551–575.
Thomas, M. O. J. (2006). Teachers using computer in mathematics: A longitudinal study.
In C. Hoyles, J. Lagrance, L. H. Son & Sinclair (Eds.), Proceedings for the 17th ICMI Study
Conference: Technology Revisited, Hanoi University of Technology, December 3–8, 2006
(c17).
Thompson, A. G. (1992). Teachers’ beliefs and conceptions: A synthesis of the research. In
D.A. Grouws (Ed.), Handbook of Research in Mathematics Teaching and Learning (pp.
127–146). New York: MacMillan.
Thompson, B. (2004). Exploratory and confirmatory factor analysis: Understanding concepts
and applications. Washington, DC: American Psychological Association.
Torff, B., & Tirotta, R. (2010). Interactive whiteboards produce small gains in elementary
students’ self-reported motivation in mathematics. Computers & Education, 54(2), 379–383.
doi:10.1016/j.compedu.2009.08.019
Ursini, S., Santos, D., & Juarez Lopez, J. A. (2005). Teachers’ resistance using technology:
Source of ideas for a pedagogical proposal. In F. Olivero & R. Sutherland (Eds.), Proceedings
of the 7th International Conference on Technology in Mathematics Teaching (ICTMT) (pp.
189–197). Bristol, UK: University of Bristol.
Van Voorst, C. (2004). Capstone mathematics courses for teachers. Issues in the Undergraduate
Mathematics Preparation of School Teachers, 4(Curriculum), 11 pp. Retrieved from ERIC
database. (EJ835511).
Wenglinsky, H., & Educational Testing Service. (1998). Does it compute? The relationship
between educational technology and student achievement in mathematics. Retrieved from
ERIC database. (ED425191).
Wilson, S., Floden, R., & Ferrini-Mundy, J. (2001). Teacher preparation research: Current
knowledge, gaps and recommendations. A research report prepared for the U.S. Department
of Education by the Center for the Study of Teaching and Policy in collaboration with
Michigan State University. Retrieved from http://depts.washington.edu/ctpmail/PDFs/
TeacherPrep-WFFM-02-2001.pdf
Young, J. R., Young, J. L., & Shaker, Z. (2012). Describing the preservice teacher technological
pedagogical content knowledge (TPACK) literature using confidence intervals. Tech Trends,
56(5), 25–33.
Zbiek, R. (1998). Prospective teachers’ use of computing tools to develop and validate
functions as mathematical models. Journal for Research in Mathematics Education, 29(2),
184–201.
Zbiek, R. M., & Hollebrands, K. (2008). A research-informed view of the process of
incorporating mathematics technology into classroom practice by inservice and preservice
teachers. In M. K. Heid & G. W. Blume (Eds.), Research on technology and the teaching
and learning of mathematics: Vol. 1. Research syntheses (pp. 287–344). Charlotte, NC:
Information Age.
Manuscript received April 3, 2013 | Initial decision April 26, 2013 | Revised manuscript accepted May 20, 2013
Appendix A
Administered Survey
Thank you for taking time to complete this survey. Please answer each question to the
best of your knowledge. You should answer demographic information first, then read
each item and choose your first belief. You need not spend any lengthy time on any
one item. You should be finished in about 15 minutes.
Your thoughtfulness and candid responses will be greatly appreciated. Your confi-
dentiality will not be compromised and your name will not, at any time, be associated
with your responses.
Your responses will be kept completely confidential and will not influence your
course grade.
Demographic Information
Age range:
a. Under 19
b. 19–22
c 23–26
d. 27–30
e. 30+
Year in college:
a. Freshman
b Sophomore
c. Junior
d. Senior
e. Graduate student
Gender:
a. Male
b. Female
c. I prefer not to say
Ethnicity:
a. African American or black
b. Alaskan Native
c. American Indian
d. Asian
e. Hispanic or Latino
f. Pacific Islander
g. White or Caucasian
h. Other: _____________
i. I prefer not to say
Technology is a broad concept that can mean a lot of different things. For the purpose
of this questionnaire, technology is referring to digital technology/technologies—that
is, the digital tools we use, such as computers, laptops, iPods, handhelds, interactive
whiteboards, computer software programs, graphing calculators, etc.
Please answer all of the questions, and if you are uncertain of or neutral about your
response, you may always select “Neither agree nor disagree.”
All items appeared as a SD, D, N, A, SA response set up.
TCK35 I know about technologies that I can use for understanding and doing geometry.
TCK36 I know about technologies that I can use for understanding and doing trigonometry.
TCK37 I know about technologies that I can use for understanding and doing calculus.
TCK38 I know that using appropriate technology can improve one’s understanding of mathematics
concepts.
TPK39 I can choose technologies that enhance the teaching of a lesson.
TPK40 I can choose technologies that enhance students’ learning for a lesson.
TPK41 My teacher education program has caused me to think more deeply about how
technology could influence the teaching approaches I use in my classroom.
TPK42 I am thinking critically about how to use technology in my classroom.
TPK43 I can adapt the use of the technologies that I am learning about to different teaching
activities.
TPK44 Different teaching approaches require different technologies.
TPK45 I have the technical skills I need to use technology appropriately in teaching.
TPK46 I have the classroom management skills I need to use technology appropriately in
teaching.
TPK47 I know how to use technology in different instructional approaches.
TPK48 My teaching approaches change when I use technologies in a classroom.
TPK49 Knowing how to use a certain technology means that I can use it for teaching.
TPK50 Different technologies require different teaching approaches.
TPACK51 I can use strategies that combine mathematics, technologies, and teaching approaches
that I learned about in my coursework in my classroom.
TPACK52 I can choose technologies that enhance the mathematics for a lesson.
TPACK53 I can select technologies to use in my classroom that enhance what I teach, how I teach,
and what students learn.
TPACK54 I can provide leadership in helping others to coordinate the use of mathematics,
technologies, and teaching approaches at my school and/or district.
TPACK55 I can teach lessons that appropriately combine mathematics, technologies, and teaching
approaches.
TPACK56 Integrating technology in teaching mathematics will be easy and straightforward for me.
TPACK57 I can teach lessons that appropriately combine ratio and proportion, technologies, and
teaching approaches.
TPACK58 I can teach lessons that appropriately combine probability and statistics, technologies, and
teaching approaches.
TPACK59 I can teach lessons that appropriately combine algebra, technologies, and teaching
approaches.
TPACK60 I can teach lessons that appropriately combine geometry, technologies, and teaching
approaches.
TPACK61 I can teach lessons that appropriately combine trigonometry, technologies, and teaching
approaches.
TPACK62 I can teach lessons that appropriately combine calculus, technologies, and teaching
approaches.
This survey was spread out in a four page format so items did not appear to run together.
Appendix B