You are on page 1of 36

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/260364520

Developing and Validating a Reliable TPACK Instrument


for Secondary Mathematics Preservice Teachers

Article  in  Journal of Research on Technology in Education · December 2013


DOI: 10.1080/15391523.2013.10782618

CITATIONS READS

58 4,458

4 authors:

Jeremy Zelkowski Jim A. Gleason


University of Alabama University of Alabama
37 PUBLICATIONS   142 CITATIONS    37 PUBLICATIONS   337 CITATIONS   

SEE PROFILE SEE PROFILE

Dana Christine Cox Stephen Bismarck


Miami University University of South Carolina Upstate
35 PUBLICATIONS   115 CITATIONS    6 PUBLICATIONS   60 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Alabama's Practitioner Leaders for Underserved Schools in Mathematics (A-PLUS) View project

Item efficiency: an item response theory parameter with applications for improving the reliability of mathematics
assessment View project

All content following this page was uploaded by Jeremy Zelkowski on 20 January 2016.

The user has requested enhancement of the downloaded file.


Developing and Validating a Reliable TPACK Instrument for
Secondary Mathematics Preservice Teachers
JRTE | Vol. 46, No. 2, pp. 173–206 | ©2013 ISTE | iste.org/jrte

Developing and Validating a Reliable TPACK Instrument for


Secondary Mathematics Preservice Teachers

Jeremy Zelkowski
Jim Gleason
University of Alabama

Dana C. Cox
Miami University

Stephen Bismarck
University of South Carolina—Upstate

Abstract

Within the realm of teaching middle and high school mathematics, the abil-
ity to teach mathematics effectively using various forms of technology is now
more important than ever, as expressed by the recent development of the
Common Core State Standards for Mathematical Practice. This article pres-
ents the development process and the results from 15 institutions and more
than 300 surveys completed by secondary mathematics preservice teach-
ers. The results suggest that technological, pedagogical, and content knowl-
edge; technology knowledge; content knowledge; and pedagogical knowledge
constructs are valid and reliable, whereas pedagogical content knowledge,
technological content knowledge, and technological pedagogical knowledge
domains remain difficult for preservice teachers to separate and self-report.
(Keywords: TPACK, survey, secondary, mathematics education, teacher edu-
cation, preservice)

W
ith the recent development and adoption of the Common Core
State Standards for Mathematics (CCSSM) (NGA & CCSSO, 2010)
come new standards for mathematical practice that re-emphasize
the importance of technological tools in mathematics classrooms. This
renewed commitment to helping students model their world and strategi-
cally select tools for mathematical pursuits essentially requires teachers of
mathematics to be knowledgeable and effective at teaching mathematics
with technology and assessing the learning of mathematics by students
who use technology when doing mathematics (Stein, Smith, Henningsen,
& Silver 2010). The CCSSM are not the first set of standards that attend to
the importance of integrating technology into the teaching and learning of
mathematics. The National Council of Teachers of Mathematics (NCTM)

Volume 46 Number 2 | Journal of Research on Technology in Education | 173


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

and the Association of Mathematics Teacher Educators (AMTE) have also


made specific recommendations regarding the role of technology in the
mathematics classroom and for the development of mathematics teachers.
NCTM’s Technology Principle provides a vision of mathematics teaching
where technology is essential, driving curricular changes in the mathematics
that can and should be learned in the classroom. The technology principle
states, “Technology is essential in teaching and learning mathematics; it
influences the mathematics that is taught and enhances students’ learning”
(NCTM, 2000, p. 24). To enact this vision, AMTE (2006) recommends that
“mathematics teacher preparation programs must ensure that all math-
ematics teachers and teacher candidates have opportunities to acquire the
knowledge and experiences needed to incorporate technology in the context
of teaching and learning mathematics” (p.1). The specialized knowledge that
teachers require to effectively integrate technology into teaching practices is
currently referred to as technological, pedagogical, and content knowledge
(TPACK) (Mishra & Koehler, 2006).
Given that some mathematics teachers utilize technology effectively and
consistently in their classrooms, while others seem to avoid technology, we
must ask whether teachers’ values and beliefs are at the heart of this paradox.
Further, because of the growing national perspective that K–12 students
should have access to technological tools and the hope that this access will
greatly influence the way mathematics is taught and learned, it behooves
current teacher preparation programs to examine their current programs
and ensure that they are producing technologically effective mathematics
teachers. The purpose of this study was to develop, establish the reliability of,
and validate constructs for, an instrument that would enable preservice sec-
ondary mathematics teacher educators to examine their preservice teachers’
(PSTs’) self-perceptions of TPACK and evaluate their program’s effectiveness
at developing PSTs’ TPACK.

Theoretical Background

Defining Mathematical TPACK


TPACK emerged in the literature as a theoretical framework for teachers’
knowledge of effective technology use in the classroom for teaching and
learning (Mishra & Koehler, 2006). The TPACK framework was developed
by expanding the construct of pedagogical content knowledge (PCK) to in-
clude the integration of technology knowledge (TK) for teaching (see Figure
1). Shulman’s (1986) construct of PCK referred to the teaching knowledge
that specifically blends both content knowledge (CK) and pedagogical
knowledge (PK). Specific to mathematics, CK and PK, as well as PCK,
frameworks have been proposed and studied relatively extensively during
the past two decades (Ball, 1990; Ball, Hill, & Bass, 2002; Brown & Borko,
1992; Hill, 2011; Lampert & Ball, 1998; Ma, 1999; Silverman & Thompson,

174 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

Figure 1. Seven knowledge domains central to the TPACK framework (http://tpack.org).

2008; Strawhecker, 2005; Van Voorst, 2004), though more so at the primary
grade levels than secondary grade levels.
Because of this work, teacher education programs have placed much
greater attention on the development of these teacher knowledge do-
mains (Ball, Lubienski, & Mewborn, 2001; Dede & Soybas, 2011; Fryk-
holm & Glasson, 2005; Kahan, Cooper, and Bethea, 2003; Wilson,
Floden, Ferrini-Mundy, 2001). With the inclusion of TK as an additional
teacher knowledge domain and the advanced technology developments
for teaching mathematics, teacher education programs for teachers of
mathematics, now must examine their program effectiveness at develop-
ing seven domains of knowledge in PSTs. These seven domains include,
TK, PK, CK, PCK, technological pedagogical knowledge (TPK), techno-
logical content knowledge (TCK), and TPACK (see Figure 1).
For a more general treatment and description of each of these domains,
we refer the reader to Schmidt, Baran, Thompson, Mishra, Koehler, and Shin
(2009) and the literature above. Here, we elaborate on TK, TCK, and TPACK
in the context of secondary mathematics. To begin crafting these working
definitions, we acknowledge the considerable amount of accomplished work
in the field (see “TPACK in secondary mathematics” below).
TK in secondary mathematics. TK in secondary mathematics refers to
knowledge of technologies that are relevant to secondary mathematics

Volume 46 Number 2 | Journal of Research on Technology in Education | 175


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

teaching and learning. The list of technologies specific to secondary math-


ematics classrooms is ever growing and includes technologies developed
specifically for the mathematics classroom as well as mainstream tech-
nologies that serve computational and/or instructional needs. In the first
category, the list surely includes computer algebra systems (CAS), dynamic
mathematical software (e.g., Fathom, Geogebra, Geometer’s Sketchpad,
TI-Nspire Teacher’s Edition Software), online Java applets, and graphing
handhelds. In the second category, the list minimally includes calculat-
ing devices, spreadsheets, data collection devices such as calculator-based
laboratories (CBLs), interactive whiteboards, personal response systems,
and many others.
TCK in secondary mathematics. TCK in secondary mathematics refers
to the knowledge that incorporating technology fundamentally changes
the mathematics that students can learn. Three examples will highlight
the impact of technology on mathematical content in the classroom. First,
dynamic mathematical software transforms the classroom into a mathemati-
cal laboratory, giving students the opportunity to engage with mathematical
concepts and relationships in an empirical environment. This, in turn, opens
up new perspectives and problem-solving techniques that would otherwise
not be possible. Second, graphing handheld devices give students immedi-
ate access to multiple representations of mathematical relationships. These
representations and the connections between them are not only reasoning
tools, but also communication tools that assist students as they learn to
negotiate shared meanings. Third, spreadsheets and data collection devices
give students access to larger, more complex data sets with which to model
real-world phenomena. These data sets would be incomprehensible to stu-
dents working exclusively with pencil and paper.
TPACK in secondary mathematics. Beyond the questions of what technolo-
gies are relevant to the teaching of mathematics and what mathematics can
be taught with the inclusion of technology is the question of how best to
leverage the power of technology in the classroom. Knowledge about how
technologies can influence the teaching and learning of mathematics is
referred to as TPACK. Teachers of mathematics must possess enough spe-
cialized knowledge to make critical classroom decisions of pedagogy with
respect to mathematical content and appropriate technologies. Internation-
ally, research has shown students learn more mathematics and learn it more
deeply by using technology effectively and appropriately during the teaching
and learning of mathematics—including some studies showing the closing
of achievement gaps in underrepresented students (Barkatsas, Kasimatic,
& Gialamas, 2009; Bos, 2007; Doerr & Zangor, 2000; Duda, 2011; Dugdale
& Kibbey, 1990; Dunham & Dick, 1994; Guckin & Morrison, 1991; Hol-
lebrands, 2003; Kaput, 1995; Li & Ma, 2010; López, 2010; Mitchell, Bailey, &
Monroe, 2007; Page, 2002; Schmidt, Kohler, & Moldenhauer, 2009; Weng-
linsky & Educational Testing Service, 1998; Zbiek, 1998). However, some

176 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

studies have shown little to no achievement effects specific to the integration


of technology into the mathematics classroom (Kelly, Carnine, Gersten, &
Grossen, 1986; Shapley, Sheehan, Maloney, & Caranikas-Walker, 2011; Torff
& Tirotta, 2010). The distinguishing factor between the research findings
appears to be fundamentally grounded in how the technology is used peda-
gogically by the teacher, their beliefs about technology, and whether students
are using the device to make mathematical connections and check for un-
derstanding rather than simply for computation, also referred to as answer-
getting. Teachers’ effective technology implementation in the mathematics
classroom has been slow, according to the literature, for various reasons,
including but not limited to: (a) resistance to change, (b) lack of confidence,
(c) access, (d) usefulness, (e) fears (i.e., technical failures, students know-
ing more about technology than the teacher), and (f) inhibitors (Beaudin &
Bowers, 1997; Keating & Evans, 2001; Leatham, 2007; Li, 2003; McKinney
& Frazier, 2008; Norton, McRobbie, & Cooper 2000; Pierce & Ball, 2009;
Quinn, 1998; Stallings, 1995; Thomas, 2006; Thompson, 1992; Ursini, San-
tos, & Juarez Lopez, 2005).
There are many examples in the field of studies that purport to document,
define or teach the appropriate use of technology in the mathematics class-
room (i.e., Demana & Waits 1990, 1998; Drier, 2001; Kutzler, 2000; Niess,
2001; 2005; Salinger, 1994; Shoaf, 2000). With this existing knowledge of
research, the AMTE (2009) adopted the Mathematics TPACK framework for
teacher educators. Specifically, the Mathematics TPACK framework consists
of four essential components: (1) Design and develop technology-enhanced
mathematics learning environments and experiences, (2) Facilitate math-
ematics instruction with technology as an integrated tool, (3) Assess and
evaluate technology-enriched mathematics teaching and learning, and (4)
Engage in ongoing professional development to enhance TPACK. Further,
Niess and her colleagues (2009) produced the TPACK development model
for teachers. They indicate that:
… many mathematics teachers’ PCK lacks a solid and consistent inte-
gration of modern digital technologies in mathematics curriculum and
instruction. Technologies, such as dynamic geometry tools or advanced
graphing calculators with computer algebra systems (CAS), are primarily
used for modeling and providing examples, where students imitate the
actions and use the technologies for verification, demonstration, and drill
and practice. In essence then, while digital technologies have evolved,
strategies for their effective integration into the learning of mathematics
have not evolved as rapidly (p. 6).
With the recent development and adoption of the CCSSM, along with
the advancement of educational technologies specific to mathematics, the
expectation to teach and learn mathematics with the appropriate use of
technology has increased dramatically, as can be seen in the standards for

Volume 46 Number 2 | Journal of Research on Technology in Education | 177


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

mathematical practices (NGA & CCSSO, 2010). It has become increasingly


important for researchers to develop instruments for monitoring the profes-
sional growth of preservice mathematics teachers’ ability and self-efficacy to
use technology effectively and appropriately (or not) for teaching mathemat-
ics (i.e., TPACK). As mathematics teachers increase their self-awareness
and confidence in the use of technology in the classroom, many increase
their pedagogical toolbox that utilizes technology to enhance the teaching
and learning of mathematics (Lawless & Pellegrino, 2007). Since its incep-
tion in 2005, TPACK has been of great interest among those concerned with
preservice preparation programs, yet much of the large-scale studies have
been along the lines of Schmidt et al. (2009) and involve preservice elemen-
tary generalists, where preservice populations are large, rather than second-
ary specialists in mathematics with preservice populations that are generally
small at universities across the United States. What was not clear in the
literature is a systematic way to longitudinally study TPACK development on
a grand scale. The purpose of our study was to create an instrument for such
research.

Developing an Instrument to Measure TPACK for


Preservice Secondary Mathematics Teachers
In their recent review of the TPACK literature, Young, Young, and Shaker
(2012) studied the instruments and methods of studying TPACK in pre-
service teachers. Their conclusion revealed no specialized instrument for
secondary mathematics TPACK. The TPACK development model (Neiss
et al., 2009) has been a framework for TPACK development in preservice
teachers, yet the research community lacks the instruments to measure and
monitor TPACK development. Some performance assessments exist for use
with preservice elementary (i.e., Angeli & Valandies, 2009) or inservice (i.e.,
Bos, 2011; Harris, Grandgenett, & Hofer, 2010) teachers, and these generally
take the form of surveys.
One benefit of using a survey format is that it measures self-efficacy.
The self-efficacy of preservice and inservice teachers influences whether or
not they can and will implement technology supported classroom lessons
(Bowers & Stephens, 2011; Dunham & Henessey, 2008; Kastberg & Leatham,
2005; Kendal & Stacey, 2001; Harris & Hofer, 2011; Niess, 2013; Tharp,
Fitzsimmons, & Ayers, 1997; Zbiek & Hollebrands, 2008). Özgün-Koca,
Meagher, and Edwards (2010) used the mathematics technology attitudes
survey (MTAS) as part of their secondary mathematics preservice prepara-
tion program. Hofer and Grandgenett (2012) conducted a one-year longi-
tudinal study on a cohort of alternative certification preservice teachers but
limited the study to a small sample size (<10) and less than one year of data
collection. Surveys have been designed and tested in large-scale studies on a
number of different populations, including K–12 online educators (Archam-
bault & Crippen, 2009; Archambault & Barnett, 2010) as well as preservice

178 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

teachers, including both primary and secondary in Singapore (Koh, Chai,


& Tsai, 2010) and the United States (Sahin, 2011) and U.S. elementary PSTs
(Lux, Bangert, & Whittier, 2011; Schmidt et al., 2009).
Although we can assume that Koh, Chai, and Tsai (2010) and Sahin
(2011) included secondary preservice mathematics teachers in their samples,
neither distinguished this group within the whole. There is an absence in the
literature with respect to a valid and reliable survey to monitor and assess
TPACK development in secondary mathematics PSTs. In a recent review of
the literature on measuring TPACK among PSTs, Abbitt (2011) conjectures
that the survey developed by Schmidt et al. (2009) “may also provide a solid
foundation for similar surveys that would be applicable to PSTs who will be
teaching other content areas and grade levels,” (p. 292), a conjecture that is
echoed by Niess (2011), who adds that more work is needed to validate such
an instrument, including more large-scale studies.
Our current large-scale study aimed to establish a valid and reliable
instrument for monitoring and assessing TPACK development in secondary
mathematics PSTs, including the seven interrelated knowledge domains that
TPACK encompasses. In theory, a valid and reliable TPACK instrument for
this population would permit secondary mathematics teacher preparation
programs to longitudinally study program effectiveness at TPACK develop-
ment, fine-tune program components, conduct large-scale multi-institution-
al research studies, and make changes to maximize TPACK development in
next generation of teachers of mathematics.

Method
Our goal was to develop an instrument that answers the need to monitor
the development of TPACK in preservice secondary mathematics teachers
throughout preparation programs. Adapting the TPACK survey for elemen-
tary PSTs (Schmidt et al., 2009), this project focused on developing and
validating a reliable content-specific survey for the population of interest:
preservice secondary mathematics teachers.

Instrument Development
We began by examining the reliable and nonreliable items from the Schmidt
et al. (2009) survey by deleting items that did not specifically pertain to the
teaching of secondary mathematics. Specifically, we deleted the items relat-
ing to social studies, literacy, and science, as the instrument was established
for generalists at the elementary level. Next, we wrote an additional 22 items
to fill gaps in the seven knowledge domain constructs (TK, PK, CK, PCK,
TPK, TCK, TPACK) by focusing on specific content areas in mathematics.
We wrote these items to adhere to item development guidelines (Fink, 2003;
Fink & Kosecoff, 1998; Fowler, 2008; Pattern, 2001). Six researchers, includ-
ing two external to the project, with expertise in secondary mathematics
education reviewed all of the items to be included in the survey for content

Volume 46 Number 2 | Journal of Research on Technology in Education | 179


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

validity (Lawshe, 1975), even though the Schmidt survey development team
had previously vetted many of the survey items. We then revised seven items
based on the feedback from the expert panel. Three items received minor
editing, such as moving a word to a different location in the item or using a
word with similar meaning that was more specific to mathematics. Six items
received major editing, such as completely restructuring the item wording.
After this revision, the final survey instrument contained 62 items consisting
of 8 TK items, 8 CK items, 8 PK items, 7 PCK items, 7 TPK items, 12 TCK
items, and 12 TPACK items. Each item used a 5-point Likert scale response
(SD = “strongly disagree,” D = “disagree,” N = “neither agree nor disagree,” A
= “agree,” SA = “strongly agree”).
For the purposes of this study, we designed a cover sheet to collect addi-
tional demographic data, including participants’ age, level in college, practi-
cum experience, gender, and ethnicity. We also designed an additional cover
sheet to gather information about the context in which data was collected,
including the name of the course for which the survey was administered,
date of survey, whether students completed a technology course, if technol-
ogy is used in mathematics content courses, and the grade-band of teacher
certification for secondary mathematics at each respective institution.

Participants
The research team sought to collect data from a variety of secondary math-
ematics teacher preparation programs across the United States in an effort
to maximize the diversity in our national sample. For this project, diversity
included size of institution, type of institution, size of secondary mathemat-
ics education program, demographics of student population, experience of
faculty teaching program courses, and geographic location. We contacted
24 secondary mathematics education faculty around the country to inquire
whether they would be willing to administer the survey to their PSTs. Half
(12) agreed to organize administering the survey in addition to the three
institutions of the co-authors. The sample was a mix of convenience and
strategically chosen institutions to produce a diverse sample. To maximize
authentic responses and completion rates, paper surveys were administered
in classes rather than online out-of-class surveys (Adams & Gale, 1982;
Lefever, Dal, & Matthiasdottir, 2007; Norris & Conn, 2005). Faculty were
encouraged to arrange administering the survey at the start of class but were
permitted to administer the survey at their convenience. Faculty had gradu-
ate students or teaching assistants administer the survey in accordance with
IRB requirements. Due to these efforts, our survey completion return rate
was over 90% and as we will mention later, the surveys returned seemed to
contain predominantly legitimate responses.
We collected 315 surveys completed by PSTs prior to student teaching
from 15 institutions across the United States (see Table 1) with class sizes
ranging from 4 to 26 students during the 2010–11 academic year. Survey

180 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

Table 1. Demographic Data of Total Responses


% by region % by institution type

Variable % MW NE SE W RI RT
Age
< 19 0.3 1.5 0.0 0.0 0.0 0.7 0.0
19–22 68.3 89.7 66.7 62.8 40.6 67.1 68.2
23–26 19.0 5.9 15.8 22.6 43.8 19.9 20.3
27–30 2.9 0.0 3.5 3.6 6.3 2.1 4.1
> 30 9.5 2.9 12.3 11.7 9.4 11.0 8.8
Class
Sophomore 8.6 36.8 3.5 0.0 0.0 6.8 12.2
Junior 17.8 16.2 26.3 16.8 0.0 17.1 16.2
Senior 50.8 41.2 40.4 61.3 50.0 43.8 58.8
Graduate 20.6 2.9 26.3 21.2 50.0 30.8 12.2
Gender
Male 32.4 39.7 24.6 32.1 43.8 32.2 25.0
Female 67.3 60.3 75.4 67.9 53.1 67.8 75.0
Practicum
Yes 58.4 46.0 44.0 70.0 59.0 69.9 47.3
No 41.6 54.0 56.0 30.0 41.0 30.1 52.7
Ethnicity
African Am 4.1 36.8 1.8 5.8 3.1 6.2 2.0
Am Indian 0.3 0.0 0.0 0.7 0.0 0.0 0.7
Asian 1.3 0.0 3.5 0.7 3.1 2.1 0.7
Hisp/Latino 3.5 1.5 1.8 0.7 25.0 0.7 6.1
Pac Islander 1.0 0.0 1.8 0.7 3.1 0.7 0.7
White 85.7 94.1 84.2 88.3 56.3 87.0 83.1
Other 2.2 1.5 5.3 0.7 3.1 0.7 6.1
Notes:. Regional classifications (Midwest, Northeast, Southeast, West) based on higher education consortia’s of states (e.g. West-
ern Interstate Commission on Higher Education). Institutional classification based on Carnegie classification and labeled as either
RI=Research Intensive, RT=Some research and/or Teaching only. Some percentages may not sum to 100 due to non-responses
and/or rounding.

administrators at different sites reported that the survey took an average of


15–20 minutes for participants to complete after the instructions were read
and any questions were addressed. Upon receiving the completed surveys
from each institution, the project’s principal investigator worked with his
graduate assistant to begin coding the paper survey responses into spread-
sheet format. We created a simple spreadsheet for all 315 returned surveys,
which included the demographic responses in Table 1 as well as the re-
sponses to the 62-item survey. To guarantee accurate coding, an additional
graduate student randomly selected 25% of the surveys and rechecked the
original spreadsheet entries. Four entry cells were miscoded (<0.001% error
rate), so the graduate student rechecked the remaining 75% of the surveys,

Volume 46 Number 2 | Journal of Research on Technology in Education | 181


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

found an additional eight cell-entry errors, and corrected them. The project
PI then randomly selected 10% of the surveys and found no entry errors.
This verification method of checking, correcting, and rechecking authenti-
cates that the 19,530 cells in the spreadsheet had virtually a zero error rate
for entry before the analyses.
The final sample reported in Table 1 reflects 49.6% of responses from sec-
ondary mathematics PSTs at eight research-intensive, institutions with the
remaining amount coming from seven institutions with some research activ-
ity and/or teaching only. The national sample consisted of responses from
institutions in the Midwest (23.1%), Northeast (19.4%), Southeast (46.6%),
and West (10.9%) regions of the United States. Our sample oversampled
approximately 10% higher in the Southeast and 10% lower in the West when
examining the 2010 U.S. census populations. The Midwest and Northeast are
within 2% of census populations to the region.

Determining the Final Sample for Analysis


As the participants were asked to complete the survey in class, they may
have felt compelled or obligated to complete the survey but lacked the
motivation to engage in the survey. To account for this possible lack of
engagement, we used four criteria for removing cases from the sample that
demonstrated lack of engagement. First, the research team determined that
students who gave nearly the same response for all items on the survey dem-
onstrated a lack of engagement, so we examined all 315 completed surveys
for less than 10% variance in the responses. Four surveys met the first filter
criteria with respective total survey variances of responses with 1.6%, 3.2%,
4.9%, and 6.4% (the next closest variation was 10.8%). We deleted these
four surveys. Similarly, if we found zero variance on any one of three pages
of survey items (approximately 20 items), we removed those surveys. We
removed 14 remaining surveys that met this criterion, 4 of which had more
than one page with zero variance, indicating nonengagement in reading the
items and distinguishing any variation on roughly 20 items. Third, we per-
formed a manual inspection of surveys to look for patterning to determine if
students appeared to fill in their responses in unlikely ways. We determined
patterning on two surveys not already deleted. One survey had responses
that were 1, 2, 3, 4, 5, 4, 3, 2, 1, etc., item to item, and the other alternated
between two responses for every item. Fourth, we applied the criterion of
completeness (all 62 items answered) and removed one incomplete survey
from the sample. In all, we removed 21 surveys (6.67% of the sample) from
the 315 surveys completed, resulting in a sample of 294 surveys for the main
analysis on 62 items (18,228 cells).

Data Analysis
To determine the structure of the TPACK instrument, we randomly split the
data into two equal groups based on the overall TPACK summation score of

182 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

Likert scale items so that the distribution between the two groups was the
same. We used an exploratory factor analysis (EFA) with the first group of
147 student surveys to determine which items should be retained for the fi-
nal instrument. We used a confirmatory factor analysis on the second group
of 147 student surveys to verify the structure obtained through the explor-
atory factor analysis. This method gives more strength and validity to the
research method and results (Fink, 2003; Litwin, 1995; Thompson, 2004). To
determine the internal reliability of each of the subscales, we used a graded
response model (Samejima, 1996) with the full sample of 294 surveys to
analyze the instrument in addition to the traditional Cronbach alpha for
reliability measures.

Results

Exploratory Factor Analysis (EFA)


Based on the theoretical construct that TPACK includes TK, CK, PK, PCK,
TCK, and TPK, our initial analysis explored the possibility of seven fac-
tors, one for each trait. However, the initial EFA on the data showed distinct
factors only for TK (Factor 2), CK (Factor 5), PK (Factor 3), and TPACK
(Factor 1) (see Table 2, p. 184). The items written to measure PCK, TCK,
and TPK loaded on various factors with no obvious pattern or more than
three items loading on a single factor. Therefore, we decided to remove the
PCK, TCK, and TPK items from the remainder of the analysis and to leave
the measurement of the two-knowledge latent traits for future studies with a
new item development process.
After the removal of the PCK, TCK, and TPK items from the analysis, we
completed an exploratory factor analysis on the remaining items with four
factors. The four-factor analysis matches the data well, with TK loading on
Factor 2, CK loading on Factor 3, PK loading on Factor 4, and TPACK load-
ing on Factor 1 (see Table 3, p. 186).

Confirmatory Factor Analysis (CFA) and Internal Reliability Measures


To finalize the instrument, we performed a CFA using a four-factor structure
and included all items with a factor loading greater than or equal to 0.500.
During the analysis, we monitored particularly a number of fit indices and sta-
tistics. Thompson (2004) indicates the CFA fit indices measures and statistics
of particular interest for our particular instrument type are the chi-squared
statistical significance test, non-normed fit index (NNFI), comparative fit in-
dex (CFI), and root-mean-square error of approximation (RMSEA). Further,
Thompson indicates that all four should be carefully examined and considered
to determine a final model. The non-normed and comparative fit statistics
should be close to 0.95 or greater to indicate an excellent model fit (0.90–0.95
is acceptable to good). The RMSEA estimates how well the model parameters
will perform at reproducing the population covariances. The RMSEA should

Volume 46 Number 2 | Journal of Research on Technology in Education | 183


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

Table 2. Factor Loadings for EFA with Varimax Rotation of Seven Factors
Item Factor 1 Factor 2 Factor 3 Factor 4 Factor 5 Factor 6 Factor 7
TK01 -0.175 0.738 -0.226
TK02 -0.274 0.837
TK03 0.860 -0.187 0.141
TK04 0.845 -0.121
TK05 0.695
TK06 0.739 -0.134 0.169 0.103
TK07 0.228 0.513 -0.102 0.157
TK08 0.120 -0.421 0.203 -0.113
CK09 -0.196 -0.102 0.732
CK10 0.103 0.899 -0.276
CK11 0.778 -0.265
CK12 0.503 0.170
CK13 -0.159 0.564 0.199
CK14 0.128 0.491 0.345
CK15 -0.183 -0.127 0.257 0.640
CK16 -0.154 0.119 0.451 0.203
PK17 0.627 0.130 -0.101
PK18 -0.120 0.828
PK19 -0.168 0.147 0.907 -0.179
PK20 -0.149 0.766
PK21 0.252 0.613 -0.165 -0.125
PK22 0.443 0.262
PK23 0.436 -0.144 0.191
PK24 0.630 0.132 0.110
PCK25 0.386 0.146 0.233
PCK26 0.168 0.659
PCK27 -0.180 0.110 0.751
PCK28 0.342 0.235 0.209 -0.167
PCK29 0.193 0.129 0.189 0.151 0.109
PCK30 -0.151 0.180 0.591
PCK31 -0.161 -0.173 0.130 0.501 0.144 0.291 0.152
TCK32 0.318 -0.100 0.219 -0.162 0.441
TCK33 -0.145 0.536 -0.194
TCK34 0.494 -0.107 0.375 -0.123 -0.323
TCK35 0.608 0.124 -0.259 0.102 -0.390
TCK36 0.872 -0.116 -0.128
TCK37 -0.111 0.754
TCK38 0.349 0.148 -0.171 -0.213
TPK39 0.733 -0.111 -0.121 -0.206
TPK40 0.713 -0.127 0.146 0.150 -0.128 -0.172

184 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

Table 2 continued
Item Factor 1 Factor 2 Factor 3 Factor 4 Factor 5 Factor 6 Factor 7
TPK41 0.488 0.131 0.192 -0.195
TPK42 0.485 0.198 0.106 0.109 -0.303
TPK43 0.583 0.156 -0.236
TPK44 -0.134 0.256
TPK45 0.207 0.542 0.113
TPK46 0.143 0.188 -0.112 0.120 0.322
TPK47 0.457 0.146 0.129 0.159 -0.172
TPK48 0.124 -0.187
TPK49 0.254 -0.118 0.262 -0.105
TPK50 0.147 0.159 -0.112 -0.254
TPACK51 0.759 -0.115 0.149 0.122
TPACK52 0.987 -0.150 -0.197 -0.156
TPACK53 1.009 -0.141 -0.120 -0.147 -0.101
TPACK54 0.495 0.255 -0.119
TPACK55 0.717 0.149 0.219
TPACK56 0.505 0.204 0.354
TPACK57 0.321 0.157 0.184 -0.193 0.534 0.335
TPACK58 0.210 -0.131 0.748 0.269
TPACK59 0.758 0.249
TPACK60 0.623 -0.185 0.119 0.219 0.259
TPACK61 0.119 0.913 0.491
TPACK62 -0.121 0.802 0.390
Notes. Factor loadings ≥ 0.500 are in boldface. Test of the hypothesis is that seven factors are sufficient. Chi-square statistic is
2375.54 on 1,478 degrees of freedom, p<0.001.

be zero if we can produce the exact population covariance. Values approxi-


mately 0.06 or less are considered very reasonable for a model fit. Once we
determined the finalized instrument through a CFA, we checked internal
reliability for measuring the internal consistency on each unidimensional
construct. The statistics for internal reliability are the Cronbach’s alpha, stan-
dard error of measurement, and number of respondents within one and two
standard deviations (Cronbach & Shavelson, 2004; Green, Lissitz, & Mulaik,
1977; Schmitt, 1996). Cronbach’s alpha is used to indicate the degree to
which set of items measures the unidimensional construct and is acceptable
when greater than 0.70, good when greater than 0.80, and excellent when
greater than 0.90. Reporting the standard errors and respondents within one
and two standard deviations demonstrates a normalized set of responses
from the population and determines if there is a strong skew in the respons-
es for each construct.
Confirmatory factor analysis (CFA). Using the second group of 147 students,
the CFA was conducted. The initial analysis had some minor questions with
fit to explore and refine. Using an iterative process of removing items with

Volume 46 Number 2 | Journal of Research on Technology in Education | 185


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

Table 3. Factor Loadings for EFA with Varimax Rotation of Four Factors.
Item Factor 1 Factor 2 Factor 3 Factor 4
TK01 -0.163 0.706
TK02 -0.248 0.848 0.102
TK03 0.843 -0.174
TK04 0.828 -0.114
TK05 0.115 0.669
TK06 0.713
TK07 0.212 0.462
TK08 0.120 -0.405 0.161
CK09 -0.197 0.755
CK10 0.725
CK11 -0.113 0.655
CK12 0.536 0.108
CK13 0.636
CK14 0.118 0.662
CK15 -0.113 -0.118 0.794
CK16 -0.166 0.632
PK17 0.667
PK18 0.801
PK19 -0.118 0.158 0.822
PK20 -0.123 0.758
PK21 0.195 -0.176 0.656
PK22 0.172 0.166 0.479
PK23 0.479
PK24 0.104 0.571
TPACK51 0.864 -0.103 -0.120
TPACK52 0.857 -0.151
TPACK53 0.923 -0.143
TPACK54 0.516 0.233 -0.154
TPACK55 0.798
TPACK56 0.239 0.490
TPACK57 0.570 0.154
TPACK58 0.386 0.236
TPACK59 0.829
TPACK60 0.750 0.133 -0.147
TPACK61 0.507 0.216 -0.141
TPACK62 0.281 0.100 0.305
Notes. Factor loadings ≥ 0.500 are in boldface. Test of the hypothesis is that four factors are sufficient. Chi-square statistic is
861.66 on 492 degrees of freedom, p<0.001.

186 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

small loadings onto the desired factor and/or loadings onto undesired fac-
tors, we established a final instrument with 22 items (6 TK, 5 CK, 5 PK, 6
TPACK). The iterative process included determining items that loaded onto
multiple factors. For example, we deleted TPACK62 due to cross-loading
almost equally on two constructs. Items with less than a 0.5 factor loading
were considered for deletion. Remaining items with a factor loading over
0.5 were examined for high correlations with other items. The remaining
items deleted were due to high correlations with other times. For example,
TPACK54 had a high correlation (>0.8) with TPACK51. We determined that
TPACK51 was a much stronger and more meaningful item than TPACK54.
Further, some of the indices indicated that the fit would be greatly improved
by removing CK10 and allowing TPACK59 and TPACK60 to interact. This
is natural, as the geometry and algebra items for TPACK should have some
correlation for secondary mathematics PSTs because much of the content
is interconnected (NCTM, 2000). The CFA for four factors is presented in
Table 4 (p. 188).
Further, as the analysis of the instrument includes using a graded re-
sponse model to estimate the standard error curve, each of the subscales
must be confirmed to be unidimensional. To verify the unidimensionality,
we analyzed each subscale (TK, CK, PK, TPACK) using all 294 students.
Internal reliability. The Technology Knowledge (TK) subscale had a
Cronbach’s alpha of 0.8899 with the graded response model (see Figure 2,
p. 189) giving a standard error of measurement below 0.35 for most of the
respondents. For the TK subscale, 74% of the respondents were within one
standard deviation of the mean, and 96% were within two standard devia-
tions of the mean.
The Content Knowledge (CK) subscale had a Cronbach’s alpha of 0.8554 with
the graded response model (see Figure 3, p. 189) giving a standard error of mea-
surement below 0.53 for participants within one standard deviation of the mean.
For the CK subscale, 73% of the respondents were within one standard deviation
of the mean, and 99% were within two standard deviations of the mean.
The Pedagogical Knowledge (PK) subscale had a Cronbach’s alpha of
0.8768 with a graded response model (see Figure 4, p. 190), giving a stan-
dard error of measurement below 0.5 for all of the participants. For the PK
sub-scale, 72% of the respondents were within one standard deviation of the
mean and 93% were within two standard deviations of the mean.
The TPACK subscale had a Cronbach’s alpha of 0.8966 with a graded re-
sponse model (see Figure 5, p. 190) giving a standard error of measurement
below 0.5 for almost all of the participants. For the TPACK subscale, 71% of
the respondents were within one standard deviation of the mean, and 94%
were within two standard deviations of the mean.
As Cronbach’s alpha for each of the four subscales is above a 0.85 for each
construct, the internal reliability is very good to excellent in measure for
differences at the group level, and therefore the subscales meet the desired

Volume 46 Number 2 | Journal of Research on Technology in Education | 187


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

Table 4. Factor Loadings for CFA with Varimax Rotation of Four Factors after Iterative Process for Item Removal
Item TPACK TK CK PK
TK01 0.663
TK02 -0.179 0.801
TK03 0.853 -0.161
TK04 0.831 -0.132
TK05 0.133 0.665
TK06 0.667
CK09 -0.228 0.774
CK10 0.826
CK11 0.754
CK12 0.579 0.102
CK13 0.593
CK14 0.108 0.555
PK17 0.142 0.625
PK18 0.821
PK19 -0.109 0.133 0.827
PK20 -0.121 0.730
PK21 0.208 0.608
TPACK51 0.801 -0.122
TPACK52 0.880 -0.103
TPACK53 0.929
TPACK55 0.710 0.111
TPACK59 0.750
TPACK60 0.678 0.139 0.147 -0.152

Factor 1 Factor 2 Factor 3 Factor 4


SS loadings 3.949 3.541 2.963 2.738
Proportion Variance 0.172 0.154 0.129 0.119
Cumulative Variance 0.172 0.326 0.454 0.573
Factor Correlations
Factor 1 -0.333 0.438 0.347
Factor 2 -0.187 -0.348
Factor 3 0.348

Notes. Factor loadings ≥ 0.500 are in boldface. Test of the hypothesis is that four factors are sufficient. Chi-square statistic is
272.38 on 167 degrees of freedom, p<0.001.

requirements for survey research instruments (DeVellis, 2011). The iterative


item removal process resulted in a model with good to excellent fit indices.
The model fit index statistics are presented in Table 5 (p. 191), and the over-
all model is presented as a diagram (see Figure 6, p. 192).
The chi-square statistic for the PK subscale is worth further discussion.
Thompson (2004) indicates this statistic is important but not the end-all
statistic. However, all the fit indices need serious consideration as well. Our

188 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

Figure 2. TK subscale standard error and participant distribution.

Figure 3. CK subscale standard error and participant distribution.

Volume 46 Number 2 | Journal of Research on Technology in Education | 189


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

Figure 4. PK subscale standard error and participant distribution.

Figure 5. TPACK subscale standard error and participant distribution.

190 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

Table 5. Confirmatory Factor Analysis Statistical Fit Indices for Final Model and Subscales
Description Chi-square df p NNFI CFI RMSEA
Four-factor model 374.40 202 0.00000 0.940 0.947 0.052
TK subscale 37.67 9 0.00002 0.947 0.968 0.104
CK subscale 18.82 5 0.00207 0.945 0.972 0.097
PK subscale 8.60 5 0.12607 0.990 0.995 0.049
TPACK subscale 44.55 8 0.00000 0.932 0.964 0.125
Notes. Four-factor model statistics and each subscale unidimensionality statistics.

analysis leads us to conclude that the PK subscale is likely valid because


all three fit indices are the strongest compared to the TK, CK, and TPACK
subscales. The NFI, CFI, and RMSEA are all beyond excellent with respect
to acceptable ranges for these fit indices for the PK subscale. Table 6 (p. 193)
presents the final model with parameter estimates for the finalized items on
the TPACK instrument for the TK, CK, PK, and TPACK constructs. The
data presented includes the parameter estimates, standard errors, z-values,
and model error term (e#) on each item.

Discussion

Key Findings and Extending the TPACK Literature


The methodology and data analysis provide a consistent protocol for de-
veloping and testing an instrument that can be used for examining TPACK
knowledge in secondary PSTs. In doing so, we report three major findings.
First and foremost, the results from this research indicate a proficient instru-
ment for monitoring and measuring secondary mathematics PSTs’ self-
assessment of TK, PK, CK, and TPACK. This is the first survey instrument
designed specifically for secondary mathematics PSTs. Unlike prior studies
(e.g. Archambault & Crippen, 2009; Schmidt et al., 2009) that focused on
teachers of multiple disciplines, our study was unique because it focused
specifically on the next generation of teachers of mathematics.
Second, our findings revealed that PCK, TPK, and TPK items did not
produce a measurable factor for each knowledge domain. This finding
brings up an issue with trying to develop an instrument that can identify
knowledge domains that overlap between two categories for secondary
mathematics teachers. This result adds a finding to the literature that differs
from the instrument of Schmidt et al. (2009), in which elementary PSTs
reported a reliable seven-factor instrument. TPK, PCK, and TCK remain
elusive to quantify and measure well through the survey items we tested. In
fact, TPK, PCK, and TCK self-reporting may be more difficult for preservice
mathematics teachers to grasp during preservice preparation. TPK, PCK,
and TCK require PSTs to mentally remove one knowledge construct, which
is abstract and likely difficult for PSTs to do when responding to these items.
This could stem from the compartmentalized nature of many programs that

Volume 46 Number 2 | Journal of Research on Technology in Education | 191


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

Figure 6. Confirmatory factor analysis model diagram for final four-factor instrument.

192 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

Table 6. Final Survey Instrument Model Statistics for Four-Factor Model


Parameter Standard Error Term Error Term
Item Coefficient Error (SE) e# SE
TK01 0.629 0.049 0.472 0.042
TK02 0.638 0.040 0.251 0.026
TK03 0.744 0.046 0.316 0.033
TK04 0.765 0.049 0.395 0.039
TK05 0.679 0.049 0.442 0.041
TK06 0.575 0.039 0.273 0.026
CK09 0.312 0.029 0.174 0.016
CK11 0.411 0.030 0.145 0.016
CK12 0.451 0.032 0.161 0.018
CK13 0.360 0.031 0.184 0.018
CK14 0.524 0.044 0.364 0.035
PK17 0.488 0.034 0.205 0.020
PK18 0.500 0.032 0.156 0.017
PK19 0.499 0.034 0.195 0.019
PK20 0.505 0.032 0.163 0.017
PK21 0.461 0.034 0.220 0.021
TPACK51 0.465 0.034 0.230 0.021
TPACK52 0.568 0.032 0.126 0.015
TPACK53 0.557 0.033 0.156 0.017
TPACK55 0.553 0.036 0.214 0.022
TPACK59 0.531 0.038 0.260 0.024
TPACK60 0.515 0.041 0.342 0.031
Notes. Z-values for all 22 items >8.97 with p<0.0001. The error term (SE) values for the interaction between TPACK59 and
TPACK60 is 0.113 (0.021).

divide pedagogy, technology, and content knowledge into separate courses.


However, this may be supported by the reliability of the TK, PK, and CK
items. Additional reasons may include the fact that specialized secondary
education content areas are so content focused that PSTs may have difficulty
removing one of the three knowledge domains in their own minds.
Lastly, the correlation (see Figure 6) between the four factors revealed that
TK and PK have a very low correlation (0.020), whereas the others all have
moderate correlations (≥0.348). So we examined these factor correlations
further. We interpret these correlational findings to imply that TPACK is cor-
related moderately with TK, PK, and CK. The relationship is nearly the same
for each factor. This indicates that TPACK development may be influenced by
each factor about the same. CK’s correlation to PK and TK is about the same,
implying that CK may be influenced by both PK and TK about the same.
However, the low TK–PK relationship may indicate that preservice secondary
mathematics teachers (PSMTs) who have made up their minds about pedago-
gy strategies (PK) may not be strong candidates for believing they do not need

Volume 46 Number 2 | Journal of Research on Technology in Education | 193


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

TK. On the other hand, if PSMTs have strong TK, it does not imply that they
see they have strong pedagogy beliefs. The key finding of all this sense-making
is that the fact that CK is correlated well with PK and TK may simply imply
that higher CK is likely when PSMTs have higher TK and PK, instead of that
a PSMT with lower TK or PK will have lower content knowledge. It could be
possible that PSMTs with more education methods courses may have stronger
CK, as K–12 content is focused on more in those courses, as opposed to math-
ematics department upper-division content courses (e.g., analysis, abstract
algebra). PSMTs who have completed upper-division mathematics courses
may perceive themselves as having lower CK because their strong self-efficacy
in mathematics may be challenged in theoretical-based courses.
With these results, our goal of developing a valid and reliable instrument
to monitor and assess preservice mathematics teachers’ TPACK develop-
ment within teacher education programs was lucrative for a four-factor in-
strument. Although no instrument is perfect, our instrument provides a ro-
bust survey aimed at one specific population that traditionally is very small
at respective institutions of higher education, making it nearly impossible to
largely quantify TPACK components through research at one or even a few
institutions. This adds to the TPACK literature with a large-scale quantita-
tive study of a historically low population, institution to institution, in the
United States. Our instrument builds on the work of Mishra and Koehler
(2005a, 2005b, 2006) through the adapting of the Schmidt et al. work (2009).
We extend the TPACK literature domain with this instrument as compared
to previous study findings for different populations.
The main goal of this research project was to develop a valid and reliable
instrument for the population of secondary mathematics PSTs to help math-
ematics teacher educators (MTEs) monitor and assess their own programs’
ability to develop TPACK in their PSTs. The data from this research resulted
in survey items to reliably monitor and assess the constructs of TK, PK, CK,
and TPACK so historically low populations of students at institutions can
use a valid and reliable instrument to examine their program effectiveness.
This instrument can be used to better inform a single program course and
can provide overall information for an entire program. If the instrument
is applied as a pre–post survey for a program, the information does not
provide detailed information regarding course specific material. A limitation
arises with programs where secondary mathematics PSMTs’ content knowl-
edge is targeted by a mathematics department and their pedagogical knowl-
edge is targeted by an education department, and there is a lack of commu-
nication between both departments. This is not uncommon in secondary
mathematics teacher preparation programs across the United States.
This study was not specifically about what students know, but rather an
objective assessment of what they think they know. PSMTs’ personal beliefs
change positively and negatively over time during preservice prepara-
tion. This instrument gives researchers and educators the ability to reliably

194 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

measure these beliefs regarding TPACK and the contributing factors of PK,
TK, and CK during preservice preparation programs. PSMTs can easily be
overconfident or lack confidence. Our experiences provide us with knowl-
edge that PSMTs can be exposed to new ways of thinking during coursework
and improve or diminish their sense of self-efficacy. Learning can go in both
directions, so our study provides an opportunity for programs and educators
to understand their learning environment to improve TPACK development
in PSMTs.

Limitations
A limitation worth noting is the potential of variation in the administration
of surveys. Specific instructions were given to each site, and we rely on the
professionals following those guidelines, and we must rely on an assumption
that the surveys were administered in a similar fashion at each institution.
More significant, we were unable to isolate PCK, TCK, and TPK knowledge
domains specifically in the population of secondary mathematics PSTs. We
acknowledge that this is limiting in terms of the use of our survey, yet we
see unearthing such a limitation as a key finding in expanding the TPACK
literature domain.

Implications for Practice, Directions for Future Research


It is imperative that we develop TPACK in our preservice mathematics
teachers in order to fully enact the vision for teaching set forth by AMTE,
CCSSM, and NCTM. The finalized instrument presented in Table 6 (p. 193)
represents a valid, reliable, and manageable 22-item survey (or 16 items
without TPACK items) with which secondary mathematics teacher educa-
tion programs can monitor and assess PSMTs’ perceptions of their devel-
opment of TPACK. Further, for the successful integration of technology
into the teaching of secondary mathematics, preservice teachers need to be
comfortable with their own perceived abilities to do so during their prepara-
tion programs. Our instrument will give researchers and MTEs the ability to
conduct TPACK research in preservice courses and programs using a variety
of methods.
First, using the instrument in a pre–post format can allow for and detect
changes over a semester course or during the student teaching internship.
We see the instrument, for example, aiding in discovering which cooperat-
ing teachers are effective at helping grow TPACK in PSMTs during their
student teaching. This instrument can also be used longitudinally over two
years. This will provide insights into when and what aspects of preparation
programs improve PSMTs’ perceptions of TPACK knowledge growth. We
also see this instrument as useful in the development of TPACK practices in
the classroom accompanied by an observational instrument (e.g., Hofer et
al., 2011). This research needs to be done in the future. Additionally, there
may be potential in using the instrument to help PSTs recognize their own

Volume 46 Number 2 | Journal of Research on Technology in Education | 195


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

professional growth and development. Helping PSTs overcome their precon-


ceived beliefs is difficult, yet this instrument can aid in the development and
assessment early in a preparation program.
In the future, mixed-methods research designs would be strong for exam-
ining TPACK in PSTs using this survey along with classroom observational
techniques, including interviews and video analyses. Moreover, combining
our instrument with observations during student teaching may provide a
rich data set of both quantitative and qualitative data. Such a rich set of data
will allow for strong, valid, and more generalizable research findings. Fur-
ther, it would be useful to begin to index specific curricular events (e.g., in-
structional strategies, assignments, goals) or clinical experiences for growth
in TPACK and improvement in the technological self-efficacy of PSTs. This
can be accomplished by using this instrument as an element of large-scale
comparative studies.
There is a great deal of work to be done with respect to TPACK develop-
ment in the next generation of secondary mathematics preservice teachers.
We hope our instrument will aid in some of that work.

Acknowledgments
The authors wish to thank individuals who helped us by administering or arranging the admin-
istration of our survey to the secondary mathematics preservice teachers at their respective insti-
tutions. Thank you, Ginny Bohme, Daniel Brahier, Anna Marie Conner, Kelly Edenfield, Carla
Gerberry, Shannon Guerrero, Gary Martin, Ginger Rhodes, Wendy Sanchez, Tommy Smith, Toni
Smith, Jami Stone, Anthony Thompson, and Jan Yow. We also wish to express our gratitude to the
preservice secondary mathematics teachers who engaged and diligently completed surveys at each
of the respective institutions of higher education. Finally, we wish to thank Barbara and Robert
Reys for their visionary mission to link early career mathematics teacher educators through the
STaR program to advance the future of the mathematics education field. Without their hard work
and the mentoring professionals of the STaR program, the data collection for this project may
have taken years and the professional relationships might never have been established.

Author Notes
Jeremy Zelkowski, PhD, is an assistant professor of secondary mathematics education in the
Department of Curriculum and Instructions in the College of Education at the University of
Alabama, Tuscaloosa. His research interests focus on TPACK development in preservice teachers,
appropriate use of technology in teaching mathematics, mathematical college readiness, and pro-
fessional development learning communities. Please address correspondence regarding this article
to Dr. Jeremy Zelkowski, Department of Curriculum and Instruction, University of Alabama, 902
University Blvd, Tuscaloosa, AL 35487-0232. Email: jzelkowski@bamaed.ua.edu

Jim Gleason, PhD, is an associate professor in the Department of Mathematics in the College of Arts
& Sciences at the University of Alabama, Tuscaloosa. His research interests focus on teacher content
knowledge, assessment, and psychometrics pertaining to mathematics teaching and learning.

Dana C. Cox, PhD, is an assistant professor in the Department of Mathematics in the College of
Arts & Sciences at Miami University, Oxford, Ohio. Her research interests focus on curriculum
design, TPACK development in preservice teachers, and studying the emergent professional vision
of teaching with technology held by preservice teachers.

196 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

Stephen Bismarck, PhD, is an assistant professor of middle/secondary mathematics education


in the School of Education at the University of South Carolina—Upstate in Spartanburg. His
research interests focus on appropriate use of technology, developing mathematical knowledge for
teaching with preservice teachers, and exploring best practices for building conceptual knowledge.

References
Abbitt, J. T. (2011). Measuring technological pedagogical content knowledge in preservice
teacher education: A review of current methods and instruments. Journal of Research on
Technology in Education, 43(4), 281–300.
Adams, L. L. M., & Gale, D. (1982). Solving the quandary between questionnaire length and
response rate in education research. Research in Higher Education, 17(3), 231–240.
Angeli, C., & Valanides, N. (2009). Epistemological and methodological issues for the
conceptualization, development, and assessment of ICT-TPCK: Advances in technological
pedagogical content knowledge (TPCK). Computers & Education, 52(1), 154–168
Archambault, L. M., & Barnett, J. H. (2010). Revisiting technological pedagogical content
knowledge: Exploring the TPACK framework. Computers & Education, 55(4), 1656–1662.
Archambault, L., & Crippen, K. (2009). Examining TPACK among K–12 online distance
educators in the United States. Contemporary Issues in Technology and Teacher Education
(CITE Journal), 9(1), 71–88.
Association of Mathematics Teacher Educators (AMTE). (2006). Preparing teachers to use
technology to enhance the learning of mathematics. Retrieved August 1, 2008, from http://
www.amte.net/resources
Association of Mathematics Teacher Educators (AMTE). (2009). Mathematics TPACK
framework. Retrieved October 13, 2009, from http://www.amte.net/resources
Ball, D. L. (1990) Prospective elementary and secondary teachers’ understanding of division.
Journal for Research in Mathematics Education, 21(2), 132–144.
Ball, D. L., Hill, H. C., & Bass, H. (2002). Developing measures of mathematics knowledge for
teaching. Ann Arbor, MI: Study of Instructional Improvement.
Ball, D. L., Lubienski, S., & Mewborn, D. (2001). Research on teaching mathematics: The
unsolved problem of teachers’ mathematical knowledge. In V. Richardson (Ed.), Handbook
of research on teaching (4th ed.). New York: Macmillan.
Barkatsas, A., Kasimatis, K., & Gialamas, V. (2009). Learning secondary mathematics
with technology: Exploring the complex interrelationship between students’ attitudes,
engagement, gender and achievement. Computers & Education, 52(3), 562–570.
Beaudin, M., & Bowers, D. (1997). Logistics for facilitating CAS instruction. In J. Berry,
J. Monaghan, M. Kronfellner, & B. Kutzler (Eds.), The state of computer algebra in
mathematics education (pp. 126–135). Lancashire, England: Chartwell-York.
Bos, B. (2007). The effect of the Texas Instrument interactive instructional environment on the
mathematical achievement of eleventh grade low achieving students. Journal of Educational
Computing Research, 37(4), 351–368.
Bos, B. (2011). Professional development for elementary teachers using TPACK. Contemporary
Issues in Technology and Teacher Education, 11(2), 167–183.
Bowers, J., & Stephens, B. (2011). Using technology to explore mathematical relationships:
A framework for orienting mathematics courses for prospective teachers. Journal of
Mathematics Teacher Education, 14(4), 285–304. doi:10.1007/s10857-011-9168-x
Brown, C. A., & Borko, H. (1992). Becoming a mathematics teacher. In D. A. Grouws (Ed.),
Handbook of research on mathematics teaching and learning. New York: Macmillan.
Cronbach, L. J., & Shavelson, R. J. (2004). My current thoughts on coefficient alpha and
successor procedures. Educational and Psychological Measurement, 64(3), 391–418.
Dede, Y., & Soybas, D. (2011). Preservice mathematics teachers’ experiences about function
and equation concepts. EURASIA Journal of Mathematics, Science & Technology Education,
7(2), 89–102.

Volume 46 Number 2 | Journal of Research on Technology in Education | 197


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

Demana, F., & Waits, B. K. (1990). Implementing the standards: The role of technology in
teaching mathematics. Mathematics Teacher, 83(1), 27–31.
Demana, F., & Waits, B. K. (1998). The role of graphing calculators in mathematics reform.
Retrieved from ERIC database. (ED458108).
DeVellis, R. F. (2011). Scaled development: Theory and applications. Thousand Oaks, CA: Sage
Publications.
Doerr, H. M., & Zangor, R. (2000). Creating meaning for and with the graphing calculator.
Educational Studies in Mathematics, 41(2), 143–163.
Drier, H. S. (2001). Teaching and learning mathematics with interactive spreadsheets. School
Science and Mathematics 101(4), 170–179.
Duda, J. (2011). Mathematical creative activity and the graphic calculator. International
Journal for Technology in Mathematics Education, 18(1), 3–14.
Dugdale, S., & Kibbey, D. (1990). Beyond the evident content goals—Part I: Tapping the depth
and flow of the educational undercurrent. Journal of Mathematical Behavior, 9, 201–228.
Dunham, P. H., & Dick, T. P. (1994). Connecting research to teaching: Research on graphing
calculators. Mathematics Teacher, 87(6), 440–445
Dunham, P., & Hennessey, S. (2008). Equity and the use of educational technology in
mathematics. In M. K. Heid & G. W. Blume (Eds.), Research on technology and the teaching
and learning of mathematics: Vol. 1. Research syntheses (pp. 345–418). Charlotte, NC:
Information Age.
Fink, A. (2003). How to manage, analyze, and interpret survey data (2nd edition). Thousand
Oaks, CA: Sage Publications.
Fink, A., & Kosecoff, J. (1998). How to conduct surveys: A step-by-step guide. Thousand Oaks,
CA: Sage Publications.
Fowler, F. J. (2008). Survey research methods (4th edition). Thousand Oaks, CA: Sage
Publications.
Frykholm, J., & Glasson, G. (2005). Connecting science and mathematics instruction:
Pedagogical context knowledge for teachers. School Science and Mathematics, 105(3),
127–140.
Green, S. B., Lissitz, R. W., & Mulaik, S. A. (1977). Limitations of coefficient alpha as an index
of test unidimensionality. Educational and Psychological Measurement, 37(4), 827–838.
Guckin, A., & Morrison, D. (1991). Math*Logo: A project to develop proportional reasoning
in college freshmen. School Science and Mathematics, 91(2), 77–81.
Harris, J. B., & Hofer, M. J. (2011). Technological pedagogical content knowledge (TPACK)
in action: A descriptive study of secondary teachers’ curriculum-based, technology-related
instructional planning. Journal of Research on Technology in Education, 43(3), 211–229.
Harris, J. B., Grandgenett, N., & Hofer, M. (2010, March). Testing a TPACK-based technology
integration assessment rubric. Paper presented at the annual meeting of the Society for
Information Technology and Teacher Education (SITE), San Diego, CA.
Hill, H. C. (2011). The nature and effects of middle school mathematics teacher learning
experiences. Teachers College Record, 113(1), 205–234.
Hofer, M. & Grandgenett, N. (2012). TPACK development in teacher education: A
longitudinal study of preservice teachers in a secondary M.A.Ed. program. Journal of
Research on Technology in Education 45(1), 83–106.
Hofer, M., Grandgenett, N., Harris, J., & Swan, K. (2011). Testing a TPACK-based
technology integration observation instrument. In C. D. Maddux, D. Gibson, B. Dodge,
M. Koehler, P. Mishra, & C. Owens (Eds.). Research highlights in technology and teacher
education 2011 (pp. 39–46). Chesapeake, VA: Society for Information Technology &
Teacher Education.
Hollebrands, K. F. (2003). High school students’ understandings of geometric transformations
in the context of a technological environment. Journal of Mathematical Behavior, 22(1),
55–72.

198 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

Kahan, J. A., Cooper, D. A., & Bethea, K. A. (2003). The role of mathematics teachers’
content knowledge in their teaching: A framework for research applied to a study
of student teachers. Journal of Mathematics Teacher Education, 6(3), 223–252. doi:
10.1023/A:1025175812582.
Kaput, J. (1995). Creating cybernetic and psychological ramps for the concrete to the abstract:
Examples from multiplicative structures. In D. Perkins, J. Schwartz, M West, & M. Wiske
(Eds.), Software goes to school: Teaching for understanding with new technologies (pp.
130–154). New York: Oxford University Press.
Kastberg, S., & Leatham, K. (2005). Research on graphing calculators at the secondary level:
Implications for mathematics teacher education. Contemporary Issues in Technology and
Teacher Education [Online serial], 5(1).
Keating, T., & Evans, E. (2001). Three computers in the back of the classroom: Preservice
teachers’ conceptions of technology integration. In R. Carlsen, N. Davis, J. Price, R. Weber, &
D. Willis (Eds.), Society for Information Technology and Teacher Education Annual, 2001 (pp.
1671–1676). Norfolk, VA: Association for the Advancement of Computing in Education.
Kelly, B., Carnine, D., Gersten, R., & Grossen, B. (1986). The effectiveness of videodisc
instruction in teaching fractions to learning-disabled and remedial high school students.
Journal of Special Education Technology, 7(2), 5–17.
Kendal, M. & Stacey, K. (2001). The impact of teacher privileging on learning differentiation
with technology. International Journal of Computers for Mathematical Learning, 6(2),
143–165.
Koehler, M. J., & Mishra, P. (2005a). Teachers learning technology by design. Journal of
Computing In Teacher Education, 21(3), 94–102.
Koehler, M. J., & Mishra, P. (2005b). What happens when teachers design educational
technology? The development of technological pedagogical content knowledge. Journal of
Educational Computing Research, 32(2), 131–152.
Koh, J. L., Chai, C. S., & Tsai, C. C. (2010). Examining the technological pedagogical content
knowledge of Singapore pre-service teachers with a large-scale survey. Journal of Computer
Assisted Learning, 26(6), 563–573. doi:10.1111/j.1365-2729.2010.00372.x
Kutzler, B. (2000). The algebraic calculator as a pedagogical tool for teaching mathematics.
International Journal of Computer Algebra in Mathematics Education, 7(1), 5–23.
Lampert, M., & Ball, D. L. (1998). Teaching, multimedia, and mathematics: Investigations of real
practice. New York: Teacher’s College Press.
Lawless, K. A., & Pellegrino, J. W. (2007). Professional development in integrating technology
into teaching and learning: Knowns, unknowns, and ways to pursue better questions and
answers. Review of Educational Research, 77(4), 575–614.
Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel Psychology, 28(4),
563–575.
Leatham, K. R. (2007). Pre-service secondary mathematics teachers’ beliefs about the nature
of technology in the classroom. Canadian Journal of Science, Mathematics, and Technology
Education. 7(2/3), 183–207.
Lefever, S., Dal, M., & Matthiasdottir, A. (2007). Online data collection in academic research:
Advantages and limitations. British Journal of Educational Technology, 38(4), 574–582.
Li, Q. (2003). Would we teach without technology? A professor's experience of teaching
mathematics education incorporating the internet. Educational Research, 45(1), 61–77.
Li, Q., & Ma, X. (2010). A meta-analysis of the effects of computer technology on school
students’ mathematics learning. Educational Psychology Review, 22(3), 215–243.
Litwin, M. S. (1995). How to measure survey reliability and validity. Thousand Oaks, CA: Sage
Publications.
López, O. S. (2010). The digital learning classroom: Improving English language learners’
academic success in mathematics and reading using interactive whiteboard technology.
Computers & Education, 54(4), 901–915. doi:10.1016/j.compedu.2009.09.019

Volume 46 Number 2 | Journal of Research on Technology in Education | 199


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

Lux, N., Bangert, A., & Whittier, D. (2011). The development of an instrument to assess
preservice teachers’ technological pedagogical content knowledge. Journal of Educational
Computing Research, 45(4), 415–431.
Ma, L. P. (1999). Knowing and teaching elementary mathematics. Mahwah, NJ: Lawrence
Erlbaum.
McKinney, S., & Frazier, W. (2008). Embracing the principles and standards for school
mathematics: An inquiry into the pedagogical and instructional practices of mathematics
teachers in high-poverty middle schools. Clearing House: A Journal of Educational
Strategies, Issues, and Ideas. 81(5), 201–210.
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A
framework for integrating technology in teachers’ knowledge. Teachers College Record,
108(6), 1017–1054.
Mitchell, B., Bailey, J. L., & Monroe, E. (2007). Integrating technology and a standards-based
pedagogy in a geometry classroom: A mature teacher deals with the reality of multiple
demands and paradigm shifts. Computers in the Schools, 24(1–2), 75–91.
National Council of Teachers of Mathematics. (2000). Principles and standards for school
mathematics. Reston, VA: Author.
National Governors Association (NGA) Center for Best Practices & Council of Chief State
School Officers (CCSSO). (2010). The Common Core State Standards initiative: Preparing
America’s students for college and career. Washington, DC: Author.
Niess, M. L. (2001). A model for integrating technology in preservice science and mathematics
content-specific teacher preparation. School Science & Mathematics, 101(2), 102–109.
Niess, M. L. (2005). Preparing teachers to teach science and mathematics with technology:
Developing a technology pedagogical content knowledge. Teaching and Teacher Education,
21(5), 509–523.
Niess, M. L. (2011). Investigating TPACK: Knowledge growth in teaching with technology.
Journal for Educational Computing Research. 44(3), 299–317.
Niess, M. (2013). Central component descriptors for levels of technological pedagogical
content knowledge. Journal of Educational Computing Research, 48(2), 173–198.
Niess, M. L., Ronau, R. N., Shafer, K. G., Driskell, S. O., Harper S. R., Johnston, C., Browning,
C., Özgün-Koca, S. A., & Kersaint, G. (2009). Mathematics teacher TPACK standards and
development model. Contemporary Issues in Technology and Teacher Education [Online
serial], 9(1), 4–24.
Norris, J., & Conn, C. (2005). Investigating strategies for increasing student response rates to
online-delivered course evaluations. Quarterly Review of Distance Education, 6(1), 13–29.
Norton, S., McRobbie, C. J., & Cooper, T. J. (2000). Exploring secondary math teachers’
reasons for not using computers in their teaching: Five case studies. Journal of Research on
Computing in Education. 33(1), 87–109.
Özgün-Koca, S. A., Meagher, M., & Edwards, M. T. (2010). Preservice teachers’ emerging
TPACK in a technology-rich methods class. Mathematics Educator, 19(2), 10–20.
Page, M. S. (2002). Technology-enriched classrooms: Effects on students of low socioeconomic
status. Journal of Research on Technology in Education, 34(4), 389–409.
Pattern, M. L. (2001). Questionnaire research: A practical guide (2nd edition). Los Angeles:
Pyrczak Publishing.
Pierce, R., & Ball, L. (2009). Perceptions that may affect teachers’ intention to use technology
in secondary mathematics classes. Educational Studies in Mathematics, 71(3), 299–317.
Quinn, R. J. (1998). Technology: PSTs’ beliefs and the influence of a mathematics methods
course. Clearing House. 71(6), 375–377.
Sahin, I. (2011). Development of survey of technological pedagogical and content knowledge
(TPACK). Turkish Online Journal of Educational Technology—TOJET, 10(1), 97–105.
Salinger, G. L. (1994). Educational reform movements and technology education. Technology
Teacher, 53(5), 6–8

200 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

Samejima, F. (1996). Graded response model. In W. J. van der Linden & R. K. Hambleton
(Eds.), Handbook of modern item response theory. New York: Springer-Verlag.
Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J., & Shin, T. S. (2009).
Technological pedagogical content knowledge (TPACK): The development and validation
of an assessment instrument for preservice teachers. Journal of Research on Technology in
Education, 42(2), 123–149.
Schmidt, K., Kohler, A., & Moldenhauer, W. (2009). Introducing a computer algebra system
in mathematics education—Empirical evidence from Germany. International Journal for
Technology in Mathematics Education, 16(1), 11–26
Schmitt, N. (1996). Uses and abuses of coefficient alpha. Psychological Assessment, 8(4),
350–353.
Shapley, K., Sheehan, D., Maloney, C., & Caranikas-Walker, F. (2011). Effects of technology
immersion on middle school students’ learning opportunities and achievement. Journal of
Educational Research, 104(5), 299–315. doi:10.1080/00220671003767615
Shoaf, M. (2000). A capstone course for pre-service secondary mathematics teachers.
International Journal of Mathematical Education in Science & Technology, 31(1), 151–160.
Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational
Researcher, 15(2), 4–14.
Silverman, J., & Thompson, P. W. (2008). Toward a framework for the development of
mathematical knowledge for teaching. Journal of Mathematics Teacher Education, 11(6),
499–511. DOI: 10.1007/s10857-008-9089-5.
Stallings, L. L. (1995). Teachers’ stories learning to use computing technologies in mathematics
teaching. Unpublished doctoral dissertation, University of Georgia, Athens.
Stein, M. K., Smith, M. S., Henningsen, M. A., & Silver, E. A. (2010) Implementing standards-
based mathematics instruction: A casebook for professional development. New York: Teachers
College Press.
Strawhecker, J. (2005). Preparing elementary teachers to teach mathematics: How field
experiences impact pedagogical content knowledge. Issues in the Undergraduate
Mathematics Preparation of School Teachers, 4(Curriculum), 12 pp. Retrieved from ERIC
database. (EJ835513).
Tharp, M. L., Fitzsimmons, J. A., & Ayers, R. L. B. (1997). Negotiating a technological shift:
Teacher perception of the implementation of graphing calculators. Journal of Computers in
Mathematics and Science Teaching, 16(4), 551–575.
Thomas, M. O. J. (2006). Teachers using computer in mathematics: A longitudinal study.
In C. Hoyles, J. Lagrance, L. H. Son & Sinclair (Eds.), Proceedings for the 17th ICMI Study
Conference: Technology Revisited, Hanoi University of Technology, December 3–8, 2006
(c17).
Thompson, A. G. (1992). Teachers’ beliefs and conceptions: A synthesis of the research. In
D.A. Grouws (Ed.), Handbook of Research in Mathematics Teaching and Learning (pp.
127–146). New York: MacMillan.
Thompson, B. (2004). Exploratory and confirmatory factor analysis: Understanding concepts
and applications. Washington, DC: American Psychological Association.
Torff, B., & Tirotta, R. (2010). Interactive whiteboards produce small gains in elementary
students’ self-reported motivation in mathematics. Computers & Education, 54(2), 379–383.
doi:10.1016/j.compedu.2009.08.019
Ursini, S., Santos, D., & Juarez Lopez, J. A. (2005). Teachers’ resistance using technology:
Source of ideas for a pedagogical proposal. In F. Olivero & R. Sutherland (Eds.), Proceedings
of the 7th International Conference on Technology in Mathematics Teaching (ICTMT) (pp.
189–197). Bristol, UK: University of Bristol.
Van Voorst, C. (2004). Capstone mathematics courses for teachers. Issues in the Undergraduate
Mathematics Preparation of School Teachers, 4(Curriculum), 11 pp. Retrieved from ERIC
database. (EJ835511).

Volume 46 Number 2 | Journal of Research on Technology in Education | 201


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

Wenglinsky, H., & Educational Testing Service. (1998). Does it compute? The relationship
between educational technology and student achievement in mathematics. Retrieved from
ERIC database. (ED425191).
Wilson, S., Floden, R., & Ferrini-Mundy, J. (2001). Teacher preparation research: Current
knowledge, gaps and recommendations. A research report prepared for the U.S. Department
of Education by the Center for the Study of Teaching and Policy in collaboration with
Michigan State University. Retrieved from http://depts.washington.edu/ctpmail/PDFs/
TeacherPrep-WFFM-02-2001.pdf
Young, J. R., Young, J. L., & Shaker, Z. (2012). Describing the preservice teacher technological
pedagogical content knowledge (TPACK) literature using confidence intervals. Tech Trends,
56(5), 25–33.
Zbiek, R. (1998). Prospective teachers’ use of computing tools to develop and validate
functions as mathematical models. Journal for Research in Mathematics Education, 29(2),
184–201.
Zbiek, R. M., & Hollebrands, K. (2008). A research-informed view of the process of
incorporating mathematics technology into classroom practice by inservice and preservice
teachers. In M. K. Heid & G. W. Blume (Eds.), Research on technology and the teaching
and learning of mathematics: Vol. 1. Research syntheses (pp. 287–344). Charlotte, NC:
Information Age.

Manuscript received April 3, 2013 | Initial decision April 26, 2013 | Revised manuscript accepted May 20, 2013

Appendix A

Administered Survey
Thank you for taking time to complete this survey. Please answer each question to the
best of your knowledge. You should answer demographic information first, then read
each item and choose your first belief. You need not spend any lengthy time on any
one item. You should be finished in about 15 minutes.
Your thoughtfulness and candid responses will be greatly appreciated. Your confi-
dentiality will not be compromised and your name will not, at any time, be associated
with your responses.
Your responses will be kept completely confidential and will not influence your
course grade.

Demographic Information

Age range:
a. Under 19
b. 19–22
c 23–26
d. 27–30
e. 30+

202 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

Year in college:
a. Freshman
b Sophomore
c. Junior
d. Senior
e. Graduate student

Have you completed a practicum or clinical placement in a middle grades


or high school math classroom?
a. Yes
b. No

Gender:
a. Male
b. Female
c. I prefer not to say

Ethnicity:
a. African American or black
b. Alaskan Native
c. American Indian
d. Asian
e. Hispanic or Latino
f. Pacific Islander
g. White or Caucasian
h. Other: _____________
i. I prefer not to say

Technology is a broad concept that can mean a lot of different things. For the purpose
of this questionnaire, technology is referring to digital technology/technologies—that
is, the digital tools we use, such as computers, laptops, iPods, handhelds, interactive
whiteboards, computer software programs, graphing calculators, etc.
  Please answer all of the questions, and if you are uncertain of or neutral about your
response, you may always select “Neither agree nor disagree.”
All items appeared as a SD, D, N, A, SA response set up.

Volume 46 Number 2 | Journal of Research on Technology in Education | 203


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

TK1 I know how to solve my own technical problems.


TK2 I can learn technology easily.
TK3 I keep up with important new technologies.
TK4 I frequently play around with the technology.
TK5 I know about a lot of different technologies.
TK6 I have the technical skills I need to use technology.
TK7 I have had sufficient opportunities to work with different technologies.
TK8 When I encounter a problem using technology, I seek outside help.
CK9 I have sufficient knowledge about mathematics.
CK10 I can use mathematical ways of thinking.
CK11 I have various strategies for developing my understanding of mathematics.
CK12 I know about various examples of how mathematics applies in the real world.
CK13 I have a deep and wide understanding of algebra.
CK14 I have a deep and wide understanding of geometry.
CK15 I have a deep and wide understanding of calculus.
CK16 I have a deep and wide understanding of advanced undergraduate mathematics.
PK17 I know how to assess student performance in a classroom.
PK18 I can adapt my teaching based upon what students currently understand or do not
understand.
PK19 I can adapt my teaching style to different learners.
PK20 I can assess student learning in multiple ways.
PK21 I can use a wide range of teaching approaches in a classroom setting.
PK22 I am familiar with common student understandings and misconceptions.
PK23 I know how to organize and maintain classroom management.
PK24 I know when it is appropriate to use a variety of teaching approaches (e.g., problem/proj-
ect-based learning, inquiry learning, collaborative learning, direct instruction) in a classroom
setting.
PCK25 I know how to select effective teaching approaches to guide student thinking and learning
in mathematics.
PCK26 I know different teaching approaches to teach ratio and proportion concepts.
PCK27 I know different strategies/approaches for teaching probability and statistics concepts.
PCK28 I know different strategies/approaches for teaching algebra concepts.
PCK29 I know different strategies/approaches for teaching geometry concepts.
PCK30 I know different strategies/approaches for teaching trigonometry concepts.
PCK31 I know different strategies/approaches for teaching calculus concepts.
TCK32 I know about technologies that I can use for understanding and doing ratio and
proportion.
TCK33 I know about technologies that I can use for understanding and doing probability and
statistics.
TCK34 I know about technologies that I can use for understanding and doing algebra.

204 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Developing and Validating a Reliable TPACK Instrument

TCK35 I know about technologies that I can use for understanding and doing geometry.
TCK36 I know about technologies that I can use for understanding and doing trigonometry.
TCK37 I know about technologies that I can use for understanding and doing calculus.
TCK38 I know that using appropriate technology can improve one’s understanding of mathematics
concepts.
TPK39 I can choose technologies that enhance the teaching of a lesson.
TPK40 I can choose technologies that enhance students’ learning for a lesson.
TPK41 My teacher education program has caused me to think more deeply about how
technology could influence the teaching approaches I use in my classroom.
TPK42 I am thinking critically about how to use technology in my classroom.
TPK43 I can adapt the use of the technologies that I am learning about to different teaching
activities.
TPK44 Different teaching approaches require different technologies.
TPK45 I have the technical skills I need to use technology appropriately in teaching.
TPK46 I have the classroom management skills I need to use technology appropriately in
teaching.
TPK47 I know how to use technology in different instructional approaches.
TPK48 My teaching approaches change when I use technologies in a classroom.
TPK49 Knowing how to use a certain technology means that I can use it for teaching.
TPK50 Different technologies require different teaching approaches.
TPACK51 I can use strategies that combine mathematics, technologies, and teaching approaches
that I learned about in my coursework in my classroom.
TPACK52 I can choose technologies that enhance the mathematics for a lesson.
TPACK53 I can select technologies to use in my classroom that enhance what I teach, how I teach,
and what students learn.
TPACK54 I can provide leadership in helping others to coordinate the use of mathematics,
technologies, and teaching approaches at my school and/or district.
TPACK55 I can teach lessons that appropriately combine mathematics, technologies, and teaching
approaches.
TPACK56 Integrating technology in teaching mathematics will be easy and straightforward for me.
TPACK57 I can teach lessons that appropriately combine ratio and proportion, technologies, and
teaching approaches.
TPACK58 I can teach lessons that appropriately combine probability and statistics, technologies, and
teaching approaches.
TPACK59 I can teach lessons that appropriately combine algebra, technologies, and teaching
approaches.

TPACK60 I can teach lessons that appropriately combine geometry, technologies, and teaching
approaches.
TPACK61 I can teach lessons that appropriately combine trigonometry, technologies, and teaching
approaches.
TPACK62 I can teach lessons that appropriately combine calculus, technologies, and teaching
approaches.

This survey was spread out in a four page format so items did not appear to run together.

Volume 46 Number 2 | Journal of Research on Technology in Education | 205


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Zelkowski, Gleason, Cox, & Bismarck

Appendix B

Finalized Survey Items


TK1 I know how to solve my own technical problems.
TK2 I can learn technology easily.
TK3 I keep up with important new technologies.
TK4 I frequently play around with the technology.
TK5 I know about a lot of different technologies.
TK6 I have the technical skills I need to use technology.
CK9 I have sufficient knowledge about mathematics.
CK11 I have various strategies for developing my understanding of mathematics.
CK12 I know about various examples of how mathematics applies in the real world.
CK13 I have a deep and wide understanding of algebra.
CK14 I have a deep and wide understanding of geometry.
PK17 I know how to assess student performance in a classroom.
PK18 I can adapt my teaching based upon what students currently understand or do not
understand.
PK19 I can adapt my teaching style to different learners.
PK20 I can assess student learning in multiple ways.
PK21 I can use a wide range of teaching approaches in a classroom setting.
TPACK51 I can use strategies that combine mathematics, technologies, and teaching approaches
that I learned about in my coursework in my classroom.
TPACK52 I can choose technologies that enhance the mathematics for a lesson.
TPACK53 I can select technologies to use in my classroom that enhance what I teach, how I
teach, and what students learn.
TPACK55 I can teach lessons that appropriately combine mathematics, technologies, and
teaching approaches.
TPACK59 I can teach lessons that appropriately combine algebra, technologies, and teaching
approaches.
TPACK60 I can teach lessons that appropriately combine geometry, technologies, and teaching
approaches.

206 | Journal of Research on Technology in Education | Volume 46 Number 2


Copyright © 2013, ISTE (International Society for Technology in Education), 800.336.5191
(U.S. & Canada) or 541.302.3777 (Int’l), iste@iste.org, iste.org. All rights reserved.
Copyright of Journal of Research on Technology in Education is the property of International
Society for Technology in Education and its content may not be copied or emailed to multiple
sites or posted to a listserv without the copyright holder's express written permission.
However, users may print, download, or email articles for individual use.

View publication stats

You might also like