Professional Documents
Culture Documents
To cite this article: Rhonda Christensen & Gerald Knezek (2001) Instruments
for Assessing the Impact of Technology in Education, Computers in the Schools:
Interdisciplinary Journal of Practice, Theory, and Applied Research, 18:2-3, 5-25, DOI:
10.1300/J025v18n02_02
Taylor & Francis makes every effort to ensure the accuracy of all the
information (the “Content”) contained in the publications on our platform.
However, Taylor & Francis, our agents, and our licensors make no
representations or warranties whatsoever as to the accuracy, completeness,
or suitability for any purpose of the Content. Any opinions and views
expressed in this publication are the opinions and views of the authors, and
are not the views of or endorsed by Taylor & Francis. The accuracy of the
Content should not be relied upon and should be independently verified with
primary sources of information. Taylor and Francis shall not be liable for any
losses, actions, claims, proceedings, demands, costs, expenses, damages,
and other liabilities whatsoever or howsoever caused arising directly or
indirectly in connection with, in relation to or arising out of the use of the
Content.
This article may be used for research, teaching, and private study purposes.
Any substantial or systematic reproduction, redistribution, reselling, loan,
sub-licensing, systematic supply, or distribution in any form to anyone is
expressly forbidden. Terms & Conditions of access and use can be found at
http://www.tandfonline.com/page/terms-and-conditions
Downloaded by [University of Waterloo] at 10:42 03 March 2015
INSTRUMENTS AND TESTING
Downloaded by [University of Waterloo] at 10:42 03 March 2015
Rhonda Christensen
Gerald Knezek
The 1998 CEO Forum study also estimated that the $3 billion spent
on technology in U.S. schools represents just over 1% of total education
spending, and that it will cost about $15 billion to make all our schools
“technology rich.” This is about $300 per student, or about 5% of total
education spending, and about five times what we now spend on tech-
nology (CEO Forum, 1998). If the recommended 30% of this $300 per
Instruments and Testing 7
Indicators of Success
scored lower than their peers on a national math test, the study
found. Students who used “drill and practice” software also scored
lower. But students who used computers for simulations and
real-life applications of math concepts scored higher, especially
those students in middle school. The study suggests that school
districts should focus attention on professional development for
teachers to make sure they know how to use computers with their
students effectively. (p. 1)
Two theoretical approaches that form a logical basis for selecting as-
sessment instruments of technology integration are presented in this pa-
per. The first is diffusion of innovation, while the second is a structural
model of technology integration.
Early research by Rogers (1983) found that adoption of innovations
is an active process that involves much reinvention. Adopters must rein-
vent the innovation and make it their own if they are to continue using it.
Hall and Rutherford (1974) developed the Concerns-Based Adoption
Model (CBAM) in the early 1970s for the study of adoption of any edu-
cational innovation. CBAM Stages of Concern (1974) and Levels of
Use of the Innovation (Loucks, Newlove, & Hall, 1975) classifications
have been adapted for information technology innovations and used by
researchers on several continents over the past three decades. The
Levels of Use form, which is described in more detail in the following
section, is based on CBAM work.
Instruments and Testing 9
cess to technology tools are all required for successful technology inte-
gration. The model is grounded in educational psychology principles
that determine key variables influencing the school learning environ-
ment (Klausmeir & Goodwin, 1975), and it relies on the multivariate
technique of Structured Equation Modeling (Schumacker, 1996) to
guide measurement techniques. The instruments presented in the book-
let Instruments for Assessing Educator Progress in Technology Integra-
tion are generally organized along these lines (Knezek, Christensen,
Miyashita, & Ropp, 2000). Descriptions of key instruments from this
book, and related instruments relevant to the content of this paper, follow.
Attitude Instruments
The Snapshot Survey by Norris and Soloway (Norris, Box, & Soloway,
1999; Norris, Soloway, Knezek, Topp, Young, & Box, 2000) addresses
10 Evaluation and Assessment in Educational Information Technology
Skill/Competency
Level of Proficiency
Data were gathered from 1,135 K-12 teachers from 13 school dis-
tricts in north central Texas during 1998. This region of Texas contains
predominantly rural schools. Educators were asked to complete the
Stages of Adoption of Technology form (see Appendix A), the TAC,
and the TAT during March 1998.
The average Stage of Adoption value across these 1,135 educators
was 4.13. High correlations were found between stages of adoption and
computer anxiety (r = .67, p < .0005, N = 973), in the direction of higher
stages being associated with reduced anxiety. Higher stages of adoption
were also strongly associated with increased computer enjoyment (r =
Instruments and Testing 11
were that both home use of a computer and home access to the Internet
were very strong discriminators for high or low stages of adoption
(Knezek & Christensen, 1999). The nature of the relationship appears to
be that home access makes a large contribution to technology integra-
tion advancement at the higher stages.
A discriminant function analysis, which is a form of regression anal-
ysis in which the dependent variable (stage of adoption) is assumed to
be only categorical, was carried out on the data to determine if teacher
attitudes toward information technology are sufficiently strongly re-
lated to advances in stages of adoption to be useful as predictors of
stage. The seven major attitude subscales on the TAC, plus computer
enjoyment and anxiety subscales that were completed by teachers and
students in the study, were used as predictors for stage of adoption.
Based on their reported attitudes toward information technology, 46%
of the teachers in the sample could be correctly classified into their own
reported stage. The expectation for correct classification would be 1/6 =
16.7% by chance. Roughly 90% accuracy is possible if one is willing to
accept plus or minus one stage as a success. This is often acceptable in a
situation where the average stage of an entire school, rather than that of
an individual teacher, is the goal (Knezek & Christensen, 1999).
Number
CECS 1100 Computer Applications This is a tool-based course in which students learn
in Education to use an integrated word processing, spreadsheet, and
database package.
CECS 3440 Technology and the This course includes the use of presentation software
Teacher and theories as well as the instructional design of
presentation materials.
CECS 4100 Computers in the Students in this class learn to find appropriate
Classroom instructional software for the classroom, develop
a multimedia project, find Internet resources for
classroom use, develop a technology-infused lesson
plan, and make a Web page to link to their instructional
resources.
CECS 4800 Technology This course provides field-based technology
Integration Mentoring integration experiences to pre-service educators.
6
5.2
5
4.4
4.13 4.1
3.9
4
Pretest
3.1
3 Posttest
0
Typical CECS 1100 CECS 3440 CECS 4100
Teacher Students Students Students
(n = 1135)
Results of analysis of the data from 1,135 rural Texas educators are
presented in Table 2 and graphically illustrated in Figure 2 (Christensen &
Knezek, 2000b). Based on analysis of data, it appears that teachers who
are in Stage One (awareness) also rated themselves lower in computer
enjoyment, computer avoidance, e-mail, productivity, and overall per-
14 Evaluation and Assessment in Educational Information Technology
Note. Anxiety, Avoidance and Negative Impact are coded in a positive direction. Higher means represent a lack of
anxiety, lack of avoidance and lack of perceived negative impact.
7
6.5
6
5.5
5
4.5
4
3.5
3
2.5
2
F2-Anxiety
F3-Avoidance
F1-Enjoyment
F4-Email
MMT
F7-Semantic
WWWT
ProdCL
F5-Negative
EmailT
Productivity
ProdT
Perception
Impact
F6-
pretest in early fall, to 4.20 at the time of the posttest in December 1999.
Proficiency with integrated applications increased from a group mean
average of 2.85 on the pretest to 4.18 by the posttest. Teachers’ per-
ceived proficiency in teaching with technology improved from an aver-
Downloaded by [University of Waterloo] at 10:42 03 March 2015
age rating of 3.25 at the time of the pretest to 4.16 at posttest time.
Proficiency with the World Wide Web improved from 3.60 at the pre-
test to 4.53 at the posttest. All of these changes are highly significant in
the statistical sense (p < .0005).
Furthermore, as shown in Figure 3, technology skill profiles for the
groups with different reported stages of adoption were quite distinct, for
all groups except Stage Five versus Stage Six. This implies that some
other attributes besides technology skills (such as teacher attitudes)
may be the distinguishing factor for Stage Five versus Stage Six level of
technology integration.
Data were gathered from UNT pre-service teachers using the Tech-
nology Proficiency Skills Assessment (TPSA) for the first time during
fall 1999. Initial findings, based on one CECS 4100 class of 21 students,
are graphically displayed in Figure 4. Students enrolled in the Com-
puters in the Classroom course reported significant skill gains (p < .001)
4.5
4
Stage 2
Stage 3
3.5 Stage 4
Stage 5
Stage 6
3
2.5
2
TPTT TPIA TPWWW TPEMAIL
Instruments and Testing 17
6 5.84
5.6 5.56
5.39
5
4.5 4.54
4.03 3.96
4
Pretest
3
Posttest
0
E-mail WWW Integrated App. Tech. in Teaching
in four areas measured by the TPSA. The effect sizes of these gains
ranged from .86 to 1.39. The average gain in level of technology profi-
ciency was approximately one standard deviation.
Pre-post changes in ISTE competencies gathered from a class of
pre-service teacher education students during fall 1999 are shown in Fig-
ure 5. This direct appraisal, nine-item form developed by Christensen,
has yielded an internal consistency reliability estimate of .92 across 188
pre-service educators and 40 university faculty (Christensen & Knezek,
2000c). Items for this instrument are listed in Appendix C.
WST Model
Fall 1999
5.00
4.50
4.00
3.50
3.00
2.50 4100 pretest
4100 posttest
2.00
1.50
1.00
0.50
0.00
Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 MEAN
SAT
Downloaded by [University of Waterloo] at 10:42 03 March 2015
TAAS
Achievement
TAC-7F
Other
Will standardized
TAT-5F scores
TPSA-EMAIL
CBAM-LOU
TPSA-WWW
Skill Classroom
Integration
TSPA-IA
STAGES
TPSA-TT
CHOURS Tools
HOMECOM
HOMEWWW
FACILITIES
proper stages of adoption, based solely on the will, skill, and tool pa-
rameters indicated in the model (Knezek, Christensen, Hancock, &
Shoho, 2000).
Impact of Technology Investment on Student Achievement. In a re-
Downloaded by [University of Waterloo] at 10:42 03 March 2015
CONCLUSION
REFERENCES
Bailo, E. R., & Sivin-Kachla, J. (1995). Effectiveness of technology in schools,
1990-1994. Washington, DC: Software Publishers Association.
CEO Forum. (1998). Year 2 STaR report: Professional development: A link to better
learning. [Online]. Available: http://www.ceoforum.org
Christensen, R. (1997). Effect of technology integration education on the attitudes of
teachers and their students. [Online]. Unpublished doctoral dissertation, University
of North Texas, Denton, Texas. Available: http://courseweb.tac.unt.edu/rhondac
Christensen, R. (1999). Technology in Education Preservice Competency Survey.
[Online]. Available: http://www.iittl.unt.edu/unt/students/tpsa2.htm
Christensen, R., & Knezek, G. (1998). Parallel forms for measuring teachers’ attitudes
toward computers. In S. McNeil, J. Price, S. Boger-Mehall, B. Robin, & J. Willis
(Eds.), Technology and teacher education annual 1998 (Vol. 2, pp. 820-824). Char-
lottesville, VA: Association for the Advancement of Computing in Education.
Christensen, R., & Knezek, G. (1999). Stages of adoption for technology in education.
Computers in New Zealand Schools, 11(3), 25-29.
Christensen, R., & Knezek, G. (2000a). Advancement of student technology integra-
tion skills through university preservice coursework. In D. Willis, J. Price, & J. Wil-
lis (Eds.), Technology and teacher education annual 2000 (Vol. 2, pp. 1505-1510).
Charlottesville, VA: Association for the Advancement of Computing in Education.
Christensen, R., & Knezek, G. (2000b). Strategies for integrating technology into the
classroom. [Online]. Pre-conference symposium paper presented at the Society for
Information Technology in Teacher Education Annual Conference, San Diego, CA.
Available: http://www.iittl.unt.edu/IITTL/presentations
Christensen, R., & Knezek, G. (2000c, April 25). Internal consistency reliability for the
Technology in Education Competency Survey. Paper presented to the Preparing
Tomorrow’s Teachers Evaluator’s Workshop, American Educational Research As-
sociation Annual Meeting, New Orleans, LA.
Coley, R. J., Cradler, J., & Engal, P.K. (1998). Computers and classrooms: The status
of technology in U.S. schools (Policy Information Report). Princeton, NJ: Policy
Information Center, Educational Testing Service.
Collis, B. A., Knezek, G. A., Lai, K. W., Miyashita, K. T., Pelgrum, W. J., Plomp, T., &
Sakamoto, T. (1996). Children and computers in school. Mahwah, NJ: Erlbaum.
Griffin, D., & Christensen, R. (1999). Concerns-Based Adoption Model (CBAM)
Levels of Use of an Innovation (CBAM-LOU). Denton, TX: Institute for the Inte-
gration of Technology into Teaching and Learning.
22 Evaluation and Assessment in Educational Information Technology
Russell, A. L. (1995). Stages in learning new technology: Naive adult email users.
Computers in Education, 25(4), 173-178.
Schumacker, R. E. (1996). A beginner’s guide to structural equation modeling.
Mahwah, NJ: Erlbaum.
Downloaded by [University of Waterloo] at 10:42 03 March 2015
U.S. Congress, Office of Technology Assessment. (1995, April). Teachers and tech-
nology: Making the connection (OTA-EHR-616). Washington, DC: U.S. Govern-
ment Printing Office.
Woodrow, J. E. (1992). The influence of programming training on the computer liter-
acy and attitudes of preservice teachers. Journal of Research on Computing in Edu-
cation, 25(2), 200-218.
APPENDIX A
Stages of Adoption of Technology
Instructions: Please read the descriptions of each of the six stages related to adoption of technology.
Choose the stage that best describes where you are in the adoption of technology.
Stage 1: Awareness
I am aware that technology exists but have not used it–perhaps I'm even avoiding it.
I am currently trying to learn the basics. I am often frustrated using computers. I lack confidence when
using computers.
I am beginning to understand the process of using technology and can think of specific tasks in which it
might be useful.
I think about the computer as a tool to help me and am no longer concerned about it as technology. I can
use it in many applications and as an instructional aid.
I can apply what I know about technology in the classroom. I am able to use it as an instructional tool
and integrate it into the curriculum.
From: Christensen, R. (1997). Effect of technology integration education on the attitudes of teachers and their stu-
dents. Doctoral dissertation, University of North Texas. Based on Russell, A. L. (1995). Stages in learning new tech-
nology. Computers and Education, 25(4), 173-178.
24 Evaluation and Assessment in Educational Information Technology
APPENDIX B
Concerns-Based Adoption Model (CBAM)
Levels of Use of an Innovation
Downloaded by [University of Waterloo] at 10:42 03 March 2015
Level 0 Non-use
I have little or no knowledge of information technology in education, no involvement with it, and I am do-
ing nothing toward becoming involved.
Level 1 Orientation
Level 2 Preparation
I focus most effort on the short-term, day-to-day use of information technology with little time for reflec-
tion. My effort is primarily directed toward mastering tasks required to use the information technology.
Level 4 A Routine
I feel comfortable using information technology in education. However, I am putting forth little effort and
thought to improve information technology in education or its consequences.
Level 4 B Refinement
I vary the use of information technology in education to increase the expected benefits within the class-
room. I am working on using information technology to maximize the effects with my students.
Level 5 Integration
I am combining my own efforts with related activities of other teachers and colleagues to achieve im-
pact in the classroom.
Level 6 Renewal
I reevaluate the quality of use of information technology in education, seek major modifications of, or al-
ternatives to, present innovation to achieve increased impact, examine new developments in the field,
and explore new goals for myself and my school or district.
APPENDIX C
Technology in Education Competency Survey
1 = Strongly Disagree (SD)
Downloaded by [University of Waterloo] at 10:42 03 March 2015
2 = Disagree (D)
3 = Undecided (U)
4 = Agree (A)
5 = Strongly Agree (SA)
SD D U A SA
Adapted from Preparedness of graduates (pp. 41-43), International Society for Technology in Education (1999).
Will new teachers be prepared to teach in a digital age? A national survey on information technology in teacher edu-
cation. Santa Monica, CA: Milken Family Foundation, Milken Exchange on Education Technology.