You are on page 1of 19

J. EDUCATIONAL COMPUTING RESEARCH, Vol.

38(2) 183-200, 2008

A LONGITUDINAL ASSESSMENT OF TEACHER


EDUCATION STUDENTS’ CONFIDENCE
TOWARD USING TECHNOLOGY

NATALIE B. MILMAN
The George Washington University

PHILIP E. MOLEBASH
Loyola High School & Loyola Marymount University

ABSTRACT

This article describes the findings of a longitudinal study examining how


teachers’ confidence toward using technology have evolved, if at all, since
taking an educational technology course during their preservice teacher
education experience. Pre- and post-surveys were administered to under-
graduate and graduate teacher education students at the beginning and end
of an educational technology course, as well as several years later. Findings
show that the overall instructional and personal confidence scores were
significantly higher than when compared to the pre-course survey mean,
although lower than the post-course survey mean.

Since the A Nation at Risk report (National Commission on Excellence in


Education, 1983), there has been an outpouring of reform movements from P-12
to post-secondary levels of education in the United States. Later reforms arose
from efforts by such organizations as the Holmes Group and the Carnegie Forum
on Education and the Economy, which focused on educational restructuring
and the improvement of the quality of education. Consequently, schools, colleges,
and departments of education (SCDEs) across the country have undergone and
continue to undergo transformation.

183

Ó 2008, Baywood Publishing Co., Inc.


doi: 10.2190/EC.38.2.d
http://baywood.com
184 / MILMAN AND MOLEBASH

For the past decade, reform efforts in many SCDEs have involved improving
the technology education and preparation of preservice teachers. The need to
improve technology education for preservice teachers has been fueled by reports
(American Council on Education, 1999; CEO Forum of Educational Technology,
1999; ISTE, 1999; NCATE, 1997; Persichitte, Tharp, & Caffarella, 1997; U.S.
Congress Office of Technology and Assessment, 1995; Willis & Mehlinger, 1996)
that suggest SCDEs are not adequately preparing preservice teachers to use
technology. This problem is not due to the fact that preservice teachers are not
participating in any type of educational technology coursework. In fact a large
number of preservice teacher education students are taking technology-related
coursework (ISTE, 1999; Kleiner, Thomas, & Lewis, 2007; Persichitte et al.,
1996). Yet, many SCDEs that participated in a 2006 survey about their educational
technology instruction of preservice teachers reported experiencing a number of
barriers to integrating technology within their programs (Kleiner et al., 2007);
therefore, it is difficult to ascertain the depth and breadth of preservice teachers’
technology instruction. According to the U.S. Congress Office of Technology
Assessment (OTA, 1995), it is the kind of instruction, not just its availability,
that is critical. For instance, fewer than one in ten graduates from teacher educa-
tion programs believed they were prepared to use more engaging technologies
such as “multimedia packages, electronic presentations, collaborations over
networks, or problem solving software” (p. 185). More than a decade later, NCES
reports corroborate these findings: Only 23% of public school teachers with up
to three years of experience and only 29% of those with four to nine years
of experience indicated that they felt very well prepared for integrating tech-
nology (Parsad, Lewis, & Farris, 2001) and only “One-third of teachers reported
feeling well or very well prepared to use computers and the Internet for instruc-
tion” (Wells & Lewis, 2006, p. 9). Until these issues are resolved, teachers will
continue “emerging from their preservice training to become part of the problem
of integrating technology into the classroom rather than part of the solution”
(Lawson, 1998, p. 1).
In an effort to address this problem, the University of Virginia’s Curry School
of Education, recognized for its leadership in integrating technology for over
a decade, has been participating in a number of endeavors to infuse technol-
ogy more effectively into its instructional program. One of its efforts was the
establishment of the Center for Technology and Teacher Education (CTTE)
in 1997; a goal of the CTTE is to prepare preservice teacher education stu-
dents to be educational technology leaders. To cultivate these leaders, the CTTE
promotes the integration of technology into methods courses and requires all
of its preservice teacher education students to complete a two-credit intro-
ductory technology course, EDLF 345, Introduction to Educational Technology.
Unlike most stand-alone technology courses, EDLF 345 incorporates content-
specific instruction in utilizing technology. Grouping students into three
different areas of emphasis—elementary, secondary Humanities, and secondary
LONGITUDINAL ASSESSMENT OF TECHNOLOGY / 185

Math/Science—allows instructors to design classes that move beyond mastery


of basic technology skills to instruction that encourages students to think of
how technology may be used in instructional practice.
Another objective of the CTTE was to gauge students’ prior technology train-
ing, usage, and attitudes toward technology to provide students with a better
learning experience and to determine if students were meeting state technology
standards. To meet these objectives, a pilot pre- and post-survey instrument was
administered to students enrolled in EDLF 345 during the fall semesters of 1998,
1999, and 2000. To investigate the longitudinal effects of the Curry School’s
technology efforts, a follow up of this survey was re-administered to these former
students in 2006, most of which have now been teaching for several years.
With the survey now completed three times over a span of up to seven years
by almost 100 former Curry School teachers, we are able to investigate a variety of
longitudinal factors. The purpose of this study was to examine the attitudes toward
technology, confidence in using technology, and technology usage patterns of
teachers five to seven years after completing an educational technology course
designed for preservice teachers. Specifically, however, due to space limitations,
this article focuses only on teachers’ confidence levels about using technology,
both instructionally and personally.

LITERATURE REVIEW

Preservice Teachers’ Technology Instruction


Just where and how do preservice teachers learn to use technology? As the
introduction section indicated, most SCDEs offer some kind of technology
instruction, whether it is a stand-alone course or instruction infused across all
teacher education classes. A 2007 NCES report by Kleiner, Thomas, and Lewis
offers the most current information about teacher education programs. The authors
provide analyses of a “2006 national survey of all Title IV degree-granting
four-year postsecondary institutions on how teacher candidates within teacher
education programs for initial licensure are being prepared to use educational
technology once they enter the field” (p. 1). In addition, they examine the topics,
practices, and strategies (e.g., three-credit course) for teaching educational tech-
nology within these institutions, barriers to integrating technology within these
teacher education programs, information about how educational technology fits
into field experiences (student teaching), educational technology professional
development, and training opportunities for faculty, and perceived outcomes
of the educational technology preparation of preservice teachers within these
institutions.
Findings indicate that of the Title IV degree-granting four-year institutions
with teacher education programs for initial licensure, 93% reported “teaching
educational technology within methods courses” (Kleiner et al., 2007, p. 8), 88%
186 / MILMAN AND MOLEBASH

integrating technology into all or some (12%) of its programs, and 100% “teaching
the use of Internet resources and communication tools for instruction in all or
some teacher education programs” (Kleiner et al., 2007, p. 6). Moreover, 51% of
these “institutions offered three- or four-credit stand-alone courses in educational
technology in their programs, and about a third (34%) offered one- or two-credit
stand-alone courses in educational technology” (Kleiner et al., 2007, p. 8). It is
important to note, however, that this report excludes institutions that only have
graduate programs in teacher education. Even so, it provides an interesting picture
of how some institutions prepare preservice teachers to use technology.

Preservice Teacher Education Students’


Attitudes toward Technology

Willis and Mehlinger (1996) contend that attitudes of preservice and inservice
teachers are studied far more than any other aspect of technology in teacher
education. It is probably a frequent topic of study in teacher education because it
is easy to collect such data but also, more importantly, because self-efficacy, an
individual’s perceptions about his or her own ability to perform a specific function
(Bandura, 1993, 1997), is a good predictor of behavior. Those with low self-
efficacy tend to shy away from situations where they believe they have little
control or ability to handle a task. Consequently, those with low self-efficacy
toward technological innovation are likely to feel high levels of anxiety, and as a
result, resist learning to use computers. Those same feelings of inadequacy about
technology regulate the degree of commitment and perseverance an individual
is willing to put forth to the learning situation (Albion, 1999, 2001; Olivier &
Shapiro, 1993). Further, Cassidy and Eachus (2002) have found that self-
efficacy is a major indicator of the level and depth in which individuals use
computers. Therefore, by gauging levels of preservice teacher self-efficacy in
using technology, teacher educators can predict whether technology-related
coursework is effective or not. The factor under consideration in this study,
confidence, is closely tied to issues of self-efficacy.
Studies have found that typical preservice teachers are somewhat anxious
about computers, feel unprepared to use them, but want to learn about them
(Blythe & Nuttall, 1992; Lichtman, 1979; Mueller, Husband, Christou, & Sun,
1991). Willis and Mehlinger (1996) also noted studies that found completion
of a course on educational computing improves attitudes toward technology in
the classroom of inservice teachers (Baird, Ellis, & Kuerbis, 1989; Berger &
Carlson, 1988; Madsen & Subastiani, 1987) and preservice teachers (Anderson,
1991; Huppert & Lazarowitz, 1991; Savenye, Davidson, & Orr, 1992). Other
more recent studies support these findings as well (Albion, 2001; Brinkeroff,
2006; Ertmer, Addison, Lane, Ross, & Woods, 1999; Gunter, Gunter, & Wiens,
1998; Jung, Rhodes, & Vogt, 2006; Kumar & Kumar, 2003; Milbrath & Kinzie,
2000; Nanjappa & Lowther, 2004; Okinaka, 1992; Richardson-Kemp & Yan,
LONGITUDINAL ASSESSMENT OF TECHNOLOGY / 187

2003; Stuve & Cassady, 2005; Wang, Ertmer, & Newby, 2004; Watson, 2006;
Yildirim, 2000).
Even though some of the more current studies expand upon previous
findings on teachers’ self-efficacy and technology (Stuve & Cassady, 2005; Wang
et al., 2004; Watson, 2006), other studies provide a more complex picture about
teachers’ perceptions of technology. For example, several researchers examine the
relationship between technology and teachers’ beliefs (Albion & Ertmer, 2002;
Ertmer, 2005; Judson, 2006; Levin & Wadmany, 2006; Lumpe & Chambers,
2001; Swain, 2006), as well as the assessment and influence of dispositions
vis-á-vis technology (Jung et al., 2006; Vannatta & Fordham, 2004). In general,
these studies show that although self-efficacy plays a large role in the degree
to which technology is utilized, they also demonstrate that factors such as
owning a computer, quality of instruction, among several others, are often just
as important as one’s confidence in using technology.

DESIGN AND METHODOLOGY


After consulting with program faculty and researching other existing
instruments (Atkins & Vasu, 1998; Becker, 1994; Dawson, 1997; Delcourt &
Kinzie, 1993), we chose to create a new survey that utilized portions of Dawson’s
(1997) survey (i.e., confidence in instructional and personal use of technology). It
was chosen because it specifically addressed the Virginia Technology Standards
for Instructional Personnel (Virginia Department of Education, 1998). By 2002,
all preservice teacher education students in the state of Virginia were required to
demonstrate proficiency in these standards. In addition, two tested instruments
(Becker, 1994; Delcourt & Kinzie, 1993) were used in its development.
Participants were undergraduate and graduate students enrolled in EDLF 345
at the University of Virginia’s Curry School of Education during the fall semester
of 1998, fall semester of 1999, spring semester of 2000, and fall semester of
2000. Participants completed the survey instrument prior to taking EDLF 345,
just after completing EDLF 345, and again in a follow-up survey five to seven
years later in 2006. The pre-, post-, and follow-up survey instruments examined
the areas outlined in Table 1. Most of the survey items required answers in a
4-point Likert scale format, with responses ranging from Strongly Disagree (1) to
Strongly Agree (4). The items examined in this manuscript, survey items 5 and 6
(see Table 1), used the 4-point Likert scale format.
The design for the study was the one-group repeated measures design. The pre-
and post-surveys were administered to 262 undergraduate and graduate students
enrolled in the sections of EDLF 345 offered from 1998 to 2000. Response rates
for the pre- and post-surveys were 100% for the 1998 and 2000 groups, and
97.8% for the 1999 group. We located relatively reliable current contact infor-
mation for 175 (67%) of the 262 former students. These potential follow-up
participants were e-mailed and asked to complete a Web-based version of the
188 / MILMAN AND MOLEBASH

Table 1. Pre- and Post-Survey Items

Pre-survey Post-survey and follow-up survey

1. Student demographics 1. – – –

2. Previous computer instruction 2. – – –


(8 items)

3. Current use of technology 3. Current use of technology (10 items)


(10 items)

4. Attitudes toward using a variety of 4. Attitudes toward using a variety of


technology (10 items) technology (10 items)

5. Confidence in instructional use of 5. Confidence in instructional use of


technology (21 items) technology (21 items)

6. Confidence in personal use of 6. Confidence in personal use of


technology (22 items) technology (22 items)

instrument. Of these 175 potential participants, 99 responded and fully com-


pleted the follow-up survey—a response rate of 56%. Data reported here reflects
analyses from the 99 participants who completed all three surveys. Data were
entered into and analyzed with repeated measures analysis of variance using the
statistical software program SPSS.
Rather than perform individual analyses of each survey item, the researchers
focused on factors derived from principal axis factor analyses performed in 1999
to summarize data related to the post-survey items for both confidence portions
of the instrument (Confidence in Instructional Use of Technology and Confidence
in Personal Use of Technology). The factors may be considered as scales with
items equally weighted. After items were grouped into factors, reliability analyses
were performed on the items in each factor.
The research questions were:

1. Does instructional confidence using technology increase as a result of


taking EDLF 345 and continue five to seven years after the completion of
this course?
2. Does personal confidence using technology increase as a result of taking
EDLF 345 and continue five to seven years after the completion of this
course?

From these questions, two Null Hypotheses were made:


LONGITUDINAL ASSESSMENT OF TECHNOLOGY / 189

Null Hypothesis 1: There will be no significant change in instructional con-


fidence using technology across all factors.
Null Hypothesis 2: There will be no significant change in personal confi-
dence using technology across all factors.
A repeated measure analysis of variance design was used to test for significant
changes in confidence among course participants.

Context and Participants


The Curry School of Education offers a five-year teacher education program,
in which students earn a bachelor’s degree in an academic major, as well as a
Masters in Teaching. All study participants were preservice teachers at the Curry
School sometime between 1998 and 2002. All were enrolled in EDLF 345,
Introduction to Educational Technology, in their first semester in the teacher
preparation program. Most students entered the program during their third
(junior) year at the University; however, several students had earned bachelor’s
degrees prior to taking the course (they were working only on a Masters in
Teaching). Although students possessed a firm understanding of their content
areas, many lacked knowledge of educational methodology because they had
just begun to take educational methodology coursework within the school of
education. In addition, there was a wide scope of technology ability among
students, ranging from the beginning to the intermediate levels and above.
Students who exhibited intermediate-to-advanced skills at the beginning of the
semester were encouraged to enroll in EDLF 545, Instructional Computing, which
has a student-teaching component.

Course Description
EDLF 345, Introduction to Educational Technology, is a two-credit, performance-
based course. A primary objective of EDLF 345 is to ensure that all teacher
education students have a foundational level of technology expertise. An addi-
tional goal involves presenting technology instruction using a model that prepares
future teachers to employ these tools in a content-specific context. The course
introduces a variety of software throughout the semester. However, the emphasis
was not on learning how to use software. Rather, the instructors espoused a
constructivist theoretical perspective that emphasized learner-centered, problem-
based learning. Students completed assignments (competencies) that typically
integrated content area standards. Therefore, students not only learned to use
various digital tools, but also ways to implement these pedagogically with their
future students.
Each week, students completed a competency, an activity designed to give
students a minimal level of hands-on experience and practice integrating each
topic reviewed into their content area. Competencies consisted of a basic
190 / MILMAN AND MOLEBASH

assignment graded on a satisfactory/unsatisfactory basis—provided the compe-


tency addressed all of the basic criteria stipulated. Unsatisfactory competencies
could be resubmitted for full points, allowing students to properly master each
week’s topic if they experience difficulty. Depending upon the content focus of
the class, competencies might have included creating a newsletter using a word
processing program, presenting a lesson using PowerPoint, creating Web pages,
making spreadsheets, and developing WebQuests, etc. Competencies usually
addressed a Virginia Standard of Learning (VSOL), as well as national curriculum
standards specific to the student’s content area. Students also maintained an
electronic journal for reflecting about their developing beliefs and experience
using technology over the course of the semester. Finally, the capstone project
for the course was the development of a “showcase” technology portfolio
(Shackelford, 1997) consisting of samples of students’ work, including a self-
evaluation of the quality of the work, an explanation of why the work was included
in the portfolio, and how—as a teacher—students might use the technology
represented toward their instructional goals (Kovalchick, 1997).
Starting in the 1998-1999 academic year, a major shift in the course arose out
of the CTTE’s and teacher education faculty’s desire to improve the course to
make it more meaningful for students by offering content-specific instruction.
Moreover, faculty expressed the need for students to enter their courses with
knowledge in applying technology in the students’ content areas. In the previous
year, it was clear that such attempts were not as successful as they could have
been considering the grouping of students for the course. As a result, the CTTE
and the director of teacher education decided to divide EDLF 345 into three
separate areas of emphasis to facilitate integration of technology tools directly
into content-specific curriculum. The three different areas of emphasis were:
Elementary, Secondary Humanities, and Secondary Math & Science. Although
students in each section learned similar technology tools and applications, instruc-
tion and examples aimed at the section’s particular content area. Consequently,
unlike in previous years, students experienced ways to develop an understanding
of how to apply these technology tools into their content areas.
There were several advantages to having students with similar content interests
in the same class. One was the opportunity for discussions about how the tech-
nological skills being learned can be integrated into content curricula. These
discussions were more fruitful in the content-specific courses offered than in
the mixed classes of previous years because students could debate with those
with similar teaching goals—for instance, how best to use these tools with a tenth
grade science high school class. Another advantage was the sharing of ideas and
lesson outlines between students that they created individually or cooperatively.
For instance, a student in the elementary section wrote in her e-journal: “Today’s
presentations were incredible. Seeing what other students came up with gave
me great ideas for the classroom.” On most occasions students shared many
innovative and creative ways of integrating technology into their content area.
LONGITUDINAL ASSESSMENT OF TECHNOLOGY / 191

Clearly, the opportunity to share ideas with peers was enhanced by providing
content-specific sections of EDLF 345. An additional advantage was the way
in which the course prepared teacher candidates for uses of technology incor-
porated into content methods courses. Methods course instructors were therefore
able to focus almost strictly on pedagogical issues, rather than set aside valuable
class time for the purpose of teaching technology skills.

RESULTS

Factor Analyses/Reliability

For the Confidence in Instructional Use of Technology items (21 items), four
factors were obtained from the analysis. Factor 1 contained six items reflecting
Instructional Confidence to Select, Evaluate, and Use Technology Tools in Plan-
ning and Delivering Instruction. A reliability coefficient a = .8649 was found
for items grouped in this factor. Factor 2 contained six items reflecting Instruc-
tional Confidence to Teach Students to Effectively Find and Use Electronic
Data/Information (item reliability a = .8648). Factor 3 contained five items
reflecting Instructional Confidence to Teach Students General Technology Terms
and General Technology Use (item reliability a = .8334). Factor 4 contained
three items reflecting Instructional Confidence to Facilitate Student Problem
Solving (item reliability a = .7523).
For the Confidence in Personal Use of Technology items (22 items), six factors
were obtained from the analysis. Factor 1 contained four items reflecting Personal
Confidence to Use Mindtools (Spreadsheets, Databases) (item reliability a =
.8902). Factor 2 contained three items reflecting Personal Confidence to Operate
a Computer (item reliability a = .8787). Factor 3 contained seven items reflecting
Personal Confidence to Explain Technology Tools or Use New Technology
Products (item reliability a = .8088). Factor 4 contained two items reflecting
Personal Confidence to Explain and Understand Legal and Ethical Issues of
Technology (item reliability a = .6929). Factor 5 contained four items reflecting
Personal Confidence to Use and Create Internet Resources (item reliability a =
.7644). Factor 6 contained two items reflecting Personal Confidence to Use a
Word Processor and E-mail (item reliability a = .5564).

Repeated Measure ANOVA

The Instructional Confidence scores were analyzed in an analysis of variance


(ANOVA) with time of measurement (pre-test prior to EDLF 345 vs. post-test
after EDLF 345 vs. follow up five to seven years later). The ANOVA was chosen
because the same instrument was administered repeatedly (three times). The
sphericity assumption was not met with any factor; therefore, the Huynh-Feldt
192 / MILMAN AND MOLEBASH

correction was applied to adjust the degrees of freedom in the ANOVA test to
produce a more accurate significance (p) value (Field & Hole, 2003).
When combining all factors together, the main effect of time of measurement
was significant, F(1.85, 180.88) = 137.67, p < .0005. To delve deeper into the
differences of the three calculated means, post-hoc pairwise comparisons (pre-test
vs. post-test, post-test vs. follow-up, and pre-test vs. follow-up) were performed
using the Bonferroni adjustment for multiple comparisons. Analyses show that
EDLF 345 was effective in increasing overall instructional confidence using
technology. The overall instructional confidence score increased from a mean of
2.61 (SD = .47) before the course (pre-test) to a mean of 3.43 (SD = .35, p < .005)
immediately following the course (post-test). Although scores dipped on the
follow-up survey down to a mean of 3.21 (SD = .44, p < .005), they were signifi-
cantly higher when compared to the pre-course mean (p < .005).
Analyses of individual factors related to instructional confidence yielded results
similar to those discussed above related to overall instructional confidence with
no factors considered, with dramatic gains seen from pre-test to post-test, and
modest dips from post-test to the follow up survey performed five to seven
years later. Despite the modest decrease from post-test to follow-up, statistically
significant improvements were still apparent when making a pairwise comparison
from pre-test to follow-up (p < .005 in every case). One anomaly with the
instructional confidence factors was with Factor 3, Instructional Confidence to
Teach Students General Technology Terms and General Technology Use. As
with all factors, a significant gain was made from pre-test to post-test, but these
gains decreased less significantly from post-test to follow up (p < .05 as opposed
to p < .005 as seen with the other factors). Table 2 presents a summary of the
results of the analysis of variance tests for instructional confidence, as well as a
sample survey question for each factor.
Personal Confidence scores were also analyzed in an analysis of variance
with time of measurement (pre-test prior to EDLF 345 vs. post-test after EDLF
345 vs. follow-up five to seven years later). The sphericity assumption was
met with Factors 3, 4, and 5, but not with Factors 1, 2, and 6. To error on the side
of caution, the Huynh-Feldt correction was applied to all six factors. When
combining all factors together, the main effect of time of measurement was
significant, F(1.99, 190.99) = 201.83, p < .0005. Post-hoc pairwise comparisons
(pre-test vs. post-test, post-test vs. follow-up, and pre-test vs. follow-up) were
performed using the Bonferroni adjustment for multiple comparisons. Analyses
showed that EDLF 345 was effective in increasing overall personal confidence
using technology. The mean instructional confidence score was increased from
a mean of 2.67 (SD = .38) before the course to a mean of 3.42 (SD = .35, p < .005)
immediately following the course. Although scores dipped from post-test to
the follow-up survey to a mean of 3.32 (SD = .35, p < .05), they still remain
at statistically significant higher levels when compared to the pre-course mean
(p < .005).
Table 2. Instructional Confidence ANOVA

Sample survey question—


Factor—Instructional Confidence to: F Value Significance I feel confident I can:

No Factor F = 137.67 a = .000

Factor 1: Select, Evaluate, and Use Technology F = 155.64 a = .000 26. Select technology resources to
Tools in Planning and Delivering Instruction support instruction

Factor 2: Teach Students to Effectively Find F = 87.89 a = .000 39. Teach search strategies for use
and Use Electronic Data/Information on the Internet and CD-ROMs

Factor 3: Teach Students General Technology F = 57.61 a = .000 34. Teach students basic technology
Terms and General Technology Use vocabulary (i.e., cursor, memory)

Factor 4: Facilitate Student Problem Solving F = 56.72 a = .000 36. Teach students to use tech-
nologies for problem solving
LONGITUDINAL ASSESSMENT OF TECHNOLOGY / 193
194 / MILMAN AND MOLEBASH

Analyses of Factors 1 and 5 yielded results very similar to those related to


overall personal confidence discussed above. Namely, confidence made signifi-
cant gains from pre-test to post-test, dipped somewhat on the five to seven year
follow-up survey, but were still significantly higher than initial pre-test scores.
Analysis of individual factors related to Factors 2, 3, 4, and 6, however, yielded
some interesting anomalies not fully consistent with the overall results and those
related to Factors 1 and 5. For Factor 2, Personal Confidence to Operate a
Computer, gains made from pre-test to post-test were significant, p < .05, but the
typical decreases seen from post-test to follow up were not encountered, p > .8.
For Factor 3, Personal Confidence to Explain Technology Tools or Use New
Technology Products, a significant gain was made from pre-test to post-test, with
this number remaining virtually unchanged for the follow-up, p > .9. For Factor 4,
Personal Confidence to Explain and Understand Legal and Ethical Issues of
Technology, modest gains were made from pre-test (M = 2.23, SD = .60) to
post-test (M = 2.67, SD = .67, p < .005); there were also gains made from post-test
to follow-up five to seven years later (M = 2.91, SD = .55, p < .005). This is
the only factor in both the personal and instructional confidence categories to
see gains made from post-survey to follow-up. Finally, for Factor 6, Personal
Confidence to Use a Word Processor and E-mail, gains made from pre-test to
follow-up were not statistically significant (p > .05), which can likely be attrib-
uted to teachers’ already high levels of confidence using technology for these
purposes before they started their teacher education coursework. Table 3 presents
a summary of the results of the analysis of variance tests for personal confidence
and a sample survey question within each factor.

DISCUSSION OF RESULTS

Findings from an earlier study (Milman & Molebash, 2000; Molebash &
Milman, 2001) on these same participants using the pre- and post-survey results
showed significant increases in personal and instructional confidence in using
technology. These findings were not surprising from a common sense standpoint
considering the “technological high” many preservice teachers experience after
completing an introductory technology course. This idea is supported by other
studies that have resulted in similar conclusions (Gunter et al., 1998; McInerney,
McInerney, & Sinclair, 1990; Okinaka, 1992; Von Holzen & Price, 1990).
However, few if any studies have continued to track the teachers in the teaching
field as this study has.
One of the purposes of this study was to gauge teachers’ confidence levels about
using technology, both instructionally and personally, five to seven years later
after taking an education technology course designed for preservice teachers.
Due to space limitations, this manuscript only offers analyses of the confidence
portions of the instrument. The results of this study illustrate that although there
Table 3. Personal Confidence ANOVA

Sample survey question—


Factor—Personal Confidence to: F Value Significance I feel confident I can:

No Factor F = 201.83 a = .000

Factor 1: Use Mindtools (Spreadsheets, F = 163.96 a = .000 18. Use a database by sorting an
Databases) existing database and creating a new
database to manage and interpret
information.

Factor 2: Operate a Computer F = 5.40 a = .000 2. Start and use software programs

Factor 3: Explain Technology Tools or Use F = 191.29 a = .000 5. Identify and explain basic
New Technology Products computer components

Factor 4: Explain and Understand Legal and F = 44.98 a = .000 11. Explain copyright laws as they
Ethical Issues of Technology relate to the Internet

Factor 5: Use and Create Internet Resources F = 141.72 a = .000 13. Use search strategies to retrieve
electronic information

Factor 6: Use a Word Processor and E-mail F = 10.04 a = .000 8. Use electronic mail (e-mail)
LONGITUDINAL ASSESSMENT OF TECHNOLOGY / 195
196 / MILMAN AND MOLEBASH

was a modest dip in confidence over the years, it still remained significantly higher
when compared to the confidence participants exhibited before taking EDLF 345.
The Curry School is widely recognized for the integration of technology
throughout all of its methods courses and, as a result, other studies focusing on
methodology courses have explored the technology skills and strategies empha-
sized in EDLF 345. Although this study did not examine confidence immediately
following these courses, it is likely that they played a role in maintaining, and
perhaps increasing, teachers’ confidence in using technology in the short- and
long-term. Many other factors also might have affected teachers’ confidence
such as well-known barriers to technology use and integration (OTA, 1995).

FUTURE RESEARCH
Through this study, a rich set of data has been collected that fosters the future
investigation of several variables related to the preparation teachers had to
integrate technology and its relationship, if any, with their current practice of using
technology. This study examined the within-subject effects of instructional and
personal confidence in using technology, but there is much to be gained by
investigating a variety of potential between-subject effects, such as gender, grade
level, content area, urban vs. rural schools, etc. Furthermore, other quantitative
data responses included in the follow-up survey, such as the Current Usage of
Technology and Attitudes Toward Technology, will also be analyzed, as well as
the qualitative data collected (although not requested, many participants included
very long descriptions/reactions to the questions contained in the post-survey
they completed years later). Perhaps most exciting about the responses collected
was that 98 of the 99 participants agreed to allow the researchers to contact them
for further interviews and classroom observations. The stories these teachers
have to share regarding the successes and frustrations they have experienced
integrating technology in their teaching will be invaluable to teacher preparation
programs seeking to develop programs of study, which will yield effective
technology integrating teachers. A follow-up study might also examine how the
technology course impacts student learning in these teachers’ classrooms, if
at all. Finally, there continues to be a need for longitudinal research to help
researchers, educators, and policymakers to understand better whether or not
SCDEs are adequately preparing teacher candidates for teaching with and pre-
paring their own students to use technology.

REFERENCES
Albion, P. (1999). Self-efficacy beliefs as an indicator of teachers’ preparedness for
teaching with technology. In J. Price et al. (Eds.), Proceedings of Society for
Information Technology and Teacher Education International Conference 1999
(pp. 1602-1608). Chesapeake, VA: AACE.
LONGITUDINAL ASSESSMENT OF TECHNOLOGY / 197

Albion, P. (2001). Some factors in the development of self-efficacy beliefs for computer
use among teacher education students. Journal of Technology and Teacher Education,
9(3), 321-347.
Albion, P. R., & Ertmer, P. A. (2002). Beyond foundations: The role of vision and belief
in teachers’ preparation for integration of technology. TechTrends, 46(5), 34-38.
American Council on Education. (1999). To touch the future: Transforming the way
teachers are taught. Washington, DC: Author.
Anderson, S. (Ed.). (1991). Computer attitudes, experience, and usage of computer confer-
encing by preservice educators. Charlottesville, VA: Association for the Advancement
of Computing in Education.
Atkins, N. E., & Vasu, E. S. (1998). The teaching with technology instrument. Learning
and Leading with Technology, 25(8), 35-59.
Baird, W., Ellis, J., & Kuerbis, P. (1989). Enlist micros: Training science teachers to use
microcomputers. Journal of Research in Science Teaching, 26(7), 567-598.
Bandura, A. (1993). Perceived self-efficacy in cognitive development and functioning.
Educational Psychologist, 28(2), 117-148.
Bandura, A. (1997). Self-efficacy: The exercise of control. New York: W. H. Freeman.
Becker, H. J. (1994). How exemplary computer-using teachers differ from other teachers:
Implications for realizing the potential of computers in schools. Journal of Research
on Computing in Education, 26(3), 291-321.
Berger, C., & Carlson, E. (1988). Measuring computer literacy of teacher trainers. Journal
of Educational Computing Research, 4(3), 247-253.
Blythe, K., & Nuttall, W. (1992). Reflections on impact of an information technology
development programme upon the planned curriculum of an institution engaged in
initial teacher training. Developing Information Technology in Teacher Education, 2, 3-50.
Brinkeroff, J. (2006). Effects of a long-duration professional development academy
on technology skills, computer self-efficacy, and technology integration beliefs and
practices. Journal of Research on Technology in Education, 39(1), 22-43.
Cassidy, S., & Eachus, P. (2002). Developing the computer self-efficacy (CUSE) scale:
Investigating the relationship between CSE, gender and experience with computers.
Journal of Educational Computing Research, 26(2), 133-153.
CEO Forum of Educational Technology. (1999, February). Professional development:
A link to better learning. Washington, DC: Author. Retrieved March 24, 2000, from
http://www.ceoforum.org/reports.cfm?RID=2
Dawson, K. (1997). Factors that influence the classroom use of technology. Unpublished
doctoral dissertation, University of Virginia, Charlottesville.
Delcourt, M. B., & Kinzie, M. B. (1993). Computer technologies in teacher education:
The measurement of attitudes and self-efficacy. Journal of Research and Development
in Education, 27(1), 35-41.
Ertmer, P. A. (2005). Teacher pedagogical beliefs: The final frontier in our quest for
technology integration? Educational Technology Research and Development, 36(5),
57-59.
Ertmer, P. A., Addison, P., Lane, M., Ross, E., & Woods, D. (1999). Examining teachers’
beliefs about the role of technology in the elementary classroom. Journal of Research
on Computing in Education, 32(1), 55-71.
Field, A. P., & Hole, G. (2003). How to design and report experiments. Thousand Oaks,
CA: Sage Publications.
198 / MILMAN AND MOLEBASH

Gunter, G. A., Gunter, R. E., & Wiens, G. A. (1998). Teaching pre-service teachers
technology: An innovative approach. Washington, DC: Paper presented at the Society
for Information Technology and Teacher Education International Conference. (ERIC
Document Reproduction Service No. ED 421 112.)
Huppert, J., & Lazarowitz, R. (1991). Student teachers’ attitudes toward computer use in
science classrooms. Journal of Computing in Teacher Education, 7(3), 12-16.
International Society for Technology in Education (ISTE). (1999). Will new teachers be
prepared to teach in a Digital Age? A national survey on information technology in
teacher education. Santa Monica, CA: Milken Exchange on Education Technology.
Retrieved March 14, 2006, from
http://www.milkenexchange.org/research/iste_results.html
Judson, E. (2006). How teachers integrate technology and their beliefs about learn-
ing: Is there a connection? Journal of Technology and Teacher Education, 14(3),
581-597.
Jung, E., Rhodes, D. M., & Vogt, W. P. (2006). Disposition assessment in teacher
education: A framework and instrument for assessing technology disposition. The
Teacher Educator, 41(4), 207-233.
Kleiner, B., Thomas, N., & Lewis, L. (2007). Educational technology in teacher education
programs for initial licensure. Washington, DC: National Center for Education
Statistics, Institute of Education Sciences, U.S. Department of Education.
Kovalchick, A. (1997). Technology portfolios as instructional strategy: Designing a
reflexive approach to preservice technology training. Techtrends, 42 (4), 31-37.
Kumar, P., & Kumar, A. (2003). Effects of a Web-based project on preservice and inservice
teachers’ attitude toward computers and their technology skills. Journal of Computing
in Teacher Education, 19(3), 87-92.
Lawson, J. (1998). Chair’s message. Outlook, 20(1), 1-2.
Levin, T., & Wadmany, R. (2006). Teachers’ beliefs and practices in technology-based
classrooms: A developmental view. Journal of Research on Technology in Education,
39(2), 157-181.
Lichtman, D. (1979). Survey of educators’ attitudes toward computers. Creative Com-
puting, 5(1), 48-50.
Lumpe, A. T., & Chambers, E. (2001). Assessing teachers’ context beliefs about tech-
nology use. Journal of Research on Technology in Education, 34(1), 93-107.
Madsen, J., & Sebastiani, L. (1987). The effect of computer literacy instruction on teachers’
knowledge of and attitudes toward microcomputers. Journal of Computer-Based
Instruction, 14(2), 68-72.
McInerney, V., McInerney, D. M., & Sinclair, K. E. (1990). Computer anxiety and student
teachers: Interrelationships between computer anxiety, demographic variables and
an intervention strategy. Paper presented at the annual conference of the AARE,
Sydney, New South Wales, Australia. (ERIC Document Reproduction Service No.
ED 352 940.)
Milbrath, Y.-C., & Kinzie, M. (2000). Computer technology training for prospective
teachers: Computer attitudes and perceived self-efficacy. Journal of Technology and
Teacher Education, 8(4), 373-396.
Milman, N. B., & Molebash, P. E. (2000). Preservice teacher education students’ tech-
nology competence: Where do we stand? Paper presented at the American Educational
Research Association Annual Meeting, New Orleans, Louisiana.
LONGITUDINAL ASSESSMENT OF TECHNOLOGY / 199

Molebash, P., & Milman, N. B. (2001). Preservice teacher education students’ prior
training, usage, and attitudes towards using technology. American Educational
Research Association Annual Meeting, Seattle, Washington.
Mueller, R., Husband, T., Christou, C., & Sun, A. (1991). Preservice teacher attitudes
towards computer technology: A log-linear analysis. Mid-Western Educational
Researcher, 4(2), 23-27.
Nanjappa, A., & Lowther, D. (2004). The role of computer self-efficacy in tech-
nology integration beliefs of school teachers in India. In C. Crawford et al. (Eds.),
Proceedings of Society for Information Technology and Teacher Education Inter-
national Conference 2004 (pp. 1295-1302). Chesapeake, VA: AACE.
National Commission on Excellence in Education. (1983). A nation at risk. Washington,
DC: U.S. Government Printing Office.
National Council for Accreditation of Teacher Education (NCATE). (1997). Technology
and the new professional teacher: Preparing for the 21st century classroom.
Washington, DC: Author.
Okinaka, R. (1992). The factors that affect teacher attitude towards computer use. Pomona,
CA. (ERIC Document Reproduction Service No. ED 345 039.)
Olivier, T. A., & Shapiro, F. (1993). Self-efficacy and computers. Journal of Computer-
Based Instruction, 20(3), 81-85.
Parsad, B., Lewis, L., & Farris, E. (2001). Teacher preparation and professional develop-
ment: 2000. Washington, DC: National Center for Education Statistics, U.S. Depart-
ment of Education.
Persichette, K. A., Tharp, D. D., & Caffarella, E. P. (1997). The use of technology by
schools, colleges, and departments of education. Washington, DC: American Asso-
ciation of Colleges for Teacher Education.
Richardson-Kemp, C., & Yan, W. (2003). Urban school teachers’ self-efficacy beliefs
and practices, innovation practices, and related factors in integrating technology.
In C. Crawford et al. (Eds.), Proceedings of Society for Information Technology and
Teacher Education International Conference 2003 (pp. 1073-1076). Chesapeake,
VA: AACE.
Savenye, W., Davidson, G., & Orr, K. (1992). Effects of an educational computing course
on preservice teachers attitudes and anxiety toward computers. Journal of Computing
in Childhood Education, 3(1), 31-41.
Shackelford, R. L. (1997). Student portfolios: A process/product learning and assess-
ment strategy. The Technology Teacher, 55(8), 31-36.
Stuve, M., & Cassady, J. (2005). A factor analysis of the NETS performance profiles:
Searching for constructs of self-concept and technology professionalism. Journal of
Technology and Teacher Education, 13(2), 304-324.
Swain, C. (2006). Preservice teachers’ self-assessment using technology: Determining
what is worthwhile and looking for changes in daily teaching and learning practices.
Journal of Technology and Teacher Education, 14(1), 29-59.
U.S. Congress Office of Technology Assessment (OTA). (1995). Teachers and technology:
Making the connection. Washington, DC: U.S. Government Printing Office.
Vannatta, R. A., & Fordham, N. (2004). Teacher dispositions as predictors of class-
room technology use. Journal of Research on Technology in Education, 36(3),
253-271.
200 / MILMAN AND MOLEBASH

Virginia Department of Education. (1998, March). Technology standards for instructional


personnel. Richmond, VA: Author. Retrieved March 24, 2000 from the World Wide
Web: http://www.pen.k12.va.us/VDOE/Compliance/TeacherED/tech.html
Von Holzen, R., & Price, R. V. (1990). Five year trends in computer students’ attitudes and
skill-levels. Lubboch, TX. (ERIC Document Reproduction Service No. ED 333 860.)
Wang, L., Ertmer, P. A., & Newby, T. J. (2004). Increasing preservice teachers’ self-
efficacy beliefs for technology integration. Journal of Research on Technology in
Education, 36(3), 231-250.
Watson, G. (2006). Technology professional development: Long-term effects on teacher
self-efficacy. Journal of Technology and Teacher Education, 14(1), 151-165.
Wells, J., & Lewis, L. (2006). Internet access in U.S. public schools and classrooms:
1994–2005. Washington, DC: U.S. Department of Education, National Center for
Education Statistics.
Willis, J. W., & Mehlinger, H. D. (1996). Information technology and teacher education.
In J. Sikula (Ed.), Handbook of research on teacher education (pp. 978-1029). New
York: Macmillan Library Reference.
Yildirim, S. (2000). Effects of an educational computing course on preservice and inservice
teachers: A discussion and analysis of attitudes and use. Journal of Research on
Computing in Education, 32(4), 479-495.

Direct reprint requests to:


Dr. Natalie B. Milman
2134 G ST, NW
Washington, DC 20052
e-mail: nmilman@gwu.edu

You might also like