Professional Documents
Culture Documents
JEFFREY R. STOWELL
DAN BENNETT
Eastern Illinois University, Charleston
ABSTRACT
Increased use of course management software to administer course exams
online for face-to-face classes raises the question of how well test anxiety
and other emotions generalize from the classroom to an online setting. We
hypothesized that administering regular course exams in an online format
would reduce test anxiety experienced at the time of the exam and improve
exam scores. We recruited 69 participants from a psychology course to take
classroom- and online-delivered exams, using a counterbalanced crossover
design. We found that students who normally experience high levels of test
anxiety in the classroom had reduced test anxiety when taking online exams,
while the reverse was true for those low in classroom anxiety. Furthermore,
the relationship between test anxiety and exam performance was weaker in
an online setting than in the classroom. We recommend that instructors
evaluate the potential impact of these findings when considering offering
examinations online.
161
METHOD
Participants
All 69 students enrolled in the first author’s Psychology of Learning course
participated in exchange for course credit. Seventeen participants were men
and 52 were women, with a mean age of 21.3 (SD = 2.2) years. The majority
(91%) of the students were junior or senior status, and 59 (86%) were Caucasian,
8 (12%) African American, 1 (1%) American Indian, and 1 (1%) Asian American.
Seventy-five percent of the participants had taken an online exam before par-
ticipating in this study.
Instruments
AEQ
From the first author’s psychology of learning class, we selected the third
and fourth regular exams for the study (there were four regular exams during
ONLINE AND CLASSROOM TEST ANXIETY / 165
the semester, each spaced 3 weeks apart). The online exams were identical to
the classroom exams and students had the same duration of time, 50 minutes, to
complete them once they started the exam. Each exam covered learning topics
(e.g., operant conditioning, vicarious learning) discussed in the textbook and
the classroom since the previous exam and contained true/false and multiple-
choice questions, with about 30% of the maximum 50 points from short answer
questions. Each exam accounted for 11% of the total course grade. For the online
exams, we used course management software (WebCT 8.0) to deliver the exam
questions all at once, allowing students to answer the questions in any order
desired. Students completed classroom exams, administered by the instructor,
on paper with a scantron sheet to record their answers.
Procedure
Students whose last names started with A-M (n = 35) completed their third
exam online anytime during a 5-day window that started when the other students
(N-Z, n = 34) took their exam in the classroom. We chose to make the online
testing condition in harmony with the standard practice of providing students a
window of time to complete the online exam (this being one of the major
advantages over classroom testing). We instructed students taking the exam
in the classroom not to share information about the exam with students who
would be taking it online. For the fourth exam, we switched the format for
all participants (i.e., A-M in the classroom, N-Z online), resulting in a counter-
balanced crossover design where we could compare participants’ data under both
testing formats.
At the time of both exams, we instructed participants to complete the AEQ-
Before immediately prior to taking the exam and the AEQ-During immediately
afterwards. When taking their exams online in WebCT, the AEQ surveys were
administered online, but four participants failed to complete them at the appro-
priate time, resulting in online AEQ data for 65 participants. For various reasons,
three students did not take their paper exam with the rest of the class so we
excluded their AEQ data and classroom exam scores from analyses. Where
relevant, our reported measure of effect size is partial eta squared.
RESULTS
The group of students that took the online exam first (A-M) did not differ
significantly from the other students in the class on demographic factors, test
anxiety, or exam scores (see Table 1). General Linear Model (GLM) analyses
in SPSS found no significant main effects of group or interactions with group,
confirming that the counterbalanced design was effective in reducing treatment
order effects.
166 / STOWELL AND BENNETT
Demographics
Grade Point Average 3.11 (0.48) 3.13 (0.50)
Age 21.2 (1.7) 21.4 (2.5)
Female 80.0% 70.6%
Exam score on prior exams 79.1% 80.3%
Taken online before 73.5% 76.5%
Dependent measures
Exam 3, Test Anxiety Score 34.0 (8.9) 30.8 (8.4)
Exam 4, Test Anxiety Score 34.0 (11.0) 33.8 (7.4)
Exam 3, Exam Score 42.5 (4.2) 41.3 (6.3)
Exam 4, Exam Score 36.8 (6.5) 36.4 (6.6)
Exam Performance
We considered classroom-based test anxiety to be a baseline estimate of general
test anxiety that could influence both classroom and online exam scores. Thus, we
treated classroom anxiety as a covariate in a GLM model with exam format as
a repeated measures factor (classroom, online) and exam score as the dependent
variable. We found a main effect of classroom test anxiety, which was associated
with poorer exam performance across exam formats, F(1, 63) = 24.19, p < .001,
np2 = .28. More importantly, the relationship between classroom test anxiety
and exam score depended on exam format, F(1, 63) = 5.80, p = .02, np2 = .08.
Follow-up correlational analyses helped illustrate the nature of this interaction:
whereas high classroom test anxiety was associated with poor performance in
the classroom (r = –.57, p < .001, n = 66), it was less strongly associated with
exam performance online (r = –.28, p = .02, n = 65).
Furthermore, when the correlation between anxiety and performance in the
classroom was compared to the correlation between anxiety and performance
online (r = –.29, p = .02, n = 65), a significant difference was found, z = 1.95,
p = .03 (one-tailed), suggesting that the relationship between anxiety and exam
performance was stronger in the classroom setting than the online setting.
Although our primary focus was on test anxiety, the other negative emotions of
anger, shame, and hopelessness were also associated with poorer exam per-
formance, while the positive emotions of enjoyment, hope, and pride were asso-
ciated with better exam performance (see Table 2).
ONLINE AND CLASSROOM TEST ANXIETY / 167
Online Emotions
Enjoyment 0.35** 0.01
Hope 0.30* 0.18
Pride 0.36** 0.05
Anger –0.34** –0.16
Anxiety –0.29* –0.26*
Shame –0.41** –0.16
Hopelessness –0.42** –0.25
Classroom Emotions
Enjoyment 0.23 0.40**
Hope 0.25* 0.29*
Pride 0.27* 0.39**
Anger –0.07 –0.30*
Anxiety –0.28* –0.56**
Shame –0.11 –0.59**
Hopelessness –0.23 –0.59**
*p < .05; **p < .01.
Test Anxiety
To determine if differences between students’ classroom and online anxiety
depended on classroom (i.e., “baseline”) anxiety levels, we entered classroom test
anxiety as a factor (high/low, median split) in a GLM model, with exam format as a
repeated measures factor, and anxiety level as the dependent variable. There was
no main effect of exam format on anxiety level, F(1, 60) = 1.14, p > .28; however,
we found a strong interaction between classroom anxiety level and exam format,
F(1, 60) = 21.26, p < .001, np2 = .26. Students who experienced high levels of
anxiety in the classroom had significantly lower test anxiety scores when taking
the exam online, t (32) = –5.03, p < .001, whereas students low in classroom
anxiety had significantly higher test anxiety levels when taking the exam online,
t (28) = 2.08, p < .05 (see Figure 1).
The majority of the students in our sample had taken online exams before
(75%). Of those who had, 59% indicated a preference for the online format, while
only 35% of the 17 students that were new to online testing indicated a similar
preference. Experience in taking online exams did not affect online anxiety
levels or online exam scores, ps > .48. It also had a negligible effect on the
previous GLM models when included as a covariate.
168 / STOWELL AND BENNETT
40 -
35 -
30 -
25 -
20 -
15 -
Overall, slightly more than half of the students indicated a preference for taking
their exams online (54%), compared to the traditional classroom setting (46%).
Interestingly, those who indicated a preference for online exams had higher
classroom test anxiety (M = 35.5, SD = 9.1) than those who indicated a preference
for the classroom setting (M = 29.0, SD = 9.8), t (64) = 2.78, p = .007.
Our sample included a relatively small number of men, but they reported
significantly lower levels of test anxiety than women (M = 29.9, SD = 7.4 versus
M = 34.3, SD = 7.7), F(1, 60) = 6.28, p = .02, np2 = .10. This sex difference did not
depend on exam format, p > .52.
DISCUSSION
Contrary to our hypothesis that online testing would be less anxious for students
than classroom testing, students reported comparable levels of test anxiety and
performed equally well under both exam conditions. However, these findings
were not true of all students. Indeed, we found moderately sized effects that
ONLINE AND CLASSROOM TEST ANXIETY / 169
format of the exam, which would have been difficult because the student responses
were not anonymous. Although it is difficult to rule out the possibility that students
used the internet to look up test answers or collaborated with a fellow student
during the online exam, we found nearly identical means, standard deviations,
and grade distributions across exam formats. Thus, if students did cheat, there was
no appreciable effect on their grades. Alternatively, students may have done more
poorly online if it were not for the opportunity to cheat, resulting in performance
comparable to their classroom performance. Finally, students’ perceptions about
cheating are probably exaggerated, as Engler, Landau, and Epstein (2008) noted
that students overestimate the likelihood of their peers cheating in relation to
the likelihood of themselves cheating.
In conclusion, online testing is a two-edged sword that has potential benefits
for some students, but negative consequences for others. The greatest benefits
appear to be for students who ordinarily experience high levels of negative
emotions during classroom examinations. To reduce unpleasant emotional
reactions and improve exam performance of certain students, instructors may
consider giving students the option of online testing as an alternative to taking
classroom examinations. However, at the present, it appears that not all students
are either ready or willing to embrace online testing with open arms.
REFERENCES
Alexander, M. W., Bartlett, J. E., Truell, A. D., & Ouwenga, K. (2001). Testing in a
computer technology course: An investigation of equivalency in performance between
online and paper and pencil methods. Journal of Career and Technical Education, 18,
69-80. Retrieved from http://scholar.lib.vt.edu/ejournals/JCTE/v18n1/alexander.html
Allen, I. E., & Seaman, J. (2008). Staying the course: Online education in the United States.
Needham, MA: The Sloan Consortium. Retrieved from http://www.sloan-c.org/
publications/survey/pdf/staying_the_course.pdf
Engler, J. N., Landau, J. D., & Epstein, M. (2008). Keeping up with the Joneses: Students’
perceptions of academically dishonest behavior. Teaching of Psychology, 35, 99-102.
Godden, D. R., & Baddeley, A. D. (1975). Context-dependent memory in two natural
environments: On land and underwater. British Journal of Psychology, 66, 325-331.
Hartley, J., & Nicholls, L. (2008). Time of day, exam performance and new technology.
British Journal of Educational Technology, 39, 555-558. doi: 10.1111/j.1467-8535.
2007.00768.x
Hembree, R. (1988). Correlates, causes, effects, and treatment of test anxiety. Review of
Educational Research, 58, 47-77. doi: 10.3102/00346543058001047
Lazarus, R. S. (1999). Stress and emotion: A new synthesis. New York: Springer.
Marszalek, J. (2007). Computerized adaptive testing and the experience of flow in
examinees [Abstract]. Dissertation Abstracts International Section A: Humanities and
Social Sciences, 67, 2465. Retrieved from http://search.ebscohost.com/login.aspx?
direct=true&db=psyh&AN=2007-99001-021&site=ehost-live
Paek, P. (2005). Recent trends in comparability studies. Pearson. Retrieved from from
http://www.pearsonedmeasurement.com/research/ research.htm
ONLINE AND CLASSROOM TEST ANXIETY / 171
Pekrun, R., Goetz, T., Titz, W., & Perry, R. P. (2002). Academic emotions in students’
self-regulated learning and achievement: A program of qualitative and quantitative
research. Educational Psychologist, 37, 91-105. Retrieved from http://search.epnet.
com/login.aspx?direct=true&db=afh&an=6790645
Powers, D. E. (2001). Test anxiety and test performance: Comparing paper-based and
computer-adaptive versions of the graduate record examination (GRE©) general test.
Journal of Educational Computing Research, 24, 249-273.
Russell, M., Goldberg, A., & O’Connor, K. (2003). Computer-based testing and validity:
A look back into the future. Assessment in Education: Principles, Policy & Practice,
10, 279-293. doi: 10.1080/0969594032000 148145
Schult, C. A., & McIntosh, J. L. (2004). Employing computer-administered exams in
general psychology: Student anxiety and expectations. Teaching of Psychology, 31,
209-211. doi: 10.1207/s15328023top3103_7
Seipp, B. (1991). Anxiety and academic performance: A meta-analysis of findings. Anxiety
Research, 4, 27-41. doi: doi:10.1080/08917779108248762
Shermis, M. D., & Lombard, D. (1998). Effects of computer-based test administrations
on test anxiety and performance. Computers in Human Behavior, 14, 111-123. doi:
10.1016/S0747-5632(97)00035-6
Shermis, M. D., Mzumara, H. R., & Bublitz, S. T. (2001). On test and computer anxiety:
Test performance under CAT and SAT conditions. Journal of Educational Computing
Research, 24, 57-75. doi: 10.2190/4809-38LD-EEUF-6GG7
Street, J. E. (2008). Examining the validity of testing in an online learning environment
[Abstract]. Dissertation Abstracts International Section A: Humanities and Social
Sciences, 69(5-A), 1750. Retrieved from http://search.ebscohost.com/login.aspx?
direct=true&db=psyh&AN=2008-99210-028&site=ehost-live
Wise, S. L., Barnes, L. B., Harvey, A. L., & Plake, B. S. (1989). Effects of computer anxiety
and computer experience on the computer-based achievement test performance of col-
lege students. Applied Measurement in Education, 2, 235. Retrieved from http://search.
ebscohost.com/login.aspx?direct=true&db=alpha&AN=7365024&site=ehost-live
Wise, S. L., Roos, L. L., Plake, B. S., & Nebelsick-Gullett, L. J. (1994). The relationship
between examinee anxiety and preference for self-adapted testing. Applied Measure-
ment in Education, 7, 81-91. doi: 10.1207/s15324818ame0701_6
Zeidner, M. (1998). Test anxiety: The state of the art. New York: Plenum Press.