You are on page 1of 24

Computers in Human Behavior 17 (2001) 225–248

www.elsevier.com/locate/comphumbeh

Exploring the use of multimedia examination


formats in undergraduate teaching: results from
the fielding testing
Min Liu*, Erini Papathanasiou, Yung-Wei Hao
Department of Curriculum and Instruction, University of Texas — Austin, Austin, TX 78712, USA

Abstract
Students’ and faculty’s perspectives toward using the multimedia examination format will
influence whether digital testing can be implemented successfully or not. This study examined
the students’ attitudes and anxiety toward taking the multimedia examinations, factors (i.e.
gender, major, student classification, and computer experience) that may influence their atti-
tudes, and faculty’s attitudes toward creating multimedia tests. Students in two undergraduate
courses [a regular course (n=100), and an online course (n=97)] participated in this research.
The results showed that there was strong support from the students and faculty for using
multimedia exams as a primary assessment form. They embraced the interactive technology,
and felt the incorporation of rich media in assessment could provide additional support for
their learning and teaching. While students from both the regular and online courses found it
acceptable to be evaluated using the multimedia format, multimedia testing was accepted
more when it was a closer reflection of the instruction. The findings showed that students
with more computer experience had lower anxiety and better attitudes toward using the
multimedia format. Students entering colleges in recent years were more technologically
literate. No significant differences in overall attitude and anxiety scores were found between
male and female students, or between liberal arts and natural sciences majors. # 2001 Elsevier
Science Ltd. All rights reserved.
Keywords: Technology-based assessment; Multimedia examination; Computerized testing; Digital testing
in undergraduate education

* Corresponding author. Tel.: +1-512-471-5211; fax +1-512-471-8460.


E-mail address: mliu@mail.utexas.edu (M. Liu).

0747-5632/01/$ - see front matter # 2001 Elsevier Science Ltd. All rights reserved.
PII: S0747-5632(01)00008-5
226 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

1. Theoretical framework

Technology-based assessment generally refers to the use of computers and other


electronic media (e.g. video) to assess and evaluate the progress of individuals in edu-
cational settings (Greenwood & Rieth, 1994). The advantages of computerized testing
have been well documented in the literature. Supporters maintain that testing on the
computers can minimize the errors involved in administration of the test by standar-
dizing the testing environment, and therefore increasing the reliability of the test
(Eaves, 1984–1985, 1985; Wise & Plake, 1990; Zandvliet & Farragher, 1997). It pro-
vides immediate scoring, which expedites decisions needed to be taken based upon the
test results. Computerized testing can provide the examinees with feedback and infor-
mation can be collected regarding test-taking behaviors. The examiner, for instance,
can collect data such as how many items have been reviewed by the examinee, or how
much time is spent on each item (Wise & Plake, 1990). Computerized assessment also
facilitates the evaluation of the test items, including analyses of the test items and bias,
and items’ reliability and difficulty (Eaves, 1984–1985, 1985). In contrast to paper based
testing, computerized based testing gives the examiner the capability to enrich the dis-
play of information by integrating multimedia elements. Finally, computerized tests
can be individualized and proceed in different ways based upon examinee’s responses.
Research has been done on the effectiveness of computerized testing. Zandvliet
and Farragher (1997) investigated the equivalence of a computer-administered test
(CAT) with a written format of the test. Analysis of the test scores indicated that
there were no significant differences attributable to the test format. Although the
CAT required more time than the written test, authors attributed this difference to
the fact that the students were unfamiliar with the computer format. They also
conducted a survey to identify students’ attitudes towards computerized testing. The
results showed that the students clearly preferred the computer-administered test.
Isham (1997) examined whether an interactive computerized assessment would help
interior design students improve their visualization skills. A paper assessment was
first developed to validate assessment problems for the computerized version. The
computerized assessment was based on the paper assessment and was similar in
content and sequence, but included animations. The results of the study indicated
that the average score of students who took the computerized version of the test was
higher than the score of those students who took the paper version. The author
suggested that the inclusion of animations in the assessment helped students with the
visualization skills. Soo and Ngeow (1998) compared students using a multimedia
self-assessment English proficiency course with students in a teacher-taught profi-
ciency class. Factors such as gender, race, and learning style were analyzed. The
results showed that students who used the self-paced multimedia program achieved
significantly higher TOEFL scores, and completed in less time than the students in a
traditional course. Multimedia instruction was effective across all genders, races,
and learning styles.
Although computerized testing has been in existence for more than a decade,
its use has been largely restricted to certain types of testing such as computerized
GRE, certain subjects which lend themselves more to computer use or to certain
M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248 227

instructors with more sophisticated computer knowledge. The use of multimedia in


computerized testing has also been limited. Currently, however, there is much
interest in integrating multimedia technology into the assessment, especially at the
college level. This interest is partly sparkled by several recent trends. More college
faculty members are integrating multimedia technology into their teaching. Innova-
tive instruction assisted by computer technology should be accompanied by inno-
vative uses of the assessment. Advances in technology make the creation of a
multimedia test possible for the majority of college faculty, even those with limited
computer knowledge. With the Web technology, more courses are being taught
online. Online instruction calls for the online assessment. Although paper-and-pencil
testing is still a norm today, there is a stronger interest and a need to explore the use
of online multimedia assessment so that the evaluation reflects the instruction more
adequately, and testing becomes not just an evaluation tool but also a learning tool
(Byrnes et al., 1991).
To implement multimedia assessment successfully, at least four factors have to be
considered: (1) students’ perspective toward taking a multimedia exam; (2) faculty’s
perspective toward making a multimedia exam; (3) content of a multimedia exam;
and (4) technical issues. Fig. 1 lists some of the questions one needs to answer for
each perspective. Research in each of these areas is needed.
Three factors could influence students’ view toward computerized testing: their
computer attitude, anxiety, and experience. Leg and Buhr (1992) found that exami-
nees’ attitude towards computerized testing was very positive and only a few stu-
dents with limited computer experience reported some levels of anxiety. The authors
stated that computer anxiety did not appear to affect students’ performance. This
finding is consistent with the results of studies by Reed and Overbaugh (1993) and
Chin, Donn and Conry (1991) who also found that computer anxiety was not a
significant factor in predicting performance. However, other studies indicated that
there was a correlation between computer anxiety and low scores (Perkins, 1995;
Reed & Palumbo, 1987). Perkins’ study indicated that students who owned a com-
puter and therefore had more computer experience demonstrated lower levels of
anxiety and performed better. Shermis and Lombard (1998) investigated the degree
to which computer and test anxiety were predictors of performance across
three computer-administered placement tests in math, English, and reading. Age
and computer anxiety were significant predictors for performance only in reading
assessment, but not for math or English.
The purpose of this study is to examine the acceptability of multimedia examina-
tions in undergraduate education from the perspectives of students and faculty.
There are at least two situations in which a multimedia test is used today: (1) the
instructor incorporates some form(s) of multimedia technology in delivering
instruction in a classroom and students take a multimedia test, and (2) the instructor
delivers instruction online incorporating multimedia and students take a multimedia
test. This second use, when the assessment reflects the instruction more closely, can
happen via the web for remote access or via a local network in a proctor-present
situation. This study investigates multimedia testing in both situations by comparing
how students and faculty in each situation view the use of multimedia examinations.
228 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

Fig. 1. Factors of importance.

2. Research questions

The research questions are:

1. What are students’ attitudes toward using the multimedia exam?


2. Do students have computer anxiety in taking the multimedia exam?
3. What factors may influence students’ attitudes toward using the multimedia
exam? The factors under investigation are gender, major, student classification,
and computer experience.
M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248 229

4. What are instructors’ attitudes toward using the multimedia exam?

Students and faculty in these questions refer to those in both situations.

3. Methodology

3.1. Participants

Two groups of undergraduate students from a large southwestern university par-


ticipated in this research. The demographic information of the two groups
was comparable (Table 1). The first group of participants (n=100) were students

Table 1
Demographic information

Regular course Online course

n=100 Percentage n=97 Percentage

Gender
Male 40 40 39 40
Female 60 60 58 60

Major
Liberal Arts 58a 60 65 67
Natural Sciences 8 8 13 14
Otherb 17 18 12 12
Undecided 13 14 7 7

Classification
Freshmen 23 23 15 15
Sophomore 37 37 25 26
Junior 24 24 21 22
Senior 16 16 36 37

Computers owners
Yes 84c 85 83 86
No 15 15 14 14

Frequent computer users


Yes 86 86 94 97
No 14 14 3 3

Having taken a computerized test


Yes 82d 88 89 92
No 11 12 8 8
a
Total n=96 for the Regular course for this item. Missing cases=4.
b
Other majors such as fine art, communication, business and education.
c
Total n=99 for the Regular course for this item. Missing cases=1.
d
Total n=93 for the Regular course for this item. Missing cases=7.
230 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

enrolled in a course in which the instructor delivered the instruction in the classroom
(the regular course). This course is typical of what is present in many universities:
the instructor delivered lectures using some form of multimedia technology (e.g.
using PowerPoint), and some graphics. Sometimes limited video and audio were
included in the delivery of the instruction. Students had labs to ask questions, and
then took a multimedia exam created using the multimedia templates described
below. The instructor took the effort to redesign the multimedia exam to not only
include the questions used on the paper exam from the previous semesters, but also
new questions that would utilize multimedia. The data were collected during the
academic year of 1998–1999.
The second group (n=97) were students enrolled in an online self-paced course in
which a multimedia CD-ROM was the primary instructional material with an
accompanying web site providing additional resources, test dates, reminders and
practice tests. This web site also served as a communication point between the stu-
dents and the instructor. Students studied the materials in their own time, and were
evaluated on their knowledge through online quizzes and exams created with
the multimedia templates (as discussed later). The exams incorporated much of the
multimedia from the CD-ROM and students could take the exams at a time con-
venient to them. Because of the self-paced nature of the course and the digital
instructional materials used, the assessment in this case reflected more closely what
was being taught. The data were collected during the academic year of 1999–2000.

3.2. Multimedia examination templates

The multimedia examination templates, used in this research, were developed at


the University of Texas — Austin for use in sciences with the support from the
National Science Foundation. The goal for creating these templates is to take
advantage of the recent advances in digital technology and utilize multimedia so as
to produce richer and more interactive evaluations than the paper medium allows.
Because the undergraduate teaching in various sciences often shares the use of
similar types of student performance assessments (e.g. laboratory assignments,
quizzes, mid-term and final exams), it is possible to produce a series of multimedia
exam templates that can be customized for use across disciplines. Instructors using
these templates can ‘‘copy, cut, and paste’’ their questions into the formats, add
multimedia materials, and utilize the interactive question formats to make their own
exams. Multimedia exams created using the templates are currently delivered on a
local network in a lab setting. A future goal is to deliver them on the web for remote
access.
The program allows instructors to build their customized exams. Instructors can
create their own exams from scratch or import their questions from an existing
word-processing file. They can edit the questions at any time, and add various media
to the questions. The program supports various file formats for image, audio, and
video. When the instructor clicks a button, the program will display a list of all the
media files available. From there she can select the file she wants to associate with
the question.
M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248 231

Each exam can have up to 10 sections and each section can have any of the six
different types of questions: multiple choice, true/false, filling-in, text matching,
point matching, and plotting questions. A screen shot for each question type is given
in Figs. 2–7.

(a) Multiple choice: a student chooses the right answer from a list of choices.
(b) True/False: a student indicates her choice by clicking on the box in front of
the words ‘‘true’’ or ‘‘false’’.
(c) Filling-in: this question requires a student to complete unfinished statements in
the space provided.
(d) Text matching: a student is to match a statement from one column with a
definition in another column.
(e) Point Matching: in this question format, a student must place items on an
image such as a photograph, a timeline, or a map. The possible places are made
apparent with a circle. The student is expected to match the circles with one of the
answer choices. A student first highlights an answer and then clicks the correct
location on the image for that answer choice.
(f) Plotting: this question type also involves the placement of items on an image.
But a student may click anywhere on the image to enter her answer. Items are
presented to the student sequentially and she must plot each item by clicking the
correct location on the plotting graph. After all points have been placed the stu-
dent can record her answers.

Fig. 2. Multiple choice question.


232 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

Fig. 3. True/False question.

Fig. 4. Filling-in question.


M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248 233

Fig. 5. Text matching question.

The program allows instructors to set the difficulty levels of the questions,
generate random questions, and alternative forms of the same exam to ensure
security and fairness of the exams for all students. Once a test is over, scores are
calculated and displayed immediately for the students. Though students cannot
retake the same exam after viewing the answers, they can review the content, exam
formats and get ready for the next exam.
The multimedia exam templates described here share similar characteristics to a
traditional paper-and-pencil exam in that it can contain the same content and
sequence as a paper exam, and use the same common testing formats such as
multiple choice, true/false and matching items. However, the new exam format
offers some innovative features that will be very hard to replicate and implement in
the paper exams. First, the presentation of questions is greatly enhanced through the
use of images, video, animation, and sound. The use of multimedia is particularly
attractive to visual learners who need assistance to understand abstract concepts.
The multimedia elements, directly related to the content of the questions, can aid the
learner in understanding the questions, and enhance learners’ ability to recall
the information necessary to answer the questions. The multiple representations of
each question, through verbal and non-verbal stimuli, can accommodate learners
with different learning styles. This is strongly supported by the dual code theory,
which suggests that the retrieval of information facilitated through multiple stimuli
can cue the right information (Ormrod, 1995).
The second important feature of this new exam format is that it allows a more
dynamic and interactive type of questioning and answering facilitated through the
technology. The questions can present real world situations more easily, where
234 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

Fig. 6. Point matching question.

Fig. 7. Plotting question.

the student is asked to make observations, take measurements, collect data, form
hypothesis, and solve a problem. Using this type of evaluation, students are being
assessed on the final answer, as well as on the steps she follows to reach the answer.
M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248 235

The goal of these questions is, therefore, not only for the students to find a solution,
but also to learn from the problem-solving process.

3.3. Data sources

3.3.1. Questionnaire
A questionnaire was devised to examine how the students feel about taking the
multimedia exams. The questionnaire has three parts: demographic information,
Likert scale items, and open-ended questions (Appendix). There are 26 items on a 5-
point Likert-scale with 1 being ‘‘I strongly disagree’’ and 5 being ‘‘I strongly agree’’.
These 26 items address two areas related to taking the multimedia exams: Anxiety
towards taking multimedia exams (five items), and attitudes towards taking multi-
media exams (21 items). The items on anxiety towards taking the multimedia exams
were rephrased from a modified version of Spielberger’s self-evaluation anxiety
scale (Reed & Palumbo, 1987). For example, the original statement was ‘‘I feel
upset when I am given a computer assignment’’ (Reed & Palumbo, 1987). This
statement was modified to be ‘‘I feel upset when I take a multimedia exam’’ for
this study. Other examples include ‘‘I feel at ease when I take a multimedia exam,’’
and ‘‘I feel nervous when I think about multimedia exams.’’ The attitude items were
based upon Richards, Johnson, and Johnson’s computer attitude scale (1986), and
the multimedia and hypermedia literature. Examples include ‘‘I had difficulties
completing the exam because it was a multimedia exam,’’ ‘‘I see no reason in
replacing paper-pencil exams with multimedia exams,’’ ‘‘Multimedia exam is a
more effective way of assessment than a paper-pencil test,’’ and ‘‘Multimedia ele-
ments (sound, images, video) helped me to answer the questions.’’ More items were
included on the attitude as it was of primary interest to this study. The ques-
tionnaire was examined by three content experts for validity and pilot-tested by a
group of five students before being used. When used again for the online course in
the subsequent year, some items were rephrased for clarity and two additional items
on using multimedia to enhance learning were added. The open-ended questions in
the questionnaire included such questions as ‘‘How do you like or dislike taking a
multimedia exam’’, ‘‘Which part of the multimedia did you like the most? Why?’’,
and ‘‘Is taking this multimedia exam more challenging to you? If so, in what
ways?’’

3.3.2. Interviews and observations


Systematic interviews were conducted to seek more in-depth information on why
the students responded in a certain way. Interviews were conducted both with the
instructors and the students. Three interviews were conducted with three different
groups of students for each population. The interviews were semi-structured. Three
questions were used to start the interviews (e.g. ‘‘Do you think the multimedia ele-
ments helped you while answering the questions?’’). The researchers made an effort
to keep the interview process open, allowing students to express their opinions
freely. Follow-up questions were used if further explanations were needed. Some
interview questions overlapped the open-ended questions on the questionnaire. The
236 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

reason for doing this is that not all students put down long and meaningful enough
responses in the questionnaire. An additional two interview questions were added to
the online course (e.g. ‘‘What difficulty do you have when you use the multimedia
exam program?’’). An extensive interview was conducted with the instructors using
10 questions. Example questions were ‘‘Why did you decide to use the multimedia
exam for your course?,’’ ‘‘Is the multimedia exam for this class the same with the
paper one you used before? In what ways are they the same or different,’’ and ‘‘How
easy or difficult was it to make the multimedia exam as compared to the paper
exam?’’ In addition, observations of the review sessions and the testing sessions for
the self-paced online course were made to see how students used the multimedia
exams in the actual testing setting.

3.4. Analysis

To answer the research questions, two-way ANOVAs were run with the grouping
(the Regular course vs. the Online course) as one independent variable and gender,
major, or student classification as the other independent variable. The dependent
variables were anxiety and attitude toward taking the multimedia exam. Simple
regression analyses were run to see if computer experience could in any way predict
students’ anxiety and attitude toward taking the multimedia exam.
Following the guidelines by Miles and Huberman (1994), the responses to the
open-ended questions and the interview data were analyzed using a two-level
scheme. At the first level, codes (labels indicating the meaning of the descriptive
information) were generated directly from the written responses or interviews
through multiple passes of the data examination. At the second level, meaningful
chunks of the data were regrouped according to the research questions. Redundant
information was taken out. The qualitative data were used to provide additional in-
depth information for the research questions. The observation data were used to
corroborate with the findings from the open-ended questions of the questionnaire
and the interview data.

4. Results

4.1. Results from the quantitative analyses

Students’ anxiety toward using the multimedia exams was measured on a Likert
scale of 1–5, the lower the number, the lower the anxiety. The overall anxiety score
was 2.6. The results of the two-way ANOVAs showed that there were no significant
interactions between the grouping (Regular class vs. Online class) and gender, major
or classification variables (Table 2). But the main effect of grouping was significant
at P < 0.01: Meanregular=3.00, Meanonline= 2.20 (Table 2). The online class had sig-
nificantly lower anxiety toward taking the multimedia exams than the regular class.
The mean score on attitude for the entire group was 3.23, the higher the number,
the better the attitude. The two-way ANOVAs showed that there was a significant
M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248 237

interaction between the grouping and gender (Table 2). The male students in the
online class had significantly better attitudes than the female students in the online
class while the attitudes for both male and female students were about the same in
the regular class. There were no other significant interactions. But the main effect of
grouping was significant at P < 0.01: Meanregular=2.93, Meanonline= 3.53 (Table 2).
The online class had significantly better attitudes toward taking the multimedia
exams than the regular class.
The results of the simple regression showed that there was a low but significant
negative relationship between the students’ computer experience and their anxiety
toward taking the multimedia exam: r=0.16, P < 0.05, t(195)= 2.22, P < 0.05,
beta=0.04, beta weight= 0.16. That is, the more computer experience the students
had, the lower their anxiety was. There was a low but significant positive relation-
ship between the students’ computer experience and their attitudes toward taking
the multimedia exam: r=0.18, P < 0.05, t(193)=2.52, P < 0.05, beta=0.03, beta

Table 2
Means and standard deviations (in parenthesis) of Anxiety and Attitude scores

n Anxiety Attitude

Regular class 100 3.00 (0.57) 2.93 (0.33)


Male 40 3.04 (0.67) 2.92a (0.30)
Female 60 2.98 (0.49) 2.94a (0.35)

Liberal Arts 58 3.02 (0.53) 2.90 (0.33)


Natural Sciences 8 2.80 (0.74) 2.83 (0.29)
Other Majors 17 3.00 (0.68) 3.01 (0.41)
Undecided 13 2.88 (0.37) 3.08 (0.23)

Freshman 23 2.86 (0.59) 3.11 (0.21)


Sophomore 37 3.09 (0.55) 2.97 (0.29)
Junior 24 3.06 (0.51) 2.74 (0.39)
Senior 16 2.91 (0.66) 2.84 (0.33)

Online class 97 2.20b (0.60) 3.53b (0.62)


Male 39 2.13 (0.69) 3.75c (0.50)
Female 58 2.24 (0.53) 3.39c (0.66)

Liberal Arts 65 2.18 (0.61) 3.51 (0.68)


Natural Sciences 13 2.39 (0.58) 3.60 (0.48)
Other Majors 12 2.02 (0.63) 3.73 (0.48)
Undecided 7 2.29 (0.46) 3.27 (0.50)

Freshman 15 1.93 (0.42) 3.62 (0.66)


Sophomore 25 2.10 (0.69) 3.70 (0.56)
Junior 21 2.38 (0.46) 3.32 (0.70)
Senior 36 2.27 (0.63) 3.51 (0.60)
a
Missing one case.
b
Significant main effect, significantly different from the Regular class, P <0.01.
c
Significant interaction effect (Grouping  Gender), P<0.01.
238 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

weight=0.18. That is, the more computer experience the students had, the better
attitudes they had.
The demographic data indicated that the regular class and the online class had
similar demographics. Both groups had 40% of male students and 60% female
students. Both had more liberal arts majors, and similar breakdowns of freshmen,
sophomores, juniors, and seniors. The majority of each group owned computers.
However, there were more frequent computer users in the online course than in the
regular course (Table 3). There were noticeably more students familiar with using
email, surfing on the web and using CD-ROMs in the online course than in the reg-
ular course (Table 3). That is, there were fewer computer novices in the online class.

4.2. Results from the qualitative analyses

Students particularly liked the visual and interactive aspects of the multimedia
exams. The visuals helped students recall the information so they could spend less
time memorizing the details and more time on understanding the concepts. The
interactive questioning makes best use of the multimedia technology and places
more emphasis on problem solving. To answer those interactive questions accu-
rately, students must understand not only the concepts, but also how to apply them
in solving a problem. Typical students’ comments from the interviews and the writ-
ten questions included:

I like the interactive part, where there was this whole case that you had to work
on and do different stuff, collect and analyze data. . . And each part builds on
the previous one. And the good thing is let’s say you missed one question and
then you move to the next part and you have to use the results of that ques-
tions, it will give you the correct one to move from there. I think that’s good,
because you actually learn something from the test. You do something wrong,
but then you realize why it’s wrong.

[I liked] The interactive part as opposed to the Multiple choice part. I liked the
fact that I was able to take measurements, rotate things and see them from dif-
ferent angles. I also liked plotting a graph, because you were able to change your
answers, see the different possibilities and estimate which one is the correct.

[Do you like the multimedia exam?] Yes, because like in the lab and in the class
we use these materials and we have these 3D models. And then the professor
uses the same things in the exam. And it’s like another way to help you
remember.

Pictures and animations can give you clues when you are trying to remember
something.

Multimedia exam appeals to me visually and therefore, seems easier, although


the material may not be. It’s more accessible.
M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248 239

Table 3
Computer use and computer experience

Computer use Regular course Online course

n=100 Percentage n=97 Percentage

1. Word Processing
Yes 90 90 96 99
No 10 10 1 1
2. E-mail
Yes 83 83a 96 99
No 17 17 1 1
3. Surfing on the Internet
Yes 75 75 92 95
No 25 25 5 5
4. Using CD software (e.g. computer games)
Yes 39 39 69 71
No 61 61 28 29
5. Web designing using editors (e.g. FrontPage, Claris HomePage)
Yes 14 14 18 20
No 86 86 79 80
6. Web designing using HTML
Yes 14 14 21 22
No 86 86 76 78
7. Using authoring programs (e.g. Authorware, Director)
Yes 3 3 5 5
No 97 97 92 95
8. Using programming languages (e.g. Java, C++)
Yes 9 9 13 13
No 91 91 84 87
Overall Computer Skillsa
Novice (computer uses 1–3, see above) 17 17 5b 5
Intermediate (computer uses 4–6) 80 80 78 82
Expert (computer uses 7–8) 3 3 12 13
a
Any differences greater than 12 are highlighted in the table.
b
Total n=95 for the Online course for this item. Missing cases=2.

Being able to manipulate things, take measurements and things like that where
you wouldn’t be able to do on a piece of paper.

Yes, because I spent less [time] remembering how something exactly looks like
and I was focus [ing] more what its feature[s] mean so that when it comes to
application I can use that instead of spending ten minutes trying to remember
what it looks like.
240 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

When asked to discuss the delivery media of the exams, more students preferred
the multimedia exam format to the paper format. For the regular course, 44% said
they preferred the multimedia exam format, 30% said they preferred the paper for-
mat and 26% said they did not have a particular preference. For the online course,
65% preferred the multimedia exam format, 24% preferred the paper format and
11% had no particular preference. There are several reasons for liking the multi-
media format. The most frequently cited reasons are (1) immediate feedback, and (2)
the use of visuals such as graphics, animation, video and audio (Table 4). When
students finished the exam, the computer calculated the scores and let the students
know their grades right away. While doing practice tests, students would get
the results immediately so they would know which part they needed to review. The
multimedia format saved students time from waiting so they could spend the pre-
cious time studying.
Many students were very conscientious about protecting the environment as they
cited that there would no waste of paper when using the digital media. Some stu-
dents found the multimedia format more interesting and fun to use than the paper
format. Other students liked the multimedia format because they could take the
exam at a time most convenient to them (Table 4).
However, some students were apprehensive about using the multimedia format.
One major concern was that valuable time was lost in scanning the test to get a feel
of the test items when questions were presented one at a time on the screen.
According to the interface design principles, it is important not to present a lot of
information on the limited screen space so as not to overload the learners. One
question per screen is appropriate especially when it is accompanied by graphics,
animation, or video. Yet, if there were 30 questions on the exam, a learner would
need to view 30 screens to get an idea of the question types. It is much easier to scan
the questions on paper. One student said, ‘‘With the multimedia exam you can see
one question at a time, whereas with a paper exam you can scan over all questions
and get a feel for the test.’’ Another concern was the technical difficulty some stu-
dents experienced as computers sometimes crashed or froze. Or sometimes the soft-
ware did not work properly. In addition, a number of students mentioned that they
were not comfortable with the digital media. One said, ‘‘I don’t like computers. I’m
not into technology at all. I still use a typewriter for my papers.’’ A few students
mentioned that reading from a computer screen was tiring.
Students had different opinions about which testing environment was more
stressful. Some preferred the multimedia format because they liked the flexible
scheduling and the use of visual media, which other students found distracting.
Many students stated that taking the multimedia exam was not more challenging
than taking the paper exam. One student stated: ‘‘It’s not more challenging because
you don’t have to be a computer genus to be able to do it. If anything, it’s more
convenient.’’
Interviews conducted with the professors showed that they were enthusiastic
about using the multimedia exams in their teaching. According to the professors,
multimedia format makes it possible to include more interactive, sophisticated and
dynamic questioning. The multimedia exam templates include not only questions
M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248 241

Table 4
Reasons for liking the multimedia exam format

Reasons listed from the most frequently to least frequently cited with sample statements

Immediate feedback
‘‘Speed. You get your score right away. No waiting to see how you did.’’
‘‘Instant feedback, because I knew right away what I did wrong.’’
‘‘Finding out grades as soon as done.’’
‘‘Multimedia. Because it was faster and much less of a hassle. ’’
‘‘The multimedia, because it goes faster than the traditional exams. Also the results are given to you right
after you take the exam.’’

Using visuals (i.e. graphics, audio, video, animation)


‘‘Visual stimulation helps recall information.’’
‘‘Graphics, pictures. They help trigger facts/concepts that were in the text.’’
‘‘Graphics are helpful. They help making connections to answers.’’

No waste of paper
‘‘Multimedia. It doesn’t waste paper, quick results.’’
‘‘I don’t like pencil-paper tests, which are wasteful.’’

Interactivity
‘‘[Multimedia format is a] better application of the material I learned.’’
‘‘Interactive part, because it’s more like problem solving, gives more info to answer the questions.’’
‘‘Interactive part is more engaging. No need for rote memorization.’’

Interesting, easy, and fun to use


‘‘I prefer multimedia exam. It seems to be easier and more interesting to use.’’
‘‘It shows the questions in a clear way. It’s easier to understand what the question is asking.’’
‘‘I think multimedia is helpful, compared to the traditional ones. It’s fun, it’s authentic.’’
‘‘It’s easier to answer questions. It’s quick. I don’t need to bring any stationery here to take the exam, very
convenient.’’

Relaxing and less anxiety


‘‘I like the multimedia environment, it’s less stressful and that helps resolve test anxiety.’’
‘‘I like multimedia exams due to the general lack of difficulty moving from one screen to another. I only
see one question at a time. The lack of feeling overwhelmed. It makes testing more relaxing.’’
‘‘Multimedia tests. Hmm, it’s fun. Color, graphics, I surf around, feel less pressured.’’
‘‘I like more relaxing atmosphere to take exams. Multimedia exams are casual for me.’’
‘‘I prefer casual. It’s been hard to take exams. Multimedia exams help relieve the hard feelings.’’
‘‘I feel more comfortable with multimedia exams. I don’t feel so pressured.’’

Flexible schedule of the test times


‘‘and you could take them on your own schedule. . .’’
‘‘Multimedia tests. It’s self-paced. I can come to take the test at any time I want to.’’
‘‘Yes. It’s self-paced. I can adjust my schedule weekly, very flexible.’’

typically found in a paper exam, but also other types of questions and activities.
These question formats allow more realistic problem presentation and can engage
students in observing, analyzing and synthesizing information better than the paper
medium allows. In these types of questions students are graded not only on the final
242 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

result, but also on the process and procedure they follow. Consequently, learning
becomes an integral part of the assessment, as one professor said ‘‘. . .the exam itself
is more than just a testing process. It is a learning process as well.’’ Another com-
mented that ‘‘. . . these questions can become very real and encourage critical think-
ing.’’ ‘‘With the multimedia exam we will be able to prepare the students towards
those higher level skills and assess them on those, too’’.
The multiple representation of information through multimedia should help stu-
dents recall information necessary to answer the questions. One professor com-
mented, ‘‘Many students are visually oriented and having the multimedia content
allows me to present materials in a completely different manner. The exams also more
closely mirror the content that is taught.’’ The professor, who taught the regular
course, recognized that due to the time constraints, not sufficient video, audio, and
animation were built in the exam to take full advantage of the templates. He hoped to
add more dynamic and interactive features the next time he used the templates.
When asked to discuss the content of the multimedia exams, the professors stated
that rather than simply converting from the paper exams, they redesigned the exams
to take advantage of the capability offered by multimedia technology and the tem-
plates. They emphasized that the process of creating the multimedia exam required a
shift in their thinking and gave them an opportunity to examine how to use various
media to support learning. As a result, the professors indicated that the initial
investment of creating a multimedia exam was heavier than creating a paper exam.
One stated, ‘‘It is a lot more than simply converting a paper exam to a computer
exam, because the dynamics of the multimedia exam and the materials you are going
to use are very different.’’ But the professors believe that though these exams gen-
erally take longer to produce, ‘‘by using random question banks, they will have a
longer shelf-life and are easier to revise.’’

5. Discussion

5.1. Students’ perspective toward taking multimedia exams

An important factor that could impact the successful implementation of the


multimedia exam format is how the students feel about using it in their learning. We
looked at two different groups, students enrolled in a course where the instructor-led
teaching method was accompanied by the multimedia assessment, and students in an
online course in which the delivery medium for both teaching and testing was digi-
tal. The students were in various majors, and ranged from freshmen to seniors. The
findings on attitudes showed that the overall attitude toward using the multimedia
format was very positive. Students cited the immediate feedback, the use of the
visual media and the interactive features as the main reasons for liking the multi-
media exams. They felt the use of rich media helped them understand concepts and
develop problem-solving skills more effectively. Students from the online course felt
more positively than the students from the regular course. This is possibly due to the
fact that the multimedia assessment reflected more closely the instruction delivered
M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248 243

for the online course. Another explanation could be that taking online instruction
helped alleviate some of the initial concerns with the technology and offered students
more practice using the technology. By studying the instructional materials in the
digital format and taking practice quizzes similar to the final tests, students in
the online course were more prepared. Students less enthusiastic about the multi-
media format were those who were less comfortable with the computers in general.
When the students become more used to the media through practice, their dis-
comfort with the technology will likely disappear.
The results have indicated that the more computer experience one has, the better
attitude and lower anxiety one will have. This finding supports the literature that
acquiring more computer experience is a key to lower computer anxiety (Leg &
Buhr, 1992; Perkins, 1995; Reed & Overbaugh, 1993). The encouraging trend shown
in this study is that college students have become more technologically literate in
recent years. Ninety percent of the students were familiar with word-processing
in the year of 1998–1999 while in 1999–2000, ninety-nine percent of the students
used word-processing software. The use of email, Web and CD-ROM technologies
has increased from 83 to 99%, 75 to 95%, and 39 to 75%, respectively from the
previous year to the subsequent year (Table 3). As a result, the number of computer
novices has dropped from 17 to only 5% in a year. Students also become better
versed in more advanced uses of computers. This trend should continue as technol-
ogy becomes more integrated into the K-12 and college education.
The population of this study consisted of more female students and more liberal
arts majors. It is encouraging to find that there were no significant differences in the
overall attitude and anxiety scores between male and female students, or between
liberal arts and natural sciences majors. The gender gap, found in computing litera-
ture in the past, is beginning to be narrowed as technology becomes an integral part
of our education and everyday living.
The concerns about using multimedia in testing were mostly related to the limita-
tions of the technology at its current status. Students experienced technical problems
such as computer crashes. Students lost precious time when they had to restart or
switch computers. As the hardware and software continue to improve, computers
will become more powerful and reliable, and reading from the computer screen will
become easier.

5.2. Faculty’s perspective toward making multimedia exams

The power of multimedia is to enhance the presentation and representation of


information through video, audio, animation, and graphics, and to support dynamic
and interactive questioning, both of which the paper medium could not easily
accomplish. Though the instructors recognized that the templates could provide the
same content and sequence as the paper exams, they chose to redesign the exams to
take advantage of the multimedia and interactive features made possible by the new
formats. The multimedia exams were enhanced versions of the paper exams
according to the instructors. In addition to including what was on the paper exams,
the questions were enriched by what and how they were presented to the students.
244 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

Such multimedia examination offers students a new experience and reflects more
closely what is being taught. In the online course, multimedia in the form of a CD-
ROM and a web site was the primary teaching medium. It would be quite inad-
equate to assess what the students have learned through a paper exam. The pro-
fessor who taught the online course said, ‘‘I found that I was using an ever
increasing amount of media in my course lectures and labs that I could not incor-
porate into traditional paper exams. So, I was not able to test the material that I
taught. The multimedia exams solved this problem for me.’’
Though the multimedia exam templates are very easy to use and instruction is
available to guide the instructors, creating a multimedia exam is not a simple con-
verting process, as the professors in the study pointed out. Collecting and creating
the digital media (before a multimedia exam can be created) requires a significant
amount of time and skills. Redesigning the exams requires a paradigm shift in one’s
thinking. Instructors who intend to use the multimedia technology in their teaching
and testing have to be willing to examine the way they teach and the way their stu-
dents learn. They have to be willing to be adventurous in their teaching and experi-
ment with the technology for the purpose of enhancing their students’ learning. This
means they could not use the same ‘‘old’’ exams and need to keep up with the
advances of the technology. It is also recognized that the multimedia templates
developed in the research can be improved. As more professors begin to use the
templates, they will find innovative ways to create the exams and utilize the poten-
tials of the technology.

6. Conclusion

So how acceptable it is to use the multimedia exam as a primary assessment form


in the undergraduate education? The results of this field-testing suggest that there is
strong support from the students and the faculty. They recognize and embrace the
interactive nature of the technology, and feel the incorporation of rich media in
assessment can provide additional support for their learning and teaching. While
students from both the regular and online courses found it acceptable to be evalu-
ated using the multimedia format, multimedia testing is accepted more when it is a
closer reflection of the instruction. The findings have shown that students with more
computer experience have lower anxiety and better attitudes toward using the
multimedia format. Encouragingly, more college students are becoming computer
literate and the gender gap in computer use is being narrowed.
The possibilities made available through internet and multimedia force us to
reexamine the ways we teach, learn, and evaluate. With more and more courses
being offered online, computerized testing is a logical way for evaluation. If tech-
nology continues to be a part of the delivery medium in teaching, it is natural to
deliver assessment in the digital format as well. Examining how to make testing a
part of the learning process is an important challenge. After all, the goal of evalua-
tion is not to merely test students, but to help them learn. Understanding students’
and facultys’ perspectives can help implement digital testing successfully.
M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248 245

Acknowledgements

The authors wish to thank John Kappelman for the research opportunity, and
Claud Bramblett, and Rob Scott for their cooperation in some of the data collection

Appendix A. Questionnaire on the use of multimedia examinations

Section I. General Information


A. Demographic Characteristics
Major: ___________________
Male ______ Female _______
Identify your student classification.
___ Freshman
___ Sophomore
___ Junior
___ Senior
___ Graduate program

B. Computer Skills and Experience:


1. Do you own a computer? Y __ / N __
2. How often do you use a computer?
a. Never
b. Rarely (Few times a month)
c. Often (Weekly)
d. Very Often (Daily)
3. Which of the following applications do you use computers for:
a. Word Processing____
b. Using E-mail____
c. Surfing on the Internet____
d. Using CD software (e.g. computer games)____
e. Web designing using editors (e.g. FrontPage, Claris HomePage)____
f. Web designing using HTML____
g. Using authoring program (e.g. Authorware, Director)____
h. Using programming languages (e.g. Java, C++)____
4. Rank your overall competence using computers:
Novice Intermediate Expert
1 2 3 4 5 6 7 8
Note: 1=Can use word-processing only.
4=Competent in the first four uses above.
8=Competent in all of the above uses.

5. Have you ever taken a computerized test? Y __ / N __


246 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

Section II. Attitudes towards multimedia examinations


Circle the number that accurately describes how you feel for each of the following
statements
strongly strongly
disagree agree
1. I feel calm when I think about the multimedia exam. 1 2 3 4 5
2. I had difficulties completing the exam because it was 1 2 3 4 5
a multimedia exam.
3. Multimedia elements (sound, images, video) give 1 2 3 4 5
me clues, which help me answer the questions.
4. It took me more time to complete the exam because 1 2 3 4 5
it was in the multimedia format.
5. Multimedia elements help, when I am trying to 1 2 3 4 5
remember things.
6. I feel at ease when I take a multimedia exam. 1 2 3 4 5
7. Concepts in the questions were clearer, because 1 2 3 4 5
multimedia elements were used.
8. I feel upset when I take a multimedia exam. 1 2 3 4 5
9. It was easier to complete the exam because it was in 1 2 3 4 5
the multimedia format.
10. I feel nervous when I think about multimedia 1 2 3 4 5
exams.
11. Multimedia elements make the multimedia exam 1 2 3 4 5
more appealing than the paper-exam.
12. Taking a multimedia exam requires more computer 1 2 3 4 5
skills than I have.
13. Multimedia elements (sound, images, video) 1 2 3 4 5
help the visual learners.
14. I am relaxed when I type answers into a 1 2 3 4 5
multimedia exam.
15. Multimedia elements will not enhance my 1 2 3 4 5
performance.
16. I prefer taking a multimedia exam than taking a 1 2 3 4 5
paper-pencil exam.
17. Multimedia elements did not help in answering the 1 2 3 4 5
questions.
18. I see no reason in replacing paper-pencil exams 1 2 3 4 5
with multimedia exams.
19. Multimedia elements make the exam more 1 2 3 4 5
interesting.
20. I feel confused when I try to think answers I need 1 2 3 4 5
to type into a multimedia exam.
21. The combination of multimedia elements helped 1 2 3 4 5
me to recall information necessary for answering
the questions.
M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248 247

22. Multimedia exam is a more effective way of 1 2 3 4 5


assessment than a paper-pencil test.
23. The combination of the multimedia elements 1 2 3 4 5
facilitates students with different learning styles.
24. Multimedia exams can measure my performance 1 2 3 4 5
better than paper-pencil exams.
25. I am motivated to learn the subject because of the 1 2 3 4 5
instant feedback from multimedia exams.
26. I’m more likely to take a course if it is evaluated 1 2 3 4 5
in the multimedia format.

Section III. Answer the following questions


1. How do you like or dislike taking a multimedia exam?
2. Which part of this multimedia exam, do you like the most? Why?
3. Is taking this multimedia exam more challenging to you? If so, in what ways?
4. Which of the multimedia elements would you like to have been used more in
the test? Why?
5. Have you taken the online sample exam before you take the real multimedia
exam? How many times did you practice it? Is it helpful?

References

Byrnes, M. E., Forehand, G. A., Rice, M. W., Garrison, D. R., Griffin, E., McFadden, M., & Stepp-
Bolling, E. R. (1991). Putting assessment to work: computer-based assessment for development educa-
tion. Journal of Developmental Education, 14(3), 2–4, 6,8.
Chin, C. H. L., Donn, J. S., & Conry, R. F. (1991). Effects of computer-based tests on the achievement,
anxiety, and attitudes of grade 10 science students. Educational and Psychological Measurement, 51,
735–745.
Eaves, R. C. (1984–1985). Educational assessment in the United States. Diagnostique, 10(1–4), 67–78.
Eaves, R. C. (1985). Educational assessment in the United States. In Sandals, L. (Ed.), An overview of the
uses of computer-based assessment and diagnosis (1992). Canadian Journal of Educational Commu-
nication, 21(1), 67–78.
Greenwood, C., & Rieth, H. (1994). Current dimensions of technology-based assessment in special edu-
cation. Exceptional Children, 61(2), 105–113.
Isham, D. (1997). Developing a computerized interactive visualization assessment. The Journal of
Computer-Aided Environmental Design and Education 3(1). Online. Scholarly Communications Project,
University Libraries, Virginia Tech. Available at: http://scholar.lib.vt.edu/ejournals/JCAEDE/v3n1/.
(21 January 1999).
Leg, S. M., & Buhr, D. C. (1992). Computerized adaptive testing with different groups. Educational
Measurement, 1(2), 23–28.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage
Publications.
Ormrod, J. E. (1995). Human learning. New Jersey: Prentice-Hall.
Perkins, R. F. (1995). Using hypermedia programs to administer tests: effects on anxiety and perfor-
mance. Journal of Research on Computing in Education, 28(2), 209–220.
248 M. Liu et al. / Computers in Human Behavior 17 (2001) 225–248

Reed, M. D., & Overbaugh, R. (1993). The effects of prior experience and instructional format on teacher
education students’ computer anxiety and performance. Computers in the Schools, 9(2/3), 75–89.
Reed, M. W., & Palumbo, D. (1987). Computer anxiety: its relationship with computer experience and locus
of control. Paper presented at the Annual Conference of the Eastern Educational Research Association,
Boston, MA.
Richards, P. S., Johnson, D. W., & Johnson, R. T. (1986). A scale for assessing student attitudes toward
computers: preliminary findings. Computers in the Schools, 3(2), 31–38.
Shermis, M. D., & Lombard, D. (1998). Effects of computer-based test administrations on test anxiety
and performance. Computers in Human Behavior, 14(1), 111–123.
Soo, K., & Ngeow, Y. (1998). Effective English as a second language (ESL) instruction with interactive
multimedia: the MCALL project. Journal of Educational Multimedia and Hypermedia, 7(1), 71–89.
Wise, S., & Plake, B. (1990). Computer-based testing in higher education. Measurement and Evaluation in
Counseling and Development, 23(1), 3–10.
Zandvliet, D., & Farragher, P. (1997). A comparison of computer-administered and written tests. Journal
of Research on Computing in Education, 29(4), 423–438.

You might also like