You are on page 1of 20

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/223771086

Exploring dimensions to online learning

Article  in  Computers in Human Behavior · July 2007


DOI: 10.1016/j.chb.2005.10.002 · Source: DBLP

CITATIONS READS

81 3,239

3 authors, including:

Raafat George Saade Dennis Kira


Beijing Institute of Technology Concordia University Montreal
161 PUBLICATIONS   2,599 CITATIONS    64 PUBLICATIONS   916 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Peer to Peer Learning View project

Statistical Anxiety View project

All content following this page was uploaded by Dennis Kira on 08 January 2019.

The user has requested enhancement of the downloaded file.


Computers in
Human Behavior
Computers in Human Behavior 23 (2007) 1721–1739
www.elsevier.com/locate/comphumbeh

Exploring dimensions to online learning


Raafat George Saadé *, Xin He, Dennis Kira
Department of Decision Sciences and MIS, John Molson School of Business, Concordia University,
1455 De Maisonneuve Blvd., Montreal, Que., Canada

Available online 1 December 2005

Abstract

The study presented in this paper sought to explore several dimensions to online learning. Iden-
tifying the dimensions to online learning entails important basic issues which are of great relevance
to educators today. The primary question is ‘‘what are the factors that contribute to the success/fail-
ure of online learning?’’ In order to answer this question, we need to identify the important variables
that: (1) measure the learning outcome and (2) help us understand the learning experience of students
using specific learning tools. In this study, the dimensions we explored are studentÕs attitude, affect,
motivation and perception of an Online Learning Tool usage. A survey methodology was used such
that validated items from previous relevant research work were adopted. 105 students completed the
questionnaire. An exploratory factor analysis (EFA) was implemented on the data captured. Results
of the EFA identified the items that are relevant to the present context and the items that can be used
to measure the dimension to online learning. Affect and perception were found to have strong mea-
surement capabilities with the adopted items while motivation was measured the weakest.
Ó 2005 Elsevier Ltd. All rights reserved.

Keywords: Dimensions; Affect; Perceptions; Motivation; Learning; Attitudes

1. Introduction

The past ten years has seen a rapid increase in Internet use. Individuals are continu-
ously using the Internet to perform a wide range of tasks such as research, shopping and
learning. Education has expanded from the traditional in-class environment to the new

*
Corresponding author. Tel.: +1 450 682 5563.
E-mail address: rsinfo@sympatico.ca (R.G. Saadé).

0747-5632/$ - see front matter Ó 2005 Elsevier Ltd. All rights reserved.
doi:10.1016/j.chb.2005.10.002
1722 R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739

digital phenomenon where teaching is assisted by computers (Richardson & Swan,


2003). Today, we find a vast amount of courses, seminars, certificates and other educa-
tional offerings on the Internet. This wave of educational material and online learning
has challenged the effectiveness of the traditional educational approach still in place
at universities and other education institutions. Consequently, these institutions are
struggling to redefine and restructure their strategies in providing education and deliver-
ing knowledge (Association of European Universities, 1996). With todayÕs student
demographics, educational institutions are rushing to meet the needs of the new learner
by designing and setting up online learning tools as support to the computer assisted
classroom.
In online learning, there seems to be two major research clusters where one deals
with the development of good designs while the other deals with the assessment of stu-
dentÕs satisfaction with an online course as it relates to a traditional face-to-face course.
There is a sense of great expectations surrounding the development and use of online
courses because of its versatility, flexibility and personalization potential. Many
researchers today (Poole & Lorrie, 2003; Saadé, 2003; Shih, Munoz, & Sanchez,
2004; Sunal, Sunal, Odell, & Sundberg, 2003) are advocating that these expectation
should be contrasted with proper empirical studies and rigorous results analysis evalu-
ating the efficiency of the online learning courses that are being implemented. The
implication to such type of research touches many domains due to the multi-dimen-
sionality of online learning.
For starters, the instructor who once simply walked into a classroom to teach now
must construct a virtual environment to support and engage in distance learning. To
that effect the instructor is required to not only know the subject matter but learn the-
ories related to education, computer science, and behavior as well. Moreover, the
instructor also needs to learn the success factors to online learning and modify not
only his/her role but character traits as well. Finally, the instructor needs to develop
different communications skills, be more creative and imaginative and develop broader
computer competencies. It is safe to assume that the implications of an instructor wish-
ing to supplement his/her classroom teaching or teach a complete online course are
personal as much as they are professional. That need to create online space has lead
many instructors to view themselves as ‘‘builders’’ first and foremost and then as
‘‘teachers’’ (Blythe, 2001; Gillette, 1999).
So far, the efforts devoted to reporting on experiences with online learning have not
been substantiated and have not progressed beyond the ‘‘no significant difference’’ when
compared to classroom based courses. It is also known that applying information technol-
ogies into the learning process not only modifies the instructorÕs role but also that of the
students. To add insult to injury, the design of an online course entails addressing many
elements that influence the learning process. Such elements include but are not limited
to instructional strategy, pedagogy integration, learning theory used, learning tools made
available, and facilitation of processes of communication among learners and instructor
and technologies used. All these design variables also lend themselves to the type of course
being implemented online.
In effect, we end up with a mix of inter-related components identified by four major
categories namely, the human component, the design component, the instructional com-
ponent and the performance component. Each one of these components contains inter-
related variables. As a result, when it comes to the development of an online course, it
R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739 1723

is necessary to analyze how the roles of instructors and students are affected and to
develop appropriate procedures, best practices or guidelines that allow the design and
evaluation of the impact of online courses on the learning process and to foster learning
online (Savenye, Olina, & Niemczyk, 2001).
Considering the design of an online course, the instructor should be aware of two major
approaches namely, system-centered and user-centered. The system-centered approach
creates a model as viewed by the instructor while the user-centered approach solicits the
participation of the student in the design process (Blythe, 2001). Regardless of the design
approach, the instructor should also be aware of the instructional design model to be used.
There are many models (over 60 models reported) that have been developed and the com-
mon development steps include analysis, design and development (or strategy), evaluation
(Savenye et al., 2001; Smith & Ragan, 1999). Other models include learnerÕs analysis,
stating of objectives, selecting methods such as media and material, utilizing media and
material, requiring learnerÕs participation, and evaluating and revising the instructions
(Savenye et al., 2001).
It is evident from the foregoing the effort required by the learner and the instructor in
taking and in designing and implementing an online course. An online course includes
different tools. Online learning tools can be defined as any web sites, software, or com-
puter-assisted activities that intentionally focus on and facilitate learning on the Internet.
Learning tools that have been investigated by researchers include web based dynamic
practice system, multimedia application and game based learning modules (Eklund &
Eklund, 1996; Irani, 1998; Poole & Lorrie, 2003; Saadé, 2003; Sunal et al., 2003). These
learning tools focus on specific learning aspects and try to meet the learning needs of a
particular group of learners.
With the wide use of technology in todayÕs learning environment, we should not
anymore be concerned with finding out which is better, face-to-face or technology-
enhanced instruction (Daley et al., 2001). In fact, studentÕs experience with a course
does not only entail the final grade but how much of the learning objectives have been
attained. Also, holistic experiences with the course should be emphasized. Online learn-
ing presents new opportunities to more engaged and student-centered learning, thereby
enhancing the learning experience. Our primary pre-occupation should be whether stu-
dents really learn with the intervention of online learning tools. If yes, what are the
variables that contribute to the success of online learning tools? If no, then what is
going wrong and how can we enhance the learning tool in question? To understand
the process of learning using online learning tools, we need to identify the important
variables that measure the learning outcome of students using a specific learning tool,
and also the variables that help us understand studentsÕ learning experience with the
learning tool.
Looking at previous literature, we identify six variables that are considered to be impor-
tant by researchers to the learning outcome and learning experience with online learning
tools. These variables (or constructs) are affect, learnerÕs perception of the course, per-
ceived learning outcome, attitude, intrinsic motivation and extrinsic motivation. In this
study, a survey methodology was followed. We adopted items (questions) for these vari-
ables from different studies and performed an Exploratory Factor Analysis (EFA) to test
the validity of the variable sets in the present context. It is the ultimate objective of this
paper to identify those variables that may play a significant role in learning while using
online learning tools.
1724 R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739

1.1. Dimensions of online learning

A recent study performed by Sunal et al. (2003) analyzed a body of research on best
practice in asynchronous or synchronous online instruction in higher education. The study
indicated that online learning is viable and resulted in the identification of potential best
practices. Most studies on student behavior were found to be anecdotal and are not
evidence based. Researchers today are concerned with exploring student behavior and atti-
tudes towards online learning.
In an attempt to study behavior and attitude towards online learning tools, we found
several theoretical models that have been proposed, including innovation diffusion theory,
the theory of reasoned action, the theory of planned behavior, and the technology accep-
tance model (TAM). All generally agree that an individualÕs beliefs and perceptions of IT
have a significant influence on their intentions to use and eventual usage. These theories
were mostly used to explain the variance in employeeÕs motivation and perceptions in
using different software in the context of corporate training. However, few of the studies
considered affect and none were found using course related beliefs and perceptions.
The evaluation of behavior and attitude factors in the context of online learning tools is
not well developed and scarce. Motivated by the need for more concrete and accurate
evaluation tools, we identified six important factors that may be used to better understand
student behavior and attitude towards online learning. These factors which we shall refer
to as the dimensions to online learning are affect, perception of course, perceived learning
outcome, attitude, intrinsic motivation and extrinsic motivation.

1.1.1. Affect
Affect refers to an individualÕs feelings of joy, elation, pleasure, depression, distaste, dis-
contentment, or hatred with respect to particular behavior (Triandis, 1979). Triandis
(1979) argued that literature showed a strong relationship between affect and behavior.
In a business context, it was observed that positive relation between affect and senior man-
agementÕs use of executive information system exists. Positive affect towards technology
leads gaining experience, knowledge and self-efficacy regarding technology, and negative
affect causes avoiding technology, thereby not learning about them or developing per-
ceived control (Arkkelin, 2003).

1.1.2. LearnerÕs perception of the course


StudentÕs perceptions of using technology as part of the course learning process was
found to be mixed (Kum, 1999; Picciano, 2002). Some students were uncomfortable with
the student-centered nature of the course and were put-off by the increased demands of the
computer-based instruction, which reduced student engagement in the course and led to a
decline in student success (Lowell, 2001). LearnersÕ perception of the course may influence
behavior due to the non-familiarity with the learning tool used. Until students fully under-
stood what was expected of them, they often acted with habitual intent based on an impre-
cise understanding or perception of the course (Davies, 2003).

1.1.3. Perceived learning outcome


Perceived learning outcome is defined as the observed results in connection with the use
of learning tools. Perceived learning outcome was measured with three items: (1) perfor-
mance improvement; (2) grades benefit; and (3) meeting learning needs. Previous studies
R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739 1725

have shown that perceived learning outcomes and satisfaction are related to changes in the
traditional instructorÕs role from leader to that of facilitator and moderator in an
online learning environment (Faigley, 1990; Feenberg, 1987; Krendl & Lieberman,
1988). Researchers also reported that students who have positive perceived learning out-
come may have more positive attitudes about the course and their learning, which may in
turn cause them to make greater use of the online learning tools.

1.1.4. Attitude
Most of the online learning literature concentrate on student and instructor attitudes
towards online learning (Sunal et al., 2003). Marzano and Pickering, 1997, indicated that
studentsÕ attitude would impact the learning they achieve. Also research has been con-
ducted to validate this assertion and extends this assertion into an on-line environment
(Daley et al., 2001). Moreover, Technology Acceptance Model (TAM) (Davis, Bagozzi,
& Warshaw, 1989) also suggests that attitudes towards use directly influence intentions
to use the computer and ultimately actual computer use. Davis et al. (1989) demonstrate
that an individualÕs initial attitudes regarding a computerÕs ease of use and a computerÕs
usefulness influence attitudes towards use.

1.1.5. Intrinsic motivation


Researchers also studied motivational perspectives to understand behavior. Davis,
Bagozzi, and Warshaw (1992) have advanced this motivational perspective to understand
behavioral intention and to predict the acceptance of technology. They found intrinsic and
extrinsic motivation to be key drivers of behavioral intention to use (Vallerand, 1997;
Venkatesh, 1999). Wlodkowski (1999) defined intrinsic motivation as an evocation, an
energy called forth by circumstances that connect with what is culturally significant to
the person. Intrinsic motivation is grounded in learning theories and is now being used
as a construct to measure user perceptions of game/multimedia technologies (Venkatesh,
1999; Venkatesh & Davis, 2000; Venkatesh, Speier, & Morris, 2002).

1.1.6. Extrinsic motivation


Extrinsic motivation was defined by Deci and Ryan (1985) as the performing of a behav-
ior to achieve a specific reward. In studentsÕ perspective, extrinsic motivation on learning
may include getting a higher grade in the exams, getting awards, getting prizes and so on.
A lot of research has already verified that extrinsic motivation is an important factor influ-
encing learning. However, other research also addresses that extrinsic motivation is not as
effective as intrinsic motivation in motivating learning or using technology to facilitate
learning.

2. Methodology

An exploratory factor analysis approach was followed to test the validity of the dimen-
sions of online learning. The EFA mathematical criteria were used to create factor models
from the data. It simplifies the structure of the data by grouping together observed vari-
ables that are inter-correlated under one ‘‘common’’ factor (or in the context of this study,
dimension). Prior to the presentation of the EFA approach and results, we describe the
tool used, the experimental setup including participants and procedure and the question-
naire used.
1726 R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739

2.1. The online learning tool

The Online Learning Tool was developed so that students could practice and then
assess their knowledge of content material and concepts in an introductory management
information systems course. The learning tool helps students rehearse as well as learn by
prompting them with multiple choice, and true or false questions. The learning tool is web
based and can be accessed using any web browser. Selection of the web to implement the
learning tool is appropriate due to the fact that the technology is available from many
locations around the campus, friends, internet cafes and homes, thus access would not
count as a barrier to the usage of the technology.

2.1.1. Design
In this study, the learning tool was developed using the system approach to design. The
practice of the systems approach is represented by the ‘‘waterfall model’’ in which the
designer/instructor identified the objective(s) of the learning tool followed by the genera-
tion of a formal specification of the functioning system (learning tool). The specifications
were then given to a programmer who created the online learning tool described above.
When a first version of the tool was completed, it was tested for functionality, robustness
and verified against the initial requirements in doing what it was initially envisioned to do.
After some refinement, the learning tool was revised and made ready for use.
The learning tool is programmed using html and scripting languages with active server
pages (ASP) support to communicate with the database. The html and ASP files are very
simple in design and do not include graphics and images or any other distracting objects.
Each page includes one or two buttons that students can click on. This design allows the
student to focus on the task at hand and away from exploration.
The learning tool is made up of three components: (1) the front end which interacts with
the user; (2) the middle layer which stores and controls the interaction session and (3) the
back end which includes the database with questions. The front end is simple and allows
the student to log into the web site and select whether he/she wants to practice or get eval-
uated. The middle layer keeps track of the studentÕs performance as well as controls the
logic behind the selection of the questions from the database and prompting them to
the student. The back end (database) contains the multiple choice and true or false pools
of questions studentsÕ answers to the questions and time that they spent answering each set
of questions.
Since the Online Learning Tool was developed for the web, students were able and
allowed to use the system anywhere, anytime. The system would monitor studentsÕ activ-
ities such that usage time, chapters being accessed and average scores per chapter were
stored and time stamped. Due to the fact that the internet is widely used among students,
the selection of the web to implement the Online Learning Tool is justified. Furthermore,
the web technology exemplifies the characteristics of contemporary information technol-
ogy and that the technology is available from many locations around the campus, friends,
Internet cafes and homes. Therefore, access would not count as a barrier to the usage of
the technology.

2.1.2. Implementation
Students were asked to use the Online Leaning Tool to count for 10% of their final
mark. The remaining part of the course grade was distributed between a midterm exam
R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739 1727

(25%), a project (20%) and a final exam (45%). The Online Learning Tool interface was
simple and contained two major components. The first component included a practice
engine where students would practice multiple choice, and true or false questions without
being monitored or having any of their activities stored. The second component entails a
test site similar to what the students have used in Component 1. Both parts have the same
interface, engine and pool of questions.
The Online Learning Tool is integrated in the instructional design of the course with
some pedagogical elements in mind. First, the questions in the practice (Component 1)
and assessment (Component 2) components of the Online Learning Tool are retrieved
from the same pool in the database. This implies that some questions will repeat and,
therefore, encourage students to use their cognitive skills such as short-term memory,
working memory, recognition and recollection. This is especially true because the stu-
dents are notified that the pool of questions is fixed and that questions will reappear.
That is, students need to be very attentive during the exercise/practice process. Ques-
tions included multiple choice, and true or false and students were given immediate
feedback to their responses. Second, the assessment (Component 2) of the studentsÕ
level of acquired knowledge found in a specific chapter of the course is not limited
by the number of questions that the students are asked to answer but only by their
willingness to practice. Students have the flexibility to answer as many questions as
they wish. In other words, students have the choice to practice again and be re-assessed
(tested) as many times as they wish. The final assessment mark, however, is calculated
as the average of all the assessments taken. For example, the student is required to do
a minimum of 20 questions. If the student after answering 20 questions receives an
average of 75% and the students wishes to increase this average, then the student
may practice some more (using Component 1) and then return to the assessment part
(Component 2) and re-attempt 10 more questions. If the student score 80% on the sec-
ond set of questions, then the running average of the student is (80 + 75)/2 = 77.5%.
Third, the answer to a few questions in every chapter was intentionally specified wrong.
That is, if the student selects the correct answer, the Online Learning Tool will tell the
student that the answer is wrong. Students are notified about this fact and are encour-
aged to find those errors and report them. Students are given bonus points for finding
those wrong questions/answers.

2.1.3. Pedagogical implications


In a systems approach to online learning tools development, the specifications are
formed as a product of the mind of the designer. From this process, a designerÕs image
of the system emerges. The online learning tool presented in this study was designed by
the instructor with the aim:

1. to get students more involved in testing their knowledge of topics learned by practicing
questions online and
2. to provide them with the opportunity to work together with the objective of maximizing
the points gained for this component of the course.

The second aim although may seem more about practicing questions; it is actually more
in line with the constructivism philosophy by the way the practicing is being promoted. In
specific, students are encouraged to work together to answer the questions thereby giving
1728 R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739

them the opportunity to communicate about what they have learned and they are also
instructed to identify the wrong questions with the incentive to gain further points. Con-
structivism is based on the premise that knowledge is constructed by the individual
through his or her interactions with the environment. In the present case, the environment
includes material resources, other students and interactive tools. Constructivism has
its roots in cognitive psychology which holds that individuals build their own understand-
ing of their world through experiences, maturation and interaction including other
individuals.
The implications of constructivism for a learning environment include customization of
curricula to prior knowledge, tailoring of teaching strategies to student backgrounds and
employing open-ended questions that promote dialogue among learners. Also, the con-
structivist approach is viewed by many (including the authors of this paper) to be more
than just activities. The mere fact that you provide the opportunity for the leaner to be
part of their learning process foster active involvement and learning which clearly demon-
strates the more respect to the student as a learner and as a human being.
The design and implementation of the online learning tool used in this study meets at
least partially the constructivism approach where learners are actually more active in the
learning process by interacting with the learning tool, and they are given the opportunity
to take part of their learning process by getting involved with others to score higher grades
and find the mistakes.

2.2. Participants and procedure

A total of 105 undergraduate students in the John Molson School of Business, Concor-
dia University, participated in using the Online Learning Tool. The studentsÕ sample rep-
resents a group:

 37% between the ages of 18 and 22, 21% aged between 22 and 24 years and 32% above
26 years old (Fig. 1);
 with a majority (57%) claiming to have 2–5 years of experience using the internet and
33% claiming to have more than 5 years of internet experience (Fig. 2);

32%
37%

> 26 yrs 18+ to 22 yr

24+ to
26 yrs 22+ to 24 yrs
10%

21%

Fig. 1. Age group.


R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739 1729

10%

1+ to
33% 2 yrs
> 5 yrs

2+ to 5 yrs

57%

Fig. 2. Experience using Internet (CS2).

1 2 3 4 5

5% 5%
14%

1. Less than 15 mins.


33% 2. Between 15 mins. and 1 hr,
3. 1+ to 2 hrs,
4. 2+ to 5 hrs,
5. More than 5 hrs

43%

Fig. 3. Frequency in using the Internet.

 with the majority (90%) indicating that they use the internet more than 1 h a day (see
Fig. 3).

A flowchart describing the suggested studentsÕ learning process with the Online Learn-
ing Tool integrated is shown in Fig. 4 below. Steps 1–5 are a cycle that needs to be fol-
lowed for every chapter. First, the student should study chapter C(i) prior to the use of
the Online Learning Tool (step 1). Once the student has studied chapter C(i), he/she
can login (via the internet) and select to practice answering questions ÔP(i,j,k)Õ associated
with the chapter ÔiÕ studied, where ÔjÕ and ÔkÕ represent multiple choice and true or false
questions, respectively (step 2). The practicing component prompts the student with a
set of five questions at a time. The student answers the questions and requests to be eval-
uated. The Online Learning Tool then identifies the correct from the incorrect answers.
The student can verify the results and when ready click on the ÔNextÕ button to be
prompted with another randomly selected set of questions (step 2). The student can prac-
tice as much as he/she feels is necessary (step 3), after which he/she can do the test for the
specific chapter T(i,j,k) (step 4). The student can then continue with another cycle identi-
fied by a new chapter to study and practice (step 5). At any time, a student can request an
activity report which includes a detailed view of what and how much they practices and a
summary report which provides them with running average performance data.
1730 R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739

No
1 2 3
Study Practice Ready for test
Chapter C(i) Questions P(i,j,k) T(i,j,k) ?

Yes
5 4 6
Go to next Assess T(i,j,k) Reports on
Chapter (i) P(i,j,k) & T(i,j,k)

Fig. 4. The Online Learning Tool process.

2.3. Questionnaire

Validated constructs were adopted from different relevant prior research work (Agar-
wal & Karahanna, 2000; Davis, 1989; Venkatesh, Morris, Davis, & Davis, 2003). The
wording of items was changed to account for the context of the study. All items shown
in the Appendix A were measured using a 5-point scale with anchors all of the questions
from ‘‘Strongly disagree’’ to ‘‘Strongly agree’’ with the exception of ÔlearnersÕ perception of
courseÕ which had anchors between 0% and 100%. The questionnaire included items
worded with proper negation and a shuffle of the items to reduce monotony of questions
measuring the same construct.

3. Result and discussion

3.1. Student feedback on affect (AFF)

Fig. 5 represents the student feedback on Affect. The Affect reported by the sample stu-
dents is not positive. More than 50% of the students reported that they feel the ‘‘learning
tool’’ to be a nuisance. Forty percent of them also reported frustration in using the ‘‘learn-
ing tool’’. The same number of the students reported anxiety and tension in using the

Stongly Diagree Disagree Neutral Agree Strongly Agree

Having the “learning tool”


as part of the course
requirements makes me
anxious and tenser.

Having the “learning tool”


as part of the course
requirement frustrates me.

I find the learning tool to be


a nuisance.

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Fig. 5. Student feedback on affect.


R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739 1731

‘‘learning tool’’. The negative affect in using the ‘‘learning tool’’ was not due to technical
problems since very little technology related problems were reported. Student most prob-
ably had negative affect due to the fact that the course (which is also reflected in the num-
ber of chapters to practice for) contained a large amount of information. This was
previously observed where negative affect has caused the student to avoid the use of the
‘‘learning tool’’ (Arkkelin, 2003). In the present study, students continued using the learn-
ing tool because their scores were part of their final mark (5%).

3.2. Student feedback on learnersÕ perception on course (PC)

Fig. 6 represents the student feedback on learnersÕ perception of course. The perception
on the course was positive. More than 50% of the students indicated that the learning tool
is important for the course. Close to 95% of the students felt that they will score above
50% in the course with half of them expecting a mark above 75%. Approximately 75%
of the students seemed to invest more than 50% of their efforts on this course. In relation
to the course material, nearly 60% of the students felt that compared to other courses, this
course on the average has 75% more valuable content, at the same time 75% more difficult
and that they were 75% more enthusiastic in taking the course.

3.3. Student feedback on attitude (ATT)

Fig. 7 represents the student feedback on attitude towards using the ‘‘learning tools’’.
Close to 60% of the students found that ‘‘learning tools’’ are helpful for understanding bet-
ter course content. Also 60% of the students reported the advantages of ‘‘learning tools’’
overweigh the disadvantages. Most students felt that the ‘‘learning tool’’ had little influ-
ence in improving their interaction with other fellow students, in helping their perfor-
mance in other courses, and in feeling more productive by using it. These results were
expected due to the fact that the ‘‘learning tool’’ targeted studentÕs learning of specific top-
ics in relation to the present course and not other courses. Also, the ‘‘learning tool’’ was
not designed to enhance collaboration among students. What is interesting is that 10% of

Zero % > 0% to <=25% >29% to <=50% >50% to <=75% >75% to <=100%

How would you rate the importance of the “learning tool” for this course?

What grade are you expecting to obtain in this course?

How much effort have you invested in this course?

Compared to other courses you have taken, how do you think the value of the
topics and content of this course?

Compared to other courses you have taken, how difficult do you feel this
course to be?

Compared to other courses you have taken, how would you rate your
enthusiasm for this course?

0% 20% 40% 60% 80% 100%

Fig. 6. Student feedback on learnersÕ perception on course (PC).


1732 R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739

Strongly Disagree Disagree Neutral Agree Strongly Agree

“Learning tools” are not helpful for understanding better the course
content.

The advantages of the“learning tool” outweigh the disadvantages.

The “learningtool” has improved the quality of interaction with other


students taking the course with me.

I find that the “learning tool” will help my performance in other courses
also using it.

Students taking the course with me encouraged me to use the “learning


tool”.

I felt more productive by using the “learning tool”.

0% 20% 40% 60% 80% 100%

Fig. 7. Student feedback on attitude.

the students actually did feel that the ‘‘learning tool’’ will help them in other courses,
claimed that it improved the quality of interaction with other students and felt that they
were more productive using it.

3.4. Student feedback on perceived learning outcome (PLO)

Fig. 8 represents the student feedback on Perceived Learning Outcome. As shown,


perceived learning outcome is very positive. More than 60% of the students indicated that

Strongly Disagree Disagree Neutral Agree Strongly Agree

The “learning tool” meets my learning needs.

The “learning tool”does not waste my time.

My understanding of the topic improved by using the


“learning tool”.

I quickly understood how the “learning tool”works and


devised a strategy aimed at increasing my score.

I recognized the value of the repetition of the questions


while using the “learning tool”.

0% 20% 40% 60% 80% 100 %

Fig. 8. Student feedback on perceived learning outcome.


R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739 1733

the ‘‘learning tool’’ meets their learning needs and does not waste their time. Their under-
standing of the topic was improved by using the tool. Close to 50% of the students
reported that they understand the strategy of the ‘‘learning tool’’ and were able to adjust
their learning in order to maximize the advantage in using the learning tool.

3.5. Student feedback on intrinsic and extrinsic motivation (IM and EM)

Fig. 9 represents student feedback on motivation. More than 80% of the students
reported that the ‘‘learning tool’’ being a support throughout the semester motivated them
to use it more regularly. This indicated that students use the ‘‘learning tool’’ because they
believe it is a support for the learning in the course throughout the semester. At the same
time, 80% of the students reported that they used the ‘‘learning tool’’ more seriously
because it is part of the grading scheme. Both the intrinsic and extrinsic motivation played
an important role in learning.

3.6. Exploratory factor analysis

It is important to choose a proper FA method to conduct the exploratory factor anal-


ysis (EFA) effectively. Although principal axis factoring has been the most widely used
method in the past, maximum likelihood (MI) has a number of desirable features that
make it the currently preferred method for conducting exploratory factor analysis. MI
estimation methods have been shown to be robust to mild to moderate departures from
normality in generating appropriate factor solution (Boomsma, 1987). It attempts to find
the most likely population parameter estimates that produced the observed correlation
matrix, assuming that the observed correlation matrix is from a sample drawn from a mul-
tivariate normal population (Lawley & Maxwell, 1965). Therefore in this study, we apply
MI in the factor analysis.

Strongly Disagree Disagree Neutral Agree Strongly Agree

The only reason I use the “learning tool” is because it is a


requirement for the course.

Having the “learning tool” a part of the grading scheme for


the course motivates metouseit moreseriously.

“Learning tool” that supportsthe course throughout the


semester make me more motivated to use them regularly

I feel that if the “learning tool” is a support tool to the


course,I would be more motivated to sue it.

0% 20% 40% 60% 80% 100%

Fig. 9. Student feedback on intrinsic and extrinsic motivation.


1734 R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739

3.6.1. Factor analysis


First, we performed an initial factor analysis to observe the relationship among the fac-
tors and their indicators. Some variables were well defined with a factor (AFF1, AFF2 and
AFF3 with Factor 4; PLO1 and PLO2 and PLO5 with Factor 5). However, other variables
such as ATT1 loaded on Factor 1 (0.580) and Factor 2 (0.578). During subsequent factor
analysis we rotated the matrix to improve our ability to interpret the loadings (to maxi-
mize the high loading of each observed variable on one factor and minimize the loading
on the other factors). We also analyzed the factor eigenvalues. The eigenvalue for a given
factor reflects the variance in all the variables, which is accounted for by that factor (Nag-
paul, 1999). A critical limit for the eigenvalue is 1.0 such that factors with values above
that limit are acceptable. This identifies the number of factors that can be extracted from
the items pool. The factors with eigenvalue greater that 1.0 contain acceptable variance
among the observed items; those with values less than 1.0 are dropped from the factor
solution.
Analysis of eigenvalues is done by considering a scree-plot. A scree-plot (such as the one
shown in Fig. 10) presents the factors possibly extracted from the pool of items and asso-
ciated eigenvalues. The adequacy of the factors can be determined by examining the scree-
plot.
Factor analysis was performed on the original set of items. After this initial analysis
(without rotation) of the eigenvalues, six factors were retained. Through rotation, we
would like to get items to be identified more strongly with one or the other factor. There
are three basic approaches to rotation including: (a) graphic, which is difficult to apply
when clustering of variables is unclear or there are more than two factors (Kim & Mueller,
1978); (b) analytic, which consists of orthogonal and oblique schemes and (c) rotation to a
target matrix, where the researcher has a pattern of relationships already in mind. In this
study, we used analytic rotation, as it is the most widely used method. We used orthogonal
rotation which tends to maximize the loadings on one factor and minimize the loading on
the other factor or factors. The most commonly used rotation scheme for orthogonal fac-
tors is Varimax, which attempts to minimize the number of variables that have high load-
ings on one factor.

4
Eigenvalue

2 4 6 8 10 12
Factor Number

Fig. 10. Eigenvalues for factors reflecting variance in items.


R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739 1735

The EFA was performed in three steps: (1) unrotated on all items; (2) rotated on all
items and (3) rotated and refined. In step 3, refined implies that we dropped all the items
that did not meet the inclusion criteria. Also, in each step we analyzed the factor matrix,
eigenvalues and the scree-plot. Here, we present the final solution.

3.6.2. Retained solution


Due to the low correlation and low factor loading, the following items are rejected:
AFF1, PC2, PC4, PC5, PC6, ATT6, PLO1, PLO2, PLO4, IM1, and EM1. After dropping
these items, the final analysis is presented. Table 1 summarizes the relationship among the
factors and their observed indicators. Items with high values are bold to contrast the load-
ing on their respective factor. Items that belong together should have relatively higher load-
ing on the same factor. For example PC1 and PC2 load 0.679 and 0.927 on factor 4, which
are high compared to the other variables which load 0.338 or less on the same factor.
We can immediately see that the variables are well defined with a factor (PC1 and PC3
with Factor 4; ATT1, ATT2, ATT3, ATT4 and ATT5 with Factor 1; AFF2 and AFF3
with Factor 3; PLO3 and PLO5 with Factor 2).
The adequacy of the model can be determined by examining the scree-plot in Fig. 10.
After three factors are retained, the eigenvalues of the additional factors attain a horizon-
tal line (scree). Therefore, additional factors would add little to the model. We can also
observe from the Fig. 10 that the first six factors are larger than 0.5 and starts from the
seventh factor, the eigenvalues drop under 0.5. If a factor has a low eigenvalue, then it
is contributing little to the explanation of variances in the variables and may be ignored
(Nagpaul, 1999).

4. Limitation and conclusion

Dimensions that influence online learning have been investigated by researcher under
different experimental traits. In this study, we gathered items from different literature
and tested the validity of these items under the use of an online learning tool context.
We acknowledge that implications of our findings are only confined to the limits at which
we interpret the results, and that these limitations must be acknowledged.

Table 1
Factor loadings on respective items
Variable Factor 1 Factor 2 Factor 3 Factor 4 Factor 5 Factor 6
AFF2 0.155 0.140 0.877 0.050 0.122 0.208
AFF3 0.200 0.006 0.650 0.200 0.035 0.147
PC1 0.282 0.151 0.073 0.679 0.326 0.079
PC3 0.236 0.170 0.095 0.927 0.005 0.193
ATT1 0.666 0.240 0.182 0.338 0.236 0.250
ATT2 0.672 0.116 0.083 0.044 0.253 0.206
ATT3 0.740 0.202 0.185 0.212 0.101 0.043
ATT4 0.674 0.142 0.115 0.100 0.008 0.036
ATT5 0.588 0.189 0.327 0.214 0.431 0.060
PLO3 0.144 0.555 0.086 0.098 0.297 0.076
PLO5 0.424 0.732 0.143 0.123 0.367 0.016
IM2 0.176 0.188 0.034 0.276 0.335 0.574
EM2 0.119 0.250 0.235 0.210 0.556 0.123
1736 R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739

From the participantsÕ perspective, bias with the sample of learners may be due to the
sample size, and demographic controls. Moreover, the nature of the course is such that it is
an introductory MIS course containing many chapters and additional topics that we ask
the students to learn. This is especially difficult for the students who have never been
exposed to the field of information technology. Therefore, generalizing the findings in
terms of behavior and intentions to other courses and schools may be limited. As a result,
we need to identify the boundary conditions of the dimensions as they relate to demo-
graphic variables such as age, gender, Internet competencies and other course properties.
In fact, the nature of the course is an important variable that contributes to the success or
failure of online learning. In effect, some courses lend themselves to be appropriate for
online while other do not. Similarly, some students have the skill to follow online learning
tools while others do not.
Considering the questionnaire, it is not free of subjectivity. The respondentsÕ self-report
measures used are not necessarily direct indicators of improved learning outcomes. Fur-
thermore, although a proper validation process of the instrument was followed, the fact
that the questions were collected from other research may not necessarily be precise and
appropriate in the context of this study. Conclusions drawn are based on a specific online
learning tool usage but not for all online learning tools. Other learning tools can be
designed for different tasks and for different platforms (in this case it was web-based)
and this study was based on a single distinct technology. This however, may not generalize
across a wide set of learning technologies.
The effectiveness of online learning tool in facilitating studentsÕ learning and the learn-
ers learning outcome are measured in a lot of dimensions. In this study, we chose five
important dimensions that have been investigated in different research and tested the
validities of these dimension under the current context. These five dimensions are Affect,
LearnerÕs Perceived on the Course, Attitude, Perceived Learning outcome, Intrinsic Moti-
vation and Extrinsic Motivation. In this validating process, all the five dimensions show
content and construct validities. The invalid items in constructs are eliminated and not
considered in the final solution of the factor analysis. Student feedback on questions items
and the factor analysis provide

 validity of the dimensions that influence the effectiveness of online learning.


 controls to revalidate under different experimental setups
 researchers with the valid questionnaire items to test models or hypotheses under differ-
ent contexts hence facilitating the analysis of mediating effects on student experiences
and
 quantitative results that may help the researcher/instructor understand the dynamics of
the online learning tool and identify critical element to enhance the tool in helping stu-
dents perform better in their learning process

Faculty knowledge about course design is the most significant bottleneck to the bet-
ter teaching and learning in higher education (Fink, 2003). It is essential that any fac-
ulty member who wishes to put a course online to attain a solid understanding of the
major design and pedagogical principles involved. There are many design approaches
such as the systems approach, the user-centered approach or the backward approach
(Wiggins, 1998). With respect to pedagogy, due to the fact that elearning is a fairly
recent phenomenon, the underlying pedagogical principles are not dealt with yet (Bixler
R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739 1737

& Spotts, 2000; Govindasami, 2002). However, starting with the traditional pedagogical
principles new extended online pedagogical methods can be created. These new peda-
gogical principles must form the very basis for inclusion of features in online learning
courses.

Appendix A. Question items in the questionnaire

Construct Code Question


Affect AFF1 I find the ‘‘learning tool’’ to be a nuisance
AFF2 Having the ‘‘learning tool’’ as part of the course
requirement frustrates me
AFF3 Having the ‘‘learning tool’’ as part of the course
requirements makes me anxious and tenser
Learners perception PC1 Compared to other courses you have taken, how
of the course would you rate your enthusiasm for this course?
PC2 Compared to other courses you have taken, how
difficult do you feel this course to be?
PC3 Compared to other courses you have taken,
how do you think the value of the topics
and content of this course?
PC4 How much effort have you invested in this course?
PC5 What grade are you expecting to obtain in this
course?
PC6 How would you rate the importance of the
‘‘learning tool’’ for this course?
Perceived learning PLO1 I recognized the value of the repetition of the
outcome questions while using the ‘‘learning tool’’
PLO2 I quickly understood how the ‘‘learning tool’’ works
and devised a strategy aimed at increasing my score
PLO3 My understanding of the topic improved by
using the ‘‘learning tool’’
PLO4 The ‘‘learning tool’’ does not waste my time
PLO5 The ‘‘learning tool’’ meets my learning needs
Attitude ATT1 I felt more productive by using the ‘‘learning tool’’
ATT2 Students taking the course with me encouraged me
to use the ‘‘learning tool’’
ATT3 I find that the ‘‘learning tool’’ will
help my performance
in other courses also using it
ATT4 The ‘‘learning tool’’ has improved the
quality of interaction
with other students taking the course with me
(continued on next page)
1738 R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739

Appendix A (continued )
Construct Code Question
ATT5 The advantages of the ‘‘learning tool’’
outweigh the disadvantages
ATT6 ‘‘Learning tools’’ are not helpful
for understanding
better the course content
Intrinsic motivation IM1 I feel that if the ‘‘learning tool’’ is a support tool
to the course, I would be more motivated to sue it
IM2 ‘‘Learning tool’’ that supports the course throughout
the semester make me more motivated to sue them
regularly

Extrinsic motivation EM1 Having the ‘‘learning tool’’ a part of the grading
scheme for the course motivates me to use it more
seriously
EM2 The only reason I sue the ‘‘learning tool’’ is because
it is a requirement for the course

References

Agarwal, R., & Karahanna, E. (2000). Time flies when youÕre having fun: cognitive absorption and beliefs about
information technology usage. MIS Quarterly, 24(4), 665–694.
Arkkelin, D. (2003). Putting prometheusÕ feet to the fire: student evaluations of prometheus in relation to their
attitudes towards and experience with computers, computer self-efficacy and preferred learning style. <http://
faculty.valpo.edu/darkkeli/papers/syllabus03.htm> (last accessed 17.04.04).
Association of European Universities. (1996). Restructuring the university: universities and the challenge of new
technologies. CRE Document No. 1.
Bixler, B., & Spotts, J. (2000). Screen design and levels of interactivity in web-based training. (Online) Available
from <http://www.clat.psu.edu/homes/jds/john/research/ivla98.htm>.
Blythe, S. (2001). Designing online courses: user-centered practices. Computers and Composition, 18, 329–
346.
Boomsma, A. (1987). Structural equation modeling by example: Applications in educational, sociological, and
behavioral research (pp. 160–188). Cambridge, England: Cambridge University Press.
Daley, B. J., Watkins, K., Williams, S. W., Courtenay, B., Davis, M., & Mike (2001). Exploring learning in a
technology-enhanced environment. Educational Technology and Society, 4(3).
Davies, R. S. (2003). Learner intent and online courses. The Journal of Interactive Online Learning, 2(1).
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1992). Extrinsic and intrinsic motivation to use computers in the
workplace. Journal of Applied Social Psychology, 22(14), 1111–1132.
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology.
MIS Quarterly, 13, 319–340.
Davis, D. F., Bagozzi, P. R., & Warshaw, R. P. (1989). User acceptance of computer technology: a comparison of
two theoretical models. Management Science, 35(8), 982–1003.
Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human behavior. New York:
Plenum.
Eklund, J., & Eklund, P. (1996). Integrating the web and the teaching of technology: cases across two universities.
In Proceedings of the AusWeb96, the second Australian worldwideweb conference, Gold coast, Australia.
<http://wwwdev.scu.edu.au/sponsored/ausweb/ausweb96/educn/eklund2> (last accessed February, 2004).
Faigley, L. (1990). Subverting the electronic network: teaching writing using networked computers. In D. Daiker
& M. Morenberg (Eds.), The writing teacher as researcher: Essays in the theory and practice of class-based
research. Portsmouth: Boynton/Cook.
R.G. Saadé et al. / Computers in Human Behavior 23 (2007) 1721–1739 1739

Feenberg, A. (1987). Computer conferencing and the humanities. Instructional Science, 16, 169–186.
Fink, L. D. (2003). Creating significant learning experiences: An integrated approach to designing college courses.
San Fransisco: Jossey-Bass.
Gillette, D. (1999). Pedagogy, architecture, and the virtual classroom. Technical Communication Quarterly, 80,
21–36.
Govindasami, T. (2002). Successful implementation of e-learning pedagogical considerations. Internet and Higher
Education, 4, 287–299.
Irani, T. (1998). Communication potential, information richness and attitude: a study of computer mediated
communication in the ALN classroom. ALM Magazine, 2(1).
Kim, J., & Mueller, C. (1978). An introduction to factor analysis. Sage series on quantitative applications in the
social sciences (Vol. 13). Newbury Park, CA: Sage.
Krendl, K. A., & Lieberman, D. A. (1988). Computers and learning: a review of recent research. Journal of
Educational Computing Research, 4(4), 367–389.
Kum, L. C. (1999). A study into studentsÕ perceptions of web-based learning environment. In HERDSA annual
international conference, Melbourne (pp. 12–15).
Lawley, D., & Maxwell, A. (1965). Factor analysis as a statistical method. London: Butterworth.
Lowell, R. (2001). The Pew Learning and Technology Program Newsletter. <http://www.math.hawaii.edu/~dale/
pew.html> (last accessed February, 2004).
Marzano, R. J., & Pickering, D. J. (1997). Dimensions of learning trainerÕs manual. Alexandria VI: Mid-Continent
Research for Education and Learning.
Nagpaul, P. S. (1999). Guide to advanced data analysis using IDAMS software. New Delhi, India: National
Institute of Science Technology and Development Studies.
Picciano, G. A. (2002). Beyond student perceptions: issues of interaction, presence and performance in an online
course. Journal of Asynchronous Learning Networks, 6(1), 21–40.
Poole, B. J., & Lorrie, J. (2003). Education online: Tools for online learning. Education for an information age,
teaching in the computerized classroom (4th ed.).
Richardson, C. J., & Swan, K. (2003). Examining social presence in online courses in relation to studentsÕ
perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68–88.
Saadé, G. R. (2003). Web-based educational information system for enhanced learning, (EISEL): student
assessment. Journal of Information Technology Education(2), 267–277<http://jite.org/documents/Vol2/
v2p267-277-26.pdf> (last accessed February, 2004).
Savenye, C. W., Olina, Z., & Niemczyk, M. (2001). So you are going to be an online writing instructor: issues in
designing, developing and delivering an online course. Computers and Composition, 18, 371–385.
Shih, P., Munoz, D., & Sanchez, F. (2004). The effect of previous experience with information and
communication technologies on performance in a Web-based learning program. Computer and Human
Behavior (in press).
Smith, P. J., & Ragan, T. J. (1999). Instructional design. Upper Saddle River, NJ: Merrill.
Sunal, W. D., Sunal, S. C., Odell, R. M., & Sundberg, A. C. (2003). Research-supported best practices for
developing online learning. The Journal of Interactive Online Learning, 2(1), 1–40.
Triandis, C. H. (1979). Values, attitudes, and interpersonal behavior. In Nebraska symposium on motivation, 1979:
Beliefs, attitudes and values (pp. 159–295). Lincoln, NE: University of Nebraska Press.
Vallerand, R. J. (1997). Toward a hierarchical model of intrinsic and extrinsic motivation. Advances in
Experimental Social Psychology, 29, 271–374.
Venkatesh, V. (1999). Creation of favorable user perceptions: exploring the role of intrinsic motivation. MIS
Quarterly, 23(2), 239–260.
Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: four
longitudinal field studies. Management Science, 46(2), 186–204.
Venkatesh, V., Speier, C., & Morris, M. G. (2002). User acceptance enablers in individual decision-making about
technology: toward an integrated model. Decision Sciences, 33, 297–316.
Venkatesh, V., Morris, M. G., Davis, F. D., & Davis, G. B. (2003). User acceptance of information technology:
toward a unified view. MIS Quarterly, 27, 425–478.
Wiggins, G. P. (1998). Educative assessment: Designing assessments to inform and improve student performance.
San Fransisco: Jossey-Bass.
Wlodkowski, R. J. (1999). Enhancing adult motivation to learn, revised edition, a comprehensive guide for teaching
all adults. San Francisco, CA: Jossey-Bass.

View publication stats

You might also like