You are on page 1of 10

Computers & Education 55 (2010) 155–164

Contents lists available at ScienceDirect

Computers & Education


journal homepage: www.elsevier.com/locate/compedu

A study of student satisfaction in a blended e-learning system environment


Jen-Her Wu a, Robert D. Tennyson b,*, Tzyh-Lih Hsia c
a
Department of Information Management, National Sun Yat-Sen University, 70 Lien-Hai Road, Kaohsiung, 80424, Taiwan
b
University of Minnesota, 56 East River Road, Minneapolis, Minnesota 55455, United States
c
Department of Information Management, Chinese Naval Academy, P.O. Box No. 90175 Tsoying, Kaohsiung 813, Taiwan

a r t i c l e i n f o a b s t r a c t

Article history: This study proposes a research model that examines the determinants of student learning satisfaction in
Received 18 March 2009 a blended e-learning system (BELS) environment, based on social cognitive theory. The research model is
Received in revised form 23 December 2009 tested using a questionnaire survey of 212 participants. Confirmatory factor analysis (CFA) was per-
Accepted 31 December 2009
formed to test the reliability and validity of the measurements. The partial least squares (PLS) method
was used to validate the measurement and hypotheses. The empirical findings indicate that computer
self-efficacy, performance expectations, system functionality, content feature, interaction, and learning
Keywords:
climate are the primary determinants of student learning satisfaction with BELS. The results also show
e-Learning
Satisfaction
that learning climate and performance expectations significantly affect learning satisfaction. Computer
Learner control self-efficacy, system functionality, content feature and interaction significantly affect performance expec-
Internet tations. Interaction has a significant effect on learning climate. The findings provide insight into those fac-
Teacher-directed tors that are likely significant antecedents for planning and implementing a blended e-learning system to
Learner-directed enhance student learning satisfaction.
Synchronous Ó 2010 Elsevier Ltd. All rights reserved.
Asynchronous
Face-to-face

1. Introduction

Classroom learning typically occurs in a teacher-directed instructional context with face-to-face interaction in a live synchronous envi-
ronment. In contrast to this form of instruction, is an approach that promotes learner-directed learning. With emerging Internet commer-
cialization and the proliferation of information technologies, online or electronic learning (e-learning) environments offer the possibilities
for communication, interaction and multimedia material delivery that enhance learner-directed learning (Wu, Tennyson, Hsia, & Liao,
2008). Although e-learning may increase access flexibility, eliminate geographical barriers, improve convenience and effectiveness for indi-
vidualized and collaborative learning, it suffers from some drawbacks such as lack of peer contact and social interaction, high initial costs
for preparing multimedia content materials, substantial costs for system maintenance and updating, as well as the need for flexible tutorial
support (Kinshuk & Yang, 2003; Wu et al., 2008; Yang & Liu, 2007). Furthermore, students in virtual e-learning environments may expe-
rience feelings of isolation, frustration and confusion (Hara & Kling, 2000) or reduced interest in the subject matter (Maki, Maki, Patterson,
& Whittaker, 2000). In addition, student satisfaction and effectiveness for e-learning has also been questioned (Piccoli, Ahmad, & Ives, 2001;
Santhanam, Sasidharan, & Webster, 2008).
With the concerns and dissatisfaction with e-learning, educators are searching for alternative instructional delivery solutions to relieve
the above problems. The blended e-learning system (BELS) has been presented as a promising alternative learning approach (Graham,
2006). BELS refers to an instructional system that combines multiple learning delivery methods, including most often face-to-face class-
room with asynchronous and/or synchronous online learning. It is characterized as maximizing the best advantages of face-to-face and
online education.
While BELS has been recognized as having a number of advantages (e.g., instructional richness, access to knowledge content, social
interaction, personal agency, cost effectiveness, and ease of revision (Osguthorpe & Graham, 2003)), insufficient learning satisfaction is still
an obstacle to the successful BELS adoption (So & Brush, 2008). In fact, research findings from Bonk and colleagues have shown that learn-
ers had difficulty adjusting to BELS environments due to the potential problems in computer and Internet access, learners’ abilities and

* Corresponding author.
E-mail address: rtenny@umn.edu (R.D. Tennyson).

0360-1315/$ - see front matter Ó 2010 Elsevier Ltd. All rights reserved.
doi:10.1016/j.compedu.2009.12.012
156 J.-H. Wu et al. / Computers & Education 55 (2010) 155–164

beliefs in the use of technology, blended course design, participant interaction, and blended environments integration (Bonk, Olson,
Wisher, & Orvis, 2002). These findings imply that an effective BLES environment should consider the human and technology factors that
affect learner satisfactions with BELS, such as individual attitudes, participant interaction, educational technologies, and course design (Wu
et al., 2008). Thus, more careful analysis of learners, educational technologies, and social contexts in BELS environments are needed (EL-
Deghaidy & Nouby, 2008).
The adoption of BELS in supporting learning has made it significant to probe the crucial determinants that would entice learners to use
BELS and enhance their learning satisfaction. The degree of student learning satisfaction with BELS courses plays an important role in eval-
uating the effectiveness of BELS adoption. Hence, comprehending the essentials of what determines student learning satisfaction can pro-
vide management insight into developing effective strategies that will allow educational institution administrators and instructors to
create new educational benefits and value for their students. Because BELS environments differ from typical classroom and virtual e-learn-
ing, a review of previous research in learning technology shows that there is a lack of studies that have examined the crucial factors that
determine learning satisfaction with BELS, such as individual cognition, technological environments, and the social contexts, as stated
above. There is a need for more in-depth research to understand what determines student learning satisfaction in a BELS environment
and to investigate how the determinants influence student perceptions of BELS contexts and their correlations. This study, therefore, pro-
poses a research model, based on the social cognitive theory (Bandura, 1986), to investigate the primary determinants affecting student
learning satisfaction in a BELS environment. We also empirically validate the proposed model and examine the relationships among those
latent variables.

2. Basic concepts and theoretical foundation

2.1. Blended e-learning system

Blended learning is described as a learning approach that combines different delivery methods and styles of learning. The blend could be
between any form of instructional technology (e.g., videotape, CD-ROM, CAI, web-based learning) with classroom teaching. Recently there
has been an increasing movement toward blending e-learning and face-to-face activities with students participating in collaborative learn-
ing and interaction with their instructors and classmates. This is called ‘‘blended e-learning” or ‘‘blended e-learning system” (Graham,
2006; Singh, 2003).
Graham (2006) defined BELS as a mixing of instruction from two historically separate learning environments: classroom teaching and
full e-learning. The term emphasizes the central role of computer-based technologies (e-learning systems) in blended learning, focusing on
access and flexibility, enhancing classroom teaching and learning activities, and transforming the way individuals learn. From a course de-
sign perspective, a BELS course can lie anywhere between the continuum anchored at opposite ends by full face-to-face and virtual e-learn-
ing approaches (Rovai & Jordan, 2004). Kerres and De Witt (2003) identified three critical components of BELS that considers the content of
the learning materials, the communication between learners and tutors and between learners and their peers, and the construction of the
learners’ sense of place and direction within the activities that denote the learning environment. This is an important distinction because it
is certainly possible to enhance regular face-to-face courses with online resources without displacing classroom contact hours. Accordingly,
we defined BELS as the combination of online and face-to-face instruction and the convergence between traditional face-to-face learning
and e-learning environments.
Several BELSs, such as WebCT (www.webct.com) and Cyber University of NSYSU (cu.nsysu.edu.tw) have developed systems that inte-
grate a variety of functions to facilitate learning activities. For example, these systems can be used to integrate instructional material (via
audio, video, and text), e-mail, live chat sessions, online discussions, forums, quizzes and assignments. With these kinds of systems,
instructional delivery and communication between instructors and students can be performed at the same time (synchronously) or at dif-
ferent times (asynchronously). Such systems can provide instructors and learners with multiple, flexible instructional methods, educa-
tional technologies, interaction mechanisms or learning resources and applying them in an interactive learning environment to
overcome the limitations of classroom and e-learning. As a result, these online learning systems may better accommodate the needs of
learners or instructors who are geographically dispersed and have conflicting schedules (Pituch & Lee, 2006). As BELS emerge as perhaps
the most prominent instructional delivery solution, it is vital to explore what determines learning satisfaction in a blended e-learning
environment.

2.2. Social cognitive theory

Social cognitive theory (Bandura, 1986) serves as an initial foundation in this study for exploring what determines student learning sat-
isfaction in a blended e-learning environment. Social cognitive theory is a widely accepted and empirically validated model for understand-
ing and predicting human behavior and identifying methods in which behavior can be changed. Several studies have applied it as a
theoretical framework to predict and explain an individual’s behavior in IS settings. The theory argues that the meta progress of a human
being occurs through consecutive interactions with the outside environment and the environment must be subjected to one’s cognition
process before they affect one’s behavior. It proposes that a triadic reciprocal causation among cognitive factors, environmental factors,
and human behavior exists. Behavior is affected by both cognitive factors and environmental factors (Wood & Bandura, 1989). Cognitive
factors refer to the personal cognition, affect and biological events. Environmental factors refer to the social and physical environments that
can affect a person’s behavior.
Environments influence an individual’s behavior through his or her cognitive mechanisms. Hence, social cognitive theory posits two
critical cognitive factors: performance expectations and self-efficacy that influence individual behavior. It gives prominence to the concept
of self-efficacy – defined as one’s judgments and beliefs of his/her confidence and capability to perform a specific behavior – recognizing
that our performance expectations of a behavior will be meaningless if we doubt our capability to successfully execute the behavior in the
first place. It can enhance human accomplishment and well-being, help determine how much effort people will expend on a behavior, how
long they will persevere when confronting obstacles and how resilient they will be in the face of adverse situations. The theory further
J.-H. Wu et al. / Computers & Education 55 (2010) 155–164 157

argues that self-efficacy influences performance expectations and performance expectations also influence behavior. Thus, self-efficacy and
performance expectations are held to be the principal cognitive determinants of individual behavior.
Regarding environmental factors, there is ample educational literature and research that shows the learning environment affects a
learners’ behavior and performance. Traditionally, a learning environment was defined in terms of the physical and social environments
in a classroom setting. Piccoli et al. (2001) expanded the traditional definition of learning environment and identified five environmental
factors that clarify how an e-learning environment differs from classroom-based education, including technology, content, interaction,
learning model, and learner control. These factors can be classified into two categories that particularly are relevant to BELS-specific envi-
ronments. The first category relates to the technological environment that includes system functionality and content feature. The second
category relates to social environments that include interactions (between learners and instructors or between learners and other learners)
and learning climate.

3. Research model and hypotheses

Based on the foregoing theoretical underpinnings, we consider that the social cognitive theory is applicable to the BELS learning context.
Accordingly, three factors: learners’ cognitive beliefs (self-efficacy and performance expectations), technological environment (system
functionality and content feature), and social environment (interaction and learning climate) are identified and elucidated as the primary
dimensions of student learning satisfactions with BELS, as shown in Fig. 1.
3.1. Cognitive factors

Cognitive factors refer to the learners’ cognitive beliefs that influence their behaviors in using BELS. Two main cognitive variables: com-
puter self-efficacy and performance expectations are believed to be the most relevant factors affecting human behavior in using an infor-
mation system (IS) (Compeau & Higgins, 1995; Compeau, Higgins, & Huff, 1999; Venkatesh, Morris, Davis, & Davis, 2003). The social
cognitive theory defined performance expectations as the perceived consequences of a behavior and further noted they are a strong force
guiding individuals’ actions. The performance expectations are derived from individual judgments regarding valuable outcomes that can be
obtained through a requisite behavior. Individuals are more likely to perform behaviors that they believe will result in positive benefits
than those which they do not perceive as having favorable consequences.
Performance expectations are defined as the degree to which a learner believes that using BELS will help him or her to attain gains in
learning performance. The definition is similar to the concepts of perceived usefulness, based on Davis’s (1989) technology acceptance
model (Venkatesh et al., 2003). The influence of performance expectations on individual behavior of using computer systems has been
demonstrated by Compeau and Higgins (1995), Compeau et al. (1999) and Venkatesh et al. (2003). Prior research in education or com-
puter-mediated learning has found that performance expectations are positively related to students’ learning performance (Bolt & Koh,
2001) and satisfaction (Martins & Kellermanns, 2004; Shih, 2006).
Individual attitudes are a function of beliefs, including the behavioral beliefs directly linked to a person’s intention to perform a defined
behavior (Ajzen & Fishbein, 1980). User acceptance is an important indicator that measures a user’s positive attitudes toward the IS and
predicts their behaviors while using the system, based on theory of reasoned action (Taylor & Todd, 1995). Satisfaction is a good surrogate
for user acceptance and is often used to measure learners’ attitude in computer-mediated learning studies (Chou & Liu, 2005; Piccoli et al.,
2001). Thus, we conceptualize the student’s attitude toward BELS as the learning satisfaction with the BELS – defined as the sum of stu-
dent’s behavioral beliefs and attitudes that result from aggregating all the benefits that a student receives from using BELS. Therefore, the
following hypothesis is proposed.

H1: A higher level of performance expectations for BELS use will positively associate with a higher level of learning satisfaction with BELS.

The second cognitive factor to be applied in this research is self-efficacy. In general, it refers to an individual’s beliefs about his or her capa-
bilities to successfully perform a particular behavior. According to social cognitive theory, individuals form their perceptions of self-efficacy
toward a task based on cue they receive from the four information sources: (1) past experience and familiarity with similar activities, (2) vicar-
ious learning, (3) social support and encouragement, and (4) attitudes toward the task. Bandura (1986) noted self-efficacy is task-specific and
its measures should be tailored to the targeted domain context. Accordingly, several studies have investigated self-efficacy beliefs towards
tasks such as computers and IS-related behaviors (Compeau & Higgins, 1995; Compeau et al., 1999). Derived from the general definition of
self-efficacy, computer self-efficiency was defined as the individual ability to use information technology to accomplish computer-related
tasks or jobs (Marakas, Yi, & Johnson, 1998). Computer self-efficacy was also validated as a determinant of IS acceptance and use.
We define computer self-efficacy as the confidence in one’s ability to perform certain learning tasks using BELS. Prior research has
shown that increases in computer self-efficacy improve initiative and persistence, which lead to improved performance or outcome expec-
tations (Francescato et al., 2006; Johnston, Killion, & Oomen, 2005; Piccoli et al., 2001), including attitude and behavioral intention (Venk-
atesh & Davis, 2000). In the context of computer-mediated learning, empirical evidence indicates that increases in computer self-efficacy
improve students’ confidence in their computer-related capabilities, which in turn leads to a perception of positive performance expecta-
tions to the learning courses (Bolt & Koh, 2001; Jawahar & Elango, 2001; Santhanam et al., 2008; Shih, 2006). That is, computer self-efficacy
could reduce learning barriers in using BELS. If students have higher computer self-efficacy and can control BELS, they will perceive the
systems’ usefulness and value, which in turn motivates their intention to use BELS. Accordingly, the following hypothesis is proposed:

H2: A higher level of individual’s computer self-efficacy will positively associate with a higher level of performance expectations for BELS use.

3.2. Technological environment

The quality and reliability of an e-learning system, as well as easy access to appropriate educational technologies, material content, and
course-related information are important determinants of e-learning effectiveness (Piccoli et al., 2001). Thus, system functionality and
158 J.-H. Wu et al. / Computers & Education 55 (2010) 155–164

content features are identified as critical technological environment factors for BELS. They are expected to influence the learner to use and
accept BELS. Prior research has shown that system functionality significantly affected user beliefs in various computer-related contexts
(Igbaria, Gamers, & Davis, 1995; Venkatesh & Davis, 2000). For instance, research findings showed that specific system functionality is a
critical factor that influences e-learning system usage (Hong, Thong, Wong, & Tam, 2002; Pituch & Lee, 2006). Pituch and Lee (2006) defined
system functionality as the perceived ability of an e-learning system to provide flexible access to instructional and assessment media.
Accordingly, we define system functionality as the perceived ability of BELS to provide flexible access to instructional and assessment med-
ia. Such media, for example, allows students to access course materials and content, turn in homework assignments, complete tests and
quizzes online.
In general, content is used to identify various divergent formats and types of information. In this study, content refers to technology-
based materials and course-related information that may provide value for learners in the context of BELS. BELS achieves its goals of shar-
ing and delivering course content through various forms of media such as tutorials, online discussions, or web-based courses. Due to the
diversity of delivery methods, it is a considerable issue that how to design and represent the hybrid content in appropriate formats or types
best suited to delivery or access by BELS (So & Brush, 2008). Appropriate BELS content features, as well as effective design, representing
hybrid course content and transparent content knowledge transfer, are core components of BELS design (Piccoli et al., 2001). Drawing
on the previous research (Zhang, Keeling, & Pavur, 2000), we define content feature as the characteristics and presentation of course con-
tent and information in BELS. Text, hypertext, graphics, audio and video, computer animations and simulations, embedded tests, and multi-
media information are some examples of content features in BELS environment.
System functionality and content feature have the potential to directly affect perceived usefulness of IS (Hong et al., 2002; Pituch & Lee,
2006) that are thought to be similar concepts in performance expectation. Several empirical evidences have argued that both content fea-
tures (Zhang et al., 2000) and system functionality (Pituch & Lee, 2006) affects the effectiveness of computer-mediated learning. That is to
say, learners perceiving a higher level of system functionality and content features in BELS will lead to a higher level of performance expec-
tations for BELS use. In addition, in the BELS environment, the diverse content features can be delivered and accessed depending upon the
support of appropriate system functionality BELS facilitated (Pituch & Lee, 2006; So & Brush, 2008). Thus, we consider that the content fea-
ture highly depends on the power and quality of system functionality of BELS. Therefore, the following hypotheses are proposed:

H3: A higher level of system functionality of BELS will positively associate with a higher level of performance expectations for BELS use.

H4: A higher level of content features in BELS will positively associate with a higher level of performance expectations for BELS use.

H5: A higher level of system functionality in BELS will positively associate with a higher level of content features in BELS.

3.3. Social environment

In computer-mediated instructional design, there is an increasing focus on facilitating human interaction in the form of online collab-
oration, virtual communities, and instant messaging in the BELS context (Graham, 2006). From the group interactions perspective, social
environment factors, such as collaborative learning (Francescato et al., 2006), learning climate (Chou & Liu, 2005) and social interaction
(Johnston et al., 2005) are important antecedents of beliefs about using an e-learning system. Prior research (Pituch & Lee, 2006) shows
that social interaction has a direct effect on the usage of an e-learning system. The interactions among students, between faculty and stu-
dents and learning collaboration are the keys to learning process effectiveness. In addition, the emotional learning climate is an important
indicator of learning effectiveness.
Interaction is defined in our study as the social interactions among students themselves, the interactions between instructors and stu-
dents, and collaboration in a BELS environment. Learning climate is defined as the learning atmosphere in the BELS context. Johnston et al.
(2005) argued that contact and interaction with instructors and learners is a valid predictor of performance. A positive learning climate
encourages and stimulates the exchange of ideas, opinion, information, and knowledge in the organization that will lead to better learning
satisfaction (Prieto & Revilla, 2006). That is, when learners believe that BELS provides effective student-to-student and student-to-instruc-
tor interactions and improves learning climate, they will be more satisfied with BELS. Therefore, the following hypotheses are proposed:

H6: A higher level of interaction will positively associate with a higher level of performance expectations for BELS use.

H7: A higher level of interaction will positively associate with a higher level of learning climate.

H8: A higher level of learning climate will positively associate with a higher level of learning satisfaction with BELS.

4. Method

4.1. Instrument development

To develop the self-report instrument, a number of prior relevant studies were reviewed to ensure that a comprehensive list of measures
were included. All measures for each construct were taken from previously validated instruments and modified based on the BELS context.
J.-H. Wu et al. / Computers & Education 55 (2010) 155–164 159

For instance, the measures for learning satisfaction were selected from Chiu, Hsu, and Sun (2005) and Wu and Wang (2005). Measures for
computer self-efficacy and performance expectations were taken from Compeau and Higgins (1995). The measures for content feature
were adapted from Zhang et al. (2000) and Molla and Licker (2001). The measures for functionality were taken from Pituch and Lee
(2006). The measures for student and instructor interactions were taken from Johnston et al. (2005), Kreijns, Kirschner, and Jochems
(2003), and Pituch and Lee (2006). Finally, the measures for the learning climate were selected from Chou and Liu (2005). Supplementary
material lists the definition of each construct, its measures, and the references.
The questionnaire consisted of two major parts including a portion for the respondent’s basic data and another for the responses to our
research constructs. The basic data portion recorded the subject’s demographic information (e.g., gender, age, highest education, computer
experiences, and so forth). The second part recorded the subject’s perception of each variable in the model. It includes items for each con-
struct. All items are measured via a 7-point scale ranging from 1 (strongly disagree) to 7 (strongly agree).
Once the initial questionnaire was developed, an iterative personal interview process with professionals, instructors, and students from
blended learning courses (including four instructors and five students from three different universities) was conducted to verify the com-
pleteness, wording, and appropriateness of the instrument and to confirm the content validity. Feedback from the interview processes
served as the basis for correcting, refining, and enhancing the experimental scales. For example, scale items were eliminated if they rep-
resented the same aspects with only slightly different wording and modified if the semantics were ambiguous in order to enhance the psy-
chometric properties of the survey instrument. At the end of the pre-test, there were seven constructs with 21 items in total to be used for
the survey.

4.2. Participants

The empirical data were collected using a cross-sectional survey methodology. Participants for this study were students that had the
opportunity to take courses via BELS. We distributed 518 paper-based and online questionnaires to target universities. The target univer-
sities were purposively selected for the universities or colleges actually implemented BELS courses in Taiwan. Because of the applications of
BELS are still at an early stage in Taiwan, the target universities are relatively rare. Data were collected via snowball and convenient sam-
pling. Due to the conventional expectation of low survey response rates in survey studies, we endeavored to find a specific local contact
person for each target university who was placed in charge of distributing the questionnaire. Three hundred and seven-six questionnaires
were returned. Sixty-four responses were incomplete and had to be discarded. This left 212 valid responses for the statistical analysis, and a
valid response rate of 40.93% of the initial sample. Among the valid responses, 84 responses were received from physical classrooms and
128 responses were gathered from online learning environments. The potential non-response bias was assessed by comparing the early
versus late respondents that were weighed on several demographic characteristics. The results indicated that there were no statistically
significant differences among demographics between the early (the first semester) and late (the second semester) respondents. These re-
sults suggest that non-response bias was not a serious concern. The respondent profiles and the non-response bias analysis results are
shown in Table 1.

5. Results

Partial least squares (PLS) method was applied for the data analysis in this study. An analytical method is, in general, recommended for
predictive research models emphasized on theory development, whereas Linear Structural Relationships (LISREL) is recommended for con-
firmatory analysis and requires a more stringent adherence to distributional assumptions (Jöreskog & Wold, 1982). PLS performs a Confir-
matory Factor Analysis (CFA). In a CFA, the pattern of loadings of the measurement items on the latent constructs was explicitly specified in
the model. The fit of this pre-specified model is then examined to determine its convergent and discriminant validities. This factorial valid-
ity deals with whether the loading patterns of the measurement items corresponds to the theoretically anticipated factors (Gefen & Straub,
2005). Convergent validity is shown when each measurement item correlates strongly with its assumed theoretical construct, while dis-
criminant validity is shown when each measurement item correlates weakly with all other constructs except for the one to which it is the-
oretically associated. The evaluation of the model fit was conducted in two stages (Chin, 1998; Gefen & Straub, 2005). First, the
measurement validation was assessed, in which construct validity and reliability of the measures were assessed. The structural model with
hypotheses was then tested. The statistical analysis strategy involved a two-phase approach in which the psychometric properties of all
scales were first assessed through CFA and the structural relationships were then validated using bootstrap analysis.

5.1. Measurement validation

For the first phase, the analysis was performed in relation to the attributes of individual item reliability, construct reliability, average
variance extracted (AVE), and discriminant validity of the indicators as measures of latent variables. The assessment of item loadings, reli-
ability, convergent validity, and discriminant validity was performed for the latent constructs through a CFA. Reflective items should be
uni-dimensional in their representation of the latent variable and therefore correlated with each other. Item loadings should be above
0.707, showing that more than half of the variance is captured by the constructs. The results indicate that all items of the instrument
had significant loadings higher than the recommended value of 0.707. As shown in Table 2, all constructs exhibit good internal consistency
as evidenced by their composite reliability scores. The composite reliability coefficients of all constructs and the AVE in the proposed model
(see Fig. 1) are more than adequate, ranging from 0.821 to 0.957 and from 0.605 to 0.849, respectively.
To assess discriminant validity: (1) indicators should load more strongly on their corresponding construct than on other constructs in
the model and (2) the AVE should be larger than the inter-construct correlations (Chin, 1998). AVE measures the variance captured by a
latent construct, that is, the explained variance. For each specific construct, it shows the ratio of the sum of its measurement item variance
as extracted by the construct relative to the measurement error attributed to its items. As a rule of thumb, the square root of the AVE of
each construct should be larger than the correlation of the specific construct with any of the other constructs in the model (Chin, 1998) and
should be at least 0.50 (Fornell & Larcker, 1981). As the results show in Table 3, all constructs meet the above mentioned requirements. The
160 J.-H. Wu et al. / Computers & Education 55 (2010) 155–164

Table 1
Respondents profile and the results of non-response bias analysis (N = 212).

Variables Classification Total (%) Early Late v2 (Sig.)


respondents (%) respondents
(%)
Gender Male 106 0.500 73 0.344 33 0.156 0.022
(0.50)
Female 106 0.500 72 0.340 34 0.160
Age 18–30 101 0.476 48 0.453 53 0.500 1.344
(0.855)
31–40 82 0.387 41 0.387 41 0.387
41–50 23 0.108 14 0.132 9 0.085
51–60 4 0.019 2 0.019 2 0.019
>61 2 0.009 1 0.009 1 0.009
Types of Jobs Student 8 0.038 3 0.014 5 0.024 4.806
(0.440)
Industry 30 0.142 12 0.057 18 0.085
Manufacturing 57 0.269 27 0.127 30 0.142
Service 10 0.047 5 0.024 5 0.024
Finance 59 0.278 36 0.170 23 0.108
Others 48 0.226 23 0.108 25 0.118
Education level Senior high school 0 0.000 0 0.000 0 0.000 8.824
(0.32)
College (2 years) 10 0.047 1 0.005 9 0.042
University (4 years) 116 0.547 60 0.283 56 0.264
Graduate school 86 0.406 45 0.212 41 0.193
BELS experience Pure physical classroom experience 15 0.071 7 0.033 8 0.038 0.371
(0.946)
Pure virtual classroom experience 42 0.198 20 0.094 22 0.104
Physical experience more than 105 0.495 53 0.250 52 0.245
virtual experience
Virtual experience more than 50 0.236 26 0.123 24 0.113
physical experience
BELS experience: participating in BELS <0.5 years 35 0.165 18 0.085 17 0.080 2.695
(years) (0.747)
0.5–1 years 95 0.448 50 0.236 45 0.212
2 years 48 0.226 25 0.118 23 0.108
3 years 11 0.052 6 0.028 5 0.024
4 years 4 0.019 2 0.009 2 0.009
>4 years 19 0.090 5 0.024 14 0.066
BELS experience: participating in BELS 1 times 44 0.208 24 0.113 20 0.094 4.710
(times) (0.452)
2 times 43 0.203 22 0.104 21 0.099
3 times 30 0.142 15 0.071 15 0.071
4 times 13 0.061 9 0.042 4 0.019
5 times 10 0.047 6 0.028 4 0.019
P6 times 72 0.340 30 0.142 42 0.198
BELS experience:spending time in the <1 h 62 0.292 33 0.156 29 0.137 4.729
BELS (1 week) (0.450)
1–3 h 75 0.354 33 0.156 42 0.198
3–5 h 43 0.203 22 0.104 21 0.099
5–7 h 20 0.094 10 0.047 10 0.047
7–9 h 6 0.028 4 0.019 2 0.009
>9 h 6 0.028 4 0.019 2 0.009
Average years of computer usage 11.79 (years) 13.7 10.7 27.076 (0.133)
experience (years) (years)

Table 2
Results of confirmatory factor analysis.

Construct Items Composite reliability AVE


Computer self-efficacy (CSE) 3 0.821 0.605
System functionality (SF) 3 0.905 0.761
Content feature (CF) 2 0.890 0.802
Interaction (I) 3 0.915 0.782
Performance expectations (PE) 3 0.940 0.838
Learning climate (LC) 3 0.926 0.807
Learning satisfaction (LS) 4 0.957 0.849

values for reliability are all above the suggested minimum of 0.7 (Hair, Anderson, Tatham, & Black, 1998). Thus, all constructs display ade-
quate reliability and discriminant validity. All constructs share more variance with their indicators than with other constructs. Thus, the
convergent and discriminant validity of all constructs in the proposed research model can be assured.
J.-H. Wu et al. / Computers & Education 55 (2010) 155–164 161

Computer
Self -efficacy H2

System H3 Performance
Functionality Expectations H1
Learning
H5 H4 Satisfaction

Content Learning H8
H6
Feature Climate

H7

Interaction

Fig. 1. The research model for BELS learning satisfaction.

Table 3
Correlation between constructs.

CSE SF CF PE I LC LS
CSE 0.778a
SF 0.539 0.872
CF 0.492 0.609 0.896
PE 0.527 0.534 0.596 0.916
I 0.389 0.507 0.608 0.662 0.884
LC 0.425 0.513 0.593 0.761 0.727 0.898
LS 0.44 0.534 0.601 0.798 0.614 0.74 0.921
a
The shaded numbers in the diagonal row are square roots of the average variance extracted.

5.2. Hypotheses testing

In the second phase of the statistical analysis, the structural model was assessed to confirm to what extent the relationships specified by
the proposed model were consistent with the available data. The PLS method does not directly provide significance tests and path coeffi-
cient confidence interval estimates in the proposed model. A bootstrapping technique was used to estimate the significance of the path
coefficients. Bootstrap analysis was performed with 200 subsamples and the path coefficients were re-estimated using each of these sam-
ples. The parameter vector estimates was used to compute parameter means, standard errors, significance of path coefficients, indicator
loadings, and indicator weights. This approach is consistent with recommended practices for estimating significance of path coefficients
and indicator loadings (Löhmoeller, 1984) and has been used in prior information systems studies (Chin & Gopal, 1995; Hulland, 1999).
Hypotheses and corollaries testing were performed by examining the size, the sign, and the significance of the path coefficients and the
weights of the dimensions of the constructs, respectively. Results of the analysis for the structural model are presented in Fig. 2. The esti-
mated path coefficient (standardized) and its associated significance level are specified next to each link. The R2 statistic is indicated next to
the dependent construct. The statistical significance of weights can be used to determine the relative importance of the indicators in form-
ing a latent construct. We found that all specified paths between constructs in our research model had significant path coefficients. The
results provide support for our model.
One indicator of the predictive power of path models is to examine the explained variance or R2 values (Barclay, Higgins, & Thomson,
1995; Chin & Gopal, 1995). R2 values are interpreted in the same manner as those obtained from multiple regression analysis. They indicate
the amount of variance in the construct that is explained by the path model (Barclay et al., 1995). The results indicate that the model ex-
plained 67.8% of the variance in learning satisfaction. Similarly, 37.1% of the variance in content feature, 55.1% of the variance in perfor-
mance expectations and 52.9% of the variance in learning climate were explained by the related antecedent constructs. The path
coefficient from computer self-efficacy to performance expectations is .229 and from interaction to learning climate is 0.727. The magni-
tude and significance of these path coefficients provides further evidence in support of the nomological validity of the research model. Ta-
ble 4 summarizes the direct, indirect, and total effects for the PLS analysis.
As for the cognitive factors, Hypotheses H1 and H2, effectively drawn from computer self-efficacy to performance expectations and per-
formance expectations to learning satisfaction are supported by the significant path coefficients, respectively. That is, students who had
higher computer self-efficacy will have higher performance expectations, which in turn will lead to higher learning satisfaction.
As for the technological environment factors, with the significant path coefficients, the analysis results also provide support for the
hypotheses H3 and H4, effectively drawn from system functionality and content feature to performance expectations. In addition, Hypoth-
esis H5, effectively drawn from system functionality to content feature is also supported by the significant path coefficients. However, it is
interesting to note that the indirect effect of system functionality on performance expectations was stronger than its direct effect (see Table
4). This seems to indicate that system functionality alone may not be sufficient for improving performance expectations when the BELS
content features are not well-matched or designed.
162 J.-H. Wu et al. / Computers & Education 55 (2010) 155–164

Computer
Self-efficacy
0.229***

System 0.092* Performance


Functionality Expectations 0.557***
0.171** Learning
0.609*** (R 2=55.1%)
Satisfaction
Content Learning
0.422*** 0.315*** (R 2=67.8%)
Feature Climate
(R2 =37.1%) 2
(R =52.9%)
0.727***

Interaction

* P<0.05 **P<0.01 ***P<0.001

Fig. 2. PLS analysis results.

Table 4
Standardized causal effects of PLS analysis.

Dependent latent variables Independent latent variables Standardized causal effects T-statistics
Direct Indirect Total
Content feature System functionality 0.609 0.609 11.849***
Performance expectations Computer self-efficacy 0.229 0.229 3.717***
System functionality 0.092 0.104 0.196 1.358*
Content feature 0.171 0.171 2.011**
Interaction 0.422 0.422 5.203***
Learning climate Interaction 0.727 0.727 18.849***
Learning satisfaction Computer self-efficacy 0.128 0.128 3.693***
System functionality 0.109 0.109 2.307**
Content feature 0.095 0.095 1.802*
Interaction 0.465 0.465 7.175***
Performance expectations 0.557 0.557 7.006***
Learning climate 0.315 0.315 3.804***
*
P < 0.05.
**
P < 0.01.
***
P < 0.001.

As for the social environment factors, hypotheses H6 and H7, the paths from interaction to performance expectations and learning cli-
mate are supported. That is, interaction apparently influences the performance expectations and learning climate, respectively. Hypothesis
H8, effectively drawn from learning climate to learning satisfaction is also supported by the significant path coefficients. That is, learning
climate influences learning satisfaction. Overall, both performance expectations and positive learning climate have a direct effect on learn-
ing satisfaction; performance expectations provide the greatest contribution (total effect) to learning satisfaction.

6. Conclusion

BELS environments have become the most prominent instructional delivery alternative when employed in e-learning systems. This
study presents a theoretical model that was based on social cognitive theory for investigating the key determinants of student learning
satisfaction in a BELS environment. The results provide strong evidence for the nomological validity of each construct and the effects on
learning satisfaction, as shown in Fig. 2. The estimate of 0.551 for the performance expectations construct (R2 = 55.1%) for these paths pro-
vides good support for the hypothesized impact of computer self-efficacy, system functionality, content feature, and interaction on the
dependent variable, performance expectations. In addition, the estimate of 0.371 for the content feature construct (R2 = 37.1%) for the path
provides support for the hypothesized impact of system functionality on the content feature. The 0.529 estimate for the leaning climate
construct (R2 = 52.9%) for these paths provides support for the hypothesized impact of interaction on the dependent variable, learning cli-
mate. In addition, the 0.678 estimate for the learning satisfaction construct (R2 = 67.8%) denotes that the learning satisfaction as perceived
by learners is directly and indirectly mediated by the performance expectations and learning climate. Therefore, as a whole, the model has
strong explanatory power for the student learning satisfaction with BELS.
The significant path coefficients, effect size and the value of the R2 reinforce our confidence in the hypotheses testing results and provide
support for the association with learning satisfaction in the BELS setting. The results demonstrated that the BELS learning satisfaction is
affected by the interaction among cognitive, technological environment, and social environment factors. We confirmed that technology
alone does not cause learning to occur. It is consistent with the theoretical perspective of social cognitive theory: human behavior as a re-
ciprocal interplay of cognitive factors, environment, and behavior (Bandura, 1986).
J.-H. Wu et al. / Computers & Education 55 (2010) 155–164 163

The empirical results indicate that performance expectations and learning climate are two strong determinants of learning satisfaction
with BELS. The computer self-efficacy, system functionality, content feature, and interaction provided an indirect contribution to learning
satisfaction via the above determinants. Thus, as students become more confident and capable of learning with BELS and more accustomed
to the BELS learning environments, they will likely expect more benefits from the use of BELS, foster positive learning climate, and, overall,
be more satisfied with the BELS learning. These findings provide initial insights into those factors that are likely significant antecedents for
planning and implementing BELS to enhance student learning satisfaction. The contributions and implications of this study include the
following:
A BELS environment should enhance students’ performance expectations and foster positive learning climate. Our findings indicate that per-
formance expectations provide the most contribution to learning satisfaction. This suggests that instructors should take advantage of BELS
effectiveness in designing and teaching courses to enhance students’ beliefs that they would be able to achieve improved outcomes with
BELS. A positive learning climate significantly affects students’ learning satisfaction. This suggests that both instructors and learners should
foster and motivate the positive learning atmosphere within the BELS learning context. Consequently, if students believe that using BELS is
worthwhile, valuable and simple, they will be more likely to accept it resulting in greater satisfaction.
Education institutions should provide incentives and supports to enhance students’ computer self-efficacy. The empirical results demonstrate
that computer self-efficacy have a significant positive influence on performance expectations. This implies that learners should have the
computer competence necessary to exploit BELS and control over his/her learning activities. Therefore, educational institution administra-
tors and instructors should provide sufficient incentives and administrative supports to encourage students to actively participate in BELS
courses and to enhance their computer self-efficacy. BELS should provide built-in help to fit various learners’ needs in different learning
circumstances.
BELS should offer appropriate system functionality and content features with multimedia presentation and flexibility. The results show that
system functionality and content features have a positive influence on perceived expectations. These findings suggest that: (1) BELS should
offer useful information with synchronous and asynchronous learning and content-rich design that satisfy students’ needs; (2) BELS should
provide various types of content presentation (e.g., multimedia), customized functions to allow learners control over the system, and flex-
ible access to fit various students’ learning requirements. It seems reasonable to note that education institutions may offer BELS-related
technical training, awareness programs to the students to enhance students’ comprehension of BELS.
BELS should provide effective interaction tools and instructors should motivate interaction publicly. The results demonstrate that participant
interaction had a significant positive influence on both performance expectations and learning climate. In addition, interaction has the
most contribution (total effect) to the performance expectations. These findings suggest that when implementing BELS courses, the instruc-
tors should motivate the positive interaction publicly to increase participant communication and collaborative learning via the system. In
general, learning climate is a function and positive feedback of participant interaction in a BELS environment. A positive learning climate
can make learning easy and natural. Thus, if BELS could support a good social environment to facilitate the students-to-student and stu-
dent-to-instructor connectivity interaction (e.g., interactive communication and collaborative learning), learners will be more likely to ac-
tively participate in interaction, so as to foster better learning climate and to perceive greater BELS performance expectations and learning
satisfactions.
Although our study provides insights into what determines student learning satisfaction in a BELS environment, it has several limita-
tions that also represent opportunities for future research. First, the model was validated using sample data gathered from the target uni-
versities in Taiwan. The fact that the participants come from one country limits the generalizability of the results. Other samples from
different nations, cultures, and contexts should be gathered to confirm and refine the findings of this study. Second, given the self-report
instrument used (e.g. for measuring computer self-efficacy, system functionality, and content feature), therefore, the typical shortcomings
associated with self-report measures must be recognized when interpreting the results. Third, this research sets a timely stage for future
research in understanding the determinants of learning satisfaction in a BELS environment. It would be interesting to use a longitudinal
design to examine the relationships among the identified research variables might be a useful extension to the current study. Finally,
the results cannot be exhaustive and future works should endeavor to uncover additional determinants of student learning satisfaction
with BELS.

Appendix A. Supplementary material

Supplementary data associated with this article can be found, in the online version, at doi:10.1016/j.compedu.2009.12.012.

References

Ajzen, I., & Fishbein, M. (1980). Understanding attitudes and predicting social behavior. Englewood Cliffs, NJ: Prentice-Hall.
Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: Prentice Hall.
Barclay, D., Higgins, C., & Thomson, R. (1995). The partial least squares approach to causal modeling, personal computer adoption and use as an illustration. Technology Studies,
2, 285–309.
Bolt, M. A., & Koh, H. C. (2001). Testing the interaction effects of task complexity in computer training using the social cogitative model. Decision Sciences, 32, 1–19.
Bonk, C. J., Olson, T. M., Wisher, R. A., & Orvis, K. L. (2002). Learning from focus group: An examination of blended learning. Journal of Distance Education, 17(3), 97–118.
Chin, W. W. (1998). The partial least squares approach to structural equation modeling. In G. A. Marcoulides (Ed.), Modern methods for business research (pp. 298–336).
Mahwah, NJ: Erlbaum.
Chin, W. W., & Gopal, A. (1995). Adoption intention in GSS: Relative importance of beliefs. The Data Base for Advances in Information Systems, 26, 42–63.
Chiu, C. M., Hsu, M. H., & Sun, S. Y. (2005). Usability, quality, value and e-learning continuance decisions. Computers & Education, 45, 399–416.
Chou, S. W., & Liu, C. H. (2005). Learning effectiveness in a web-based virtual learning environment: A learner control perspective. Journal of Computer Assisted Learning, 21,
65–76.
Compeau, D. R., & Higgins, C. A. (1995). Computer self-efficacy: Development of a measure and initial test. MIS Quarterly, 19, 189–211.
Compeau, D. R., Higgins, C. A., & Huff, S. (1999). Social cognitive theory and individual reactions to computing technology: A longitudinal study. MIS Quarterly, 23, 145–158.
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13, 318–339.
EL-Deghaidy, H., & Nouby, A. (2008). Effectiveness of a blended e-learning cooperative approach in an Egyptian teacher education programme. Computers & Education, 51,
988–1006.
Fornell, C. D., & Larcker, F. (1981). Evaluating structural equation models with unobservable variables and measurement errors. Journal of Marketing Research, 18(2), 39–50.
164 J.-H. Wu et al. / Computers & Education 55 (2010) 155–164

Francescato, D., Porcelli, R., Mebane, M., Cuddetta, M., Klobas, J., & Renzi, P. (2006). Evaluation of the efficacy of collaborative learning in face-to-face and computer-supported
university contexts. Computers in Human Behavior, 22, 163–176.
Gefen, D., & Straub, D. W. (2005). A practical guide to factorial validity using PLS-graph: Tutorial and annotated example. Communications of the Association for Information
Systems, 16(5), 91–109.
Graham, C. R. (2006). Chapter 1: Blended learning system: Definition, current trends, future directions. In C. J. Bonk & C. R. Graham (Eds.), Handbook of blended learning. San
Francisco, CA: Pfeiffer.
Hair, J. F., Anderson, R. E., Jr., Tatham, R. L., & Black, W. C. (1998). Multivariate data analysis with readings (5th ed.). NJ: Prentice Hall.
Hara, N., & Kling, R. (2000). Students’ distress with a web-based distance education course: An ethnographic study of participants’ experiences. Information, Communication
and Society, 3, 557–579.
Hong, W., Thong, J. Y. L., Wong, W. M., & Tam, K. Y. (2002). Determinants of user acceptance of digital libraries: An empirical examination of individual differences and system
characteristics. Journal of Management Information Systems, 18(3), 97–124.
Hulland, J. (1999). Use of partial least squares (PLS) in strategic management research: A review of four recent studies. Strategic Management Journal, 20, 195–204.
Igbaria, M., Gamers, T., & Davis, G. B. (1995). Testing the determinants of microcomputer usage via a structural equation model. Journal of Management Information Systems,
11(4), 87–114.
Jawahar, I. M., & Elango, B. (2001). The effect of attitudes, goal setting and self-efficacy on ender user performance. Journal of End User Computing Practice, 13(2), 40–45.
Johnston, J., Killion, J., & Oomen, J. (2005). Student satisfaction in the virtual classroom. The Internet Journal of Allied Health Sciences and Practice, 3(2).
Jöreskog, K. G., & Wold, H. (1982). The ML and PLS techniques for modeling with latent variables: Historical and comparative aspects. In K. G. Jöreskog & H. Wold (Eds.),
Systems under indirect observation: Causality structure and prediction (pp. 219–243). Amsterdam: North Holland.
Kerres, M., & De Witt, C. (2003). A didactical framework for the design of blended learning arrangements. Journal of Educational Media, 28(2/3), 101–113.
Kinshuk, D., & Yang, A. (2003). Web-based asynchronous synchronous environment for online learning. United States Distance Education Association Journal, 17(2), 5–17.
Kreijns, K., Kirschner, P., & Jochems, W. (2003). Identifying the pitfalls for social interaction in computer-supported collaborative learning environments: A review of the
research. Computers in Human Behavior, 19, 335–353.
Löhmoeller, J. B. (1984). LVPS 1.6 program manual: Latent variable path analysis with partial least squares estimation. Universitaet zu Koehn, Zentralarchiv fuer Empirische
Sozialforschung.
Maki, R. H., Maki, W. S., Patterson, M., & Whittaker, P. D. (2000). Evaluation of a web-based introductory psychology course: Learning and satisfaction in on-line versus lecture
courses. Behavior Research Models, Instruments, and Computers, 32, 230–239.
Marakas, G. M., Yi, M. Y., & Johnson, R. D. (1998). The multilevel and multifaceted character of computer self-efficacy: Toward clarification of the construct and an integrative
framework for research. Information Systems Research, 9, 126–162.
Martins, L. L., & Kellermanns, F. W. (2004). A model of business school students’ acceptance of a web-based course management system. Academy of Management Learning and
Education, 3, 7–26.
Molla, A., & Licker, P. S. (2001). E-commerce systems success: An attempt to extend and respecify the DeLone and McLean model of IS success. Journal of Electronic Commerce
Research, 2(4), 1–11.
Osguthorpe, R. T., & Graham, C. R. (2003). Blending learning environments: Definitions and directions. The Quarterly Review of Distance Education, 4(3), 227–233.
Piccoli, G., Ahmad, R., & Ives, B. (2001). Web-based virtual learning environments: A research framework and a preliminary assessment of effectiveness in basic it skills
training. MIS Quarterly, 25, 401–426.
Pituch, K. A., & Lee, Y. (2006). The influence of system characteristics on e-learning use. Computers & Education, 47, 222–244.
Prieto, I. M., & Revilla, E. (2006). Formal and informal facilitators of learning capability: The moderating effect of learning climate. IE working paper. WP06-09. 21-02-2006.
Rovai, A., & Jordan, H. (2004). Blended learning and sense of community: A comparative analysis with traditional and fully online graduate courses. The International Review of
Research in Open and Distance Learning, 5(2), 1–12.
Santhanam, R., Sasidharan, S., & Webster, J. (2008). Using self-regulatory learning to enhance e-learning-based information technology training. Information Systems Research,
19, 26–47.
Shih, H. P. (2006). Assessing the effects of self-efficacy and competence on individual satisfaction with computer use: An IT student perspective. Computers in Human Behavior,
22, 1012–1026.
Singh, H. (2003). Building effective blended learning programs. Educational Technology, 44(1), 5–27.
So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical
factors. Computers & Education, 51, 318–336.
Taylor, S., & Todd, P. A. (1995). Decomposition and crossover effects in the theory of planned behavior: A study of consumer adoption intentions. International Journal of
Research in Marketing, 12, 137–156.
Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46, 186–204.
Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27, 425–478.
Wood, R., & Bandura, A. (1989). Social cognitive theory of organization management. Academy of Management Review, 14, 361–384.
Wu, J. H., Tennyson, R. D., Hsia, T. L., & Liao, Y. W. (2008). Analysis of e-learning innovation and core capability using a hypercube model. Computers in Human Behavior, 24,
1851–1866.
Wu, J. H., & Wang, S. C. (2005). What drives mobile commerce? An empirical evaluation of the revised technology acceptance model. Information & Management, 42, 719–729.
Yang, Z., & Liu, Q. (2007). Research and development of web-based virtual on-line classroom. Computers & Education, 48, 171–184.
Zhang, X., Keeling, K. B., & Pavur, R. J. (2000). Information quality of commercial website home pages: An explorative analysis. In Proceeding of the 21th international
conference on information systems (pp. 164–175). Brisbane, Australia.

You might also like