Professional Documents
Culture Documents
The attached
copy is furnished to the author for internal non-commercial research
and education use, including for instruction at the authors institution
and sharing with colleagues.
Other uses, including reproduction and distribution, or selling or
licensing copies, or posting to personal, institutional or third party
websites are prohibited.
In most cases authors are permitted to post their version of the
article (e.g. in Word or Tex form) to their personal website or
institutional repository. Authors requiring further information
regarding Elsevier’s archiving and manuscript policies are
encouraged to visit:
http://www.elsevier.com/copyright
Author's personal copy
a r t i c l e i n f o a b s t r a c t
Article history: In this study, secondary school teachers’ acceptance of a digital learning environment (DLE) was investi-
Available online 3 November 2010 gated. Questionnaires were taken on three times (T1/T2/T3) during the same school year, with the Unified
Theory of Acceptance and Use of Technology (UTAUT) as theoretical framework. Next to questionnaires,
Keywords: user-logs were collected during the entire school year. A total of 72 teachers completed a questionnaire
Technology acceptance on at least one occasion: 64 teachers responded at T1, 41 at T2, and 55 at T3. We first investigated which
UTAUT factors influence teachers’ acceptance of a DLE. The main predictors of DLE acceptance were performance
Digital learning environment
expectancy and social influence by superiors to use the DLE. Effort expectancy and facilitating conditions
Teacher
Observed use
were of minor importance. We then investigated how well the amount of final observed use could be pre-
dicted, and found that at T1 about one third, at T2 about one fourth and at T3 about half of the variance in
observed use was predicted by attitude, behavioral intention and self-reported frequency of use. Our
study showed that to maximize use of a DLE, its usefulness should be demonstrated, while school boards
or principals should strongly encourage teachers to (start to) use the DLE.
Ó 2010 Elsevier Ltd. All rights reserved.
0747-5632/$ - see front matter Ó 2010 Elsevier Ltd. All rights reserved.
doi:10.1016/j.chb.2010.10.005
Author's personal copy
tion (Fishbein & Ajzen, 1975), Social Cognitive Theory (Bandura, Acceptance studies measure teachers’ or student teachers’
1986), Innovation Diffusion Theory (Rogers & Shoemaker, 1971), acceptance of computers operationalized as the intention to use
or the Theory of Interpersonal Behavior (Triandis, 1980), several (Hu et al., 2003; Ma et al., 2005; Teo, 2009) or attitudes towards
models were developed, with the Technology Acceptance Model computers (Teo et al., 2008). As in acceptance studies in settings
(TAM) (Davis, Bagozzi, & Warshaw, 1989) as the most prominent other than education, usefulness was a consistently strong predic-
model. TAM, building on the Theory of Reasoned Action, states that tor of acceptance (Hu et al., 2003; Ma et al., 2005; Teo, 2009; Teo
the acceptance of a technology depends on two types of beliefs: the et al., 2008). In general, the effect of ease of use was not that strong
technology’s perceived usefulness and its perceived ease of use. (Teo et al., 2008) or only indirectly significant through usefulness
TAM has been applied in several hundreds of studies in a wide (Hu et al., 2003; Ma et al., 2005; Teo, 2009). The effect of subjective
range of settings, also in the field of education (e.g. Sanchez-Franco, norms on acceptance was inconsistent. Teo et al. (2008) identified
2010; Teo, Lee, & Chai, 2008). Typically no more than 40% of the it as a direct predictor of acceptance, while Hu et al. (2003) found it
variance in the dependent variable is explained, leaving room for to be influential only in the beginning, and Ma et al. (2005) found
additional antecedents of acceptance (Legris, Ingham, & Collerette, no effect. Three studies found facilitating conditions (Teo, 2009;
2003), resulting in many follow-up studies focusing on model Teo et al., 2008) or the related construct compatibility (Hu et al.,
expansion or refinement. Ultimately, this led to a field of research 2003) to influence acceptance indirectly through perceived ease
in which the knowledge was dispersed and lacked structure, until of use and/or perceived usefulness. Other predictors of computer
Venkatesh, Morris, Davis, and Davis (2003) synthesized the avail- acceptance were attitude (Teo, 2009), (computer) self-efficacy
able body of evidence. Eight widespread (technology) acceptance (Hu et al., 2003; Kao & Tsai, 2009; Teo, 2009), job relevance (Hu
theories were taken into account, and through an empirical study, et al., 2003) and technological complexity (Teo, 2009).
four recurrent constructs were withheld and form the base of the In educational sciences, several studies found that computer
development of the Unified Theory of Acceptance and Use of Tech- attitudes have a positive influence on the integration of computers
nology (UTAUT): in education. In these studies, the term (computer) attitudes may
refer to very diverse constructs:
– Performance expectancy (PE): this encompasses perceived use-
fulness (Davis, 1989) and other constructs regarding the useful- – General computer attitude: this encompasses confidence, anxi-
ness of the technology and is defined as ‘‘the degree to which an ety and enjoyment/liking (Hermans et al., 2008; Shapka & Fer-
individual believes that using the system will help him or her to rari, 2003; van Braak et al., 2004);
attain gains in job performance” (Venkatesh et al., 2003); – Attitude towards computers in the classroom (Mueller et al.,
– Effort expectancy (EE): this encompasses constructs concerning 2008; Sang et al., 2010; van Braak et al., 2004) enclosing items
the ease of use of the technology, such as perceived ease of related to the usefulness of a computer as a tool.
use (Davis, 1989), and is defined as ‘‘the degree of ease associ-
ated with the use of the system” (Venkatesh et al., 2003); The importance of providing facilitating conditions is also a
– Social influence (SI): encompassing constructs relating to norms recurrent theme in this line of research. The following constructs
in the social environment of the individual on his/her use of the that may be considered as categories of facilitating conditions were
technology, e.g. subjective norms (Fishbein & Ajzen, 1975). mentioned as important for integrating computers in education:
Social influence is defined as ‘‘the degree to which an individual equipment resources and support from school administrators
perceives that important others believe he or she should use the (Smarkola, 2008), institutional support (Kadijevich, 2006), training,
new system” (Venkatesh et al., 2003); access to ICT resources and ongoing support (Williams, Coles, Wil-
– Facilitating conditions (FC): this construct is very broad as it son, Richardson, & Tuson, 2000), and computer training (van Braak
encompasses training, support, infrastructure, and knowledge. et al., 2004). Other factors with a positive influence on the integra-
This construct was distilled from perceived behavioral control tion of computers in the classroom were self-efficacy (Sang et al.,
(Ajzen, 1991), facilitating conditions (Thompson, Higgins, & 2010; Shapka & Ferrari, 2003) and computer experience (Hermans
Howell, 1991) and compatibility (Moore & Benbasat, 1991). It et al., 2008; Mueller et al., 2008).
is defined as ‘‘the degree to which an individual believes that
an organizational and technical infrastructure exists to support 1.3. Operationalizing acceptance
use of the system” (Venkatesh et al., 2003).
Technology acceptance can be measured in several ways. Orig-
Next to these four constructs, UTAUT also contains four vari- inally, models were devised for situations where users could
ables that moderate the relationships between the predictors and choose to use (or not use) a technology, and this was reflected in
intention or use: gender, age, experience with the technology the operationalization of acceptance: one accepts a technology if
and voluntariness of use. UTAUT was found to explain up to 70% s/he uses (or intends to use) the technology. However, in many
of the variance in behavioral intention, thereby outperforming its cases, users do not have a choice; they simply have to use the tech-
originating models (Venkatesh et al., 2003). nology so that other conceptualizations of acceptance might be
better (Warshaw & Davis, 1985b). Below, the most common oper-
1.2. Technology acceptance in education ationalizations of acceptance are listed:
The introduction and use of computers (or technology in general) – Use or use behavior (Halawi & McCarthy, 2008; Landry, Griffeth,
in education attracted the attention of several researchers in the & Hartman, 2006; Venkatesh et al., 2003): observed or self-
past. Two major lines of research can be discerned: on the one hand reported. Observed use can be considered as the ultimate mea-
acceptance studies (e.g. Hu et al., 2003; Ma, Andersson, & Streith, sure for acceptance: e.g. the duration of use computed from sys-
2005; Teo, 2009; Teo et al., 2008) and on the other hand more educa- tem logs (Venkatesh et al., 2003), recording the actions a subject
tional research in which computer attitudes, teacher beliefs and the undertakes while completing a task (Shapka & Ferrari, 2003). A
integration of computers in the classroom are studied (e.g. Hermans, problem with use (both observed and self-reported) is that it
Tondeur, van Braak, & Valcke, 2008; Mueller, Wood, Willoughby, requires subjects to have some experience with the technology.
Ross, & Specht, 2008; Sang, Valcke, van Braak, & Tondeur, 2010; When the implementation of a technology is still being planned,
Shapka & Ferrari, 2003; van Braak, Tondeur, & Valcke, 2004). other measures of acceptance should be used;
Author's personal copy
For this study, attitude, behavioral intention and use will be in- Fig. 1. Research model.
cluded as measures for acceptance. In view of the conceptual over-
lap with and dominance of behavioral intention in this field of
research, behavioral expectation will not be taken into account. (1) by examining professional users (teachers);
Use will be measured as self-reported frequency of use and ob- (2) by administering questionnaires on three occasions during
served frequency of use from log files. one school year. This way the evolution over time of the
teachers’ opinions concerning the technology can be
revealed;
1.4. Purpose (3) by collecting, in addition to the questionnaires, use behavior
from log files. This is a major strength of the study, as most
The purpose of this study is to scrutinize secondary school studies in this field of research have to rely on self-reported
teachers’ acceptance of a digital learning environment. Two re- measures of acceptance (Legris et al., 2003).
search questions are put forward. We will first investigate which
factors contribute to secondary school teachers’ acceptance of a The combination of these three characteristics distinguishes
DLE. As we draw on UTAUT as theoretical framework, the first re- this study from other (educational) technology acceptance studies.
search question to be addressed in this study is:
2. Materials and methods
RQ1: To what degree can performance expectancy, effort expec-
tancy, social influence and facilitating conditions predict the 2.1. Technology
acceptance of a digital learning environment, measured as atti-
tude, behavioral intention, self-reported frequency of use, and The digital learning environment under scrutiny is Smartschool
observed near-term use? (www.smartschool.be). Smartschool offers its users (administra-
tive force, school board, teachers and pupils) both basic and very
Second, we will also investigate if the amount of final observed advanced opportunities. The three core functionalities of Smart-
use of the DLE can be predicted. As we dispose of both self-re- school are:
ported and observed measures, we will assess how well the self-re-
ported measures of acceptance predict actual use. We hypothesize – digital learning environment consisting of 16 modules. In the
that these measures of acceptance have both a direct and indirect DLE, teachers can set up learning paths, create exercises, take
influence on the amount of observed use. A direct influence be- tests, collect and store tasks, etc.;
cause attitude, behavioral intention and self-reported use all – communication: Smartschool has an internal messaging system
served in past research as measures for acceptance in the absence for communication between users, public discussions can be
of a measure of observed use. An indirect influence because accep- conducted in forums, and users can read important messages
tance measures are interrelated: attitude influences intention (Da- from the school board on the bulletin board;
vis et al., 1989), intention in its turn self-reported use (Venkatesh – administration: this comprises for example taking surveys,
et al., 2003). Putting this together leads to the model as depicted online timetables, and an intradesk where users can submit
in the research question 2 pane of Fig. 1, while the second research important documents.
question is formulated as follows:
Next to these core functionalities, extra features can be added to
RQ2: To what degree can attitude toward use of a DLE, behav- Smartschool, like an online scorecard, or linking the upload zone
ioral intention to use the DLE and self-reported frequency of with Ephorus (www.ephorus.nl) to control for plagiarism in stu-
use of the DLE predict the final observed use of the DLE? dent papers.
Combining the two research questions leads to the research 2.2. Data collection
model in Fig. 1. By combining the research questions, we will
be able to distill the factors that lead to a maximal use of the 2.2.1. Study population
DLE. The participants were members of the teaching staff (total pop-
This study intends to add to the current literature on (educa- ulation of 90 teachers) of a secondary school. The school is situated
tional) technology acceptance in three respects: in the Dutch-speaking part of Belgium. In this school, three streams
Author's personal copy
of education are offered: general, technical and vocational The second (T2) and third (T3) questionnaires were handed out
education. to the teachers per their personal pigeonhole in the teachers’ room.
Completed responses could be posted in a sealed box in the teach-
ers’ room. The second questionnaire was handed out at the end of
2.2.2. Instrument October, right before fall break, and 41 usable responses were col-
The acceptance part of the questionnaire was made up of 21 lected. The last questionnaire was handed out at the end of May,
items (22 at T2 and T3, see Section 3.1.). The items were adapted and 55 usable responses were returned. A total of 72 (unique)
from Duyck et al. (2008), Moore and Benbasat (1991), and Venk- teachers completed at least one questionnaire; user logs (showing
atesh et al. (2003), and tweaked to an educational context. The fol- data and time the user logged in into Smartschool) were collected
lowing scales (number of items per scale between brackets), were for these 72 teachers.
included in the questionnaire survey: performance expectancy
(four items), effort expectancy (three items at T1, four at T2 and
2.3. Data-analysis
T3, see Section 3.1.), social influence (four items), facilitating con-
ditions (three items), attitude (three items), behavioral intention
Prior to the analysis of the research questions, some preliminary
(two items), and self-reported frequency of use (two items). As
analyses will be run. First, the reliability of the scales will be estab-
the use of the digital learning environment was mandatory, we
lished using Cronbach a. Then descriptive statistics will be com-
were only interested in social influence exerted by superiors and
puted and the correlations between the constructs will be
the SI-scale was adjusted in this way. All items had to be rated
calculated.
on a 7-point Likert scale ranging from ‘‘complete disagreement
For the first research question, we want to investigate which
(1)” to ‘‘complete agreement (7)”, except for self-reported use that
factors contribute to the acceptance of the DLE, if this changes over
ranged from ‘‘never (1)” to ‘‘daily (7)”.
time, and how well the predictors predict the acceptance of the
Next to these items, demographic information (gender, age, do-
DLE. Hereto, ordinary least squares regression analyses will be
main of teaching) was collected while the teachers could also indi-
run in SPSS 15, per measure of acceptance, pooled over the mea-
cate which of the 16 modules in the DLE part they used. At the end
surements and per measurement (T1, T2 or T3).
of the questionnaire, there was room for remarks or complaints.
Path analysis using AMOS 6.0 will be applied to address the sec-
For this study, use is derived from system logs containing date
ond research question, as we do not only want to investigate how
and time a user logged in into the system. Two measures were
well the self-reported measures of acceptance predict observed
computed:
use, but also how the self-reported measures interrelate. To test
the fit between our model and the data, the following fit-measures
– Near-term use: number of days a teacher logged in into the sys-
will be used: normed v2, root mean square error of approximation
tem during the month following the survey, respectively, the
(RMSEA), comparative fit index (CFI), and adjusted goodness-of-fit
use in September (T1), November (T2) and June (T3);
index (AGFI). The recommendations of Hu and Bentler (1999) are
– Final use: number of days a teacher logged in into the system
used: below .05 for RMSEA and higher than .95 for CFI and AGFI,
during the school year 2006–2007 (from September 1st to June
while for normed v2 < 3.0 (Teo, Lee, Chai, & Wong, 2009).
30th).
3. Results
2.2.3. Procedure
The questionnaire was administered at three times during the 3.1. Preliminary analyses: Reliability, descriptive statistics,
same school year. The first questionnaire (T1) was taken during a correlations
plenary preparatory meeting at the end of August prior to the start
of school year. At this meeting, Smartschool was formally intro- Table 1 displays the reliability and descriptive statistics of the
duced to the teaching staff, although it was accessible since May scales and measures that were used throughout this study. The
and already pretested. At the meeting, the principal strongly reliability of the FC-scale was below the threshold for acceptable
encouraged the teachers to use Smartschool during the lessons reliability (.70) (Nunnally & Bernstein, 1994), however, by remov-
and for school tasks. He also announced that Smartschool would ing the item ‘‘Smartschool is not compatible with other systems I
replace the official bulletin board, hence that use of Smartschool use” the reliability of this scale was drastically improved. For the
was mandatory. Teachers were given time to complete the ques- EE-scale, there was a problem with one item (‘‘I fear that learning
tionnaire during the meeting, the responses were collected at the to work with Smartschool will not go fast and will take a lot of
end, this way 64 usable responses were collected. time”) at T1, therefore the item was replaced by ‘‘Learning to work
Table 1
Reliability and mean of the scales per measurement.
Notes: a,b,c Values in the same line with the same superscript differ on p < .05 (independent samples t-test, two-sided); d
number of days of
Smartschool use during the school year; e number of days of Smartschool use during the month following the survey.
Author's personal copy
Table 2
Results of regression analysis.
withheld for the EE-scale for the remainder of the analyses. Pooled ATT BI s-r USE Near-term use
Five significant differences between the mean scale ratings Timea .04 .04 .43*** .07
were observed (Table 1). Perceptions of the ease of use of Smart- PEa .71*** .42*** .14 .36**
school (EE) were significantly higher at T3 compared to both T1 EEa .24*** .13 .07 .06
(t(117) = 2.392, p = .018) and T2 (t(94) = 2.670, p = .009). Mean SIa .02 .28*** .23*** .20*
FCa .01 .08 .21* .03
scores on facilitating conditions increased significantly from T1
Time PEa .05 .05 .03 .02
to T2 (t(102) = 2.426, p = .017). Finally, self-reported use (s-r use) Time EEa .05 .08 .05 .05
on T1 was significantly lower compared to both T2 (t(100.923) = Time SIa .01 .07 .09 .10
6.667, p < .001) and T3 (t(115.741) = 6.823, p < .001). Time FCa .05 .07 .10 .08
Adj. R2 .80 .35 .38 .12
The descriptive statistics show that the users rated performance
Sig. R2 changeb p = .60 p = .62 p = .29 p = .54
expectancy of Smartschool low. Another remarkable, yet expected,
finding is the high mean score on the SI-scale at all times. Notes: a The values reported are standardized b regression coefficients; b this refers
The correlation analysis did not reveal unexpected findings, but to the significance level of the change in R2 after adding the interaction terms.
third of the variance was explained. The pooled analysis provided self-reported use, and self-reported use observed use. There was
more information. A main effect of time (b = .43) was found, indi- one minor exception, the influence of attitude on observed use in-
cating that self-reported use increased over time. Both facilitating creased over time. While nonexistent at T1 (b = .02) and T2
conditions (b = .21) and social influence (b = .23) had a direct effect (b = .07), the influence of attitude on observed use was marginally
on self-reported use. Variance explained was a lot higher compared significant at T3 (b = .20, p < .10).
to the analyses per measurement. Adding interaction terms did not
lead to any increase in the proportion of variance explained. 4. Discussion
Hu, P. J. H., Clark, T. H. K., & Ma, W. W. (2003). Examining technology acceptance by Shapka, J. D., & Ferrari, M. (2003). Computer-related attitudes and actions of teacher
school teachers: A longitudinal study. Information and Management, 41, candidates. Computers in Human Behavior, 19, 319–334.
227–241. Smarkola, C. (2008). Efficacy of a planned behavior model: Beliefs that contribute to
Kadijevich, D. (2006). Achieving educational technology standards: The relationship computer usage intentions of student teachers and experienced teachers.
between student teacher’s interest and institutional support offered. Journal of Computers in Human Behavior, 24, 1196–1215.
Computer Assisted Learning, 22, 437–443. Smartschool. (2010). <http://www.smartschool.be/> Accessed 21.05.10.
Kao, C. P., & Tsai, C. C. (2009). Teachers’ attitudes toward web-based professional Teo, T. (2009). Modelling technology acceptance in education: A study of pre-
development, with relation to Internet self-efficacy and beliefs about web- service teachers. Computers & Education, 52, 302–312.
based learning. Computers & Education, 53, 66–73. Teo, T., Lee, C. B., & Chai, C. S. (2008). Understanding pre-service teachers’ computer
Landry, B. J. L., Griffeth, R., & Hartman, S. (2006). Measuring student perceptions of attitudes: Applying and extending the technology acceptance model. Journal of
blackboard using the technology acceptance model, decision sciences. Journal of Computer Assisted Learning, 24, 128–143.
Innovative Education, 4, 87–99. Teo, T., Lee, C. B., Chai, C. S., & Wong, S. L. (2009). Assessing the intention to use
Legris, P., Ingham, J., & Collerette, P. (2003). Why do people use information technology among pre-service teachers in Singapore and Malaysia: A
technology? A critical review of the technology acceptance model. Information multigroup invariance analysis of the technology acceptance model (TAM).
Management, 40, 191–204. Computers & Education, 53, 1000–1009.
Ma, W. W. K., Andersson, R., & Streith, K. O. (2005). Examining user acceptance of Thompson, R. L., Higgins, C. A., & Howell, J. M. (1991). Personal computing - Toward
computer technology: An empirical study of student teachers. Journal of a conceptual-model of utilization. MIS Quarterly, 15, 125–143.
Computer Assisted Learning, 21, 387–395. Triandis, H. C. (1980). Values, attitudes, and interpersonal behavior. In M. M. Page &
Marchewka, J. T., Liu, C., & Kostiwa, K. (2007). An application of the UTAUT model H. E. Howe (Eds.), Nebraska symposium on motivation, 1979: Beliefs, attitudes, and
for understanding student perceptions using course management software. values (pp. 195–259). Lincoln: University of Nebraska Press.
Communications of the IIMA, 7, 93–104. van Braak, J., Tondeur, J., & Valcke, M. (2004). Explaining different types of computer
Moore, G. C., & Benbasat, I. (1991). Development of an instrument to measure the use among primary school teachers. European Journal of Psychology of Education,
perceptions of adopting an information technology innovation. Information 19, 407–422.
Systems Research, 2, 192–222. Venkatesh, V., & Bala, H. (2008). Technology acceptance model 3 and a research
Mueller, J., Wood, E., Willoughby, T., Ross, C., & Specht, J. (2008). Identifying agenda on interventions. Decision Sciences, 39, 273–315.
discriminating variables between teachers who fully integrate computers and Venkatesh, V., Brown, S. A., Maruping, L. M., & Bala, H. (2008). Predicting
teachers with limited integration. Computers & Education, 51, 1523–1537. different conceptualizations of system use: The competing roles of behavioral
Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory. New York: McGraw- intention, facilitating conditions, and behavioral expectation. MIS Quarterly,
Hill. 32, 483–502.
Pynoo, B., Devolder, P., Voet, T., Vercruysse, J., Adang, L., & Duyck, P. (2007). Attitude Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance
as a measure of acceptance: Monitoring IS implementation in a hospital setting. of information technology: Toward a unified view. MIS Quarterly, 27,
In Proceedings of the sixth pre-ICIS annual workshop on HCI research in MIS (pp. 425–478.
20–24). Warshaw, P. R., & Davis, F. D. (1985a). Disentangling behavioral intention and
Rogers, E. M., & Shoemaker, F. F. (1971). Communication of innovations: A cross- behavioral expectation. Journal of Experimental Social Psychology, 21, 213–228.
cultural approach. New York: Free Press. Warshaw, P. R., & Davis, F. D. (1985b). The accuracy of behavioral intention versus
Sanchez-Franco, M. J. (2010). WebCT – The quasimoderating effect of perceived behavioral expectation for predicting behavioral goals. Journal of Psychology,
affective quality on an extending technology acceptance model. Computers & 119, 599–602.
Education, 54, 37–46. Williams, D., Coles, L., Wilson, K., Richardson, A., & Tuson, J. (2000). Teachers and
Sang, G., Valcke, M., van Braak, J., & Tondeur, J. (2010). Student teachers’ thinking ICT: Current use and future needs. British Journal of Educational Technology, 31,
processes and ICT integration: Predictors of prospective teaching behaviors 307–320.
with educational technology. Computers & Education, 54, 103–112.