You are on page 1of 17

International Journal of Education and Development using Information and Communication Technology

(IJEDICT), 2016, Vol. 12, Issue 1, pp. 20-36

Test of e-Learning Related Attitudes (TeLRA) scale: Development,


reliability and validity study

D. H. Kisanga and G. Ireson


Nottingham Trent University, United Kingdom

ABSTRACT

The Tanzanian education system is in transition from face-to-face classroom learning to e-


learning. E-learning is a new learning approach in Tanzanian Higher Learning Institutions [HLIs]
and with teachers being the key stakeholders of all formal education, investigating their attitude
towards e-learning is essential. So far, however, there has been little consideration given to
research that examines teachers’ attitudes towards e-learning in Tanzanian HLIs and
consequently, there is no standard attitude scale that has been developed to measure this. This
paper presents the development and validation of a scale of teachers’ attitude to e-learning.
Whilst being initially developed to assess the attitude of teachers in HLIs the authors belief,
having piloted with pre-service trainee teachers in England that the scale transfers across
national boundaries. The final instrument contains 36 items with a Cronbach alpha score of
0.857. Although the developed attitude scale was intended for use in HLIs, it can also be of
interest to researchers investigating attitudes on other sectors.

Keywords: Attitude, TeLRA scale, e-learning.

Abbreviations: HLIs – Higher Learning Institutions; ICT – Information and Communications


Technology; TeLRA – Test of e-Learning Related Attitudes.

INTRODUCTION
 
E-learning has, over recent years, become ever more popular and it is gaining wide acceptance
as a “non-traditional” mode of accessing higher education (UNESCO, 2009). Researchers
investigating the role of e-learning in education systems suggested it to be the best alternative to
cope up with constraints to access education (Garrison & Anderson, 2003; Weller, 2007; Clarke,
2008; Garrison, 2011). E-learning improves efficiency, effectiveness, quality, time and access of
education at all learning levels (COL 2003; Littlejohn and Pegler 2007; Salmon 2011).

One aspect of success of e-learning programs depends, to a considerable extent, on teachers’


attitudes towards e-learning systems (van Raaij & Schepers, 2008). Teachers play a key role in
the integration of e-learning in education such that their attitudes towards e-learning have
significant impact not only to students’ attitude formation toward e-learning (Pynoo et al., 2012)
but also on the education transformation agenda as a whole. However, a willingness to change
from traditional learning approaches like face-to-face to e-learning is a social phenomenon where
attitude has an important role to play.

The concept of attitude has been one of the most influential phenomena of all social and
psychological constructs (Fishbein & Ajzen, 1980). Throughout the history of social psychology,
social scientists have used attitude to explain human actions, since they regarded attitudes as a
behavioural disposition. This study defines attitude to be positive or negative evaluative
judgement of an entity based on affective, cognitive or behavioural experience (Schwarz, 2007). It
implies that, people’s evaluative judgment of an entity depends on how they feel about it
(affective evaluation), knowledge they have about the object (cognitive evaluation) and how they
Test of e-Learning Related Attitudes (TeLRA) scale 21
 

have acted towards it in the past (behavioral evaluation) (Eagly & Chaiken, 2007). Early literature
on teachers’ attitude towards technology development, adoption and implementation define
attitude toward technology as an affective or evaluative judgement about the technology in
question (Davis, Bagozzi & Warshaw, 1989; Barki & Hartwick, 1994). Technology which is
believed to be both important and personally relevant is more likely to generate people’s positive
attitude towards it (Rogers, 2003).

Examination of numerous approaches used to measure attitude reveals that attitude scales are
the most commonly used with summated rating Likert scales (Albaum, 1997; Johns, 2010). A
Likert, or Likert-like, scale employs self-reporting methods with a series of questions focused on
assessing attitudes. Respondents would rate the attitudinal object by choosing the best option
that reflects their level of agreement or disagreement (Likert, 1932). The response format in the
Likert scale has been adopted in various attitude scales amongst which is the Test of Science
Related Attitudes, TOSRA (Fraser, 1981). TOSRA is a five-point scale that requires respondents
to express their degree of agreement to each statement as either strongly agree, agree, not sure,
disagree or strongly disagree.

The purpose of the work presented here was to develop and validate an attitude scale to assess
teachers’ attitudes towards e-learning. Constructs of the TOSRA scale were modified to develop
the Test of e-Learning Related Attitudes (TeLRA) scale. The TeLRA scale can be considered to
be a reliable and valid measure of attitudes towards e-learning from a sample of teachers from
Tanzanian HLIs. Justification for developing and using TeLRA scale as well as evidence for the
reliability and validity of the scale are also discussed.

Conceptual framework
 
This study reports part of the research conducted in Tanzanian HLIs that investigated teachers’
attitudes towards e-learning. The study was guided by Davis’ (1986) Technology Acceptance
Model (TAM) (see Figure 1).

 
 
  Perceived  
    Usefulness  
                                                                                                          (U)  
                                                                                                                                                           Attitude  
                           External               Behaviour   Actual  
                         Variables  
                                                                                  Toward   Intention  to   System  Use  
(EV)
                                                                                                                                         
  Using  (A)   Use  (BI)  
 
      Perceived  
Ease  of  Use  
(EoU)  

Figure 1: TAM Model Source: Davis et al., (1989).

TAM is an information systems theory that predicts how the user comes to accept and use
technology. The model consists of four constructs: external variables (EV), teachers’ perceived
usefulness (U), teachers’ perceived ease of use (EoU) and the teachers’ attitude (A) toward e-
learning. Two constructs from TAM namely, behaviour intention and actual system use were not
included in the conceptual framework because usage of e-learning in Tanzanian HLIs is still in its
infancy (Hooker et al., 2011; Sanga et al., 2013), and therefore, attitude (A) was selected to be a
dependent variable. In this model, EV was suggested to mediate the impact of the two constructs
22 IJEDICT

U and EoU on A with single directional arrows representing one way impact. TAM is helpful for
both prediction and explanation in the sense that through user’s internal beliefs and different
significant variables, the researcher can identify reasons that lead to adoption or rejection of e-
learning and find appropriate corrective measures or explanations for that decision (Davis et al.,
1989; Turner et al., 2010). The TAM is easy to extend and validate whilst results from applying
the extended TAM are often accepted as being accurate predictors of adoption as well as usage
(Davis 1989; Legris et al., 2003).

LITERATURE REVIEW
 
Trends over the past two decades on attitude scales on ICT shows development of attitude
measures towards computers (Nickell & Pinto, 1986; Francis, 1993; Richter, Naumann &
Groeben, 2000) and towards e-learning (Bernard, Brauer, Abrami & Surkes, 2004; Wilkinson,
Roberts & While, 2010; Teo, 2010b; Morse, Gullekson, Morris & Popovich, 2011; Hernandez-
Ramos, Martinez-Abad, Penalvo, Garcia & Rodriguez-Conde, 2014). In this study, e-learning is
defined as all kinds of electronically supported learning (whether in networked/non-networked
environments) where the learner interacts with teachers, content and other learners regardless of
place and time (Sangra et al., 2012).

Although use of attitude scales in e-learning has provided rich data for analysis and interpretation
of research findings, the literature has identified weaknesses with many existing scales (Garland
& Noyes, 2008; Teo, 2010b; Hernandez-Ramos, Martinez-Abad, Penalvo, Garcia & Rodriguez-
Conde, 2014). For example, in their analysis of previous computer attitude scales developed in
1980s and 1990s, Garland & Noyes (2008) discovered that the stability of most of these scales
has been declined from when they were first developed. Although the examined scales were all
reliable, Garland & Noyes (2008, p. 563) argue that, “the traditional style of computer attitude
scale is no longer as relevant as when first developed.” Wilkinson, Roberts & While, (2010, p.
1369) refer such scales to be “dated with technological developments”. It implies that, with
individuals’ cultural and ICT experiences as well as new technological developments, educators
need scales that demonstrate predictive validity (Garland & Noyes, 2008) as well as reflecting
such developments.

A further weakness of attitude scales is their inability to be used in diverse populations. Literature
reveals various validated attitude scales towards e-learning with different constructs each
applicable to a particular context. For example, Bernard et al., (2004, p. 31) utilise factor analysis
to test their development and validation of a 38-item attitude scale to predict achievements on
online learning. This analysis revealed four themes that included “general beliefs about distance
education, confidence in prerequisite skills, self-direction and initiative, and desire for interaction.”
This scale did not meet authors’ requirements that aimed at investigating teachers’ attitudes
towards e-learning rather than online learning in particular.

The attitude scale presented by Wilkinson, Roberts & While (2010) measures students’ attitudes
towards e-learning across five themes: IT skills, IT experience, IT use, IT access, Attitude to
computers and Attitude to computers in education. Further analysis of this scale revealed that its
items concentrated mainly on the measuring of skills and experience with computers and the
Internet, thus in the authors’ opinion it lacked the diversity of attitudinal aspects geared to the
concept attitudes towards e-learning. Although the scale demonstrated both external and internal
reliability, it demanded more improvement to produce a useful scale (Wilkinson, Roberts & While,
2010).

Moreover, Teo (2010b) developed a 21-item E-learning Accepted Measure (ElAM) scale with
three sub-scales: Tutor Quality, Perceived Usefulness and Facilitating Conditions. Similarly, ElAM
Test of e-Learning Related Attitudes (TeLRA) scale 23
 

did not meet our specifications in that it was aimed at measuring users’ acceptance of e-learning
instead of user’s attitude towards e-learning. Although the scale was developed and validated in
two different studies, its validity remained limited to the sample used (Teo, 2010b).

In recent years, Morse, Gullekson, Morris & Popovich, (2011, p. 482) developed a 17-item
Attitudes Towards the Internet Scale (ATIS) with three themes: General Internet Usage, Negative
Internet Attitudes and Task Facilitation. Contrary to the aim of the current study, ATIS focused
only on one aspect, which was attitude towards the Internet. In line with Teo (2010b), it was
suggested that ATIS needs to be validated in other domains to enhance its reliability (Morse,
Gullekson, Morris & Popovich, 2011). A more recent study by Hernandez-Ramos et al., (2014)
developed a 15-item single construct attitude scale to examine teachers’ attitudes towards the
use of ICT. The scale demonstrated acceptable internal consistency but focused only on
measuring attitudes towards use and was validated among teachers of a single university (Ibid.).
Findings from analysis of above studies show that all scales demonstrated psychometric
properties but, they lack utility to different cultural domain and/or items lack diversity of aspects or
themes geared to the measurement of attitudes towards e-learning.

Consequently, Fraser’s (1981) Test of Science Related Attitudes (TOSRA) scale was adapted in
this study to develop Test of e-Learning Related Attitudes (TeLRA). TOSRA was developed to
measure attitude towards science among secondary school students and it had seven, ten-item
themes, which include Social Implications of Science, Normality of Scientists, Attitude to Scientific
Inquiry, Adoption of Scientific Attitudes, Enjoyment of Science Lessons, Leisure Interest in
Science and Career Interest in Science (Fraser, 1981). TOSRA uses a five-point Likert style
response format with response categories ranging from Strongly Agree to Strongly Disagree
(Ibid.). TOSRA has accepted internal reliability, discriminating validity and has undergone test-
retest to 238 students administered in two-week period between the two studies (Fraser, 1981).

TOSRA was adapted in this study because of its cross-cultural validity. It has been tested in
Australia and the United States (Fraser, 1981), Indonesia (Fraser, Aldridge & Adolphe, 2010),
Turkey (Telli, den Brok & Cakiroglu, 2010), as well as Pakistan (Anwer, Iqbal & Harriso, 2012).
Most recently, one sub-scale, named, Enjoyment of Science Lessons has been validated in
Albania, Kosovo, Romania, Poland and Austria (Emilov, 2013). Moreover, each theme has
conceptually-similar items and through factor analysis all themes revealed unidimensionality
property (Fraser, Aldridge & Adolphe, 2010). Furthermore, TOSRA has been modified and
applied to measure attitudes towards Mathematics through a Test of Mathematics Related
Attitudes, TOMRA (Taylor, 2004; Hoang, 2008; Chow, 2011).

Although TOSRA was originally designed for measuring secondary school students’ attitudes
towards science, literature shows that with a careful review and modification of themes, TOSRA
can also be used among teachers. For example, Chin (2005) adopted TOSRA in measuring
teachers’ attitudes towards science in Taiwan. Similarly, Santiboon (2013) adopted TOSRA and
developed Test of Administrator-Related Attitudes (TOARA) to measure teachers’ attitudes
towards school’s administration in Thailand. Other applications of TOSRA are also possible to
find in local sources.

To date, however, little consideration has been given to conduct research that examines
teachers’ attitudes towards e-learning at Tanzanian HLIs. In due regard, there is no standard
attitude scale that has been developed to measure teachers’ attitudes towards e-learning.
Ndume, Tilya & Twaakyondo, (2008) conducted a survey to establish the acceptance and
challenges of e-learning as well as design an assistive tool for people with disability at Tanzanian
HLIs. However, in their research, no particular scale was developed to measure teachers’
acceptance of e-learning for which validity and reliability could be established. They suggested
future research to focus on investigating individuals’ perceptions about e-learning and factors
24 IJEDICT

contributing to those perceptions. Therefore, knowledge based on teachers’ attitudes towards e-


learning in Tanzania is limited by lack of valid and reliable measures. The majority of scales
discussed in this study were developed for different contexts. Thus, the aim of this study was to
develop and validate an attitude response scale that measured teachers’ attitudes towards e-
learning in Tanzanian HLIs.

 
METHOD
 
Development of TeLRA scale had four stages. Stage one included item development through
review of literature and assessment of TOSRA scale items guided by the conceptual framework
adapted from TAM (see Figure 1). Stage two composed of face and content validity from experts.
Stage three was reliabilty testing and stage four involved the pilot study. The main study
describing results of reliability and the factor analysis of the TeLRA scale are presented in the
Result section.

Stage 1: Items development


 
TeLRA scale’s items development were guided by the four constructs of the conceptual
framework, (see Figure 1); Rogers’ (2003) five characteristics of innovations, (which are, relative
advantage, compatibility, complexity, triability as well as observability of a technology), that help
to explain their different rates of adoption, as well as the TOSRA scale. In addition, two questions,
“I believe using e-learning will improve my job performance” and “Using computer systems
requires a lot of mental efforts,” which had previously been extensively experimented, were
adapted from journal articles of Davis (1989), Legris et al., (2003), Turner et al., (2010) and Teo
(2010a).

Six separate themes were constructed. For the purpose of this study six themes were found
relevant and they included Social implication of e-learning (13 items), Attitude toward e-learning
(13 items), Benefits from e-learning (15 items), Enjoyment of computer experiences (13 items),
Leisure interest in e-learning affairs (11 items) and Interest in teaching through e-learning
technologies (13 items). Thus, the purpose of starting with many items in our scale was to include
as many aspects as possible related to attitudes towards e-learning, consequently maximizing its
face and content validity as well as other analysis.

The first theme was intended to measure teachers’ general belief about e-learning. The second
theme aimed to measure teachers’ affective and cognitive evaluation towards e-learning. The
third construct aimed to measure teachers’ cognitive information about e-learning. That is,
knowledge they have about value of e-learning to education and their career as a whole. The last
three constructs were intended to examine teachers’ affective evaluation about e-learning in
terms of their interaction with computers, interest in e-learning innovations as well as their future
participation in e-learning. Variation of themes in the TeLRA scale development aimed at
including different items that may not only influence attitudes towards e-learning, but also their
scores could be used to predict future teachers’ attitudes towards e-learning. Moreover, the
selected items of the TeLRA scale are free from cultural differences. They can be responded by
participants from a range of different national, social and cultural contexts. In order to avoid no
commitment among respondents, TeLRA scale consisted of four-point Likert’s response format
with degrees of agreement ranging from 1- strongly disagree, 2- disagree, 3- agree to 4- strongly
agree.
Test of e-Learning Related Attitudes (TeLRA) scale 25
 

Stage 2: Face and content validity


 
The 78-items TeLRA scale was submitted to experts so as to determine their face and content
validity. Evaluation was conducted in terms of language clarity, adequacy as well as
representative coverage of the domain, readability and complexity level of the items including
appropriate time taken to complete the questions (Cohen, Manion & Morrison, 2011). Eighteen
items were found to be either ambiguous, a repetition of another item or to measure a different
concept, these items were deleted. Three items were slightly revised leaving a sixty item test. The
new 60-items were re-evaluated and all three experts reported back with the judgment that the
scale appeared to be measuring the intended construct.

Stage 3: Reliability testing


 
The 60-items TeLRA scale went through a field test among 30 pre-service teachers at the
Nottingham Trent University in England so as to establish its reliability before it was adopted in
the pilot study. Reliability of the scale was measured by computing Cronbach’s alpha, which
indicates the extent to which all items in the scale measure the same underlying attribute
(Cronbach, 1951; Pallant, 2010; Bryman & Cramer, 2011). The Cronbach alpha score obtained
was 0.877.

However, 24 items were found to have low item-total correlation value (less than 0.25) indicating
that they were measuring different concept from the scale (Bryman & Cramer, 2004) and
therefore, they were removed. Further removal of items with item-total correlation less than 0.25
enhanced the reliability to 0.888. The refined TeLRA scale had 36 items (see Table 1).

Stage 4: Pilot study


 
The 36-items TeLRA scale was used in the pilot study conducted at the University Computing
Center, University of Dar es Salaam, in Tanzania. The institution has similar characteristics to
those involved in the main study. Twenty six teachers out of thirty participated in the study. The
TeLRA scale returned a Cronbach alpha score of 0.871. No ambiguities were reported in the test
items. A small change in coefficient from that obtained at the Nottingham Trent University can be
attributed to participants being from two different cultural backgrounds. However, it was still highly
reliable. Therefore, all items were retained for the main study so as to measure a possible change
that would be brought about by an impact from a bigger sample.

The main study


 
A cross-sectional survey was used to collect data from 258 teachers randomly selected from four
HLIs, which were not engaged in e-learning programmes. Data were collected using paper
questionnaires. Prior to the analysis, all responses were coded as 1= Strongly disagree, 2 =
Disagree, 3 = Agree and 4 = Strongly agree for positive worded items. Subsequently, responses
were reversed coded as 1= Strongly agree, 2 = Agree, 3 = Disagree and 4 = Strongly disagree for
all negative worded items.
26 IJEDICT

Table 1: The 36-item TeLRA scale

Test of e-Learning Related Attitudes (TeLRA) Scale

Information about teachers’ understanding and attitudes about e-learning.

Instructions
• There is no wrong answer; each response will be treated as a correct one. Your opinion is what is required in this
study.
• Do not think too long about each statement. It should take you around 10 minutes to complete.
• For each statement, put a tick ( P ) to show your level of agreement; Strongly Disagree, Disagree, Agree, and
Strongly Agree. Do not tick across two boxes.

Strongly

Strongly
Disagree

Disagree

Agree
Agree
Statement

1 E-learning is very economical for educational institutions to adopt.


2 I believe using e-learning will improve the quality of my work.
3 Computers make work more interesting.
4 I prefer reading articles in e-learning.
5 It is easier to revise electronic educational materials than printed material
6 I prefer using a computer to prepare my lessons.
7 I feel uncomfortable reading a text book on a computer screen than a physical text book.
8 I enjoy teaching using computers.
9 Delivering a lecture through electronic technologies is very difficult.
10 E-learning requires expensive technical support.
11 E-learning reduces quality of knowledge attained.
12 Interacting with the computer system is often frustrating.
13 A face-to-face method is more learner-centred than E-learning methods.
14 I believe using e-learning technologies will improve my job performance.
15 Communicating through social networks is fun.
16 I like reading magazines on new technology innovations.
17 Teaching through e-learning is tiresome.
18 E-learning increases learners’ social isolation.
Test of e-Learning Related Attitudes (TeLRA) scale 27
 

Strongly

Strongly
Disagree

Disagree

Agree

Agree
Statement

19 E-learning technologies are difficult to use.


20 Using computer systems requires a lot of mental effort.
21 Discussions on e-learning technologies are uninteresting.
22 My institution has enough teaching-learning resources to carry out e-learning.
23 E-learning will increase teachers’ efficiency.
24 Working with computers is exciting.
25 I like discussing about new e-learning innovations.
26 Supporting learners in an e-learning environment is very difficult.
27 E-learning infrastructure is very expensive for the government to afford.
28 It will be difficult for me to become skilful in the use of e-learning tools.
29 I make errors frequently when using a Computer.
30 Using a computer at home is very frustrating.
Using e-learning technologies will allow me to accomplish more work than would otherwise be
31
possible.
32 I enjoy computer games very much.
33 E-learning is a threat to teachers’ employment.
34 E-learning will provide me with better learning opportunities than traditional means of learning.
35 I find computer online interaction unexciting.
36 Communicating through electronic mails is annoying.

RESULTS
 
In order to obtain conceptually small and significant number of themes, all 36 items of the TeLRA
scale were subjected to Principal Component Analysis (PCA) using SPSS version 21. Prior to
performing the PCA, suitability of data for analysis was assessed. This involved Kaiser-Meyer-
Olkin’s measure of sampling adequacy, which requires a value greater than 0.6 and a Bartlett’s
test of sphericity to be significant at a significant value ρ < 0.05 (Tabachnick & Fidell 2013, p.
619). In this study the Kaiser-Meyer-Olkin value was 0.82, exceeding the recommended value of
0.6 and Bartlett’s Test of Sphericity was statistically significant at ρ=0.000, which verified
suitability of data for the PCA. The PCA revealed the presence of 10 factors with eigenvalue
exceeding 1. Examination of a scree plot revealed a clear break after the third component (see
Figure 2).

Using only the scree plot to determine number of factors to retain can be subjective (Bryman &
Cramer, 2004). Therefore, a Parallel Analysis was conducted. Parallel analysis compares the size
of eigenvalues obtained by the SPSS output with eigenvalue obtained from a randomly generated
28 IJEDICT

data set of the same number of attitude scale variables and sample size (Pallant, 2010). Five
factors, which explained a total of 42.6% of the variance, were obtained.

Figure 2: A scree plot

To enhance interpretation of factors, only items with factor loadings 0.5 or higher in both pattern
and structure matrix were selected for inclusion because “the greater the loading, the more the
variable is pure measure of the factor” (Tabachnick & Fidell, 2013, p. 654). The theme, Interest in
teaching using e-learning technologies emerged with one item, att9_rev (see Table 2) and ideally
the authors would like three or more items loading on each factor therefore, it was excluded.
Consequently, four factors were obtained with 22 items (see Table 2). The factors are, named,
respectively: Challenges of e-learning (7 items); Benefits from e-learning (7 items); Attitude on
using computer systems (3 items) as well as Leisure interest on e-learning innovations and use of
computers (5 items), maintaining half of the themes introduced at the beginning of the study.

In order to examine unidimensionality of the established factors, each factor was separately
subjected to PCA. Only one factor was extracted in each case justifying that items in each case
were measuring the same underlying concept. We further conducted reliability test of each factor
and obtained Cronbach alpha scores from 0.641 to 0.788. However, we observed that item
number Att32, “I enjoy computer games very much” (see Table 2) in the fourth factor had a
corrected item-total correlation less than 0.3 indicating that the item was measuring something
different from the themes as a whole. The omission of this item boosted reliability of the theme
from 0.641 to 0.651 (see Table 3).
Test of e-Learning Related Attitudes (TeLRA) scale 29
 

Table 2: Pattern and structure matrix for PCA of a 36-item TeLRA scale
Pattern Matrix Structure Matrix Communalities
Component Component Extraction
Item No. Item
1 2 3 4 5 1 2 3 4 5
Challenges of e-learning
att26_rev Supporting learners in an e-learning environment is very difficult. 0.699 0.067 -0.057 0.020 0.117 0.704 0.177 -0.223 0.059 -0.019 0.516

att10_rev E-learning requires expensive technical support. 0.610 -0.180 0.035 0.052 -0.012 0.575 -0.070 -0.093 0.024 -0.091 0.362

att21_rev Discussions on e-learning technologies are uninteresting. 0.609 0.056 -0.055 -0.002 0.088 0.616 0.150 -0.198 0.032 -0.030 0.392

att20_rev Using computer systems requires a lot of mental effort. 0.605 -0.122 -0.186 0.121 -0.061 0.643 0.036 -0.329 0.124 -0.167 0.471
E-learning infrastructure is very expensive for the government to
att27_rev 0.582 -0.016 -0.299 0.074 0.146 0.627 0.113 -0.429 0.107 0.021 0.503
afford.
att18_rev E-learning increases learners’ social isolation. 0.527 0.081 -0.051 -0.125 -0.167 0.578 0.166 -0.191 -0.084 -0.271 0.384

att19_rev E-learning technologies are difficult to use. 0.527 -0.041 -0.164 0.289 -0.183 0.599 0.158 -0.321 0.309 -0.287 0.503

att12_rev Interacting with the computer system is often frustrating. 0.478 -0.082 -0.067 0.157 -0.336 0.543 0.088 -0.211 0.162 -0.416 0.432

att11_rev E-learning reduces quality of knowledge attained. 0.457 0.269 0.084 -0.121 -0.200 0.512 0.332 -0.064 -0.046 -0.306 0.384

att33_rev E-learning is a threat to teachers’ employment. 0.400 -0.001 -0.334 -0.080 0.305 0.424 0.044 -0.398 -0.049 0.209 0.37
A face-to-face method is more learner-centred than E-learning
att13_rev 0.377 0.343 0.099 -0.291 -0.144 0.426 0.343 -0.021 -0.203 -0.242 0.37
methods.
I feel uncomfortable reading a text book on a computer screen
att7_rev 0.300 0.060 0.094 -0.153 -0.275 0.331 0.099 0.004 -0.132 -0.325 0.215
than a physical text book.
Benefits from e-learning
Att2 I believe using e-learning will improve the quality of my work. -0.131 0.735 -0.016 -0.104 0.111 -0.029 0.675 -0.055 0.068 0.035 0.498
I believe using e-learning technologies will improve my job
Att14 0.000 0.624 -0.136 0.118 -0.132 0.161 0.686 -0.230 0.280 -0.230 0.522
performance.
Att23 E-learning will increase teachers’ efficiency. 0.107 0.553 -0.146 0.257 0.298 0.188 0.609 -0.230 0.398 0.188 0.548
E-learning is very economical for educational institutions to
Att1 -0.024 0.552 -0.046 -0.014 0.125 0.055 0.533 -0.094 0.119 0.051 0.302
adopt
It is easier to revise electronic educational materials than printed
Att5 -0.193 0.552 -0.049 0.020 -0.206 -0.054 0.559 -0.087 0.154 -0.252 0.378
material
Att3 Computers make work more interesting. 0.080 0.546 -0.042 0.088 0.112 0.163 0.570 -0.123 0.223 0.019 0.351

Att4 I prefer reading articles in e-learning 0.031 0.508 0.135 0.093 -0.072 0.098 0.529 0.055 0.207 -0.136 0.308
E-learning will provide me with better learning opportunities than
Att34 0.109 0.475 -0.063 0.122 -0.134 0.229 0.548 -0.166 0.246 -0.225 0.356
traditional means of learning.
Att6 I prefer using a computer to prepare my lessons. 0.060 0.412 -0.121 0.245 -0.295 0.215 0.534 -0.226 0.360 -0.376 0.459

Attitude on using computer systems


It will be difficult for me to become skilful in the use of e-learning
att28_rev 0.061 0.183 -0.677 -0.067 -0.124 0.271 0.274 -0.719 0.031 -0.215 0.578
tools.
att30_rev Using a computer at home is very frustrating. 0.051 0.004 -0.655 0.090 -0.043 0.217 0.117 -0.678 0.142 -0.109 0.473

att29_rev I make errors frequently when using a Computer. 0.120 -0.069 -0.572 0.227 0.001 0.251 0.073 -0.609 0.257 -0.063 0.432

att35_rev I find computer online interaction unexciting. 0.083 0.110 -0.492 -0.034 -0.116 0.237 0.189 -0.532 0.034 -0.186 0.323

att36_rev Communicating through electronic mails is annoying. 0.232 0.049 -0.472 -0.185 0.104 0.328 0.085 -0.511 -0.133 0.021 0.348
Using e-learning technologies will allow me to accomplish more
Att31 0.004 0.278 -0.337 0.195 -0.159 0.163 0.387 -0.399 0.290 -0.229 0.336
work than would otherwise be possible.
Leisure interst on e-learning innovations and use of computers
Att25 I like discussing about new e-learning innovations. 0.106 0.042 -0.014 0.707 0.019 0.134 0.229 -0.095 0.721 -0.019 0.535

Att16 I like reading magazines on new technology innovations. -0.084 0.204 0.077 0.583 0.030 -0.057 0.316 0.033 0.623 0.013 0.435

Att24 Working with computers is exciting. 0.063 0.243 -0.170 0.553 0.069 0.147 0.396 -0.249 0.624 0.001 0.493

Att32 I enjoy computer games very much. 0.230 -0.006 0.380 0.536 -0.006 0.156 0.116 0.286 0.513 -0.022 0.419

Att15 Communicating through social networks is fun. -0.215 0.026 -0.257 0.504 -0.017 -0.132 0.144 -0.248 0.523 -0.014 0.36
Delivering a lecture through electronic technologies is very
att9_rev 0.173 0.047 -0.077 0.053 -0.689 0.320 0.191 -0.186 0.087 -0.733 0.589
difficult.
My institution has enough teaching-learning resources to carry
Att22 0.135 0.281 0.299 0.116 0.509 0.025 0.227 0.269 0.156 0.471 0.405
out e-learning.
Att8 I enjoy teaching using computers. 0.045 0.268 0.192 0.401 -0.453 0.134 0.411 0.082 0.461 -0.488 0.538

att17_rev Teaching through e-learning is tiresome. 0.132 0.105 -0.349 -0.117 -0.444 0.305 0.200 -0.421 -0.054 -0.508 0.44
30 IJEDICT

Table 3: Mean inter-item correlations and Cronbach’s alpha values for each factor

No.  of   Mean  i nter-­‐item   Cronbach's  


Factor
items correlations alpha
1 Challenges  of  e-­‐learning 7 0.348 0.788
2 Benefits  from  e-­‐learning 7 0.300 0.731
3 Attitude  on  using  computer  s ystems 3 0.416 0.680
Leisure  interest  on  e-­‐learning  innovations  
4 4 0.318 0.651
and  use  of  computers

We further conducted the PCA after deleting items with factor loadings less than 0.5. Results
show that all items loaded perfectly well in the same factors except att13, “A face-to-face method
is more learner-centered than E-learning methods.” On rechecking the reliability of a scale, only
att13 was found to have a higher alpha value if item deleted than the rest of the items. Removal
of this item yielded a Cronbach’s alpha coefficient of 0.806.

We finally repeated the PCA with 22 items and fixing the number of factors to 4. Again all items
loaded perfectly well in the same factors. However, Factor 1 became Benefits from e-learning (7
items); Factor 2, Challenges of e-learning (7 items); Factor 3, Leisure interest on e-learning
innovations and use of computers (5 items) and the last factor was Attitude on using computer
systems (3 items). The four-factor solution explained a total of 47% of the variance with Factor 1
contributing 20.9%, Factor 2 contributing 12.7%, Factor 3 contributing 6.8% and Factor 4
contributing 6.6%.

DISCUSSION
 
The purpose of this study was to develop and validate a Test of e-Learning Related Attitudes
(TeLRA) scale. TeLRA scale offers an alternative to existing e-learning measures which focus on
teachers’ attitudes towards e-learning in HLIs. Most of the existing measures have been
developed in highly technological advanced countries where e-learning are in full operation in
education settings and many lacks cross-cultural validity. Moreover, their wording structure could
not meet our specific requirements in using them. Thus, to address these limitations, constructs
of TeLRA scale was initially defined through adapting a cross-cultural validated TOSRA scale.
Items were developed based on literature review guided by the conceptual framework adapted
from the TAM theory. We described reliability and validation process and finally, conducted a
factorial validity.

The factor analysis identified four distinct factors (see Table 2) with factor loadings greater than
the absolute value 0.5, supporting half of our initial constructs of the scale. The cut-off point of 0.5
was aimed at having a scale with strong, distinct and limited number of themes that describe
teachers’ attitudes towards e-learning. The mean inter-Item correlations in each factor were
greater than 0.3 indicating unidimensionality in each factor.

The themes of TeLRA scale after PCA have also conformed to a conceptual framework adapted
from the TAM theory. The conceptual framework consisted of four constructs: external variables,
teachers’ perceived usefulness, teachers’ perceived ease of use and teachers’ attitude toward e-
learning which depends on the first three constructs. Themes such as Challenges of e-learning,
Benefits from e-learning, Leisure interest on e-learning innovations and use of computers are
respectively conformed to external variables, teachers’ perceived usefulness and teachers’
Test of e-Learning Related Attitudes (TeLRA) scale 31
 

perceived ease of use of computers. In addition, half of the emerged themes matched with the
initially predicted themes adapted from the TOSRA scale. This further enhanced its construct
validity.

Furthermore, results displayed in Table 2 proposed that the evaluation process of e-learning
could also be controlled by affective components (Bodur et al., 2000; van den Berg et al., 2006;
Dempsey & Mitchell 2010) and cognitive links between e-learning and its associated attributes
(Fazio 2007). Fazio (2007, p. 608) claims that individuals’ evaluative judgment about an entity
can be constructed out of “attributes that characterize the entity and their favourability.” It implies
that teachers could construct positive or negative attitudes towards e-learning through feelings
and emotions they use to associate with e-learning and/or on the basis of its salient attributes
existing at that time (ibid.).

As raised earlier, diversity of aspects to measure attitudes has been demonstrated by the
behaviour of individual items. Most of items concentrated to the themes challenges of e-learning,
benefits from e-learning and leisure interest on e-learning innovations and use of computers. This
suggests that issues concerning teachers’ attitudes towards e-learning need to be addressed
particularly in these areas. Teachers may be supportive to e-learning if they believe that it can be
implemented with minimum challenges, it is beneficial to their career and education and it is easy
to implement.

Despite the fact that TeLRA scale has demonstrated an acceptable internal reliability, we have
not examined, at this stage, either external reliability or predictive validity. Demonstrating
external reliability would have further informed researchers its reliability over time. In addition,
through test-retest examination would have made possible to compute Confirmatory Factor
Analysis (CFA), which its results would have establish an empirical confirmation of the themes
derived by the PCA further enhancing its validity. It is the authors’ view that, conducting CFA with
the same data set does not significantly add to the analysis since both techniques are so closely
related such that a factor structure derived from a PCA would almost always fit well in a CFA
when using the same data set (Hurley et al., 1997; Matsunaga 2010). Similarly, predictive validity
would have established how well the results from the TeLRA scale can be used to predict future
teachers’ attitudes towards e-learning. These are issues to be addressed in the future studies.

However, the extracted four factors solution with established factorial validity could be described
as highly representative of the whole concept of attitude measure towards e-learning. With
education transformation from face-to-face learning to e-learning (URT, 2003; URT, 2007)
happening throughout Tanzania, initiatives for adopting e-learning are also emerging in many
HLIs. Thus, through the use of the TeLRA scale, education institutions can identify particular
opportunities as well as threats that can influence teachers’ attitudes towards e-learning. Being
the key stakeholders in all formal education, successful adoption and implementation of e-
learning in HLIs will highly depend on teachers’ perceptions and readiness to use e-learning in
their classrooms. With a slight difference in Cronbach’s alpha of the TeLRA scale when used in
England and Tanzania, the authors do not expect significant variations in results when used in
other countries.

Acknowledgment
 
We wish to acknowledge Nottingham Trent University (NTU) for granting us permission to
conduct this study. Opinions expressed in this study are those of the authors and do not
represent views of the NTU.
32 IJEDICT

REFERENCES

Anwer, M., Iqbal, H. M., & Harrison, C. 2012. “Students’ Attitude towards Science: A Case of
Pakistan”. Pakistan Journal of Social and Clinical Psychology. Vol. 9, No. 2, pp. 3-9.
Retrieved online 10/11/2013 at: http://www.gcu.edu.pk/ClinPsyJourVol.htm

Albaum, G. 1997. “The Likert scale revisited: An alternate version”. Journal of the Market Research
Society, Vol. 39, no. 2, pp. 331-348.

Barki, H., & Hartwick, J. 1994. Measuring User Participation, User Involvement, and User
Attitude. MIS Quarterly, Vol. 18, no. 1, pp. 59-82.
Beavers, A. S., Lounsbury, J. W., Richards, J. K., Huck, S. W., Skolits, G. J., & Esquivel, S. L.
2013. “Practical Considerations for Using Exploratory Factor Analysis in Educational
Research”. Practical Assessment, Research and Evaluation, Vol.18, no. 6, pp. 1-13.

Bernard, R., Brauer, A., Abrami, P., & Surkes, M. 2004. “The development of a questionnaire for
predicting online learning achievement”. Distance Education, Vol. 25, no. 1, pp. 31-47.

Bryman, A., & Cramer, D. 2004. Constructing Variables. In M. Hardy and A. Bryman, (Eds.)
Handbook of data analysis (pp. 17-34). London: SAGE Publication Ltd.

Bryman, A., & Cramer, D. 2011. “Quantitative Data Analysis with IBM SPSS 17, 18 and 19: A
Guide for Social Scientists”. London: Routledge.

Bodur, H. O., Brinberg, D., & Coupey, E. 2000. “Belief, Affect, and Attitude: Alternative Models of
the Determinants of Attitude”. Journal of Consumer Psychology, Vol. 9, no. 1, pp. 17 -
28.

Chin, C-C. 2005. “First-year Pre-service Teachers in Taiwan-Do they enter the teacher program
with satisfactory scientific literacy and attitudes toward science”? International Journal
of Science Education, Vol. 27, no. 13, pp. 1549–1570.

Chow, T. F. 2011. “Students' difficulties, conceptions and attitudes towards learning algebra : an
intervention study to improve teaching and learning”. Doctoral dissertation, Curtin
University Retrieved online 10/11/2013 at http://espace.library.curtin.edu.au/cgi-
bin/espace.pdf?file=/2013/04/26/file_1/191518.

nd
Clarke, A. 2008. “E-learning skills”. 2 edn. England: Palgrave Macmillan.
th
Cohen, L., Manion, L., & Morrison, K. 2011. “Research Methods in Education”. 7 edn. London:
Routledge.

COL (Commonwealth of Learning), 2003. “A Virtual University for Small States of the
Commonwealth”. 15th Conference of Commonwealth Education Ministers Edinburgh,
October 2003, Edinburgh. Scotland: Commonwealth of Learning.

Cronbach, L. J. 1951. “Coefficient Alpha and the Internal Structure of Tests”. Psychometrika, Vol.
16, no. 3, pp. 297-334.

Davis, F. D. 1986. “A Technology Acceptance Model for Empirically Testing New End-User
Information Systems: Theory and Results”. Doctoral dissertation, Sloan School of
Management, Massachusetts Institute of Technology. Retrieved online 14/04/2012 at
http://dspace.mit.edu/handle/1721.1/15192 .
Test of e-Learning Related Attitudes (TeLRA) scale 33
 

Davis, F. D. 1989. “Perceived Usefulness, Perceived Ease of Use, and User Acceptance of
Information Technology”. MIS Quarterly, Vol. 13, no. 3, pp. 319-340.

Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. 1989. “User Acceptance of Computer
Technology: A Comparison of Two Theoretical Models”. Management Science, Vol. 35,
no. 8, pp. 982-1003.

Dempsey, M., & Mitchell, A. 2010. “The Influence of Implicit Attitudes on Choice When
Consumers Are Confronted with Conflicting Attribute Information”. Journal of Consumer
Research, Vol. 37, no. 4, pp. 614-625.

Eagly, A., & Chaiken, S. 2007. “The Advantages of an Inclusive Definition of Attitude”. Social
Cognition, Vol. 25, no. 5, pp. 582-602.

Emilov, I. 2013. “Constructivist Practices in Teaching Chemistry: A Cross-National Study”.


Athens: ATINER'S Conference Paper Series No: EDU2013-0610. Pp. 1-17. Retrieved
online 20/11/2013 at http://www.atiner.gr/papers/EDU2013-0610.pdf

Fazio, R. H. 2007. “Attitudes as Object-Evaluation Associations of Varying Strength”. Social


Cognition, Vol. 25, no. 5, pp. 603-637.

Fishbein, M., & Ajzen, I. 1980. “Understanding Attitudes and Predicting Social Behaviour”. USA:
Prentice-Hall, Inc.

Francis, L. J. 1993. “Measuring attitude toward computers among undergraduate college


students: The affective domain”. Computers and Education, Vol. 20, no. 3, pp. 251-255.

Fraser, B. J. 1981. “TOSRA: Test of Science-Related Attitudes handbook”. Australia: Australian


Council for Research Educational Research.

Fraser, B, J., Aldridge, J. M., & Adolphe, F.S. 2010. “A Cross-National Study of Secondary
Science Classroom Environments in Australia and Indonesia”. Research Science
Education Journal. Vol. 40, pp. 551-571. Retrieved online 10/11/2013 at
http://disde.minedu.gob.pe/

Hurley, A, E., Scandura, T. A., Schriesheim, C. A., Brannick, M. T., Seers, A., Vandenberg, R. J.,
& Williams, L. J. 1997. „Exploratory and confirmatory factor analysis: guidelines, issues,
and alternatives”. Journal of Organizational Behaviour, Vol. 18, pp. 667-683.

Legris, P., Ingham, J., & Collerette, P. 2003. “Why do people use information technology? A
critical review of the technology acceptance model”. Journal of Information and
Management, Vol. 40, pp. 191–204.

Likert, R. 1932. “A Technique for the Measurement of Attitudes”. Archives of Psychology, No.140.
New York: New York University.

Littlejohn, A. and Pegler, C., 2007. “Preparing for Blended e-Learning”. London: Routledge.

Garland, K. J., & Noyes, J. M. 2008. “Computer attitude scales: How relevant today”? Computers
in Human Behaviour, Vol. 24, no. 2, pp. 563-575.
34 IJEDICT

st nd
Garrison, D. R. 2011. “E-learning in the 21 Century”. 2 edn. London: Routledge Taylor &
Francis Group.
st
Garrison, D. R. & Anderson, T. 2003. “E-learning in the 21 Century: A Framework for Research
and Practice”. London: Routledge Falmer.

Guri-Rosenblit, S., & Gros, B. 2011. “E-Learning: Confusing Terminology, Research Gaps and
Inherent Challenges”. Journal of Distance Education. Vol. 25, no. 1. Retrieved online
13/05/2012 at http://www.jofde.ca/index.php/jde/article/view/729

Hernandez-Ramos, J. P., Martinez-Abad, F., Penalvo, F. J. G., Garcia, M. E. H., & Rodriguez-
Conde, M. J. 2014. “Teachers’ attitude regarding the use of ICT. A factor reliability and
validity study”. Computers in Human Behaviour. Vol. 31. Pp. 509-516.

Hoang, T. N. 2008. “The Effects of Grade Level, Gender, and Ethnicity on Attitude and Learning
Environment in Mathematics in High School”. International Electronic Journal of
Mathematics Education. Vol. 3, No. 1, pp. 47-59. Retrieved online 10/11/2013at
http://www.iejme.com/012008/d3.pdf

Johns, R. 2010. “Likert Items and Scales. Survey Question Bank: Methods Fact Sheet 1”.
Retrieved online 19/11/2013 at
http://surveynet.ac.uk/sqb/datacollection/likertfactsheet.pdf

Korkmaz, O. 2012. “A validity and reliability study of the Online Cooperative Learning Attitude
Scale (OCLAS)”. Computers and Education, Vol. 59, no. 4, pp. 1162-1169.

Morse, B. J., Gullekson, N. L., Morris, S. A., & Popovich, P. M. 2011. “The development of a
general Internet attitudes scale”. Computers in Human Behaviour, Vol. 27, no. 1, pp.
480-489.

Matsunaga, M. 2010. “How to Factor-Analyse Your Data Right: Do’s, Don’ts, and How-To’s”.
International Journal of Psychological Research, Vol. 3, no. 1, pp. 97-110.

Ndume, V., Tilya, F.N., & Twaakyondo, H. 2008. “Challenges of Adaptive e-Learning at Higher
Learning Institutions: A Case Study in Tanzania”. International Journal of Computing
and ICT Research, Vol. 2, no. 1, pp. 47 - 59. Retrieved online 17/01/2012 at
http://ijcir.org/volume2-number1/article6.pdf

Nickell, G. S., & Pinto, J. N. 1986. “The computer attitude scale”. Computers in Human
Behaviour, Vol. 2, no. 4, pp.301-306.
th
Pallant, J. 2010. “SPSS Survival Manual: A step by step guide to data analysis using SPSS”. 4
edn. Australia: Allen & Unwin Book Publishers.

Pynoo, B., Tondeur, J., Van Braak, J., Duyck, W., Sijnave, B., & Duyck, P. 2012. “Teachers’
acceptance and use of an educational portal”. Computers and Education, Vol. 58, no. 4,
pp. 1308-1317.

Richter, T., Naumann, J., & Groeben, N. 2000. “Attitudes toward the computer: construct
validation of an instrument with scales differentiated by content”. Computers in Human
Behaviour, Vol. 16, no. 5, pp. 473-491.
th
Rogers, E. M. 2003. “Diffusion of Innovations”. 5 edn. New York: Simon & Schuster, Inc.
Test of e-Learning Related Attitudes (TeLRA) scale 35
 

Salmon, G., 2011. “E-Moderating: The key to teaching and learning online” [eBook]. 3rd Ed.
London: Routledge. Retrieved online 10/03/2014 at
http://lib.myilibrary.com/Open.aspx?id=336371&src=0.

Sangra, A., Vlachopoulos, D., & Cabrera, N. 2012. “Building an Inclusive Definition of E-Learning:
An Approach to the Conceptual Framework”. International Review of Research in Open
and Distance Learning, Vol. 13, no. 2, pp. 145-159.

Santiboon, T. 2013. “Factors Influencing Teachers’ Perceptions of their School Administrators’


Interpersonal Behaviours in the Primary School Administrations in Thailand”. Standard
Journal of Education and Essay, Vol. 1, no. 7, pp. 98– 105.

Schwarz, N. 2007. “Attitude construction: evaluation in context”. Social Cognition, Vol. 25, no. 5,
pp. 638-656.
th
Tabachnick, B. G., & Fidell, L. S. 2013. “Using Multivariate Statistics”. 6 edn. USA: Pearson
Education, Inc.

Taylor, B. A. 2004. “The influence of classroom environment on high school students'


mathematics anxiety and attitudes”. Doctoral dissertation, Curtin University. Retrieved
online 10/11/2013 at
http://espace.library.curtin.edu.au/R/HN6263X3YA7IHRGLF57L3RTM5TKL5II3FFEFTM
B4XV91F3RE48-01172

Telli, S., den Brok, P., & Cakiroglu, J. 2010. “The importance of teachers-student interpersonal
relationships for Turkish students’ attitudes towards science”. Research in Science &
Technology Education, Vol. 28, no. 3, pp. 261-276.

Teo, T. 2010a. “A Path Analysis of Pre-service Teachers' Attitudes to Computer use: Applying
and Extending the Technology Acceptance Model in an Educational Context”.
Interactive Learning Environments, Vol. 18, no. 1, pp. 65-79.

Teo, T. 2010b. “Development and validation of the E-learning Acceptance Measure (ElAM)”. The
Internet and Higher Education, Vol. 13, no. 3, pp. 148-152.

Turner, M., Kitchenham, B., Brereton, P., Charters, S., & Budgen, D. 2010. “Does the technology
acceptance model predict actual use? A systematic literature review”. Information and
Software Technology, Vol. 52, pp. 463–479.

UNESCO 2009. “Trends in Global Higher Education: Tracking an Academic Revolution”. A


Report Prepared for the UNESCO 2009 World Conference on Higher Education. Paris.
*
URT . 2003. “National Information and Communications Technologies Policy”. Tanzania: Ministry
*
of Communications and Transport. [ URT-United Republic of Tanzania].

URT. 2007. “Information and Communication Technology (ICT) Policy for Basic Education”.
Tanzania: Ministry of Education and Vocational Training.

van den Berg, H., Manstead, A. S. R., van der Pligt, J., & Wigboldus, D. H. J. 2006. “The impact
of affective and cognitive focus on attitude formation”. Journal of Experimental Social
Psychology, Vol. 42, pp. 373–379.
36 IJEDICT

van Raaij, E, M., & Schepers, J. J. L. 2008. “The acceptance and use of a virtual learning
environment in China”. Computers and Education, Vol. 50, no. 3, pp. 838-852.

Weller, M. 2007. “Virtual Learning Environments: Using, choosing and developing your VLE”.
London: Routledge Taylor and Francis Group.

Wilkinson, A., Roberts, J., & While, A. E. 2010. “Construction of an instrument to measure student
information and communication technology skills, experience and attitudes to e-
learning”. Computers in Human Behaviour, Vol. 26, no. 6, pp. 1369-1376.

Copyright for articles published in this journal is retained by the authors, with first publication rights granted
to the journal. By virtue of their appearance in this open access journal, articles are free to use, with proper
attribution, in educational and other non-commercial settings.

Original article at: http://ijedict.dec.uwi.edu/viewarticle.php?id=2087

You might also like