You are on page 1of 14

Scandinavian Journal of Educational Research

ISSN: 0031-3831 (Print) 1470-1170 (Online) Journal homepage: http://www.tandfonline.com/loi/csje20

Examining the Relationship between Teachers’


Self-Efficacy, their Digital Competence, Strategies
to Evaluate Information, and use of ICT at School

Ove Edvard Hatlevik

To cite this article: Ove Edvard Hatlevik (2016): Examining the Relationship between Teachers’
Self-Efficacy, their Digital Competence, Strategies to Evaluate Information, and use of ICT at
School, Scandinavian Journal of Educational Research, DOI: 10.1080/00313831.2016.1172501

To link to this article: http://dx.doi.org/10.1080/00313831.2016.1172501

Published online: 23 May 2016.

Submit your article to this journal

Article views: 3

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


http://www.tandfonline.com/action/journalInformation?journalCode=csje20

Download by: [University of Nebraska, Lincoln] Date: 29 May 2016, At: 14:31
SCANDINAVIAN JOURNAL OF EDUCATIONAL RESEARCH, 2016
http://dx.doi.org/10.1080/00313831.2016.1172501

Examining the Relationship between Teachers’ Self-Efficacy, their


Digital Competence, Strategies to Evaluate Information, and use of
ICT at School
Ove Edvard Hatlevik
The Norwegian Centre for ICT in Education, Oslo, Norway

ABSTRACT ARTICLE HISTORY


Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

Research indicates that self-efficacy in teaching is a key issue for carrying out Received 25 June 2015
good teaching practice. The aim of this paper was to examine the Accepted 9 March 2016
relationship between teachers’ self-efficacy in information and
KEYWORDS
communication technologies (ICT), their strategies to evaluate Teachers’ self-efficacy; digital
information, their digital competence, and use of ICT at school. A competence; strategies to
sample of 332 teachers participated in a survey. The teachers evaluate information;
answered self-report questions and responded to a multiple-choice structural equation
test on digital competence. Structural equation modelling was used modelling
to test a hypothesized model of the relationship between self-efficacy
in basic ICT, self-efficacy in online collaboration, strategies to evaluate
information, digital competence, and the use of ICT. The analysis
confirmed that the empirical data supported the hypothesized model.
Significant factor loadings and positive relationships between the
factors were found. Overall, the factors in the model explained 41%
of teachers’ digital competence, 49% of their self-efficacy in online
collaborative, and 36% of their use of ICT at school.

During the last decade, a rapid development of information and communication technologies (ICT)
has taken place globally for work and business purposes, for instance, in hospitals, banks, and retail
trade, to mention but a few. There are national and international initiatives to implement ICT in the
educational system (European Commission, 2013; Ferrari, 2012; Fraillon, Ainley, Schulz, Friedman,
& Gebhardt, 2014). When examining the use of ICT in the educational systems in Europe, the Euro-
pean Commission (2013) found that countries have different priorities and are choosing various
strategies regarding how to implement and use ICT at school. However, within most countries,
the role and competence of teachers is discussed when it comes to understanding the quality of
schools.
According to Christophersen, Elstad, Turmo, and Solhaug (2016), teachers’ efficacy in teaching is
important for understanding how teachers are capable of obtaining good teaching practice (Chris-
tophersen et al., 2016). Teachers’ efficacy is about their beliefs and confidence about being capable to
“carry[ing] out good teaching in the classroom” (Christophersen et al., 2016, p. 2).
Krumsvik (2011) emphasizes the importance of addressing teacher self-efficacy when it comes to
their teaching practice with ICT. Today there are both national and international attempts to use ICT
for teaching and learning purposes (Binkley et al., 2012; Ferrari, 2012; Fraillon et al., 2014). In Nor-
way, digital skills and competence aims have been explicitly described in the curriculum for primary
and secondary school since 2006 (Norwegian Directorate for Education and Training, 2012).

CONTACT Ove Edvard Hatlevik ove.e.hatlevik@iktsenteret.no


© 2016 Scandinavian Journal of Educational Research
2 O. E. HATLEVIK

The object of this paper is to examine teachers’ confidence in the use of technology, how they
evaluate online information, their tested digital competence, and their use of ICT at school. A
model about this relationship is developed from previous research, and structural equation model-
ling is used to test the model with empirical data.

1. Perspectives
1.1. Context: ICT in School
Since 2006, digital skills and competence have been part of the Norwegian curriculum for primary
and secondary school. Subject-specific competence goals are developed after 2nd, 4th, 7th, and 10th
grade. Some of these competence goals contain descriptions about using digital tools and media for
learning purposes (i.e., draw a graph, use online syllabus, evaluate online information, critical use of
technology, awareness of personal safety). Additionally, each subject has a description about how to
Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

understand digital skills in that subject and, further, how digital tools or media can be used to
enhance students’ learning, understanding, and achievements.
Digital skills and competences are on the agenda for teachers in primary and secondary schools,
but it does not seem that the teacher education institutions have done enough to prepare pre-service
teacher students for the new curriculum (Engen, Giæver, & Mifsud, 2014; Gudmundsdottir, Hatle-
vik, Ottestad, & Wiberg, 2014; Krumsvik, 2014; Tømte, Kårstein, & Olsen, 2013). According to
Krumsvik (2014), the area of ICT is still rather new to teacher educators and the digitalization of
teacher education has to be prioritized. Gudmundsdottir et al. (2014) collected data from a sample
of 356 newly qualified teachers., and reported a mismatch between what newly qualified teachers had
learned through their teacher education programmes and the demands for professional digital com-
petence as a teacher at schools. One reason could be that analysis of all Norwegian teacher education
institution curriculum indicates that teacher education institutions are providing fragmented initiat-
ives relating to ICT use and competence development, which does not seem to be grounded in or
supported by management (Tømte et al., 2013). Further, Engen et al. (2014) emphasize how the cur-
rent structure of the teacher education programme does not provide teacher students with opportu-
nities to develop professional digital competence in line with the school curriculum and the
expectations of the digital competence of newly qualified teachers. When it comes to putting digital
competence on the agenda, it is necessary to align teacher educators, teacher education institutions,
and policy (Krumsvik, 2014). One solution could be to develop “a framework for professional devel-
opment” (Krumsvik, 2014, p. 278) adapted to teacher educators working at the institutions.
However, it is not just teacher education that lacks emphasis on digital skills and competence. The
International Computer and Information Literacy Study (Fraillon et al., 2014) shows that Norwegian
teachers report positive attitudes about ICT at school, but they also report limited use of ICT during
lessons. When comparing European schools, it seems that Norwegian municipalities do not apply the
incentives to get teachers to undertake development in digital competence (European Commission,
2013). Training has little significance for salary, duties, or status. The training of teachers can be
characterized as a combination of organized in-house training (Fraillon et al., 2014) and trail-and-
error on your own (Egeberg et al., 2012). Overall, there seems to be little cooperation on joint devel-
opment and sharing of teaching plans among the Norwegian teachers (Fraillon et al., 2014).

1.2. Self-Efficacy
There are several examples of research on self-efficacy among teachers (Christophersen et al., 2016;
Fanni, Rega, & Cantoni, 2013; Klassen & Chiu, 2010; Krumsvik, 2011; Teo, 2014; Tondeur, Hermans,
van Brak, & Valcke, 2008).
The concept of self-efficacy builds on a theoretical framework emphasizing the assumption that
people are active agents in shaping the directions of their careers (Sáinz & Eccles, 2012). Self-efficacy
SCANDINAVIAN JOURNAL OF EDUCATIONAL RESEARCH 3

captures peoples perceived expectations about their capability to complete a task or achieve a goal
(Bandura, 1997, 2006; Sáinz & Eccles, 2012). The assumption is that when people are confident
about being able to complete a task, they can be more willing to direct their concentration and persist
at the task (Bandura, 2006; Klassen & Chiu, 2010). Self-efficacy seems therefore to be important for
the choice and performance of activities.
Researchers have found that teachers’ behaviours and choices during lessons can be predicted by
their self-efficacy (Klassen & Chiu, 2010). For example, teachers’ self-efficacy seems to have a positive
correlation with higher levels of academic achievement, more effective teacher practices, and higher
levels of commitment to being a teacher (Skaalvik & Skaalvik, 2010; Viel-Ruma, Houchins, Jolivette,
& Benson, 2010). The perceived importance of the task or of the context could also influence peoples’
choices and prioritizing.
Bandura (1997) distinguish between a domain-specific self-efficacy and a more general self-effi-
cacy. Measures of self-efficacy should “reflect a particular context or domain of functioning, rather
Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

than global functioning” (Klassen & Chiu, 2010, p. 741). It seems that a more domain- or task-
specific self-efficacy has a greater potential to capture learning and achievements within the specific
domain compared with the more general self-efficacy. When it comes to developing questions and
items for research purposes, Klassen and Chiu (2010) emphasize examining people’s forward-look-
ing capabilities and not a more global ability. One reason could be that it is easier to have realistic
beliefs about a more specific task (“I can give a presentation with digital tools”) compared with a
more general task (“I am a good teacher”).
The concept of self-efficacy is important when it comes to using ICT at school (Fanni et al., 2013;
Klassen & Chiu, 2010; Krumsvik, 2011; Tondeur et al., 2008). Hammond, Reynolds, and Ingram
(2011) conducted a study looking at reasons for why teachers use ICT. They found that teachers
with lower levels of self-efficacy in respect of ICT “were among the least frequent users of ICT”
(p. 196). So, Choi, Lim, and Xiong (2012) examined teacher students and their computer use.
They reported that computer efficacy was related to both personal computer use and prospective
computer use. Further, Teo (2014) stated that the intention to use technology could be influenced
by self-efficacy and technological complexity. Fanni et al. (2013) scrutinized the relationship between
computer self-efficacy and teacher self-efficacy. They report that increased levels of computer self-
efficacy can lead to higher levels of confidence in being an efficient teacher with ICT. Other research-
ers have underpinned a positive relationship between task-specific self-efficacy and achievements in
ICT (Abele & Spurk, 2009; Broos & Roe, 2006; Yang & Cheng, 2009).
When it comes to self-efficacy and ICT in school, Krumsvik (2011) distinguishes between being
confident about using ICT on your own, that is, self-efficacy in basic ICT, and being confident about
using ICT for teaching or didactical purposes, that is, using ICT to enhance online collaboration
among students.

1.3. Strategies to Evaluate Information


Researchers have described how learning strategies can be important for learning and understanding
(Schunk, Pintrich, & Meece, 2008; Weinstein, Husman, & Dierking, 2000). Krumsvik (2011) states
that learning strategies also are an important part of teacher’s professional digital competence.
Learning strategies can involve thoughts, beliefs, feelings, and behaviours (Weinstein et al., 2000,
p. 733) that can support the learner’s acquisition, reflection, and understanding of knowledge, skills,
and information.
Puustinen and Rouet (2009) are concerned about what people do when they search for and reflect
upon online information. Puustinen and Rouet (2009) underpin the importance of validating digital
sources and evaluating whether information is trustworthy or not. They suggest that people need to
consider the relevance and credibility of online information.
Ferrari (2013) developed a framework for understanding and developing digital competence. The
framework has five areas of digital competence. Information is one area within this framework.
4 O. E. HATLEVIK

Ferrari distinguishes between: (1) browsing for information, (2) evaluating information, and (3) stor-
ing information. Category 2, evaluating information, is in line with what Puustinen and Rouet (2009)
describe as evaluating whether information is relevant and credible. According to Ferrari (2013),
evaluating information means being able to “gather, process, understand and critically evaluate
information” (p. 5). Teachers working with digital sources and online information could therefore
benefit from developing strategies for the evaluation of information.
What about the relationship between self-efficacy, learning strategies, and digital competence?
Research shows a positive relationship between self-efficacy and adaptive learning strategies (i.e.,
organizing and evaluating information) (Weinstein et al., 2000). Further, Schunk et al. (2008) indi-
cate a positive relationship between learning strategies and achievements (i.e., digital performance)
(Hatlevik, 2012; Hatlevik, Gudmundsdottir, & Loi, 2015).

1.4. Digital Competence


Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

Digital skills and competence are emphasized in many national curricula (Fraillon et al., 2014). There
seem to be national and international expectations about what students should be capable of per-
forming and achieving in digital environments (Balanskat & Gertsch, 2010; Binkley et al., 2012;
ETS, 2001; Ferrari, 2012; Fraillon et al., 2014).
In order to develop digital skills and competence among students, teachers have to fulfil two require-
ments. First, they must be able to deliver the digital competence aims set in the curriculum to the stu-
dents (Lee & Tsai, 2010). Second, they must be able to use technology in their own teaching so that they
can help students to manage the digital competence aims in the curriculum (Krumsvik, 2011).
Several researchers show how people can learn to use technology for their own learning (Claro
et al., 2012; Gui & Argentin, 2011; Katz & Macklin, 2007). This paper addresses digital competence
as a concept to describe what students are capable of achieving in digital environments or with digital
tools (Calvani, Fini, Ranieri, & Picci, 2012; Ferrari, 2013; Hatlevik & Christophersen, 2013). The
benefits of using the concept of digital competence are that: (1) it emphasizes technical understand-
ing, creativity, critical evaluation, and awareness of technology, (2) it is a framework for understand-
ing and developing digital competence (Ferrari, 2013), and (3) it is in line with the competence aims
in the Norwegian curriculum (Norwegian Directorate for Education and Training [NDfET], 2012).
In connection with Norwegian schools and curricula, it may be appropriate to define digital compe-
tence as students’ capability to “use digital tools, media and resources efficiently and responsibly, to
solve practical tasks, find and process information, design digital products and communicate con-
tent” (NDfET, 2012, p. 12).
In this paper, digital competence is operationalized through the following four sub-categories: (1)
search and process, (2) produce, (3) digital responsibility, and (4) communication. These four sub-
categories are compared with other two other international frameworks (Ferrari, 2013; Fraillon et al.,
2014). To search and process means being able to “search for, navigate in, sort out, categorize and
interpret digital information appropriately and critically” (NDfET, 2012, p. 12). This is rather similar
to the definition of Information (Ferrari, 2013, p. 5) and the descriptions of accessing, evaluating and
managing information (Fraillon et al., 2014, p. 35). The ability to convert, reapply, and create digital
elements is central to the sub-category produce; this is similar to Content creation as an area (Ferrari,
2013, p. 5) and to the aspects of transforming and creating information. (Fraillon et al., 2014, p. 35).
The sub-category digital responsibility emphasizes students’ Internet awareness and their knowledge
of digital security and privacy protection; and is comparable with the area Safety (Ferrari, 2013, p. 5)
and the aspect of Using information safely and securly (Fraillon et al., 2014, p. 35). Communication
means being able to present and publish findings and to collaborate during learning activities; it is in
line with the area Communication (Ferrari, 2013, p. 5) and the aspect of Sharing information (Frail-
lon et al., 2014, p. 35). There are also differences between the content of these frameworks, because
the framework for the Norwegian curriculum does not explicitly mention knowledge of computer
use (Fraillon et al., 2014) or problem solving with technology (Ferrari, 2013).
SCANDINAVIAN JOURNAL OF EDUCATIONAL RESEARCH 5

To conclude, the frameworks presented in this section consist of areas or sub-categories of digital
competencies. These three frameworks (Ferrari, 2013; Fraillon et al., 2014; NDfET, 2012) have differ-
ences when it comes to the numbers and labels of the areas/sub-categories, but there are many simi-
larities when it comes to the content of the frameworks.

1.5. Research Questions and Hypotheses


This paper addresses the relationship between teacher’s self-efficacy, strategic use of information,
digital competence, and use of ICT. This study tries to answer two research questions (RQ):

RQ1: What characterizes the quality of the measurement of self-efficacy to test digital literacy?
RQ2: What characterizes the relationship between self-efficacy in basic ICT, self-efficacy in online
Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

collaboration, strategic use of information, digital competence, and the use of ICT at school?

Six hypotheses, about the relationship between these variables, were developed from existing
research literature (see Figure 1).
Two hypotheses address how self-efficacy in basic ICT is related to both self-efficacy in online
collaboration and strategies to evaluate information:
H1. Teachers’ self-efficacy in basic ICT predicts their self-efficacy in online collaboration.
H2. Teachers’ self-efficacy in basic ICT predicts strategies to evaluate information.

The two next hypotheses deals with the variables self-efficacy in basic ICT and strategies to evalu-
ate information, predicting digital competence.
H3. Teachers’ self-efficacy in basic ICT predicts digital competence.
H4. Strategies to evaluate information predict digital competence.

Finally, the model contains two hypotheses about variables predicting teacher’s use of ICT at
school.
H5. Self-efficacy in online collaboration predicts ICT use at school.
H6. Digital competence predicts teacher’s use of ICT at school.

Figure 1. Theoretical model showing the relationship between teachers self-efficacy, strategy for organizing information, digital
competence, and use of ICT.
6 O. E. HATLEVIK

2. Methods
2.1. Procedures and Participants
This is a cross-sectional study. Sample preparation consisted of two steps. First, 500 schools were
randomly selected based on official information about schools in Norway. Second, school leaders
were asked to select 1–3 teachers from their schools to participate in the study. The school leaders
received information about how to select teachers from their schools how to give teachers access to a
web-based questionnaire. The total sample consisted of 312 teachers (45.2% male and 54.8% female)
from 140 primary and secondary schools (Table 1). The response rate was approximately 28% at the
school level, which was lower than expected.

2.2. Measures
Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

The study consisted of a questionnaire with self-report questions, and a test section with multiple-
choice questions. All the questions and statements are presented in Table 2.

Background Information
The teachers were asked about their age and gender.

Self-Efficacy in Basic ICT


Self-efficacy was measured with statements reflecting teachers’ beliefs about their capability (Ban-
dura, 2006; Klassen & Chiu, 2010). Four statements were used to assess teachers’ beliefs regarding
self-efficacy in basic ICT. Teachers’ responses were scored in the following way: 1 = “No,” 2 =
“Yes, with help from others,” and 3 = “Yes, alone.”

Self-Efficacy in online Collaboration


Two statements were used to assess teachers’ beliefs regarding their confidence about organizing stu-
dents’ online collaboration with ICT. Teachers’ responses were scored in the following way: 1 =
“No,” 2 = “Yes, with help from others,” and 3 = “Yes, alone.”

Strategies to Evaluate Information.


Three statements were used to measure the teachers’ strategies for evaluating when gathering infor-
mation. A Likert scale was used for the responses (1 = Strongly disagree, 2 = Disagree, 3 = Agree, and
4 = Strongly agree).

Use of ICT
The teachers were given one question about how often they used a computer each week for admin-
istrative purposes and one question about how often they used a computer each week for teaching
purposes. In coding the teachers’ answers, 1 = “3 hours or less each week,” 2 = “Between 3 and 6

Table 1. Gender, age, and use of ICT.


Type %
Gender
Male 45.2
Female 54.8
Age
20–29 8.8
30–39 31.5
40–49 32.1
50–69 27.6
Use ICT more than 6 hours each week:
During their lessons 30.4
For administrative purposes 66.9
SCANDINAVIAN JOURNAL OF EDUCATIONAL RESEARCH 7

Table 2. Results showing means, factor loadings, and standard errors for all items in the structural equation model (N = 312, CFI =
0.960, TLI = 0.955, RMSEA = 0.021 (LO 90 = 0.00, HI 90 = 0.031), WRMR = 0.806).
Mean Loading SE
Self-efficacy basic skills (CA = 0.73)
I can use a spreadsheet to draw a graph 2.32 0.63** 0.05
I can download and install programmes 2.77 0.61** 0.06
I can edit digital photos or graphic 2.55 0.52** 0.07
I can make a database 1.55 0.73** 0.05
Self-efficacy online collaboration (CA = 0.71)
I can use collaborative writing tools on the Internet 2.33 0.66** 0.08
I can use social media together with students for collaborative/group work 2.01 0.78** 0.08
Strategies to evaluate information (CA = 0.61)
When I find information online, I check if the information is in line with my task 3.67 0.69* 0.06
When I have found information online, I check I this is in line with information from other sources 3.41 0.66* 0.07
When I have found information about a theme online, I am concerned with the author 3.39 0.44* 0.07
Digital competence (CA = 0.73)
Can you trust information from Wikipedia? 0.92 0.68* 0.11
Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

What statement about Wikipedia is not correct? 0.60 0.41* 0.07


You want to change competence aims in the curriculum. What can you do? 0.55 0.39* 0.07
What is the main difference in a digital map and a map on paper? 0.95 0.83* 0.12
Can others identify which websites you have visited and the search word you have used? 0.69 0.55* 0.07
What is not a browser? 0.89 0.71* 0.08
What is a cookie? 0.86 0.66* 0.08
What is the name of a computer operating system? 0.65 0.68* 0.06
Can you trust information from [an online encyclopaedia]? 0.71 0.50* 0.07
What does it mean that the email is encrypted? 0.78 0.49* 0.08
Can a virus attack your computer if you download a film from the Internet? 0.67 0.64* 0.06
Can you remove an online photo? 0.64 0.44* 0.07
Who is the owner of the images a person publish on his/her Facebook profile? 0.83 0.55* 0.08
Can a person delete his/her Facebook profile? 0.68 0.60* 0.06
Can you publish a photo on a blog site? 0.38 0.35* 0.08
Use of ICT (CA = 0.52)
Use of ICT for administrative purposes 3.32 0.84* 0.08
Use of ICT for teaching during lessons 2.18 0.42* 0.08
Including the Cronbachs alpha (CA) for all factors. ICT = information and communication technologies.
*p < .01.

hours,” 3 = “Between 7 and 9 hours each week,” 4 = “Between 10 and 12 hours each week,” and 5 =
“More than 13 hours per week.”

Digital Competence Measure


In addition to the self-report questions, the teachers answered a test with 15 multiple-choice ques-
tions measuring digital competence. Each multiple-choice question had four alternative response
options, with only one correct answer per question. Students’ answers were coded with 1 point
for each correct answer and 0 for each incorrect answer.
The questions were developed on the basis of the competence aims in the national curriculum at
the end of 7th grade. The questions covered various aspects of the national curriculum as digital
responsibility, how to search for and process information, and how to produce information with
authentic questions.

2.3. Data Collection and Protection


The study followed the guidelines on data collection and privacy protection from the Norwegian
Social Science Data Services (www.nsd.uib.no).

2.4. Data Analysis


Structural equation modeling (SEM) was used to answer the two research questions and to test the
six hypotheses. It can be described as a multivariate regression model with both dependent and
8 O. E. HATLEVIK

independent variables that can include factor analyses and can be used for different purposes (i.e.,
confirmatory factor analysis [CFA] and path analyses with manifest or latent variables).
In this paper, the CFA option in Mplus 7 was used to examine the quality of the measurement of
self-efficacy and digital competence (RQ 1).
In order to answer RQ 2 and to test the six hypotheses, SEM was used to specify a model based on
the six hypotheses. Self-efficacy in basic ICT was linked to self-efficacy in online collaboration (H1),
strategies to evaluate information (H2), and digital competence (H3). Further, the model linked strat-
egies to evaluate information to digital competence (H4). Finally, self-efficacy in online collaboration
(H5) and digital competence (H6) was linked with ICT use at school.
Self-efficacy in basic ICT was estimated with four items, self-efficacy in online collaboration
with two items, strategies to evaluate information with three items, and use of ICT at school
with two items. Digital competence was estimated with 15 categorical items (correct or not).
The mean and variance adjusted weighted least squares estimator in Mplus was used to test
Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

the model.
With Mplus, information was available in order to examine the quality of the tested model. A
chi-square test was run, and a non-significant chi-square indicated that the data did not differ
from the model. However, when analysing categorical data, the chi-square had to be taken
with caution.
Common indices were used to evaluate the fit of the model; the comparative fit index (CFI), the
Tucker-Lewis fit index (TLI), the root mean square error of approximation (RMSEA), the standar-
dized root mean square residual (SRMR). Levels of CFI and TLI ≥ 0.95, RMSEA ≤ 0.08, SRMR ≤
0.06 are acceptable (Brown, 2006; Kline, 2010). When including categorical data in the analyses,
the SRMR is replaced by the weighted root mean square residual (WRMR), and a WRMR ≤ 1 indi-
cates a good fit.
Overall, the quality of the internal structure of the model is of high importance. The levels of fac-
tor loadings have to meet the following criteria: (1) the factor loadings have to be statistically signifi-
cant on a 5% level, (2) factor loadings close to 0.40 or above are desirable, but factor loadings above
0.20 are accepted.

3. Results
3.1. Properties of Items
The data were analysed using IBM SPSS, version 21 and Mplus 7.0. Before starting the analyses, all
items were tested for normality. Skewness provides information about the how the distribution of
items is compared with the normal distribution, and kurtosis refers to whether there is a flat or
pointed distribution of data. All the items in the study have reasonable levels of skewness. Three
items had higher levels of kurtosis, but the levels of kurtosis were within what Lau and Yuen
(2015, p. 3) assume as normal.
Missing data in from respondents has implications for the analyses. In this study, missing data
could be handled by data imputation, as data was missing completely at random. This was detected
as Little’s Missing Completely at Random test (Polit & Beck, 2004) was not statistically significant
(chi-square = 164.9, DF = 146, p = 0.14).

3.2. Characteristics of Teachers


In total, 312 teachers (54.8% female and 45.2% male) (Table 1) answered the questionnaire. The
majority of the teachers were aged between 30 and 49 years old. The teachers were asked about
their use of ICT. A total of 66.9% of the teachers used ICT more than 6 hours each week for prep-
aration and administrative purposes, whereas 30.4% of the teachers in the study reported that they
used ICT more than 6 hours each week for teaching purposes.
SCANDINAVIAN JOURNAL OF EDUCATIONAL RESEARCH 9

3.3. CFA
Confirmatory factor analyses were used to examine the quality of (1) the measurement of self-effi-
cacy and (2) test of digital competence.
To test the measurement of self-efficacy, a model with four items loading on self-efficacy in basic
skills and two items loading on self-efficacy in online collaboration was tested. The results indicated a
very good model fit as the chi-square test was not significant, CFI = 1.00, TLI = 1.00, RMSEA = 0.00
(LO 90 = 0.00 and HI 90 = 0.000), SRMR = 0.00. All factor loadings were significant (p < 0.01) and
above 0.70.
To examine the test of digital competence, a model with 15 items loading (categorical data) on the
latent variable digital competence was run. The fit results indicated a very good model fit because the
chi-square test was not significant, CFI = 0.97, TLI = 0.97, RMSEA = 0.02 (LO 90 = 0.00 and HI 90 =
0.04), WRMR = 0.80. All factor loadings were significant (p < 0.01) and 14 out of 15 factor loadings
were above 0.40.
Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

3.4. Psychometric Properties: Testing the Total Model


A model with the six hypotheses (Figure 1) was converged to an acceptable solution. The fit results of
the model are acceptable as the chi-square was not significant, CFI = 0.960, TLI = 0.955, RMSEA =
0.021 (LO 90 = 0.00 and HI 90 = 0.031), WRMR = 0.806.
The internal factor structure of the model seems to be good as all factor loadings are above 0.35
and statistically significant (p < 0.01). The means, factor loadings, and standard errors of the items
are presented in Table 2.
Figure 2 contains the regression paths between the factors in the model and the explained var-
iance in the dependent variables. All the relationships specified in the model Hypotheses 1–6 are stat-
istically significant. Self-efficacy in basic ICT predicts self-efficacy in online collaboration (H1: β = .70,
p < .01), strategies to evaluate information (H2: β = .36, p < .01), and digital competence (H3: β = .46,
p < .01). Strategies to evaluate information predict digital competence (H1: β = .31, p < .01). Self-effi-
cacy in online collaboration (H5: β = .18, p < .01), and digital competence (H6: β = .50, p < .01) pre-
dicts ICT use.

Figure 2. Tested model with regression coefficients and explained variation in the dependent variables. *p < .05, **p < .01.
10 O. E. HATLEVIK

The variables in the model explain 13% of the variation in strategies to evaluate information, 41%
of the variation in digital competence, 49% of the variation in self-efficacy in online collaboration, and
36% of ICT use.

4. Discussion
This paper addresses the relationship between teachers’ self-efficacy, strategies to evaluate infor-
mation, digital competence, and use of ICT. Two research questions were developed for this paper.
The first research question is about the quality of the measurement of self-efficacy and the digital
competence test. The items used to identify self-efficacy in basic ICT and self-efficacy in online collabor-
ation were found to be valid measures. The analyses of theses questions supported a distinction between
self-efficacy in basic ICT and online collaboration, as recently suggested by Krumsvik (2011). These
items are examples of a task-specific self-efficacy because they describe how confidence about capability
Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

is connected to a certain domain of functioning (Bandura, 1997). This is in line with Klassen and Chiu’s
(2010) recommendation to ask questions about specific tasks in order to measure realistic beliefs.
Further, the analysis of the digital competence test supported that the 15 test items could be used to
build the latent variable digital competence. The 15 tasks were developed from specific competence aims
in the curriculum, and the tasks imply a more scholastic perspective of ICT instead of a more recreational
use of ICT. The test of digital competence with these 15 tasks does not cover the entire curriculum, but the
test measures selected parts of how digital competence is described in the curriculum.
The second research question was operationalized into six hypotheses. The analysis of the
suggested model shows that all the six examined hypotheses are supported by the collected data.
When examining the first hypothesis, the results of the SEM analyses indicate that teachers’ self-effi-
cacy in basic ICT predicts their self-efficacy in online collaboration. There is a high correlation
between teachers reporting confidence in solving basic ICT tasks and reporting self-confidence
using ICT with students in online collaboration. This is in line with an assumption that the self-effi-
cacy in basic ICT seems to be important for teachers’ to develop self-efficacy to use ICT for teaching
purposes (Krumsvik, 2011).
When it comes to the second hypothesis, the results show that self-efficacy in basic ICT predicts
strategies to evaluate information. Previous research suggests that increased levels of motivation,
operationalized as self-efficacy, could lead to more commitment and increased levels of learning
strategies (i.e., how to process information) (Schunk et al., 2008; Weinstein et al., 2000).
When it comes to explaining digital competence (Hypotheses 4 and 5), the results indicate that
both self-efficacy in basic ICT and strategies to evaluate information can predict variation in
measured digital competence. Self-efficacy in basic ICT and digital competence are positively related
with each other, as reported by Fanni et al. (2013), and there seems to be a partial overlap between
self-efficacy and digital competence. One explanation of this partial overlapping is that self-reported
efficacy measures something different than assessed digital competence (Hatlevik et al., 2015).
According to Hargittai and Shafer (2006), it is important to have a distinction between tested and
self-reported digital competence (Hargittai & Shafer, 2006). When it comes to strategies to evaluate
information, Schunk et al. (2008) suggest that use of appropriate strategies could lead to better
achievements. Krumsvik (2011) also emphasize the importance of learning strategies when describ-
ing teachers’ digital competence.
Finally, we look at the results from testing the fifth and the sixth hypothesis. The result shows that
both self-efficacy in online collaboration and digital competence can explain variation in teachers’ use of
ICT. This indicates that teacher’s beliefs about his or her own capability to carry out online collabor-
ation activities for students is positively related with their use of ICT in education. It is also reported by
recent research that teachers’ self-efficacy is important for their use of ICT at school (Klassen & Chiu,
2010). When it comes to the sixth hypothesis, it seems that higher achievements in digital competence
can lead to more frequent use of ICT. It is essential that digital competence is positively related to use of
ICT, because higher levels of digital competence can contribute to a more sensible and critical use of
SCANDINAVIAN JOURNAL OF EDUCATIONAL RESEARCH 11

technology in school. This is an argument for developing frameworks and approaches in order to inte-
grate digital competence into Norwegian teacher education (Krumsvik, 2014).
There has been a reform in Norwegian schools where it has been attempted to integrate digital
competencies into subjects, but it seems that the training of teachers is not put on the agenda
(Tømte et al., 2013). The Teaching and Learning International Survey (TALIS) 2013 shows that Nor-
wegian teachers are demanding opportunities to upgrade and increase their competence in the use of
ICT. Learning to use ICT skills for teaching and using new technologies in the workplace were the
second and third most needed areas for professional development among Norwegian teachers in the
TALIS (Organisation for Economic Co-operation and Development, 2014). One possible expla-
nation could be the lack of systematic training of teachers working in schools (Gudmundsdottir
et al., 2014). Teacher educators have to find ways to integrate digital competence as part of their tea-
cher education and professional developing of teachers (Krumsvik, 2014). Hopefully, a focus on con-
tinuing education for teachers can contribute to increased self-efficacy, enhanced strategies for
Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

evaluating information, and improved digital competencies (in alignment with the descriptions of
competence aims in the curriculum).

5. Conclusion
5.1. Limitations
The limitations of the study have to be mentioned. First, the response rate was only 25.3% at the
school level. Second, when using an online survey, teachers and schools with a positive attitude
towards technology could be overrepresented. Third, a self-selection bias could occur because it
was not mandatory to participate in the study. However, as we found variation in self-efficacy, digital
competence, and ICT use between the teachers, it seems that a rather heterogeneous sample of tea-
chers must have participated. Finally, the hypothesized model has a good fit, but it is possible that
other models also could fit the data as well, or that other constructs could be used to enhance the
relationship between the factors in the model. Nevertheless, the results from the study give insight
into factors predicting digital competence.

5.2. Concluding Remarks and Further Research


To conclude, the theoretical concepts, the theoretical model, and the hypotheses were supported by
the data. The questions used to measure teachers’ ICT confidence seem to be consistent and appro-
priate to identify teachers’ self-efficacy – both in basic ICT and online collaboration. The questions
were not targeting a general type of self-efficacy, but they were ICT-specific.
Second, tasks in the tests were developed against the background of the national curriculum. This
study shows that the tasks used to measure digital competence provide an acceptable and a valid
measure. This test does not cover the digital competence concept in full, but covers selected topics
of digital competence.
Third, the analyses support the relationship between the self-efficacy ICT, strategies for evaluating
information, digital competence, and use of ICT. The results show that teachers’ ICT self-efficacy is
important for strategies to evaluate information and for teaching practice. Self-efficacy can explain
variation both in teachers’ digital competence and their use of ICT at school. This underpins how
important it is that teachers have confidence about their own capabilities when it comes to using
ICT for teaching purposes. The results also show that strategies to evaluate information can be
key features for teachers’ development in digital competence.
Overall, it is important for further research to examine the relationship between self-efficacy, use
of ICT, and digital competence with a longitudinal research design. Further, more research is also
required on how to develop teachers’ self-efficacy, their strategies to evaluate information, and
their digital competence according to the competence aims in the curriculum.
12 O. E. HATLEVIK

Disclosure Statement
No potential conflict of interest was reported by the author.

References
Abele, A. E., & Spurk, D. (2009). The longitudinal impact of self-efficacy and career goals on objective and subjective
career success. Journal of Vocational Behavior, 74, 53–62.
Balanskat, A., & Gertsch, C. A. (2010). Digital skills working group. Review of national curricula and assessing digital
competence for students and teachers: findings from 7 countries. Brüssels: European Schoolnet.
Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: Freeman.
Bandura, A. (2006). Guide for constructing self-efficacy scales. In F. Pajares, & T. Urdan (Eds.), Adolescence and edu-
cation: Vol. 5. Self-efficacy and adolescence (pp. 307–337). Greenwich, CT: Information Age.
Binkley, M., Erstad, E., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., & Rumble, M. (2012). Defining 21st century
skills. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills (pp. 17–66).
Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

Dordrecht, the Netherlands: Springer. doi:10.1007/978-94-007-2324-5.


Broos, A., & Roe, K. (2006). The digital divide in the playstation generation: Self-efficacy, locus of control and ICT
adoption among adolescents. Poetics, 34, 306–317.
Brown, T. A. (2006). Confirmatory factor analysis for applied research. London: The Guilford Press.
Calvani, A., Fini, A., Ranieri, M., & Picci, P. (2012). Are young generations in secondary school digitally competent? A
study on Italian teenagers. Computer & Education, 58, 797–807.
Christophersen, K. A., Elstad, E., Turmo, A., & Solhaug, T. (2016). Teacher education programmes and their contri-
bution to student teacher efficacy in classroom management and pupil engagement. Scandinavian Journal of
Educational Research, 60, 240–254.
Claro, M., Preiss, D. D., San Martín, E., Jara, I., Hinostroza, J. E., Valenzuela, S., & Nussbaum, M. (2012). Assessment of
21st century ICT skills in Chile: test design and results from high school level students. Computers & Education, 59
(3), 1042–1053. doi:10.1016j.compedu.2012.04.004.
Educational Testing Service (ETS). (2001). Digital transformation. A framework for ICT literacy. A report of the inter-
national ICT literacy panel. Princeton, NJ: Educational Testing Service. Retrieved January 10, 2013 from http://
www.ets.org/iskills/about/research/
Egeberg, G., Gudmundsdottir, G. B., Hatlevik, O. E., Ottestad, G., Skaug, J. H., & Tømte, K. (2012). Monitor 2011.
Skolens digitale tilstand. Oslo: Senter for IKT i utdanningen.
Engen, B. K., Giæver, T. H., & Mifsud, L. (2014, March 5–7). Mind the gap: ICT in the Norwegian national curriculum
and the 2010 Teacher Education Reform. Paper presented at the Nordic Educational Resarch Association (NERA)
conference. Education for sustainable development, Lillehammer, Norway.
European Commission. (2013). Survey of schools: ICT in education. Luxembourg: The European Union.
Fanni, F., Rega, I., & Cantoni, L. (2013). Using self-efficacy to measure primary school teachers’ perception of ICT:
Results from two studies. International Journal of Education and Development using Information and
Communication Technology (IJEDICT), 9(1), 100–111.
Ferrari, A. (2012). Digital competence in practice: An analysis of frameworks (Report EUR 25351 EN). Luxembourg:
Publications Office of the European Union.
Ferrari, A. (2013). DIGICOMP: A framework for developing and understanding digital competence in Europe.
Luxembourg: JRC Scientific and Policy Reports EUR26036EN.
Fraillon, J., Ainley, J., Schulz, W., Friedman, T., & Gebhardt, E. (2014). Preparing for Life in a Digital Age. The IEA
International Computer and Information Literacy Study International Report. Cham: Springer.
Gudmundsdottir, G. B., Hatlevik, O. E., Ottestad, G., & Wiberg, L. K. (2014). New teachers’ digital competence and
experiences of ICT in teacher education programmes in Norway. 17th UNESCO-APEID International
Conference, Bangkok, Thailand.
Gui, M., & Argentin, G. (2011). Digital skills of internet natives: different forms of internet literacy in a random sample
of northern Italian high school students. New Media & Society, 13(6), 963–980.
Hammond, M., Reynolds, L., & Ingram, J. (2011). How and why do student teachers use ICT? Journal of Computer
Assisted Learning, 27, 191–203.
Hargittai, E., & Shafer, S. (2006). Differences in actual and perceived online skills: The role of gender. Social Science
Quarterly, 87(2), 432–448. doi:10.1111/j.1540-6237.2006.00389.x
Hatlevik, O. E. (2012). Analyzing factors influencing students’ productive use of computers: A structural equation
model. The International Journal of Technology, Knowledge and Society, 7(4), 11–28. Retrieved from http://ijt.
cgpublisher.com/product/pub.42/prod.787
Hatlevik O. E., & Christophersen, K.-A. (2013). Digital competence at the beginning of upper secondary school:
Identifying factors explaining digital inclusion. Computers & Education, 63, 240–247.
SCANDINAVIAN JOURNAL OF EDUCATIONAL RESEARCH 13

Hatlevik, O. E., Gudmundsdottir, G. B., & Loi, M. (2015). Digital diversity among upper secondary students: A multi-
level analysis of the relationship between cultural capital, self-efficacy, strategic use of information and digital com-
petence. Computers and Education, 81, 345–353.
Katz, I. R., & Macklin, A. S. (2007). Information and communication technology (ICT) literacy: Integration and assess-
ment in higher education. Princeton, NJ: Educational Testing Service. Retrieved February 29th 2012 from http://
www.ets.org/iskills/about/research/
Klassen, R. M., & Chiu, M. M. (2010). Effects on teachers’ self-efficacy and job satisfaction: Teacher gender. Years of
experience, and job stress. Journal of Educational Psychology, 101(3), 741–756.
Kline, R. B. (2010). Principles and practice of structural equation modeling (3rd ed.). New York, NY: Guilford Press.
Krumsvik, R. J. (2011). Digital competence in Norwegian teacher education and schools. Högre Utbilding, 1, 39–51.
Krumsvik, R. J. (2014). Teacher educators’ digital competence. Scandinavian Journal of Educational Research, 58(3),
269–280. doi:10.1080/00313831.2012.726273.
Lau, W. W. F., & Yuen, A. H. K. (2015). Factorial invariance across gender of a perceived ICT literacy scale. Learning
and Individual Differences. http://dx.doi.org/10.1016/j.lindif.2015.06.001
Lee, M. H., & Tsai, C. C. (2010). Exploring teachers’ perceived self efficacy and technological pedagogical content
knowledge with respect to educational use of the World Wide Web. Instructional Science, 38, 1–21. doi:10.1007/
Downloaded by [University of Nebraska, Lincoln] at 14:31 29 May 2016

s11251-008-9075-4.
Norwegian Directorate for Education and Training (NDfET). (2012). Framework for basic skills. Retrieved October 10,
2014, from http://www.udir.no/PageFiles/66463/FRAMEWORK_FOR_BASIC_SKILLS.pdf?epslanguage=no
Organisation for Economic Co-operation and Development. (2014). TALIS 2013 results: An international perspective
on teaching and learning. Paris: OECD Publishing.
Polit, D. F., & Beck, C. T. (2004). Nursing research: Principles and methods (7th ed). Philadelphia, PA, USA: Lippincott
Williams & Wilkins.
Puustinen, M. & Rouet, J.-R. (2009). Learning with new technologies: Help seeking and information searching
revisited. Computers & Education, 53, 1014–1019.
Sáinz, M., & Eccles, J. (2012). Self-concept of computer and math ability: Gender implications across time and within
ICT studies. Journal of Vocational Behavior, 80, 486–499.
Schunk, D. H., Pintrich, P. R., & Meece, J. L. (2008). Motivation in education: Theory, research, and applications. Upper
Saddle River, NJ: Pearson/Merrill Prentice Hall.
Skaalvik, E. M., & Skaalvik, S. (2010). Teacher self-efficacy and teacher burnout: A study of relations. Teaching and
Teacher Education, 26, 1059–1069.
So, H. J., Choi, H., Lim, Y. J., & Xiong, Y. (2012). Little experience with ICT: Are they really the Net generation student-
teachers? Computers and Education, 59, 1234–1245.
Teo, T. (2014). Unpacking teachers’ acceptance of technology: Tests of measurement invariance and latent mean
differences. Computers & Education, 75, 127–135.
Tondeur, J., Hermans, R., van Brak, J., & Valcke, M. (2008). Exploring the link between teachers’ educational belief
profiles and different types of computer use in the classroom. Computers in Human Behaviour, 24, 2541–2553.
Tømte, C., Kårstein, A., & Olsen, D. S. (2013). IKT i lærerutdanningen. På vei mot profesjonsfaglig digital kompetanse?
Oslo, Norway: NIFU.
Viel-Ruma, K., Houchins, D., Jolivette, K., & Benson, G. (2010). Efficacy beliefs of special educators: The relationships
among collective efficacy, teacher self-efficacy, and job satisfaction. Teacher Education and Special Education: The
Journal of the Teacher Education Division of the Council for Exceptional Children, 33(3), 225–233. doi:10.1177/
0888406409360129.
Weinstein, C. E., Husman, J., & Dierking, D. R. (2000). Self-regulation interventions with a focus on learning strategies.
In I. M. Boekerts, P. R. Pintrich, & M. Zeidner (Eds.), The handbook of self-regulation (pp. 728–747). London:
Academic Press.
Yang, H.-L., & Cheng, H.-H. (2009). Creative self-efficacy and its factors: An empirical study of information system
analysts and programmers. Computers in Human Behavior, 25(2), 429–438. doi:10.1016/j.chb.2008.10.005.

You might also like