Professional Documents
Culture Documents
net/publication/311363231
CITATIONS READS
7 471
4 authors:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Panagiotis Kampylis on 22 December 2018.
CONFERENCE
PROCEEDINGS
SEVILLE (SPAIN)
14-16 NOVEMBER 2016
2016
9THINTERNATIONAL CONFERENCE OF
EDUCATION,
RESEARCH AND
INNOVATION
CONFERENCE
PROCEEDINGS
SEVILLE (SPAIN)
14-16 NOVEMBER 2016
Published by
IATED Academy
iated.org
ICERI2016 Proceedings
9th International Conference of Education, Research and Innovation
November 14th-16th, 2016 — Seville, Spain
Edited by
L. Gómez Chova, A. López Martínez, I. Candel Torres
IATED Academy
ISBN: 978-84-617-5895-1
ISSN: 2340-1095
Depósito Legal: V-2569-2016
Abstract
This paper summarises the first phase of the development of an evidence-based self-assessment tool
(SAT) for the integration and effective use of digital technologies in schools across Europe, based on
the European Framework for Digitally-Competent Educational Organisations (DigCompOrg,
http://europa.eu/!dV98uF). In particular, the paper presents an analysis of existing school self-
assessment tools and how they correspond to the DigCompOrg conceptual model. A literature review
identified nine tools for further analysis, representing different approaches to the self-assessment of
the use of digital technologies in schools. The findings confirm that there has been no attempt until
now to develop a pan-European tool that is evidence-based and can add value by promoting
transparency, comparability and peer-learning across Europe both for schools and educational
policymakers. The findings reveal a number of general considerations that include a focus on
leadership and governance practices; emphasis on digital infrastructure and resources;
acknowledgment of teachers’ role and the need for capacity building; the need for integration of digital
technologies across the curriculum; and the need for cross-fertilisation and peer-learning for the
development and implementation of self-assessment tool. These considerations will guide the next
phases of the study, i.e. the design of a prototype SAT for digitally-competent schools. Such a tool can
empower policymaking within each country, and enable school communities (i.e., school leaders,
teachers and students) to periodically self-reflect on their school’s current state of development and on
future steps in realising effective digital-age learning.
Keywords: digital-age learning; digitally-competent schools; Digital technologies in Education;
Framework for Digitally-Competent Educational Organisations (DigCompOrg).
1 INTRODUCTION
Educators and policy makers acknowledge that learning in and for the digital age represents a
formidable challenge for policy-makers, schools, teachers, students and society in general [1-4]. There
is ample evidence of the use of digital technologies in exciting and promising ways at all levels of
education. However, in order to consolidate progress, educational institutions need to regularly review
their teaching, learning and organisational practices in order to become progressively digitally-
competent at an organisational level [e.g. 5]. To do so, educational institutions need (a) a clear
understanding of what it means to be digitally competent as an organisation, and (b) tools to assist
them in self-assessing their current state of development and in formulating plans for improvement.
The European Framework for Digitally-Competent Educational Organisations (DigCompOrg) provides
a comprehensive and generic conceptual model for the effective integration of digital technologies by
educational institutions [2]. DigCompOrg model is the result of a research study designed and
conducted by the Joint Research Centre of the European Commission for the Directorate General for
Education and Culture (DG EAC). The DigCompOrg framework identifies seven core elements and 15
sub-elements that characterise all educational organisations. There is also scope for the addition of
sector-specific elements and sub-elements (e.g., elements and sub-elements particular to higher
education). DigCompOrg expands the sub-elements to 74 descriptors that add further detail to the
elements and sub-elements of the model. Diagrammatically, the elements, sub-elements and
descriptors of DigCompOrg are presented as sectors of a circle, with an emphasis on their
interrelatedness and interdependence (see Fig. 1).
The DigCompOrg model focuses on the teaching, learning, assessment and related learning support
activities undertaken by a given educational organisation. It does not extend to core administrative
tasks that would require ICT competences, as for these, one can draw on experience from any
organisation. The ultimate aim of DigCompOrg however is to improve student learning through the use
of digital technologies.
The DigCompOrg framework is designed to help both educational institutions and policymakers in
planning the effective use of digital learning technologies at all levels of the educational journey:
primary and secondary schools, Vocational Education and Training (VET) centres as well as higher
education institutions such as universities, technical universities and polytechnics. Several Member
States have translated DigCompOrg into their own languages and have been using it to promote the
1
integration of digital technologies in their education systems . As its uptake expands, DigCompOrg can
facilitate a common understanding across the Member States as well as transparency and
comparability between related initiatives throughout Europe. However, DigCompOrg remains a
conceptual model, as it has not yet been piloted or implemented in real, organisational settings.
It is recognised that in order to be useful in practice in particular settings, the DigCompOrg framework
requires adaptation and customisation, and work is currently being undertaken to validate a version of
the framework specifically applicable to schools (primary and secondary) and VET (initial). This paper
traces the first phase of the follow-up study ‘DigCompOrg School Pilots’, currently being conducted by
JRC and a consortium of experts from UK, Ireland, Denmark, Spain, Italy and Estonia for DG EAC.
The study aims to (a) validate a customised version of the framework and, based on this validation
exercise, (b) to develop an evidence-based organisational-level self-assessment tool (SAT) for the
innovative and effective use of digital technologies in schools across Europe. Such a SAT can enable
school communities (via the collection of data from school leaders, teachers and students) to
periodically self-reflect on their current state of development and on their plans for future steps in
realising effective digital-age learning.
2 METHODOLOGY
We used a mixed method approach for the first phase of the DigCompOrg School Pilots project, which
involved desk research; analysis of existing tools that promote the integration of digital technologies in
1
See for instance the translations of the model in Spanish (http://bit.ly/27R4ZWx), Estonian (http://bit.ly/2cAM3WV) and
Lithuanian (http://bit.ly/2d8jocI).
0817
schools at national/international level; and expert consultation (see Fig. 2). The next phases of this
project will focus on the following:
• A user consultation survey to validate the adaptation for a ‘schools’ context of the generic
DigCompOrg framework;
• A series of expert and user consultations to underpin the design of a prototype SAT based on
the descriptors of DigCompOrg conceptual model (as adapted for ‘schools’) and the analysis of
existing tools;
• Pilot implementation of the prototype SAT in a number of schools across Europe;
• Additional (mainly qualitative) research in selected schools as well as exert and stakeholder
consultation based on the analysis of the results from the pilot implementation of the SAT;
• Consolidated version of the DigCompOrg SAT.
In this paper, we explain the process for selecting a number of existing tools that are of particular
interest and relevance to the DigCompOrg SAT design initiative. These are tools that schools across
Europe use to support self-assessment of their uptake of digital technologies, or maturity in the use of
these technologies. We also present the findings from the analysis of these tools and the key lessons
learnt. The analysis and lessons learnt will influence the next phases of the DigCompOrg School Pilots
study, in particular the development and piloting of the DigCompOrg SAT in at least four countries and
in five languages (English, Spanish, Danish, Estonian, Italian).
0818
this inventory, nine were considered very relevant to the development of the SAT envisaged in the
context of DigCompOrg School Pilots. These are marked with an asterisk (see Table 1).
A further step in locating relevant tools involved consultation with the members of the Education and
Training 2020 Working Group of Digital Skills and Competences [6], an expert group comprising
representatives from the education ministries of Member States, relevant EU bodies or agencies,
education and training associations and European social partners as well as independent experts. This
consultation identified an additional four tools, not initially included in the original DigCompOrg
inventory.
Finally, an online search for more tools was conducted, covering a wide range of materials such as
technical, evaluation and policy reports; websites, wikis and blogs; journal and conference papers;
promotional literature (e.g., leaflets); and slideshow presentations.
Through this three-step approach, thirteen self-assessment tools developed and/or used at regional,
national or international level in Europe was identified. These tools are summarised in Table 1 below.
The basic criteria for the initial selection of the tools were the following:
• Verification that each tool is (or can be) used by European primary, secondary or VET schools
for the self-assessment of their practices in integrating and effectively using digital technologies
for teaching and learning;
• Verification that the development and/or implementation of each tool is recent or ongoing.
2
In some cases acronyms have been adopted by the authors of this paper as abbreviations when discussing the tools analysed;
these are not necessarily acronyms created by the authors of the tools.
0819
As can be seen in Table 1, the vast majority of the tools (9 out of 13) have a national character, one is
intended for regional use, two have been developed in the context of European projects (i.e. Ae-MoYS
and FCMM) and one has international character (i.e. Microsoft SRT). It is worth noting that a majority
of the tools rely on public funding (regional, national or European) for their development and/or
implementation and only one has been developed by a private company. Also, there is a tool (DSoD)
which has been developed through public/private partnership. All of the tools are available free of
charge for use by schools, with the exception of NAACE SRF that requires an annual subscription of
£50 (plus VAT).
In order to select the tools that will be analysed further, we applied the following criteria:
• Verification that reliable data for the development and/or implementation of the tool is available
3
in English language ;
• Verification that each case reflects an approach that can provide insights for the development of
DigCompOrg SAT;
• Verification that the selected tools reflect the broadest possible variety in terms of:
o Implementation phase (pilot, scale, mainstream);
o Type of the tool (e.g., questionnaire, matrix, online, in print);
o Geographical coverage (regional, national, international);
o Users involved for providing the information (school leaders, teachers, students);
o Scope of usage (ranging from its use solely by individuals, to its use at the level of the
organisation or beyond, e.g., aggregated data used at the education system level).
Based on the criteria presented above, nine tools were finally selected for further analysis,
representing different approaches to the self-assessment of the use of digital technologies in schools
(see Table 2). Given the nature of this exercise, the analysis is limited to providing a narrative
overview of the tools analysed and does not in its own right present an empirical synthesis of their
effectiveness and impact.
3
As a result, four potentially relevant tools available in other languages (i.e., LIKA available in Swedish, e-Škole available in
Croatian; DigiPeegel available in Estonian and AGITIC available in Spanish) are not included in the analysis presented below.
0820
FCMM Enables teachers and Comparability Part of the Future Classrooms toolkit.
schools to assess the level with national & Diagnostic report to plan for the next level of
of innovation with international maturity. Under Creative Commons licence.
technology average
Microsoft Change management tool No Focus on creating a vision for the use of ICT.
SRT for ICT integration Support to manage change process.
Opeka Evaluation of teachers' Comparison Qualitative research is conducted to validate
and schools' digital with other tool results. The questionnaire includes also
competences and culture teachers from questions about the quality of the tool itself.
the same Information from the tool is used to modify
school or the Finnish ICT policy in education.
same town,
teachers who
teach the same
subject, or with
all teachers
e-Learning Where schools are No Printed planning tool, part of a Handbook for
Roadmap currently positioned in e- planning and implementing eLearning. Whole
Learning and where they school planning and self-evaluation is
would like to go enabled.
School Reflect on facilitation and No, only the Intended for school heads but for use in
mentor execution of pedagogical school has collaboration with other staff.
use of ICT access to the
results and can
decide to give
or not access to
the school's
managerial
agency
NAACE Structured route for No Originally developed by Becta. School
SRF reviewing and improving reaches a certain level (with supporting
schools' use of technology evidence) and applies for a national quality
accreditation ICT Mark (http://bit.ly/2d3o0iU).
Table 2 demonstrates that the selected tools follow diverse approaches for the self-assessment of the
use of digital technologies by schools. In terms of focus, the tools show a quite convergent approach:
most of them aim to guide schools to self-assess and self-reflect on their current state of development
and to support them in following a structured programme for change and improvement. Several tools
are used to create the school´s vision (e.g., Microsoft SRT) and action plan (e.g., the Ae-MoYS) for a
more effective uptake of digital technologies. The eLemer tool from Hungary follows an interesting
approach, asking users to identify possible evidence (such as lesson plans, school regulations, e-
portfolios etc.) that are available and support assertions made in their self-assessments.
Regarding the ownership of the data generated by the tools and its use for benchmarking, the
approaches are divergent. Some of the tools offer the opportunity for comparisons at local, national or
even international level. On the other hand, several tools intentionally do not offer this functionality and
the results are available only to the school itself. In the case of School Mentor, the school can decide
whether or not to give access to data to the school´s managerial agency, which is the local
municipality.
Some of the tools (e.g., Opeka, eLemer) use aggregated but anonymised data for informing policy
makers at local or national levels in order to influence the related policies for the integration and
effective use of digital technologies by schools.
Table 3 below, presents an overview of the tools analysed in terms of their type, extensiveness
(number of items included) and the scales or maturity models they use.
0821
Table 3. Type and length of the tools analysed.
As can be seen in Table 3, tools can be divided in two groups. The first uses the format of a
questionnaire providing schools with a number of statements/questions for self-assessing their use of
digital technologies for learning. The second group of tools uses matrices with a (4 to 5 levels) maturity
model and a number of descriptors that have been described against this model. Opeka from Finland
includes also statements related to the quality of the tool itself and some 10 questions to elicit
background information. Several other tools also ask users for background information with the
exception of Microsoft SRT that does not ask for such information.
In most of the tools analysed, school heads provide the data. In some cases, all teachers in a given
school are involved. Only the eLemer tool includes students. NAACE SRF offers a variety of options
for providing the data: one person (e.g., school head); the whole senior management/leadership team;
staff working in teams and providing data for each area/key element; all staff working together to
review all statements.
The key elements of the selected tools were mapped against the DigCompOrg conceptual model (see
Table 4) in order to identify commonalities in the way the different tools cluster the questionnaire
statements or the matrix items they use. It should be noted that the nine tools analysed are intended to
represent the diversity of tools in this field and not to represent those which map most
comprehensively to the framework. The key observations from Table 4 are synthesised and discussed
in the next section.
4 DISCUSSION
The desk research reveals the use of a variety of tools and approaches for the self-assessment of the
use of digital learning technologies by schools in several European countries. The tools analysed map
very well to the DigCompOrg conceptual model, which is comprehensive and holistic in its nature. The
findings from the analysis of the key characteristics of existing tools have led to a number of general
observations, presented briefly below, which will inform the design process for the DigCompOrg SAT.
0822
is crucial for the innovative and effective use of digital technologies, as they have to provide the
enabling conditions, such as vision and strategic planning.
Table 4. Mapping the key elements of the tools against the DigCompOrg conceptual model.
DigCompOrg Leadership & Teaching & Professional Assessment Content & Collaboration Infrastructure
Conceptual governance learning development practices curricula & networking
framework practices practices
FCMM Organisational Educational Capacity Management Educational Teacher- Tools and
eMaturity; Processes building of Teaching, Resources student resources
Management (Pedagogy; Learning & (Underpinning collaboration
of Teaching, Learner Assessment Technology);
Learning & Role); Educational
Assessment Outcomes
(Learning
Objectives),
Microsoft Leadership & Teaching, Capacity Teaching, Leadership & a Learning
SRT a Culture of Learning & Building Learning & Culture of Environment
Innovation Assessment Assessment Innovation;
Learning
Environment
eLEMER Management Learners & Teachers & Management Infrastructure
learning; teaching
Teachers &
teaching
Opeka Digital ICT-skills ICT-skills Digital learning Devices and
learning culture software
culture
School Organisation; Pedagogical In School
mentor Administration practice; Organisation: resources
& framework Digital Communication
conditions; competence & External
Mapping & Communication
planning
Ae-MoYS Leadership & School ICT Professional ICT in the School ICT Resources &
Vision; School culture Development Curriculum culture Infrastructure
ICT culture
e-Learning Leadership & e-Learning Professional ICT in the e-Learning ICT
Roadmap planning; e- Culture Development curriculum Culture infrastructure
Learning
Culture
DSoD Leadership School ICT Continuing ICT School ICT Resources &
and Vision; culture Professional integration in culture infrastructure
School ICT Development the curriculum
culture
NAACE SRF Leadership & Teaching Professional Assessment Use of ICT in Resources
management and learning development of digital the curriculum
capability
0823
4.3 Acknowledgment of teachers´ role and the need for capacity building
Most of tools put emphasis on the role of teachers and the need for their continuing professional
development in order to be confident and competent users of digital technologies for learning. Some
tools, such as School Mentor, place specific emphasis on the digital competence of students and
teachers that is required to underpin the use of digital technologies in an innovative and effective way
across the curriculum.
0824
schools. The next step is an intensive consultation process about the development of the tool, based
on the DigCompOrg conceptual model, involving
• More than 15 experts from UK, Ireland, Denmark, Spain, Italy and Estonia;
• School leaders, teachers and students from more than 70 schools from Denmark, Spain, Italy
and Estonia;
• Educational stakeholders and policy makers at local, regional, national and European level.
Both the analysis of existing tools and the wide consultation process aim to provide input and insights
leading to the development of a prototype SAT, to be piloted in schools from four EU education
systems: Spain, Italy, Denmark and Estonia during 2017. The pilot implementation of the tool will
involve not only school leaders and teachers but also students, so to have the most holistic view of the
innovative and effective use of digital technologies for learning in the participating schools. The
quantitative analysis of the data from the pilot implementation of the DigCompOrg SAT will be
complemented by qualitative research (e.g., focus groups and case studies) and outcomes will be
discussed with education experts, stakeholders and policy makers in order to develop and make
available the consolidated version of DigCompOrg SAT.
The primary aim of the consolidated DigCompOrg SAT, which is expected to be released by the end of
2017, is twofold. On the one hand it aims to encourage self-reflection and self-assessment within
individual schools as they progressively deepen their engagement with digital learning and pedagogy.
On the other hand, it aims to support policy makers in designing, implementing and evaluating policy
interventions for the integration and effective use of digital learning technologies. Overall, the
DigCompOrg SAT aims to stimulate evidence-based dialogue and sharing of experiences, thus
contributing to European and Member State policy priorities to modernise schools in Europe and to
promote effective digital-age learning.
DISCLAIMER
The views expressed in this article are purely those of the authors and should not be regarded as the
official position of the European Commission.
REFERENCES
[1] OECD, "Students, Computers and Learning - Making the connection," 2015.
[2] P. Kampylis, Y. Punie, and J. Devine. Promoting Effective Digital-Age Learning - A European
Framework for Digitally-Competent Educational Organisations.2015 EUR 27599 EN.
doi:10.2791/54070 [Online]. Available: from https://ec.europa.eu/jrc/en/digcomporg
[3] European Commission. Draft 2015 Joint Report of the Council and the Commission on the
implementation of the Strategic framework for European cooperation in education and training
(ET2020) - New priorities for European cooperation in education and training, {SWD(2015) 161
final} [Online]. Available: https://ec.europa.eu/transparency/regdoc/rep/1/2015/EN/1-2015-408-
EN-F1-1.PDF
[4] European Network of Education Councils, "Learning in the Digital Age - Report of the seminar of
the European Network of Education Councils, Athens, 5-6 May 2014 with the support of the
European Commission DG Education and Culture," E. N. o. E. C. (EUNEC), Ed., ed. Brussels:
European Network of Education Councils (EUNEC) Secretariat, 2014.
[5] European Commission. Opening up Education: Innovative teaching and learning for all through
new Technologies and Open Educational Resources [COM(2013) 654 final] [Online]. Available:
http://ec.europa.eu/education/news/doc/openingcom_en.pdf
[6] European Commission. ET 2020 Working Groups [Online]. Available: http://europa.eu/!Xg99VX
0825