You are on page 1of 10

Learning Analytics in Higher Education – Challenges and

Policies: A Review of Eight Learning Analytics Policies


Yi-Shan Tsai Dragan Gasevic
The University of Edinburgh The University of Edinburgh
Edinburgh, United Kingdom Edinburgh, United Kingdom
+44 131 651 6243 +44 131 651 3837
yi-shan.tsai@ed.ac.uk dragan.gasevic@ed.ac.uk

ABSTRACT followed by a review of existing policies for learning analytics


This paper presents the results of a review of eight policies for (Section 5) and a discussion on how the policies have addressed
learning analytics of relevance for higher education, and discusses the identified challenges (Section 6), so as to provide insights into
how these policies have tried to address prominent challenges in future policy development. In the context of this paper, ‘learning
the adoption of learning analytics, as identified in the literature. analytics policy’ is perceived as a set of guidelines that include
The results show that more considerations need to be given to both legislative regulations and non-legislatives principles for the
establishing communication channels among stakeholders and use of learning analytics. Therefore, a code of practice for learn-
adopting pedagogy-based approaches to learning analytics. It also ing analytics is also considered a policy in this study.
reveals the shortage of guidance for developing data literacy
among end-users and evaluating the progress and impact of learn-
2. BACKGROUND
ing analytics. Moreover, the review highlights the need to estab- Educational institutions are complex adaptive systems, which tend
lish formalised guidelines to monitor the soundness, effectiveness, to be stable and resistant to change due to a range of political,
and legitimacy of learning analytics. As interest in learning ana- social, cultural and technical norms [17]. Therefore, the challenge
lytics among higher education institutions continues to grow, this to bring about change in higher education institutions where com-
review will provide insights into policy and strategic planning for plex and adaptive systems exist has been described as a ‘wicked
the adoption of learning analytics. problem’ [17]. It is suggested that persistent, dedicated and strate-
gic efforts are required for the adoption of learning analytics
CSS Concepts [3][26], which highlights the imperative to analyse challenges
faced by institutions in their adoption of learning analytics so as to
Applied computing → Education; Security and privacy →
develop a comprehensive policy that is visionary, pertinent to the
Human and societal aspects of security and privacy context, concurrent to the institution’s mission, and addresses the
challenges faced by the institution.
Keywords
Learning analytics, policy, code of practice, challenge, strategy, There have been approaches to learning analytics policy develop-
higher education ment, such as the cause-effect framework and the RAPID Out-
come Mapping Approach (ROMA) [17][36]. The former is used
1. INTRODUCTION to identify the relationships between multiple linkages in complex
While interest in learning analytics remains high among higher systems where step-by-step action is needed to bring about effect
education institutions, some hesitate to embrace it due to various change. The latter contains seven steps: define (and redefine)
challenges regarding data procurement, institutional capabilities policy objectives, map political context, identify key stakeholders,
and buy-in from relevant stakeholders [13]. There are many unan- identify desired behaviour changes, develop an engagement strat-
swered questions with respect to the effectiveness and usefulness egy, analyse internal capacity to effect change, and establish mon-
of learning analytics to specific institutional contexts and prob- itoring and learning frameworks. This framework is meant to be
lems even for institutions that have taken initiative to adopt learn- used iteratively and reflectively so as to allow refinement and
ing analytics [10]. This pervading uncertainty calls for an investi- adaptation of goals and solutions. For example, Ferguson and
gation into challenges faced by higher education institutions in others used the ROMA framework to analyse three institution-
their adoption of learning analytics, and into policies that are wide learning analytics cases, which gives a strategic view of the
meant to frame the practice of learning analytics and ensure that it practice and suggests practical steps to consider at each phase of
is effective and appropriate. the ROMA cycle [13].
In light of this, this paper first reviews literature about the state of It is of great importance that institutions adopt learning analytics
learning analytics adoption and related challenges based on 23 under clear guidelines that are grounded in cultural, social, eco-
empirical studies of learning analytics (Section 4). This is then nomic and political contexts specific for each institution and are
based on existing best practices for learning analytics and learning
Permission to make digital or hard copies of all or part of this work for
personal or classroom use is granted without fee provided that copies are theories. However, there has not been much analysis of existing
not made or distributed for profit or commercial advantage and that copies learning analytics policies, especially not with respect to how they
bear this notice and the full citation on the first page. Copyrights for com- address challenges that have emerged in the empirical literature
ponents of this work owned by others than ACM must be honored. Ab- on learning analytics adoption. This paper sheds light on existing
stracting with credit is permitted. To copy otherwise, or republish, to post practices and makes a call for more methodological, comprehen-
on servers or to redistribute to lists, requires prior specific permission sive and coordinated work by the community.
and/or a fee. Request permissions from permissions@acm.org
LAK '17, March 13-17, 2017, Vancouver, BC, Canada 3. METHODOLOGY
Copyright 2017 ACM. ISBN 978-1-4503-4870-6/17/03…$15.00. In order to understand the state of learning analytics adoption and
DOI: http://dx.doi.org/10.1145/3027385.3027400
challenges that higher education institutions face in implementing that explored methods for data analysis and studies that were not
learning analytics and how existing learning analytics policies carried out in the higher education context.
have tried to address these challenges, we searched relevant litera- During the snowball phase of our bibliographical research, we
ture using simple key words including ‘learning analytics’ and identified four existing policies for learning analytics, including
‘policy or policies’ in journals and databases that were well- Jisc’s “Code of Practice” [24], LACE’s “DELICATE checklist”
known for their collections of learning analytics related research [11], LEA’s Box’s “Privacy and Data Protection Policy” [29], the
and educational research. Table 1 gives an overview of the UK National Union of Students’ (NUS) “Learning Analytics: A
sources that we have consulted. Some of the journals were includ- Guide for Students' Unions” [18], and the Open University’s “Pol-
ed at a later stage when we conducted a “snowball” search by icy on Ethical use of Student Data for Learning Analytics” [31].
following up references cited in literature that we included at the Then we used the Google search engine to look for other learning
initial stage. It was also at this stage that we were able to retrieve analytics policies using the key words – “learning analytics AND
some learning analytics policies. policies”, and manually examined individual cases that have been
Table 1. Sources of the bibliographical research studied in the literature that was included in the second phase. We
found three more institutional policies with these methods:
Databases SCOPUS, Wiley Online Library, ERIC, ACM, Charles Sturt University’s “CSU Learning Analytics Code of
IEEE Practice” [9], Nottingham Trent University’s “Use of Learning
Analytics to Support Student Success Policy” [21], and the Uni-
Journals Journal of Learning Analytics, Journal of versity of Sydney’s “Principles for the Use of University-held
Computer Assisted Learning, Journal of Edu- Student Personal Information for Learning Analytics at the Uni-
cational Technology & Society, American versity of Sydney” [25].
Behavioural Scientist, Journal of Educational
Technology & Society, The Information Socie- The bibliographical research finished in July 2016 and rendered
ty, Computers & Education 71 pieces of literature, among which 25 were empirical studies, 38
were desk studies, and eight were policies for learning analytics.
Proceedings The LAK Conference We conducted a systematic literature review on 23 empirical stud-
ies after we further filtered out two studies based on the degree of
Organisational LACE publications, EDUCAUSE library relevance to our research interest. The findings of the empirical
databases studies are presented in Section 4, while the review of the analysis
After the first stage of automatic search using key words, we fil- of existing policies is presented in Section 5.
tered out less relevant literature based on the titles and abstracts,
using our selection criteria of topics and publication types (Table 4. RESULTS – STATE AND CHALLENGES
2). We were then able to conduct a ‘snowball’ search with the IN LEARNING ANALYTICS ADOPTION
remaining literature, as mentioned earlier, and examine papers This review identified six prominent challenges among higher
using the same filter criteria. education institutions in terms of the adoption of learning analyt-
ics. We first present the findings related to the state of learning
Table 2. Filter criteria for the bibliographical research
analytics adoption (Section 4.1) and then challenges identified in
Topics Types of publica- the empirical literature (Section 4.2).
tions
4.1 The State of Learning Analytics Adoption
Included Ethics and privacy, poli- Research reports, A review of the state of learning analytics shows that interest is
cies, institutional strategies, conference proceed- high among higher education institutions, but adoption remains
institutional readiness, ings, journal articles, immature. In the context of the US higher education sector, learn-
institutional capacities, book chapters, poli- ing analytics has been used to eliminate impediments to retention
learning analytics, academ- cy documents, all and student success, and to create personalised learning environ-
ic analytics years of publication, ments [6]. However, there is a greater interest in monitoring or
English language measuring student progress than predicting learning success or
prescribing intervention strategies [35]. Moreover, learning ana-
Excluded Affordances of learning PowerPoint presen- lytics remains an interest rather than a major priority at most insti-
analytics models and tools, tations, blog articles, tutions [5].
interventions on class or news articles, work-
individual levels, ap- shops A similar phenomenon was observed in the Australian higher
proaches to analytics, stud- education sector. A study conducted by Colvin and others re-
ies not based in higher vealed that only 2 out of the 32 institutions under study reached
education institutions the advanced stage – having evidence of implementation of multi-
ple interventions or initiatives informed by data [10]. The rest of
While topics included in the filter criteria were exclusive to the the cases were either at the preparatory stage of learning analytics
context of learning analytics, we decided to include literature that or early stage of implementation.
looked at “academic analytics” if they met two conditions: one of
our interested topics was covered, and the insights learnt were The adoption of learning analytics in the UK higher education
applicable to discussions about learning analytics. In order to get sector is also in its infancy. A survey (N=53) conducted by the
an overview of institutional adoption of learning analytics, we Heads of e-Learning Forum – which includes heads of 130 UK-
excluded literature that examined technological frameworks for based universities – among their members discovered that 25
learning analytics or small scales of interventions that were not respondents did not implement learning analytics at all, 18 were
centrally supported by institutions. We also excluded literature working towards implementation, 9 partially implemented, and
only 1 fully implemented learning analytics across their institution
[19]. Moreover, another study that took a qualitative approach to bers. The latter has been identified as one of the stoppers of learn-
investigate the adoption of learning analytics in 12 institutions in ing analytics in that resistance to change is pervasive in higher
the UK discovered that few interviewees were willing to claim education institutions [5, 17]. Therefore, the involvement of lead-
significant outcomes from their learning analytics activities to ership and visionary implementation with considerations of the
date due to the nascent stage of learning analytics technologies interests of various stakeholders is imperative to the development
and practice [23]. of analytics.
4.2 Challenges in the Adoption of Learning 4.2.2 Challenge 2 –Shortage of Equal Engagement
Analytics Learning analytics distinguishes itself from academic analytics by
In addition to technical challenges in data and system integration the learner-centred and learner-concerned nature. However, few
[5, 34], several studies have identified challenges related to strate- studies have tried to explore opinions of students regarding the
gic planning and policy. Six primary challenges have been identi- use of their data for learning analytics or the impact on their learn-
fied: ing journeys. Among these studies, Slade and Prinsloo’s study
looked at 300 posts from the University Students’ Consultative
- Challenge 1: There is a shortage of leadership capabilities to
Forum to understand students’ views regarding the use of their
ensure that implementation of learning analytics is strategically
data for learning analytics [28]. Similarly, two publications by
planned and monitored.
EDUCAUSE present both positive and negative views of students
- Challenge 2: There are infrequent institutional examples of at Purdue University regarding the impact of a learning analytics
equal engagement with different stakeholders at various levels. project – Course Signals – on their learning [4, 27]. Unfortunate-
ly, neither of these publications offers clear descriptions of the
- Challenge 3: There is a shortage of pedagogy-based approaches methods adopted to collect and analyse student opinions. Another
to removing learning barriers that have been identified by ana- study by Drachsler and Greller made an attempt to engage stu-
lytics. dents to explore the current level of understanding and expecta-
- Challenge 4: There are insufficient training opportunities to tions towards learning analytics among relevant stakeholders [11].
equip end users with the ability to employ learning analytics. However, returns of the survey from students were few due to
possibilities of uneven distribution among the stakeholders and
- Challenge 5: There are a limited number of studies empirically exclusive dissemination channels. This study flags the top-down
validating the impact of analytics-triggered interventions. bias that exists in current research on learning analytics and the
need to engage students strategically.
- Challenge 6: There is limited availability of policies that are
tailored for learning analytics-specific practice to address issues In addition to proactive engagement with students, sound commu-
of privacy and ethics as well as challenges identified above. nication between other stakeholders to establish common under-
standing of learning analytics and institutional readiness still
4.2.1 Challenge 1 – Shortage of Leadership needs facilitation. Oster and others investigated factors that influ-
Studies found that the maturity of learning analytics is closely enced institutions’ evaluations of their readiness to implementing
related to the presence of leadership. Yanosky and Arroway iden- learning analytics [22]. Their study discovered that information
tified the lack of advanced analytics-based projections and proac- officers tended to give higher rating to an institution’s readiness
tive responses to analytics results among the US higher education than other roles within the institution. Their explanation is that
institutions, and claimed that this pattern had not changed since technology professionals have daily interaction with institutional
2012 [35][8]. They attributed one of the causes to the shortage of data in various ways from collection to management and report-
dedicated leadership in that leadership was associated with higher- ing, and hence their familiarity with and comfort working with
level analytics maturity. The lack of support of key leadership in data may have led to discrepancies in the view of institutional
learning analytics activities among higher education institutions in readiness when compared to other stakeholders in their institu-
the USA was again identified in a recent report published by ED- tions. As a result, Oster and colleagues suggest that bridging per-
UCAUSE [5]. ceptions among stakeholders within an institution is critical to
In the Australian study, Colvin and others observed a different ensure a cohesive and collaborative implementation. Similarly,
degree of leadership commitment in two clusters of institutions. the Heads of E-Learning Forum (HeLF) conducted a survey on
One cluster perceived learning analytics as a vehicle or tool for the adoption of learning analytics in the UK higher education, and
measurement or efficiency gains. The other cluster perceived 41 of the 53 respondents (Heads of e-learning) indicated that there
learning analytics as a means to aid reflections on the connection was limited understanding about the possible benefits and out-
between retention and antecedent teaching, learning and student comes of learning analytics across their institutions, while 4 sug-
experience factors [10]. The first cluster tended to lack reflections gested ‘no understanding at all’ [19]. In addition, this question
upon the relationship between implemented strategies and the received 13 free text comments, among which all but one referred
development of organisational capacity for learning analytics, to different levels of awareness and understanding within and
whereas the second cluster contained a higher degree of readiness across departments. Particularly, teams within technical areas had
factors, more mature strategy development, and more involvement the greatest understanding. The aforementioned studies reveal the
of senior leadership. Although this study does not suggest that gap of understanding between different stakeholders within one
there is a lack of leadership in the adoption of learning analytics institution, which could potentially become an obstacle to the
among Australian higher education institutions, Colvin and others institutional embrace of learning analytics.
stress that a strategic vision that responds to the needs of an or- 4.2.3 Challenge 3 – Shortage of Pedagogy-based
ganisation is critical for long-term impact. Furthermore, they ar-
gue that the development of strategic institutional capability is a
Approaches
key prerequisite for the growth of implementation capability for Although learning analytics is claimed to have a great potential in
learning analytics and overall institutional uptake by staff mem- reforming the way people learn and the way teaching is delivered,
pedagogical approaches are not always considered as part of the
strategy for learning analytics. For example, Macfadyen and Daw- icantly large and useful to obtain preliminary insights, the study is
son investigated the extent to which an institution has informed not able to demonstrate long-term impacts or sustained effects
decisions based on analytics results [16]. They found that the brought about by analytics-triggered interventions.
institution was prone to addressing technical challenges while the
development of pedagogical plans was ignored. Another study
4.2.6 Challenge 6 –Shortage of Learning Analytics
exploring what influenced the beliefs of educators concerning the Specific Policies
adoption of learning analytics tools identified the weakness of While institutions generally have regulations regarding the use of
learning analytics tools to move from spotting student weakness data, the ambiguous and divergent views towards ethical issues
and risk levels to providing pedagogically informed suggestions across countries has created much difficulty in the development of
[1]. Despite the fact that the overall perception of the usefulness learning analytics frameworks, and hence impeded the advance-
of learning analytics was positive among the participants, the only ment of learning analytics [15]. In light of this challenge,
variant that was found to have a significant relation with the inten- Drachsler and Greller created an eight-point checklist named
tion to adopt LA tools was when the educator encountered learn- DELICATE, in order to facilitate a trusted implementation of
ing contents that needed improvement. Similarly, Dyckhoff found learning analytics [11]. However, a survey conducted by the
that teachers were particularly concerned with how teaching tools aforementioned Heads of e-Learning Forum in the United King-
correlate with learning behaviours and outcomes [12]. She sug- dom revealed that principles around the ethical use of data for
gested that the design of a prototype for an exploratory learning learning analytics as well as codes of practice were mainly under
analytics tool must differentiate learning offerings that the tool consideration with very few institutions having addressed these
can afford. The abovementioned studies highlight the importance issues to date (13 out of 53 UK-based institutions – that responded
to consider pedagogical requirements and solutions in the design to the survey – claimed to have considered principles and best
of learning analytics tools and projects. practices around the ethical use of data, and 5 out of 53 institu-
tions have adopted a code of practice) [19]. The results of this UK
4.2.4 Challenge 4 –Shortage of Sufficient Training report show that current practice of learning analytics at higher
Shortage of skilled people has been identified as one of the ele- education institutions lacks clear guidance that is designed for
ments in gaps between needs and solutions in the adoption of learning analytics-specific practice.
learning analytics [20]. For example, a pilot survey that examined Another key element in learning analytics policies is strategy,
readiness for learning analytics among nine institutions in the which allows leaders to purposefully, tactically, and continuously
USA found that one of the respondents’1 greatest concern is the move projects towards their goals [2, 20]. Macfadyen and Dawson
lack of analytics ability among the staff [3]. The skill shortage advocate ‘visionary data analysis’; that is, to present data in a
makes it hard to move learning analytics towards an institution- logical way which highlights progress and room for growth
wide scale. Goldstein and Katz explored the characteristics of against a backdrop of institutional targets [16]. They believe that
institutions that claimed to have achieved success outcomes from learning analytics data needs to be presented with consideration of
the use of analytics systems, and discovered that the effectiveness the socio-technical sphere in order to motivate organisational
of an institution’s training programme and present staff skilled at adoption and cultural change. The idea that learning analytics
analytics were key to the success [14]. Wasson and Hansen advo- need to be implemented under a strategic vision that responds to
cate that relevant training opportunities should be offered to all the needs of an organisation concurs with arguments made by
relevant stakeholders to improve understanding of learning analyt- Colvin and others, as mentioned earlier [10]. Further, Ferguson
ics and equip them with skills to operate the tools and interpret and others promote the RAPID Outcome Mapping Approach
data [32]. This can potentially bridge the gap in understanding and (ROMA) as a framework for learning analytics to achieve learn-
capabilities of learning analytics among different stakeholders, as ing and teaching goals [13]. It is recommended that a policy that
identified in challenge 2. ensures the practice of learning analytics to be legal, ethical and
4.2.5 Challenge 5 –Shortage of Studies Empirically strategic should be installed in every higher education institution.
Validating the Impact The six challenges identified in the literature highlight the fact
Establishing successful cases has been identified as a requirement that learning analytics need to be implemented with considera-
to persuade senior staff who can allocate budgets to support learn- tions of multiple dimensions that include institutional contexts,
ing analytics [19]. However, it has been identified as a challeng- stakeholders at various levels, pedagogical applications, institu-
ing task to evaluate the success of learning analytics or demon- tional capacities, success evaluation, legal and ethical considera-
strate advanced employment of learning analytics. The reason for tions, and a strategy that aligns with the institutional missions.
this is attributed to the fact that the majority of higher education Thus, it is imperative that higher education institutions develop
institutions that have taken initiatives to adopt learning analytics, learning analytics specific policies or update existing policies to
are still in a preparatory or early stage [23][10]. Moreover, the lag meet the requirements of learning analytics, and make them rele-
time required to measure the effects of analytics-triggered inter- vant to the institutional contexts and all stakeholders therein.
ventions has made access to learning analytics outcomes even
harder [5]. So far, success claimed for learning analytics has 5. RESULTS – EXISTING LEARNING AN-
mainly been based on data collected during a short period of time. ALYTICS POLICIES
For example, a study collected data from four institutions during The owners of the eight policies fall in two groups:
the 2012 spring semester found a significant difference in mean 1. Support organisations and research consortiums:
course grades between groups of students that had received inter-
ventions and those that did not [7]. While the size of data is signif- a. Jisc (a non-profit organisation that supports digital ser-
vices and solutions in the UK higher, further education and
skills sectors).
1 Respondents were involved in work related to learning analytics, b. LACE (Learning Analytics Community Exchange, an EU
data analysis, and research related to educational technologies. funded project which aims to integrate communities work-
ing on learning analytics and educational data mining from the effectiveness of learning analytics, which is re-
schools, workplace and universities). flected in the detailed suggestions for methods.
c. LEA’s Box (an EU funded project which aims to create a The policy is presented succinctly with eight principles
learning analytics toolbox to enable a goal-oriented and LACE that focus on dealing issues of privacy and ethics. It is
proactive educational assessment that will provide forma- known as the DELICATE checklist.
tive support to learners).
The document is meant to balance individual privacy
d. National Union of Students (NUS), UK (a confederation
and beneficial uses of data when developing techno-
of 600 students’ unions from more than 95 per cent of all
LEA’s logical projects. It is part of a 44-page-long report, in
higher and further education unions in the UK, committed
Box which an overview of literature on ethical issues, pri-
to promote, defend and extend student rights).
vacy and data protection is included in addition to
2. Higher education institutions: privacy and data protection regulations.
a. Nottingham Trent University (NTU), UK The policy is meant to state NUS’ beliefs about proper
b. The Open University (OU), UK practices of learning analytics and where they stand to
c. Charles Sturt University (CSU), Australia NUS defend students’ right when issues arise. NUS adopts
Jisc’s Code of Practice for legal and ethical regula-
d. The University of Sydney (USyd), Australia tions.
In the following sections, we present these organisations’ expecta-
The policy is developed specifically for the use of the
tions of learning analytics and features of their policies. Then, we
NTU student dashboard at NTU, and focused on methodolo-
summarise key topics that have been covered by one or more
gies that are adopted.
policies including strategy, legal/ organisational obligations, pri-
vacy protection, data management and governance (3). The policy details the scope for and oversight on the
Table 3. Aspects of review ethical use of data in addition to the policy statement
OU (eight principles). The policy statement is further de-
Dimensions Aspects veloped into a page-long version for easy communica-
tion2 with end users.
Strategy Goal setting, methods, evaluation of im-
pact, assurance of validity, communica- The policy contains seven principles grounded in rele-
tion and support, and user roles vant literature and a table of commitments that ex-
CSU
plains how these principles should look like in prac-
Obligations Legal and organisational obligations tice.
Privacy protection Data anonymity, informed consent, and The policy contains eight principles that succinctly
opt-out options summarise the purpose, process, obligations, responsi-
USyd
Data management Data handling process and access to data bilities, rights of students, and implications of learning
and governance analytics to students.
Whenever applicable, the results are summarised based on points
that are in common, in addition to tables that present additional 5.3 Strategy
points distinct to individual policies. Strategy identified in the eight policies comprises six components:
goal setting, methods, evaluation of impact, validity assurance,
5.1 Expectations communication and support, and user roles.
While all of the eight owners of the policies acknowledge that the
use of learning analytics must serve the purpose of enhancing
5.3.1 Goal Setting
With the exception of Lea’s Box, all of the policies investigated
learning, six of them explicitly state their expectations of learning
explicitly define goals for learning analytics. JISC and LACE
analytics. They are summarised into four goals:
have stressed the need to set up goals but have not provided speci-
- Providing timely intervention fications for the goals, perhaps due to the diversity of the institu-
tions in their partnership. While the rest of the policies all suggest
- Providing personalised learning that enhancing learning and teaching should be the ultimate goals
- Strengthening student-teacher relationships for learning analytics, some provide additional information as to
what learning analytics should and should not do (Table 5).
- Developing a data-informed culture
Table 5. The 'should' and 'should not' of learning analytics
5.2 Features of the Policies goals
While privacy protection and ethical use of data are of primary
concerns to all, each of the eight policies is distinct in some fea- Policies Learning analytics should/ should not aim to…
tures, which have shaped the emphases of these policies (Table 4). Learning analytics must support the student-teacher
NUS
Table 4. Features of the eight policies partnership, which is at the heart of education.
NTU Student dashboards should enable rather than replace
Policies Features
The code of practice covers issues around data, re-
Jisc sponsibilities, interventions and adverse impacts of 2 The eight principles comprise a four-page-long publicity, called
learning analytics. There is a particular emphasis on “Using Information to Support Student Learning”.
dialogues with students. It is not to be used for the so that students take responsibility for learning and the university
purposes of assessment. can provide a more accurate interpretation of data and tailored
interventions for individual students. Similarly, CSU suggests that
Learning analytics should enable “personalised man- students should be encouraged to be active “managers” of their
agement” of the relationship between the university own learning through the use of analytics, and interventions
CSU and its students and employees. Moreover, it should should be made under professional, sensitive and fair judgements
provide input into decision-making for all the universi- while promoting student-centred practices.
ty staff.
5.3.3 Evaluation of Impact
Learning analytics should inform institutional strate- There is generally a lack of plans for evaluating the impact of
gies to improve student retention and progression learning analytics in most policies. The USyd policy is the only
OU
(macro level), and drive interventions and develop policy that mentions plans to review their practice of learning
personalised learning paths (micro level). analytics regularly to examine its relevance to the goal – enhanc-
Both NUS and NTU show concerns about student-teacher rela- ing learning experiences and outcomes. NUS states that they will
tionship, while CSU and OU highlight the goal of using data to continue to review the effects of learning analytics on policy is-
inform decisions. sues and the public accountabilities of higher education institu-
tions. Specifically, NUS object to any possibility of “datafication”
It is worth mentioning that CSU makes it clear that data generated of student behaviours and “dictatorship of data” in education.
by learning and teaching systems “will not be used as an official
record of the University, and do not, in themselves, create an obli- 5.3.4 Assurance of Validity
gation to act” [9]. That is to say, the university holds the right to Many of the policies have made suggestions to enhance the validi-
make the final decision in terms of the extent to which they will ty of learning analytics in terms of data quality and comprehen-
take action in response to analytics results. By contrast, OU states, siveness (Table 7).
“Where data indicates that there is potential for action to be taken
which might better support students in achieving their study goals Table 7. Considerations of data validity
or in reaching their potential, the University has a responsibility to Considerations of data validity Policies
act on this” [31].
All but
5.3.2 Methods Learning analytics cannot capture or present a com-
LACE
All the eight policies make suggestions on the methods that plete picture of a learning process.
and USyd
should be used to approach data, while some consider interven-
tions, resources and the engagement with students. Table 6 sum- Jisc and
Limitations of analytics must be revealed and inac-
marises the methods. Lea’s
curacies should also be disclosed and minimised.
Box
Table 6. A summary of suggested methods
Learning analytics tools should allow the record of
Suggested methods Policies LEA’s
date stamps for new inferences or information so as
Box
The types of data that will be collected for learn- All but to keep data updated and accurate.
ing analytics need to be stated. NTU 5.3.5 Communication and Support
The ways student data will be collected and used Communication and support are essential elements to smooth
All implementation of any project within an organisation. While Jisc,
should be clearly explained.
LACE, and LEA’s Box have not dealt with these aspects, NUS
The approaches to analysis and interventions and the four universities have attended to them to different de-
Jisc
should be explained. grees (Table 8).
Jisc, NTU, Table 8. Elements of communication and support
The circumstances for interventions should be
OU and
specified. Communication and support Policies
CSU
Resources that will be allocated for learning ana- Students will receive support to resolve disputes en-
lytics should be specified with consideration of Jisc countered in their learning environments, and NUS
students of different requirements members (officers and staff at higher education insti-
NUS
tutions) will receive assistance to engage with their
The use of learning analytics should respect the OU and institutions on learning analytics issues and defend
differences between individual students. CSU students’ rights.
Bias or other adverse impacts may occur as a re- The purpose, boundaries and methods used for the
Jisc, OU
sult of analytics, and such unintended results student dashboard, and the expectations of students to
and CSU
should be prevented by all means. reflect on their own learning process using the dash- NTU
board should be communicated when students start
Students should be engaged actively in the adop- NTU, OU
the university (e.g., at induction or an early tutorial).
tion of learning analytics. and CSU
Learning analytics users need to be informed about
OU and
The NTU policy encourages students to use student dashboards as how the results of learning analytics may affect them,
CSU
a tool to reflect engagement with their studies. OU states that and what their rights and obligations are.
students should be engaged as active agents in the implementation
Staff and students will be provided with a set of guid- OU
ance notes to engage them with learning analytics, USyd - The University’s Privacy Policy 2013
and the university will provide training to develop the
required skills across the institution. - Privacy Management Plan
- University of Sydney Act 1989
Students would be notified of any privacy breaches
related to their information and would be informed of USyd It is also worth noting that, with the upcoming application of gen-
their rights as to how to make a formal complaint. eral data protection regulations (GDPR) in 2018 [30], higher edu-
cation institutions in Europe and UK are likely to have to update
Although each of the four universities states that communication existing policies soon.
about the adoption of learning analytics needs to be in place, none
of them mentions any two-way communication channels for dif- 5.5 Privacy Protection
ferent stakeholders to share ideas and experience to work together Considerations of privacy issues deal with identification of data,
on learning analytics. informed consent, and options to opt out of data collection.
5.3.6 User Roles 5.5.1 Data Anonymity
In terms of the roles that users play in the implementation of Table 11 summarises the considerations of data anonymity in the
learning analytics, Jisc, NTU, OU and CSU all emphasise that eight policies.
students should be treated as active agents with certain degrees of
responsibility to manage their data, make decisions related to their Table 11. Considerations of data anonymity
learning, and engage with their studies. In addition, Jisc suggests
Considerations of data anonymity Policies
that higher education institutions should specify the obligations
for students and staff to act on analytics, and that staff should Any data collected for or generated by learning All but
have sound working knowledge of the legal and ethical practice of analytics must remain anonymous. USyd
learning analytics.
JISC,
Data must remain anonymous when it is trans-
5.4 Legal/ Organisational Obligations ferred between multiple sources and aggregated.
LACE, and
All of the policies were developed under the framework of nation- LEA’s Box
al or international policies for data protection (Table 9). Whenever third parties are involved, special at-
LACE,
Table 9. National and international data protection policies tention needs to be paid to data anonymity, and
LEA’s
any data sharing needs to comply with privacy
National/ International data protection Box, OU,
Policies Place requirements in the current policy as well as insti-
laws and CSU
tutional and national/international policies.
Jisc, Although the USyd policy does not explicitly mention data ano-
- Data Protection Act 1998
NTU, UK nymity in its short list of principles, they point out that students
OU, NUS (NUS adopts Jisc’s Code of Practice) will be appropriately notified about how their data will be dis-
closed, and the university will adopt appropriate safeguard to
LACE Europe - EU Directive 95/46/EC protect the security and integrity of university-held student infor-
- Act No. 101/2000 Coll. (Czech Repub- mation.
lic) 5.5.2 Informed Consent
LEA’s - DSG 2000 (Austria) All of the policies state that consent must be obtained before data
Europe is collected from students, and three provide additional infor-
Box - Data Protection Act 1998 (UK)
mation about the contexts of consent seeking (Table 12).
- 1995 Data Protection Directive (Di-
rective 95/46/EC) (EU) Table 12. Context of consent seeking
- NSW Privacy and Personal Infor- Contexts of consent seeking Policies
mation Protection Act 1998 (PPIPA)
- National Statement on Ethical Conduct Informed consent will only be sought at student en- NTU
CSU,
Australia in Human Research (NSECHR) rolment.
USyd
- NSW Health Records and Information Personal interventions also require consent from stu- Jisc
Privacy Act 2002 (Only USyd) dents.
In addition to the aforementioned laws, all universities consulted Whenever HTTP cookies are used in a learning ana- LEA’s
relevant policies existing in their institutions. Table 10 summaris- lytics system, user consent should be obtained. Box
es those that were listed in the four policies.
5.5.3 Opt-out Options
Table 10. Institutional data protection policies Table 13 summarises the considerations of opportunities to opt
Institutions Institutional data protection policies out of data collection for learning analytics in the eight policies.

NTU - The University’s data protection policy Table 13. Opt-out options

OU - The University’s Teaching and Learning Policy Opt-out options Policies

CSU - CSU Privacy Management Plan and 19 other All but


Users should be given the option to opt out of the
policies held in the CSU Policy Library NTU and
data pool or collection process.
USyd
Students can amend consent agreements on a peri- OU and four deal with the access to data by other stakeholders, including
odical basis. CSU third parties.

Users should be able to opt out without any conse- Table 15. Considerations of data access
LACE
quences. Considerations of data access Policies
Any potential adverse consequences as a result of Learning analytics needs to be in forms accessible to
opting-out of data collection must be clearly ex- Jisc Jisc
students.
plained.
Analytics facilities should be available to anyone to
LEA’s
Learning analytics mechanisms must allow a specif- LEA’s whom the service has been provided even if consent
Box
ic user’s data to be withdrawn at any time. Box for the collection of their data is not given.
While the USyd policy has no mention about opt-out options, Any metrics and labels attached to students must be
NTU clearly states that such an option is not available, and their revealed on students’ request under the condition that Jisc
justification says, no harmful impact results.
As part of their enrolment conditions, students give Students should be given access to data on their learn-
permission to the University to use and process data. It ing in a way that i) enhances agency and autonomous
would not be possible to deliver courses or manage CSU
learning, ii) promotes quality learning and engagement,
support for students without this data. It is therefore and iii) recognises student diversity and individuality.
not possible for students to opt out from having their
data in the Dashboard or other University core infor- LEA’s
Different stakeholders will be granted access to data
mation systems [21]. Box
with different authorisation privileges as to the range
and
By contrast, OU and CSU offer the option to amend consent of data that they can access.
NTU
agreements on a periodical basis. However, it is not clear whether
students are given the opportunity to amend their consent to exist- When external parties, including educational authori-
ing data collection or to give consent to the collection of a new set ties, are involved, their access to student data needs to Jisc
of data. Unlike universities that are more concerned about data be clearly defined.
completeness, research consortiums offer more generic statements Student data will only be shared with third parties un-
regarding data opt-out. der the circumstance that useful information in helping NTU
students can be provided.
5.6 Data Management and Governance Student data must not be sold or shared with commer-
A parallel issue to privacy concerns is data management and gov- NUS
cial third parties.
ernance. The current review deals with the following aspects: the
data-handling process and access to data.
6. DISCUSSION
5.6.1 Data Handling Process In response to the six challenges identified in the literature, this
Each of the eight policies deals with data handling to various de- section will examine whether and how the eight existing policies
grees of detail, but all policies indicate that the process for han- address these and provide a bigger picture of these challenges in
dling student (and staff) data must be transparent, with clear ex- the wider environment of higher education. Certain discussions
planation on ways such data will be used. Additional information will focus on the four higher education institutions alone.
about data handling is provided in Table 14. 6.1 Leadership Involvement and Learning
Table 14. Considerations of data transparency Analytics Specific Policies
The fact that the four universities have developed institutional
Considerations of data transparency Policies policies for learning analytics suggests that there is some degree
LACE, LEA’s of senior management involvement and support from the institu-
The methods used to collect data have to be tions. Moreover, the governing bodies of these policies will be
Box, OU, CSU
disclosed to the subjects of the data collection. responsible for ensuring that the practice of learning analytics
and USyd
complies with the guidelines for data management and with the
The information about how data will be stored LACE and goals to enhance learning and teaching. However, the fact that
needs to be provided. CSU only four policies from higher education institutions were re-
trieved indicates some possible scenarios. First, most institutions
Users need to be notified about where their
either implement learning analytics without formalised guidelines
data has travelled in any integration process
LEA’s Box or with guidelines that are not originally developed specifically to
between multiple entities and informed about
meet requirements of learning analytics or specific institutional
any changes made to the analytics process.
contexts. Second, some institutions are at a nascent stage of adopt-
5.6.2 Access to Data ing learning analytics, and they allow this pilot stage to be part of
All of the policies state that users should be given the right to the process for policy development. Third, the adoption of learn-
access their own data that is held on institutional systems, and ing analytics in some institutions is a grassroots movement, and it
some (LACE, LEA’s Box, OU, and USyd) explicitly indicate that has not gained support from senior management yet.
users should be able to manage and update such data. Table 15 6.2 Communications between Stakeholders
summarises considerations of data access in various contexts. The Thoughts that have been put into communications between rele-
first four items focus on students’ access, while the remaining vant stakeholders in the eight policies are limited to practical as-
pects as to what, how and why data is collected. It is noticeable 7. CONCLUSION
that this communication strategy is predominantly top-down. The review shows that the eight policies have not given enough
None of the policies proposed dynamic channels to allow two- considerations to the establishment of two-way communication
way communication among stakeholders at different levels. Alt- channels and pedagogical approaches. Most of the policies also
hough the USyd policy states that students will be informed of lack guidance for the development of data literacy among end-
their right to make formal complaints, this is limited to the com- users and for evaluation of the impact and effectiveness of learn-
munication of negative opinions. However, the exchange of posi- ing analytics. Nevertheless, the existing learning analytics policies
tive and constructive ideas should also be valued, as this can lead have established some examples that are of value for reference,
to an understanding of best practices. particularly for institutions that are planning to develop their own
6.3 Pedagogy-based Approaches institutional policies for learning analytics. The review also con-
None of the eight policies made an attempt to suggest any peda- firms that one of the biggest challenges in the adoption of learning
gogy-based approach that i) teaching staff may take in response to analytics is the lack of institutional policies that are developed
learning analytics results, ii) technology developers should con- specifically to guide the practice of learning analytics with con-
sider when designing and developing learning analytics tools, or siderations of an individual institution’s own cultural, economic,
iii) institutions should consider when acquiring external learning political and technical context as posited in policy development
analytics tools. Although teaching staff are expected to make de- approaches such as the ROMA Rapid Framework [17, 36].
cisions based on analytics results, it is not clear whether they have It should be noted that the findings and recommendations present-
been involved in the selection of the best parameters to evaluate ed in this paper are by no means all encompassing. A small subset
student progress or engagement, neither is it clear what alterna- of the retrieved policies was composed in a succinct style for effi-
tives teachers have when analytics results do not provide actiona- cient communication to their targeted audiences. It is not clear
ble insights into teaching design. whether these institutions have more detailed versions of policies
that are only available to internal stakeholders. We would like to
6.4 Skills for Learning Analytics encourage all higher education institutions to publish their policies
While institutions normally have general instructions on the use of for learning analytics, both in succinct and fully comprehensive
learning analytics systems, NTU states that engagement with stu- formats. The former enables rapid communication with relevant
dents in learning analytics needs to happen as early as when they stakeholders, while the latter can provide opportunities for further
start their studies at the university. As discussed earlier, Jisc, engagement and keep higher education institutions accountable to
NTU, OU and CSU expect students to engage with learning ana- their students and employees. We would also like to encourage
lytics actively by managing their data and taking action based on the community to create a common (Web) space where all such
analytics results. The emphasis on putting learning analytics in the policies will be indexed and shared3.
hands of students highlights the need to develop ‘data literacy’ –
the skill to accurately interpret and critique presented analysis of We acknowledge that gaps exist between policy and practice. For
data [33]. However, among all the universities, only OU has areas that these policies have failed to address, it does not neces-
promised to develop the skills required for learning analytics sarily mean that these organisations have neglected them in their
across the organisation. It is unclear what kinds of “skills” are practice of learning analytics, and vice versa. The intention of this
considered as required though, and further discussion with input review is to highlight areas policy makers can consider when up-
from learning analytics experts and system designers is required dating an existing policy or developing a new policy for learning
to increase institutional capacity for learning analytics. analytics, so as to ensure that the implementation of learning ana-
lytics is appropriate, effective and legitimate.
6.5 Evidence of Effectiveness
Of the eight policies, only the USyd policy mentions the need to 8. ACKNOWLEDGMENTS
validate learning analytics outcomes against desired goals. More- We would like to thank the European Commission for funding the
over, it is clear that all the policies have neglected the importance SHEILA project (Ref. 562080-EPP-1-2015-1-BE-EPPKA3-PI-
of evaluation of interventions introduced to learning or teaching FORWARD). The project involved collaborative input from all
design. The literature suggests that not many mature cases are the partners involved and their contributions are highly appreciat-
available for evaluations on long-term impact due to the nascent ed. This document does not represent the opinion of the European
stage of learning analytics [23][10][5][7]. This is reflected in the Community, and the European Community is not responsible for
policies under current review. any use that might be made of this content.
Evaluation should be included as an integral part of a learning 9. REFERENCES
analytics policy regardless of the maturity stage of implementa- [1] Ali, L., Asadi, M., Gašević, D., Jovanović, J. and Hatala, M.
tion, as it offers opportunities for institutions to learn from previ- 2013. Factors Influencing Beliefs for Adoption of a Learning
ous experiences. Some policy development methods have empha- Analytics Tool: An Empirical Study. Computers & Educa-
sised this element. For example, the RAPID Outcome Mapping tion. 62, (Mar. 2013), 130–148.
Approach (ROMA) includes the development of monitoring and [2] Arnold, K.E., Lonn, S. and Pistilli, M.D. 2014. An Exercise
learning frameworks as one of its seven critical phases in policy in Institutional Reflection: The Learning Analytics Readi-
development [36][17]. ness Instrument (LARI). Proceedings of the Fourth Interna-
Moreover, a solid process of evaluation allows institutions to tional Conference on Learning Analytics And Knowledge
build successful cases to promote learning analytics not only with- (New York, NY, USA, 2014), 163–167.
in institutions, but also in the wider sector of higher education
[35]. For example, two respondents in the study conducted by
3 A short collection of existing learning analytics policies is avail-
Newland and others suggested that case studies that demonstrate
benefits of learning analytics would provide credible evidence for able on the project website of a European Commission funded
raising awareness and understanding among all stakeholders [19]. project – SHEILA (http://sheilaproject.eu/la-policies/)
[3] Arnold, K.E., Lynch, G., Huston, D., Wong, L., Jorn, L. and [19] Newland, B., Martin, L. and Ringan, N. 2015. Learning
Olsen, C.W. 2014. Building Institutional Capacities and Analytics in UK HE 2015: A HeLF Survey Report.
Competencies for Systemic Learning Analytics Initiatives. [20] Norris, D.M. and Baer, L.L. 2013. Building Organizational
Proceedings of the Fourth International Conference on Capacity for Analytics.
Learning Analytics And Knowledge (New York, NY, USA, [21] Nottingham Trent University 2015. Use of Learning Analyt-
2014), 257–260. ics to Support Student Success Policy. Nottingham Trent
[4] Arnold, K.E. and Pistilli, M.D. 2012. Course Signals at Pur- University.
due: Using Learning Analytics to Increase Student Success. [22] Oster, M., Lonn, S., Pistilli, M.D. and Brown, M.G. 2016.
Proceedings of the 2nd International Conference on Learn- The Learning Analytics Readiness Instrument. Proceedings
ing Analytics and Knowledge (New York, NY, USA, 2012), of the Sixth International Conference on Learning Analytics
267–270. & Knowledge (New York, NY, USA, 2016), 173–182.
[5] Arroway, P., Morgan, G., O’Keefe, M. and Yanosky, R. [23] Sclater, N. 2014. Learning Analytics: The Current State of
2016. Learning Analytics in Higher Education. ECAR. Play in UK Higher and Further Education. Jisc.
[6] Baer, L.L., Duin, A.H., Norris, D. and Brodnick, R. 2013. [24] Sclater, N. and Bailey, P. 2015. Code of Practice for Learn-
Crafting Transformative Strategies for Personalized Learn- ing Analytics. Jisc.
ing/Analytics. Proceedings of the Third International Con- [25] SEG Education Committee 2016. Principles for the Use of
ference on Learning Analytics and Knowledge (New York, University-held Student Personal Information for Learning
NY, USA, 2013), 275–277. Analytics at The University of Sydney. The University of
[7] Baron, J., Whatley, M., Schilling, J. and McGovern, M. Sydney.
2013. Scaling Learning Analytics across Institutions of [26] Siemens, G., Dawson, S. and Lynch, G. 2013. Improving the
Higher Education. EDUCAUSE. Quality and Productivity of the Higher Education Sector:
[8] Bichsel, J. 2012. Analytics in Higher Education: Benefits, Policy and Strategy for Systems-level Deployment of Learn-
Barriers, Progress, and Recommendations. ECAR. ing Analytics. Society for Learning Analytics Research for
[9] Charles Sturt University 2015. CSU Learning Analytics the Australian Office for Learning and Teaching.
Code of Practice. Charles Sturt University. [27] Signals: Applying Academic Analytics: 2010.
[10] Colvin, C., Rogers, T., Wade, A., Dawson, S., Gasevic, D., http://er.educause.edu/articles/2010/3/signals-applying-
Shum, S.B., Nelson, K., Alexander, S., Lockyer, L., Kenne- academic-analytics. Accessed: 2016-07-27.
dy, G., Corrin, L. and Fisher, J. 2015. Student Retention and [28] Slade, S. and Prinsloo, P. 2014. Student Perspectives on the
Learning Analytics: A Snapshot of Australian Practices and Use of Their Data: Between Intrusion, Surveillance and
a Framework for Advancement. The Australian Government Care. Challenges for Research into Open & Distance Learn-
Office for Learning and Teaching. ing: Doing Things Better–Doing Better Things (Oxford, UK,
[11] Drachsler, H. and Greller, W. 2016. Privacy and Analytics: Oct. 2014), 291–300.
It’s a DELICATE Issue a Checklist for Trusted Learning [29] Steiner, C.M., Masci, D., Johnson, M., Türker, A., Drnek,
Analytics. Proceedings of the Sixth International Conference M. and Kickmeier-Rust, M. 2014. Privacy and Data Protec-
on Learning Analytics & Knowledge (New York, NY, USA, tionPolicy. Deliverable D2.3. LEA’s Box.
2016), 89–98. [30] The European Parliament and the Council of the European
[12] Dyckhoff, A.L. 2011. Implications for Learning Analytics Union 2016. Regulation (EU) 2016/679 of the European Par-
Tools: A Meta-analysis of Applied Research Questions. In- liament and of the Council of 27 April 2016 on the Protec-
ternational Journal of Computer Information Systems and tion of Natural Persons with Regard to the Processing of
Industrial Management Applications. 3, (2011), 594–601. Personal Data and on the Free Movement of Such Data, and
[13] Ferguson, R., Macfadyen, L.P., Clow, D., Tynan, B., Alex- Repealing Directive 95/46/EC (General Data Protection
ander, S. and Dawson, S. 2014. Setting Learning Analytics Regulation) (Text with EEA Relevance). Official Journal of
in Context: Overcoming the Barriers to Large-Scale Adop- the European Union. OJ, L119 (May 2016), 1–88.
tion. Journal of Learning Analytics. 1, 3 (Sep. 2014), 120– [31] The Open University 2014. Policy on Ethical use of Student
144. Data for Learning Analytics. The Open University.
[14] Goldstein, P.J. and Katz, R.N. 2005. Academic Analytics: [32] Wasson, B. and Hansen, C. 2015. Data Literacy and Use for
The Uses of Management Information and Technology in Teaching. Measuring and Visualizing Learning in the Infor-
Higher Education. ECAR. mation-Rich Classroom. P. Reimann, S. Bull, M. Kickmeier-
[15] Hoel, T. and Chen, W. 2015. Data Sharing for Learning Rust, R. Vatrapu, and B. Wasson, eds. Routledge. 56–73.
Analytics - Questioning the Risks and Benefits. Proceedings [33] Wolff, A., Moore, J., Zdrahal, Z., Hlosta, M. and Kuzilek, J.
of the 23rd International Conference on Computers in Edu- 2016. Data Literacy for Learning Analytics. Proceedings of
cation (ICCE (2015). the Sixth International Conference on Learning Analytics &
[16] Macfadyen, L. and Dawson, S. 2012. Numbers Are Not Knowledge (New York, NY, USA, 2016), 500–501.
Enough. Why E-learning Analytics Failed to Inform an Insti- [34] Yanosky, R. 2009. Institutional Data Management in Higher
tutional Strategic Plan. Faculty of Education - Papers (Ar- Education. ECAR.
chive). (Jan. 2012), 149–163. [35] Yanosky, R. and Arroway, P. 2015. The Analytics Land-
[17] Macfadyen, L.P., Dawson, S., Pardo, A. and Gaševic, D. scape in Higher Education. ECAR.
2014. Embracing Big Data in Complex Educational Sys- [36] Young, J. and Mendizabal, E. 2009. Helping Researchers
tems: The Learning Analytics Imperative and the Policy Become Policy Entrepreneurs-How to Develop Engagement
Challenge. Research & Practice in Assessment. 9, (2014), Strategies for Evidence-based Policy-making. Overseas De-
17–28. velopment Institute.
[18] National Union of Students 2015. Learning Analytics: A
Guide for Students’ Unions. National Union of Students.

You might also like