You are on page 1of 47

Technology-Supported

Personalised Learning:
A Rapid Evidence Review

Date July 2020

Author Louis Major


Gill A. Francis

#EdTechHub @GlobalEdTechHub edtechhub.org
Creative Commons Attribution 4.0 International https://creativecommons.org/licenses/by/4.0/.’
EdTech Hub 

About this document 


Recommended  Louis Major and Gill A. Francis (2020). 
citation  Technology-Supported Personalised Learning: A 
  Rapid Evidence Review. ​(EdTech Hub Rapid Evidence 
  Review). DOI: 1​ 0.5281/zenodo.4556925 

Licence  Creative Commons Attribution 4.0 International 

https://creativecommons.org/licenses/by/4.0/​. 

You — dear readers — are free to share (copy and 


redistribute the material in any medium or format) 
and adapt (remix, transform, and build upon the 
material) for any purpose, even commercially. You 
must give appropriate credit, provide a link to the 
license, and indicate if changes were made. You may 
do so in any reasonable manner, but not in any way 
that suggests the licensor endorses you or your use. 

Available at  https://docs.edtechhub.org/lib/A2II5ZV7 

Notes  EdTech Hub is supported by UK aid and the World 


Bank; however, the views expressed in this document 
do not necessarily reflect the views of the UK 
Government or the World Bank. 
 

Rapid Evidence Reviews 


This publication is one part of a series of Rapid Evidence Reviews that has 
been produced by EdTech Hub. The purpose of the Rapid Evidence Reviews is 
to provide education decision-makers with accessible, evidence-based 
summaries of good practice in specific areas of EdTech. They are focused on 
topics which are particularly relevant in the context of widespread global 
challenges to formal schooling as a result of Covid-19. All the Rapid Evidence 
Reviews are available at h
​ ttps://edtechhub.org/research/​. 

   
Technology-Supported Personalised Learning: A Rapid Evidence Review 2 
EdTech Hub 

Contents 
Abbreviations and acronyms 4 
Summary 5 
1. Introduction 8 
1.2. Background 8 
1.3. Purpose 9 
1.4. Application 9 
1.5. Research questions 9 
2. Methodology 1​2 
2.1. Scoping review 1​2 
2.2. Literature search and eligibility criteria 1​2 
2.3. Limitations 1​5 
2.4. Theme identification 16 
3. Findings 1​8 
RQ1. How has technology-supported personalised learning been 
implemented in low and middle-income countries? 1​8 
RQ2. What key themes are reported in the literature that may inform a 
response to the Covid-19 pandemic? 2​5 
Theme 1: Improving access and adapting to the diverse needs of 
learners 2​5 
Theme 2: The role of teachers and appropriate teacher professional 
development 29 
Theme 3: Pedagogical and motivational affordances 3​1 
Theme 4: Potential challenges and barriers in implementation 3​3 
4. Recommendations 3​7 
5. ​Annex A: Bibliography 4​1 
6. ​Annex B: Search terms 4​6 
7. ​Annex C: Data description spreadsheet 4​7 
 

Technology-Supported Personalised Learning: A Rapid Evidence Review 3 


EdTech Hub 

Abbreviations and acronyms 


CAI Computer Assisted Instruction 

CTS Cognitive Tutoring Systems 

EdTech Educational Technology 

ICT  Information and Communications Technology 

ITS Intelligent Tutoring Systems 

LMICs Low- and middle-income countries 

RCT Randomised Controlled Trial 

RER Rapid Evidence Review 

TaRL Teaching at the Right Level 

 
 

 
 

   

Technology-Supported Personalised Learning: A Rapid Evidence Review 4 


EdTech Hub 

Summary 
This Rapid Evidence Review (RER) provides an overview of existing research 
on the use of technology to support personalised learning in low- and 
middle-income countries (LMICs). The RER has been produced in response to 
the widespread global shutdown of schools resulting from the outbreak of 
Covid-19. It therefore emphasises transferable insights that may be applicable 
to educational responses resulting from the limitations caused by Covid-19. In 
the current context, lessons learnt from the use of technology-supported 
personalised learning — in which technology enables or supports learning 
based upon particular characteristics of relevance or importance to learners — 
are particularly salient given this has the potential to adapt to learners’ needs 
by ‘teaching at the right level’.  
This RER provides a summary of the potential benefits of 
technology-supported personalised learning as well as identifying possible 
limitations and challenges. It intends to inform educational decision makers, 
including donors and those in government and NGOs, about the potential to 
use technology-supported personalised learning as a response to the current 
pandemic. The findings and recommendations are also anticipated to be of 
interest to other education stakeholders (e.g. researchers and school leaders).  
The RER involved a systematic search for academic and grey literature to 
address the overarching question: ​What is known about personalised 
learning through using technology that can be of value in responding 
effectively to mass school shutdowns in LMICs?​ After a rigorous screening 
process, 24 studies (in 12 countries) published since 2006 were analysed. 
Details on the inclusion criteria, as well as the associated limitations, are 
explained in the methodology section. Two specific research questions (RQs) 
guided the enquiry:  
■ RQ1​: How has technology-supported personalised learning been 
implemented in LMICs?  
■ RQ2​:​ W
​ hat key themes are reported in the literature that may inform a 
response to the Covid-19 pandemic?  
While a number of potential research limitations must be taken into account, 
on the whole, an encouraging and positive impact on learning outcomes is 
reported. Indeed, the RER demonstrates that there is a growing base of strong 
evidence on the impact of technology-supported personalised learning to 
support school-age learners in LMIC contexts.  
Research involving a range of digital technologies and learners of various ages 
is reported. Studies mainly target instruction in mathematics and science 
Technology-Supported Personalised Learning: A Rapid Evidence Review 5 
EdTech Hub 

although there are examples of research involving the development of 


non-cognitive skills. Importantly, the RER corroborates previous research 
which suggests there is no agreed definition of technology-supported 
personalised learning. It notes that ‘personalised learning’ does not necessarily 
mean ‘individualised learning’; it can include group-level adaptation and 
collaborative learning. Levels of personalisation also appear to fall on a 
continuum of being highly responsive to the user to less responsive. A further 
interesting finding is that studies report using technology as either a 
supplementary (providing additional opportunities for students to practice 
instructional content outside of regular classroom instruction), integrative 
(using technology during instruction to facilitate teaching and learning), or 
substitute (investigating the possibility of using personalised technology in 
lieu of teaching) approach.  
Structured according to four themes, the findings of the thematic analysis 
reveal further insights: 
1. Improving access to education and adapting to the diverse needs of 
learners: T
​ his theme examines how technology-supported personalised 
learning enables access to quality educational materials, adapts to 
learners’ needs by ‘teaching at the right level’, extends learning, and 
potentially closes educational gaps for the most marginalised.  
2. The role of teachers and appropriate professional development: ​This 
theme examines the central role of teachers and teacher professional 
development in enabling technology-supported personalised learning 
in addition to addressing potential constraints on teaching and learning. 
3. Pedagogical and motivational affordances: T ​ his theme examines the 
pedagogical affordances of technology-supported personalised learning 
and the impact this can have on learner motivation.  
4. Potential challenges and barriers in implementation: T ​ his theme 
examines implications with regard to cost and infrastructure, in addition 
to potential issues for scalability and sustainability.  
The key findings and recommendations from this review are: 
Technology-supported personalised learning appears to offer 
significant promise to improve learning outcomes​, including 
potentially ‘out-of-class’ and ‘out-of-school’ learning.  
The adaptive nature of technology-supported personalised learning 
to ‘teach at the right level’ is key​ as it enables students to learn at their 
own pace and according to their current proficiency.  
Technology-supported personalised learning may be most beneficial 
in closing educational gaps for lower attaining students​, potentially 
including those returning to school after an absence. 
Technology-Supported Personalised Learning: A Rapid Evidence Review 6 
EdTech Hub 

Any introduction of personalised learning technology should not be 


interpreted as decreasing the importance of the teacher​, but rather 
enhancing it.  
Implications for cost and infrastructure are unclear​, but using existing 
hardware solutions is likely to help to reduce costs and increase access. 

   

Technology-Supported Personalised Learning: A Rapid Evidence Review 7 


EdTech Hub 

1. Introduction 
The Covid-19 pandemic has resulted in widespread and unprecedented global 
disruption to education.​1​ Physical distancing policies to suppress the spread of 
Covid-19, which often advise that students and teachers cannot congregate in 
schools in the conventional manner, has led to a global expansion of the use 
of technology within education. 

This RER provides a summary of existing research evidence on the use of 
technology to support personalised learning in LMICs. It offers insights and 
evidence that can assist in the development and implementation of effective 
EdTech interventions across the globe and in situations of disruption to 
education and distance learning within the current context.  

1.2. Background 
Personalising education by adapting learning opportunities and instruction to 
individual capabilities and dispositions has been a long-standing objective 
among educators (Natriello, 2017). Indeed, everyday practice in schools 
globally almost always involves a degree of personalisation as teachers and 
students respond to each other’s constantly shifting needs, aims and desires 
(Beetham, 2005; Holmes et al., 2018). The idea of personalised learning is 
therefore not new. There are, however, variations in how personalisation is 
realised in practice. 

Research on technology’s role in enabling learning that is better suited to the 


characteristics and needs of learners can be traced back several decades (and 
even beyond, to groundbreaking work on ‘teaching machines’ by Pressey and 
Skinner in the 1920s and 1950s respectively: Holmes et al., 2018). In more recent 
years, stimulated by the increasing availability and sophistication of digital 
technology, it has been argued that the adaptive and personalisable 
affordances of EdTech offer a way of addressing challenges facing education 
systems around the world. Potentially these affordances can open up new, 
scalable opportunities for greater personalisation that adjust the learning 
experience (e.g. based on age, ability, prior knowledge and/or personal 
relevance; FitzGerald et al., 2018). They may also enable diverse 
representations of content that reflect learners’ own preferences and cultural 
reference points, in addition to the ability to automatically capture and 
respond to students’ learning patterns with data. 

1
See: ​en.unesco.org/covid19/educationresponse 

Technology-Supported Personalised Learning: A Rapid Evidence Review 8 


EdTech Hub 

1.3. Purpose 
In the context of LMICs in particular, personalised learning carries significant 
promise in improving the state of education (Zualkernan, 2016): for instance, 
with regard to identifying and teaching at the ‘right’ (i.e. the learner’s current) 
level; reducing the negative effects of high pupil–teacher ratios; increasing 
access to education; and improving learning outcomes (Kishore & Shah, 2019). 
The Covid-19 global health emergency has accelerated interest in how EdTech 
can support personalised learning given the nature of schooling is likely to be 
seriously affected in the medium to long term due to the introduction of 
physical distancing, school closures and other policies intended to alleviate 
the impact of the virus. As a result, there is an urgent need to identify existing 
research on technology-supported personalised learning in order to inform an 
effective response to the crisis. This is particularly the case for LMICs where 
marginalised learners risk falling even further behind.​2​ This RER, alongside 
others, contributes to an emerging evidence base on the use of technology for 
education during the Covid-19 pandemic, and organises the most relevant 
literature into coherent themes for the consideration of key stakeholders. 

1.4. Application 
This RER is intended to inform educational decision-makers, including donors 
and those in government and NGOs, about the potential to use 
technology-supported personalised learning as a response to the current 
pandemic. The findings and recommendations are also anticipated to be of 
interest to other education stakeholders (e.g. researchers and school leaders). 
Given that the circumstances surrounding EdTech interventions differ greatly 
across LMIC and other education systems, as with other related reviews (e.g. 
Escueta et al., 2017), focusing on research undertaken in LMIC contexts allows 
for the integration of findings in a way that can yield meaningful policy 
implications.  

1.5. Research questions 


This study asks the overarching question: W
​ hat is known about personalised 
learning through using technology that can be of value in responding 
effectively to mass school shutdowns in LMICs?  
Two specific research questions (RQs) guide this enquiry:  

2
Estimates suggest the pandemic could lead to approximately US$10 trillion of lost earnings 
over the lifetime of every primary and secondary student globally while substantial reductions 
in education budgets are also a possibility (Azevedo et al., 2020). 

Technology-Supported Personalised Learning: A Rapid Evidence Review 9 


EdTech Hub 

RQ1.​ How has technology-supported personalised learning been 


implemented in LMICs?  
● Where has research been undertaken?  
● Which learners have been involved in the researched interventions? 
● What approaches to technology-supported personalised learning are 
reported? 
● How does technology-supported personalisation relate to learning 
outcomes? 
RQ2. ​What key themes are reported in the literature that may inform a 
response to the Covid-19 pandemic?  
1.5.1. Definition and scope of the study 
Like many concepts in education, there is no universal definition of 
personalised learning (Holmes et al., 2018). Indeed, Cuban (2018) describes 
personalised learning as “​like a chameleon; it appears in different forms​”. 
According to Cuban, these forms can be conceptualised as a ‘continuum’ of 
approaches: from teacher-led classrooms to student-centred classrooms, with 
‘hybrid’ approaches in between. Such ambiguity has led to the idea of 
personalised learning being conflated with individualised learning and 
differentiated learning, and sometimes also confused with problem- or 
inquiry- or project-based learning (Holmes et al., 2018).  
Although definitions of personalised learning vary, broadly stated there is 
agreement that it is learner-centred and flexible, and responsive to individual 
learners’ needs (Gro, 2017).​3​ As reflected by the keywords used to search the 
literature (encompassing areas such as computer-aided instruction and 
intelligent tutoring systems among others; see methodology), an intentionally 
broad view of t​ echnology-supported personalised learning​ as an ‘umbrella’ 
term was adopted from the outset. Influenced by FitzGerald and colleagues 
(2018), in this RER ​we conceptualise technology-supported personalised 
learning as: the ways in which technology enables or supports learning 
based upon particular characteristics of relevance or importance to 
learners​. This may refer to technology-supported instruction in which: the 
pace of learning is adjusted; the instructional approach is optimised for the 
needs of each learner (e.g. through learning objectives, content or tools); 
learning is driven by learner interests; learners are empowered to choose 
what, how and when they learn (Office of Educational Technology, 2017).  

3
While beyond the scope of the RER, note that the contentious and widely disputed idea of 
‘learning styles’ does not feature in mainstream definitions or approaches to personalised 
learning, see: 
www.theguardian.com/education/2017/mar/12/no-evidence-to-back-idea-of-learning-styles 

Technology-Supported Personalised Learning: A Rapid Evidence Review 10 


EdTech Hub 

1.5.2. Structure of the RER 


Following this introduction, the methodological approach is discussed, 
including details of the scoping review, the literature search, eligibility criteria 
and possible limitations of the methodology. Then, detailed findings are 
presented in response to the research questions (including four themes that 
emerged from a thematic analysis of identified literature). The report 
concludes by providing a summary of key findings and recommendations.  

   

Technology-Supported Personalised Learning: A Rapid Evidence Review 11 


EdTech Hub 

2. Methodology 
The methodological approach for this RER was informed by the Cochrane 
Collaboration Rapid Reviews Methods Group interim guidance on producing 
rapid reviews (Garrity et al., 2020) in addition to the framework for undertaking 
a scoping review (Arksey & O’Malley, 2005; Levac et al., 2010). 

2.1. Scoping review  


A rigorous and systematic form of secondary research, scoping reviews involve 
collecting, evaluating and presenting available evidence at a ‘high level’. 
Differing from ‘conventional’ systematic reviews in that they are better able to 
account for studies with varying intentions and designs, scoping reviews 
provide an accessible and summarised overview of existing research to inform 
policymakers and other stakeholders (Levac et al., 2010).  
Preliminary search terms were developed based on the research questions 
and after considering the titles, abstracts and keywords of research which was 
known beforehand to be important and relevant (even if not focusing 
exclusively on LMICs e.g. the review by FitzGerald et al., 2018). Search terms 
were iteratively refined during pilot searches that revealed potentially useful 
studies and terms (identified following further analysis of titles, abstracts and 
keywords). Using this approach, a final set of 35 search terms was compiled 
(Annex B).  

2.2. Literature search and eligibility criteria 


Automated searches were undertaken during May 2020 using Google Scholar 
and the Searchable PUblication Database (SPUD), an extensive searchable 
publication database (3+ million records to date) developed by the EdTech 
Hub team. Unlike a ‘traditional’ systematic review, which may screen all search 
results, the rapid review methodology employed relied on a system of quotas. 
As such, only the most relevant results (up to a maximum of the first 20 pages 
of results as ranked by Google Scholar) were selected for the first round of 
screening. In total, the search strings returned 38,335 results across Google 
Scholar and SPUD, with 198 potential candidate studies being identified 
through the automated searches.  
Figure 1 provides an overview of the search process. The title and abstract 
screening, as well as all other subsequent screenings, were conducted 
according to the eligibility criteria in Table 1. Where research was identified to 
be potentially important despite not strictly meeting the eligibility criteria this 
was retained in a complementary collection in case it was useful later. ‘Grey 
literature’ (e.g. non-peer reviewed reports) was accepted if relevant to the 
Technology-Supported Personalised Learning: A Rapid Evidence Review 12 
EdTech Hub 

scope of the RER. All data were shared by the research team through online 
documents and folders (e.g. Google Docs, Zotero). 
Figure 1​: Literature search and screening process. 

   

Technology-Supported Personalised Learning: A Rapid Evidence Review 13 


EdTech Hub 

Table 1:​ Eligibility criteria for literature searches and screening. 

Criterion type  Inclusion criteria  Exclusion criteria 

Population  Involving elementary  Involving learners in 


and/or secondary school  higher or tertiary 
students (ranging from 5  education only 
to 19 years old) based in 
LMICs 

Intervention  Falling under the broad  Studies focusing on access 


‘umbrella’ of  to technology with little 
technology-supported  consideration for how this 
personalised learning  is personalised to the 
needs of learners, or 
personalised learning with 
no use of technology 

Outcomes  Reporting effects on  Focusing on the 


academic performance  development and testing 
(e.g. measured by grades  of software with no learner 
or performance on tests)  data 
or relating to student 
needs/preferences (e.g. 
motivation to learn) 

Study design  Describing primary  Reviews and 


empirical research (i.e.,  meta-analyses or 
acquired by means of  providing a ‘lessons 
observation,  learned’ account without 
experimentation or  presenting any empirical 
survey), both quantitative  evidence 
and qualitative 

Date  Published 2006–2020   

Language   
English-language only 

After full-text screening according to the eligibility criteria, 41 relevant studies 


were identified. Nonetheless, the reference lists of studies identified during 
the automated searches were also examined as a further check to ensure that 
Technology-Supported Personalised Learning: A Rapid Evidence Review 14 
EdTech Hub 

relevant research was not missed. This ‘backward snowballing’ strategy 


resulted in 11 additional studies being identified. Further studies (n=10) were 
also identified via expert referral. In total, 62 studies were identified. Hassler 
and colleagues’ (2016) adaptation of Gough’s (2007) ‘weight of evidence’ 
framework was applied to determine those studies of most value. This 
involved one member of the research team independently scanning identified 
studies before making an evaluation of ‘low’, ‘medium’ or ‘high’ for each of the 
following criteria: 
■ Methodological trustworthiness:​ the trustworthiness of a study’s results 
based on an evaluation of the research approach used.  
■ Relevance to the RER:​ relevance of a study for the specific purposes of 
this review, namely how technology-supported personalised learning 
can be of value in responding effectively to mass school shutdowns 
during Covid-19. 
Any study categorised as ‘low’ for trustworthiness (n=19), relevance (n=12), or 
both (n=7) was omitted from further analysis (n=38). Thus these studies were 
excluded primarily because they reported only minimal empirical findings or 
considered technology-supported personalisation in a limited way. This 
process resulted in the inclusion of 24​ s​ tudies that met a minimum threshold 
of ‘medium trustworthiness’ and ‘medium relevance’.  
To address RQ1, a process of data extraction involving the 24 included studies 
was undertaken. Initially, this involved extracting data to determine the key 
characteristics of studies (i.e. where has research been undertaken? Which 
learners have been involved in the researched interventions? What 
approaches to technology-supported personalised learning are reported? 
How does technology-supported personalisation relate to learning 
outcomes?). Having established this overview of the research landscape, 
thematic analysis was applied to address RQ2. Whereas data extraction (e.g., 
numbers of participants) is objective and not interpretive, thematic analysis 
(or ‘thematic synthesis’; Thomas & Harden, 2008) involves telling the story that 
emerges across the findings reported by the included studies. Informed by 
established guidelines for narrative syntheses (Ryan, 2013), the research team: 
read studies to become familiar with their similarities and differences; 
discussed emerging relationships within and between studies; iteratively 
revised and refined themes to agree on a final set of themes.  

2.3. Limitations 
The search only considered English-language research published from 2007 
onwards. The choice of keywords used or omitted, publication bias, or the 
selection and/or nature of digital libraries searched may have had an impact 
Technology-Supported Personalised Learning: A Rapid Evidence Review 15 
EdTech Hub 

on the eventual findings. Due to the constraints of the RER timeframe, 


activities such as data extraction and quality assessment were necessarily 
undertaken primarily by one researcher in a short period of time, and thus 
some subjectivity or error may have been introduced. Time constraints also 
likely limited how comprehensively the research questions were addressed. It 
is also important to note that findings may not be generalisable to the current 
Covid-19 context, given the majority of reported research was undertaken in a 
school or ‘school-like’ context prior to the pandemic. Concerns have also been 
raised about whether learning gains from using personalised technology are 
actually attributable to the use of the software (e.g. as opposed to additional 
lessons conducted by a teacher; Buchel et al., 2020). A further limitation of 
research in this area is that the software is not always fully described; often the 
name of the software is omitted, and the full capacity of the software is not 
outlined. These factors may limit accurate inferences about the degree to 
which the reported software was personalised and how. Finally,​ ​the broad 
conceptualisation of technology-supported personalised learning employed 
resulted in the identification and analysis of a diverse range of heterogeneous 
studies of varying rigour which may have implications for the interpretation of 
findings.  
Actions to mitigate the potential impact of these issues included undertaking 
pilot searches, examining the reference lists of included studies for other 
relevant work (‘snowballing’ — a process that revealed several commonly 
cited studies had already been identified thus demonstrating a degree of 
saturation) and maintaining frequent contact between researchers involved. 
While the findings of the RER are inherently limited by the quality of evidence 
available, the application of the quality/relevance assessment helped to 
mitigate the risk of low-quality or irrelevant research significantly impacting 
conclusions.  

2.4. Theme identification 


In the next section we present the findings of the RER. R​ Q1 ​contextualises 
evidence available by outlining the characteristics of research on 
technology-supported personalised learning in LMICs, including how (and 
with what impact) this has been implemented. This contextual question 
provides the basis for informing the thematic outcomes in ​RQ2​, which 
established four​ ​themes (and sub-themes):  

Technology-Supported Personalised Learning: A Rapid Evidence Review 16 


EdTech Hub 

Improving access to education and adapting to the diverse needs of 


learners 
○ Enabling access to quality educational materials  
○ Adapting to learners’ needs by ‘teaching at the right level’ 
○ Extending learning in new ways  
○ Closing educational gaps for the most marginalised  
The role of teachers and appropriate professional development 
○ The central role of teachers and teacher professional development 
○ Addressing constraints on teaching and learning 
Pedagogical and motivational affordances 
○ Peer interaction, scaffolding & productivity 
○ Learner motivation 
Potential challenges and barriers in implementation 
○ Cost 
○ Infrastructure, scalability and sustainability 

   

Technology-Supported Personalised Learning: A Rapid Evidence Review 17 


EdTech Hub 

3. Findings 
RQ1. How has technology-supported personalised 
learning been implemented in low and middle-income 
countries?  
See Annex C for a summary of information extracted from included studies.  

Where has research been undertaken?  


Evidence on technology-supported personalised learning is continually 
developing across LMICs. Identified research has assessed the 
implementation of technology-supported personalised learning in Asia (n=12), 
Africa (n=6) and Latin America (n=6).  

This RER synthesises a total of 24 studies​4​ from 12 countries during the period 
2007 to 2020: India (n=5), Pakistan (n=1), Nigeria (n= 4), Kenya (n=2), Chile (n=1), 
Ecuador (n=1), El Savador (n=1), Cambodia (n=1), and rural China (n=6). Three 
additional countries are also reported in two comparative studies: Chile, 
Mexico and Ecuador were compared in the same experimental study by Casas 
and colleagues (2014); Brazil, Mexico, and Costa Rica were also compared in 
the same case study by Ogan and colleagues (2012). 
Research addressing technology-supported personalised learning is current 
and shows that work is ongoing in the field judging by the publication dates 
of retrieved studies: 2007 (n =1), 2008 (n=1), 2010 (n=1), 2011 (n=1), 2012 (n=2), 2013 
(n=3), 2014 (n=2), 2015 (n=3), 2016 (n=2), 2017 (n=1), 2018 (n=2), 2019 (n=3), 2020 
(n=2).  
In addition, a range of research methods have been employed across different 
countries. Randomised Controlled Trials (RCTs) were the most common (n=12) 
and were conducted in rural China (n=6), India (n=4), Cambodia (n=1), and El 
Salvador (n=1). Quasi-experiments (n=8) were carried out in Nigeria (n=4), India 
(n=1), and the Latin American countries of Chile, Mexico, and Ecuador. There 
were 4 case studies; 2 from Kenya, 1 from Venezuela, and one study which 
compared Brazil, Mexico, and Costa Rica. Note, this classification of ‘case study’ 
was applied to studies designed to evaluate the development and 
implementation of specific personalised learning technologies in LMIC 
contexts. The four case studies collected both quantitative data (student 

4
Two interesting studies did not meet the formal inclusion criteria for RQ1 given their focus on 
the teacher and not students (Stott & Case, 2014; Zualkernan et al., 2013). Their reported 
findings are, however, considered in response to RQ2.  

Technology-Supported Personalised Learning: A Rapid Evidence Review 18 


EdTech Hub 

learning outcomes) and qualitative data (teacher interviews) to assess the 


efficacy of personalised software (Andallaza et al., 2012; Mutahi, 2015, 2017; 
Ogan, 2012).  

Which learners have been involved in the researched 


interventions? 
Studies involved learners attending primary (n=15) and secondary schools (n= 
9). The sample size of the studies overall are considered to be fairly large 
(minimum sample = 18, maximum sample = 21,936). For instance, an RCT in 
India by Muralidharan and colleagues (2019) sampled 619 participants, a 
quasi-experimental study sampled 734 learners across three Latin American 
countries (Chile, Ecuador and Mexico; Casas et al., 2014), and a case study by 
Andallaza and colleagues (2012) involved 143 learners from Venezuela.  

What approaches to technology-supported 


personalised learning are reported? 
A range of digital technologies are reported to deliver educational content to 
students in order to maximise opportunities for learning cognitive (test scores 
or learning outcomes) or non-cognitive skills (social skills, computer 
proficiency).​5​ These have mostly targeted instruction in single subjects: 
mathematics (n=15), science (n=3), English (n=1), multiple subjects (n=4), and 
one study addressing social skills.  
The introduction to this RER pointed out how there is no agreed definition of 
technology-supported personalised learning. This is reflected in the varied 
terminology used by included studies. Common terminologies used to 
describe research related to ‘technology-supported personalised learning’ 
include:  
● Computer-assisted learning​ e.g. Bai et al. (2018), Banerjee et al. (2007) 
● Computer-aided Learning​ e.g. Muralidharan et al. (2019) 
● Computer-aided Instruction (CAI)​ e.g. Carrillo et al. (2011); Ito et al. (2019) 
● Intelligent Tutoring Systems (ITS)​ e.g. Andallaza et al. (2012) 
● Cognitive Tutoring Systems (CTS)​ e.g. Ogan et al. (2012) 

5
Refer to Annex C (Data Description Spreadsheet) which includes a list of the personalised 
technology used in each study. In this context, ‘cognitive skills’ generally refer to assessment 
of learning outcomes using tests, and non-cognitive skills include social skills (e.g. Ige, 2019), 
computer proficiency skills (e.g. Mo et al., 2013), and affective skills (e.g. Andallaza et al., 2012). 
An interesting observation is the emphasis on assessing cognitive outcomes although 
learning is of course inextricably linked to non-cognitive skills like students’ needs, 
preferences, socio-emotional development, etc.  

Technology-Supported Personalised Learning: A Rapid Evidence Review 19 


EdTech Hub 

The studies which reported using either computer-assisted learning (n=9), 


computer-aided learning (n=3), or CAI (n=5) appear to use slightly different 
terms to describe a similar goal. While not all studies provide operational 
definitions for these terms, two common definitions were observed. 
Computer-assisted learning is characterised as a type of computer-aided 
learning which uses computerised instruction, drills and exercises, 
simulations, and instructional games (Gambari et al., 2016; Lai et al., 2013, 2015), 
or involves the use of a computer program that offers remedial learning 
materials in the form of interesting interfaces and games with the aim of 
improving educational outcomes and interest in learning (Bai et al., 2018; Mo 
et al., 2013).  
In contrast, the studies which reported using ITS (n=3) and CTS (n=4) placed 
greater emphasis on the affordances the technology provided to the learner. 
These described how: responses to learner inputs (monitoring and feedback) 
were provided, content was adjusted to match the level of the learner, and a 
high volume of user data can be captured as feedback to the learner and 
teacher. Specifically, ITS are defined as “​computer applications that are 
capable of providing individualised instruction to learners through the use of 
artificial intelligence, thereby supporting the learner and facilitating the 
learning process​” (Andallaza et al., 2012, p.1). CTS are defined as a type of ITS 
that is capable of assessing skill mastery as a student solves problems, and 
provides context-sensitive hints, error feedback, and adaptive problem 
selection (Ogan et al., 2012). These adaptive softwares are specifically designed 
to facilitate self-paced learning through tailoring content to levels of learning 
(which can free teachers to act as classroom facilitators rather than teaching 
directly; Ogan et al., 2012). 
There appears to be a link between the l​ evel of personalisation​ afforded by 
the technology and the reported approach to personalised learning. Three 
levels of personalisation afforded by educational technology were 
distinguished. Those with ‘​fewer personalisation affordances​’ (n=8 studies), 
‘​medium personalisation affordances​' (n=6 studies), and ‘​greater 
personalisation affordances’​ (n=10 studies). The classifications ​‘fewer 
personalisation affordances’ ​and ‘​ medium personalisation affordances’​ can 
broadly be applied to studies reporting personalised learning using 
approaches like computer-assisted learning, computer-aided learning and 
CAI. By contrast, studies investigating technology-supported personalised 
learning using ITS, CTS, or other highly personalised technological software 
can be described as featuring ​‘greater personalisation affordances’​.  
Software featuring fewer personalisation affordances may not use highly 
sophisticated intelligent software. Generally embedded in their design, 
Technology-Supported Personalised Learning: A Rapid Evidence Review 20 
EdTech Hub 

however, is the explicit alignment of the software content to the local 


country’s national curriculum, in addition to some level of personalisation that 
provides feedback to the learner to support monitoring of learning and 
progress. Technologies with m ​ edium ​personalisation affordances go beyond 
aligning the content of the personalised software to the curriculum but also 
try to coincide use of the software to ongoing class instruction. They also 
target the level of learner by presenting concepts according to task difficulty 
and facilitating interactive user feedback. Technologies involving g​ reater 
personalisation affordances were: highly data driven;​ had the potential for 
6​

interaction (or responsive engagement) between the technology and the 


learner; involved educational content that was contextualised to meet the 
local context of the research.  
In the present RER, the classifications of ‘fewer-’, ‘medium-’ and ‘greater-’ 
personalisation affordances are intended to indicate the differences in the 
extent to which personalisation is affected. Hence, levels of personalisation 
may fall on a continuum of being highly responsive to the user (e.g., 
scaffolding learning and providing hints to difficult questions), to less 
responsive (e.g., by providing activities like exercises for drill and practice, 
viewing videos linked to questions, and limited feedback such as indicating 
that user responses are correct or incorrect).  
A further interesting finding is that studies implementing 
technology-supported approaches to personalised learning used the 
technology as either a ​supplementary (​ n=14), ​integrative​ (n=3) or s​ ubstitute 
approach (n=2). Further, studies have compared these approaches: 
supplementary/integrative​ (n=1), ​supplementary/substitution​ (n=1) in addition 
to attending to s​ oftware evaluation ​(albeit involving an analysis of learning 
outcome data, n=3). 
Supplementary approaches​ provide additional opportunities for students to 
practice instructional content outside of regular classroom instruction. Such 
studies typically use additional learning opportunities to provide remedial 
support through independent practice using a learning software (e.g. 
Banerjee et al., 2007; Buchel et al., 2020). These have been trialled with 
software featuring fewer-, medium- and greater-personalisation affordances 
with content designed to target the different levels of the learner. Variations 
exist, however, in the extent and quality of engagement and feedback 
between the learner and the software. Supplementary approaches to 
personalisation thus complement the quality of instruction available to 
6
Examples include data drawn from interfaces and sensors that capture fine-grained user 
interactions (Mutahi et al., 2015), or that provide visual feedback on student progress using 
logs generated during a session (the Aplusix ITS in Andallaza et al., 2012)  

Technology-Supported Personalised Learning: A Rapid Evidence Review 21 


EdTech Hub 

students. Students can therefore use such technology independently or with 


teacher guidance (Buchel et al., 2020). 
Integrative approaches​ use the technology during instruction to facilitate 
teaching and learning. In this approach, the teacher and technology co-exist, 
where it is the teacher’s role to facilitate and reinforce the learning process. 
They are designed not as supplementary, standalone systems but take into 
account the teacher, student and classroom interactions (Mutahi, 2015). For 
instance, the teacher uses technology to complement their lesson 
instructions by including time for students to use technology (Gambari, 
2016b). During this time the teacher may use the feedback data generated to 
adjust teaching and re-teach concepts.  
Substitute approaches i​ nvestigate the possibility of using personalised 
technology in lieu of teaching i.e. where instruction is delivered solely through 
technology. There is little evidence of technology-supported personalised 
learning successfully replacing certified teachers or regular teaching. Gambari 
and colleagues (2015) compared an individualised computer-assisted 
instructional program to two other non-computer assisted instructional 
programs. The researchers found no significant differences in learning 
outcomes among the three groups, implying that neither approach had an 
advantage.  
Two studies designed interventions that compared these approaches with 
each other (Linden, 2008; Gambari et al., 2016a).​7  
In Table 2, an overview of the link between f​ ewer​-, m
​ edium​- and 
greater​-personalisation affordances and the ways in which 
technology-supported personalised learning has been implemented is 
outlined. It is worth recalling that the studies using software with greater 
personalisation affordances (ITS and CTS) have been the least researched. 
Further work is required to make affirmative conclusions about the use of any 
of these approaches.  

7
Linden (2008) evaluated a computer-assisted learning programme designed to reinforce 
Indian students' understanding of material presented in class and found this was a poor 
substitute for the teacher-delivered curriculum and was no better than a complement 
(supplement) programme delivered using an out of school model. Gambari and colleagues 
(2016a) study in Nigeria found that an integrative approach – integrating an interactive 
computer program into chemistry instruction – was no more effective than using 
conventional teaching methods or a substitute approach (using a computer tutorial 
instructional package).  

Technology-Supported Personalised Learning: A Rapid Evidence Review 22 


EdTech Hub 

Table 2. Summarising reported technology-supported personalised learning 


approaches by the nature of their implementation. 

  Fewer  Medium  Greater 


personalisation  personalisation  personalisation 
affordances   affordances   affordances  
(n=8)  (n=6)  (n=10) 

Supplementary  3  6  5 
(n=14) 

Substitute (n=2)  2  0  0 

Integrative (n=3)  1  0  2 

Supplementary/  1  0  0 
integrative 
(n=1) 

Supplementary/  1  0  0 
substitution 
(n=1) 

Software  0  0  3 
evaluation​8 
(n=3) 

How does technology-supported personalisation 


relate to learning outcomes? 
Studies report diverse but broadly positive relationships between 
technology-supported personalised technology and learning outcomes (Table 
3).​9​ It is striking how a relatively limited amount of qualitative or mixed 
methods research has been undertaken (although as discussed in the 
Limitations section, this lack of representation may be due to studies being 
inadvertently filtered out or missed).  

Table 3. ​Summarising reported impact on students’ learning (by research method) 

  Studies   Positive  Mixed  Negative 

8
These studies also attended to an analysis of learning outcomes (n=2). 
9
The learning outcomes are summarised to provide a broad overview. Ideally, a meta-analysis 
that compares effect sizes is a more appropriate way of determining the common effect 
across different studies and will be the next step towards extending this RER. 

Technology-Supported Personalised Learning: A Rapid Evidence Review 23 


EdTech Hub 

outcomes  outcomes​10  outcomes 

RCTs  12  10  2  0 

Quasi-experiments  8  4  0  4 

Case study​11  4  3  0  1 

Total  24  17  2  5 


 

Of the studies featuring f​ ewer personalisation affordances​ (Table 2, n=8), five 


report that the intervention had a negative impact on learning and three 
report a positive impact. These three studies (all ‘supplementary’ approaches) 
were designed to provide remedial instruction that was tightly aligned to the 
curriculum, teacher instruction and learner feedback.  
Similarly, the studies classified as featuring m
​ edium personalisation 
affordances​ (n=6) all used a supplementary approach that had a positive 
impact on learning. Moreover, it appears that the effort to contextualise the 
contents of the software so that it aligns with the national curriculum, 
classroom lessons or the level of the learner can have profound impact 
regardless of technology sophistication.  
In terms of the impact on learning for studies classified as featuring ​greater 
personalisation affordances​ (n=10): five used a supplementary approach, all of 
which had positive impacts on students’ learning; two used an integrative 
approach that also had positive impacts on students learning; and three were 
software evaluations that reported varying results in terms of impact on 
learning outcomes.  

10
The studies categorised as mixed outcomes generally found a positive effect on student 
learning from using the software. However, the effects were small over and above traditional 
pencil and paper learning (Ma et al., 2020) and the personalised approach was a poor 
substitute for the teacher-delivered curriculum in comparison to a complementary program 
which showed statistically significant gains for the weakest and oldest students in the class 
(Linden, 2008).  
11
The case studies were software evaluation studies which trialled newly developed 
personalised learning software with teachers and/or students to garner feedback on the 
useability of the tool and users’ perceptions. Andallaza and colleagues (2012) collected 
quantitative data by observing students’ affective states while using the software to 
determine if the software facilitated the development of affective skills. Mutahi and colleagues 
(2015, 2017) analysed qualitative data via teacher interviews to get feedback on the usability of 
the software and quantitative software usage data. Ogan and colleagues (2012) presents a 
qualitative case study featuring teacher interviews.  
Technology-Supported Personalised Learning: A Rapid Evidence Review 24 
EdTech Hub 

RQ2. What key themes are reported in the literature 


that may inform a response to the Covid-19 pandemic?  
Building on RQ1, four interconnected themes identified in the literature are 
now considered. As outlined in RQ1, technology-supported personalised 
learning has been implemented in three main ways (as a supplementary, 
integrative or substitute approach). The reported synthesis is intentionally — 
and necessarily (given the constraints of the RER timeframe and the broad 
definition of technology-supported personalised learning) — ‘high level’ as it 
does not differentiate between the distinct ways in which technology has 
been used to support personalised learning. Further, the impact of cultural 
and social differences between different contexts, and the fact that the 
majority of research relates to mathematics and science education, must be 
considered when interpreting results from the reviewed studies. Despite 
these challenges, themes identified are intended to provide an accessible 
summary of existing evidence so that educators, policymakers and donors 
might make informed decisions about the potential role of 
technology-supported personalised learning as a response to the Covid-19 
pandemic.​12  

Theme 1: Improving access and adapting to the 


diverse needs of learners 
Enabling access to quality educational materials  
Technology-supported personalised learning appears to offer an accessible 
means by which students can access instructional materials capable of 
enhancing learning. Thus, such technology can address severe teacher 
shortages (Ito et al., 2019) and the need for out-of-school learning (e.g. to 
support homework; Kumar & Mehra, 2018). Established technology-supported 
personalised learning programs such as Mindspark offer a means to deliver 
educational content in a variety of settings (in schools, in after-school centres, 
or through self-guided study). Such solutions are being deployed across 
increasingly diverse platforms (including computers, tablets and 
smartphones; Muralidharan et al., 2019), and can be used offline as well as 
online (Bai et al., 2018; Ma et al., 2020).  

12
Note, findings from two additional studies, that focus primarily on the role of the teacher, 
have also been incorporated into the thematic analysis given they provide insights 
complementing reported themes (Stott & Case, 2014; Zualkernan et al., 2013). Also included are 
findings reported in two other highly relevant studies undertaken in Latin America, originally 
published in Spanish, which were identified following the automated search (Perara & Aboal, 
2017a, 2017b). 

Technology-Supported Personalised Learning: A Rapid Evidence Review 25 


EdTech Hub 

In this context, ‘quality educational materials’ may be evaluated on two levels: 


(1) technological content carefully developed to be aligned with the 
curriculum and instruction at a level of instructional units (e.g. Carillo, 2011; Ito 
et al., 2019), and (2) lessons being delivered to students (e.g. Mo et al., 2014). As 
discussed in RQ1, so far much of the evidence points to positive gains when 
technology-supported personalised learning supplements classroom 
instruction (Lai et al., 2013; Mo et al., 2014). See Theme 4 for further discussion 
on potential barriers to equitable EdTech access that may be particularly 
relevant given the Covid-19 context.  
Adapting to learners’ needs by ‘teaching at the right level’ 
Somewhat unsurprisingly, the adaptive nature of technology-supported 
personalised learning is a key emergent theme. For instance, the way this can 
enable students to learn at their own pace and according to their current 
proficiency (Ito et al., 2019), including collaboratively (Ogan et al., 2012). 
Allowing students to work at their own speed using personalised software 
pitched at their level can avoid potential negative status effects of them being 
labelled as being in a ‘weaker’ track, while the dynamic updating of content 
mitigates the risk of premature permanent tracking of ‘late bloomers’ 
(Muralidharan et al., 2019). Even more important is ensuring that the 
educational content is pitched at the learner’s level of proficiency. Here, the 
technology is used to differentiate instruction in a way that meets the goal of 
remediation (Banerjee et al., 2007).  
While there are several mechanisms by which computer-aided learning can 
improve teaching and learning, a particularly attractive feature is its ability to 
deliver individually customised content for Teaching at the Right Level (TaRL) 
for all students, regardless of the extent of heterogeneity in learning levels 
within a classroom (Muralidharan et al., 2019). This can help to directly address 
one of the main reasons for the general inability to meet desired learning 
outcomes in LMICs: the inability to meet the heterogeneous learning needs of 
a large student population with constrained educational resources (Kumar & 
Mehra, 2018).  
Consider the following example reporting the use of a mathematics 
intervention in urban India. Addressed to all children but adapted to each 
child’s current level of achievement, a technology-supported personalised 
learning initiative allowed each learner to be individually and appropriately 
stimulated (Banerjee et al., 2007). Specifically designed to address constraints 
on effective pedagogy in LMICs, such software may feature the use of an 
extensive item-level database of test questions and student responses to 
benchmark the initial learning level of every student; the material being 
delivered can then be dynamically personalised to match the level and rate of 
Technology-Supported Personalised Learning: A Rapid Evidence Review 26 
EdTech Hub 

progress made by each individual student (Muralidharan et al., 2019). In 


addition to allowing for variation in academic content presented, other 
potential benefits include allowing different entry points and differentiated 
instruction without the need to reorganise peers in the classroom (including 
preserving the age-cohort-based social grouping of students; Muralidharan et 
al., 2019). 
Extending learning in new ways 
In addition to this capacity to support TaRL, technology-supported 
personalised learning appears to offer the potential to promote learning in 
other ways beyond those previously possible. A randomised controlled trial in 
Salvadoran primary schools, for instance, reveals not only how 
computer-assisted personalised learning produces substantial learning gains, 
but may actually outperform traditional modes of instruction (Buchel et al., 
2020). Such a relative advantage seems to be driven by a mismatch between 
teacher preparation and the complexity of the concepts they have to teach: 
under traditional teaching models, it seems questionable that children are 
able to master what their teachers fail to understand. However, 
technology-supported personalised learning may allow learners to make 
progress beyond their teachers' content knowledge. Such approaches may 
thus help to teach or remediate critical deficiencies in both students’ and 
teachers’ understandings (Ogan et al., 2012). Researchers including Gambari 
and colleagues (2016a) have explored using personalised technology as an 
integrative or blended model where it is used as part of instruction in 
mathematics and science to address challenges such as a lack of instructional 
materials and to facilitate the teaching of constructs that are abstract and 
difficult to understand. While the researchers did not find using 
computer-simulated instruction during instruction to be more effective than 
traditional instruction, the study points to a need for research to detangle the 
contribution of delivering pedagogical content through the teacher versus 
through the technology.  
Closing educational gaps for the most marginalised  
Consistent with the promise of technology-supported personalised learning 
to customise instruction for each student, integrating a novel approach to 
implementing grade level appropriate material into existing teaching practice 
can substantially increase learning for students of all baseline learning levels 
(Muralidharan et al., 2019). Of particular significance during the current 
context of mass school shutdowns, given many learners will likely require 
additional support to get to the ‘right level’ upon returning to school, is a 
growing collection of evidence that indicates how technology-supported 
personalised learning may help most in closing educational gaps for 
Technology-Supported Personalised Learning: A Rapid Evidence Review 27 
EdTech Hub 

marginalised learners. This is evident in examples of studies done in India, 


rural China and Latin American countries that deliberately target 
disadvantaged students from low-income backgrounds or aim to address 
issues relating to quality education (e.g. Carillo et al., 2011; Mo et al., 2013).  
Many parents of the most marginalised learners have neither the skills nor the 
money to provide remedial tutoring, while many teachers often do not have 
time to give students the individual attention they need. The ability of 
personalised technology to teach all students equally effectively, for instance 
as a complementary input to using existing computer resources, has been 
reported as offering the potential to narrow the urban-rural achievement gap 
and help disadvantaged populations (Bai et al., 2018). Indeed, students from 
disadvantaged family backgrounds (Lai et al., 2013), or who have less educated 
parents (Lai et al., 2015), may benefit more from such programmes. In settings 
where students are more likely to be substantially behind grade level, or 
where there is substantial heterogeneity, the effects of adaptive technology 
might be larger because technology can personalise education (Ma et al., 
2020). As a result, the relative impact of learning gains may be much greater 
for lower-attaining students (Muralidharan et al., 2019), although arguably 
such learners may be the most likely to have limited access to required 
technology.  
Positive effects have also been observed with regard to gender, which is 
indicative of the promising use of computer simulation and tutorial 
instructional strategies to bridge the academic gaps that might exist between 
male and female secondary science students (Gambari et al., 2015). Note, 
however, that other research has reported no similar positive effect for girls, 
nor indeed for high-performing students irrespective of their gender (Ma et al., 
2020). This is something also reported by Kumar and Mehra (2018), who, while 
finding students with low and medium mathematics attainment benefited 
significantly from the personalised homework, higher-attaining students did 
not to the same degree. This might have been because the algorithm offered 
too many easy questions that could be suboptimal for the learning needs of 
some high ability students. Other potential explanations include 
high-attaining students already knowing how to learn effectively (and hence 
are always more likely to do well), as well as the ‘gap’ being much smaller in 
terms of how much they can improve. 

Technology-Supported Personalised Learning: A Rapid Evidence Review 28 


EdTech Hub 

Theme 2: The role of teachers and appropriate 


teacher professional development 
The central role of teachers and teacher professional development 
While the exact ways in which technology-supported personalised learning is 
implemented vary, evidence on the role of the teachers in such 
implementation is overwhelmingly consistent: any introduction of 
personalised learning technology should not be interpreted as a loss of the 
importance of the teacher in teaching. For instance, Buchel and colleagues 
(2020) found that while students benefited from additional mathematics 
instruction, the learning gains were greater when this instruction was 
delivered using personalised learning technology with an experienced teacher 
over a supervisor who does not offer pedagogical support. It is possible that 
the availability of the teacher to provide immediate feedback is 
complemented by the potential of the technology to deliver individualised 
materials (at the pace and level of the learner) which has benefits for the 
progress of the whole class. 
Overall, the majority of the research on technology-supported personalised 
learning in LMICs trials supplementary approaches where students used the 
personalised technology outside of class instruction and without input from 
the teacher (see RQ1). Importantly, it appears studies that report success 
typically rely on the teacher or a knowledgeable expert to ensure the quality 
of the software’s instructional content and the alignment between class 
teaching and further practice for students. The few studies that have 
compared substitute and complementary approaches to using personalised 
technology have consistently reported no advantages when the technology 
replaces the teacher (Gambari et al., 2016a, 2016b; Linden, 2008).  
Thus, reported research should not be interpreted as supporting a reduced 
emphasis of the role of teachers in education. Rather, since the delivery of 
education involves tasks that vary for individual students and situations, and 
requires complex contextually aware communication, technology should be 
viewed as a complement (rather than substitute) to teachers (Muralidharan et 
al., 2019). This is, of course, a common message emerging from EdTech 
research across recent decades and it is no less applicable here. Where a 
technology-supported personalised learning system is reported to have been 
used, learners have themselves recognised the role of the teacher as a helpful 
guide in the learning process (87% of 388 students; Casas et al., 2014).  
Using technology in this way can include deploying it to perform routine tasks 
to free up teachers to spend more time on aspects of education where they 
have comparative advantages over technology (e.g. such as supporting group 
Technology-Supported Personalised Learning: A Rapid Evidence Review 29 
EdTech Hub 

learning strategies that can help develop social and other non-cognitive skills; 
Perara & Aboal, 2017a). Personalised approaches using cognitive tutoring 
systems that provide self-contained lessons, can help to mitigate common 
barriers to using educational software (such as the preparation time teachers 
require; Ogan et al., 2012). In cases where teachers cannot be in class, such 
technology could potentially assist substitute teachers or aides and 
supplement existing lessons, thereby facilitating a dynamic interaction 
between the teacher, system and learner by tracking student engagement 
and learning (Mutahi et al., 2015). How personalised technology can provide 
analytics or support data-analysis-intensive tasks (Muralidharan et al., 2019) is 
also likely to be an important focus of future research, particularly in those 
contexts where it is not possible for teachers to be physically present with 
students. As also highlighted in Theme 1, student progress may be hampered 
by limited teacher knowledge; hence, investing in the skills of teachers 
through offering professional development programmes is important (Buchel 
et al., 2020; Mo et al., 2014). When integrating technology-supported 
personalised learning approaches, teachers should be trained on the effective 
pedagogical use of the technology (through seminars, workshops and 
conferences; Gambari et al., 2016a).  
Additionally, there appears to be some limited evidence indicating the 
effectiveness of electronic tutoring as a tool for promoting conceptual change 
among in-service teachers themselves. Quantitative data collected from 1,049 
South African science teachers who attended 54 in-service teacher workshops 
suggest that individual use of the software can be effective in developing new 
knowledge, especially for those who already have relatively high levels of prior 
knowledge (Stott & Case, 2014).  
Addressing constraints on teaching and learning 
Providing they are operational and available, reported personalised 
technological interventions appear to be well received by teachers (who 
broadly agree that they offer efficient and effective learning accompaniments; 
e.g. Mutahi et al., 2017). Teachers’ intention to use such systems, however, is 
strongly dependent on how well the system is aligned with their teaching 
practices, students’ learning habits, and whether the content on the platform 
is made available in a language that can be understood by students 
(Zualkerman et al., 2013). Teachers must also reconcile their usual 
one-size-fits-all delivery model, in line with the order in which their curriculum 
expects them to teach concepts, with the notion of different pathways for 
different students. 
In addition to enabling ‘teaching at the right level’ (see Theme 1), personalised 
learning software may help in addressing other constraints on teaching and 
Technology-Supported Personalised Learning: A Rapid Evidence Review 30 
EdTech Hub 

learning. For instance, in the case of the Mindspark software, the high quality 
of content, combined with effective delivery and interface, can help 
circumvent constraints of teacher human capital and motivation. Algorithms 
for analysing patterns of student errors and providing differentiated feedback, 
and follow-up content that is administered in real time, also enable more 
relevant and more frequent feedback (Muralidharan et al., 2019). As a result, 
promoting the targeted use of personalised learning technology may be an 
attractive option for governments and NGOs operating in settings with low 
teacher quality. This is because learning software can empower teachers to 
improve the quality of their teaching, particularly when they themselves 
struggle with particular concepts they have to teach (Buchel et al., 2020). 
Other ways in which technology-supported personalised learning may 
support teaching include outside of school uses (e.g. through 
easy-to-implement personalised homework; Kumar & Mehra, 2018), and by 
providing extensive information on student performance to better guide 
teacher effort in the classroom while not contributing to increasing teacher 
workload (Muralidharan et al., 2019).  

Theme 3: Pedagogical and motivational affordances  


There is a close link between the affordances provided by technology and the 
manner in which it is implemented. Complementing the previous discussion 
in Themes 1 and 2, in this subsection other potential affordances of 
technology-supported personalised learning are considered.  
Peer interaction, feedback and scaffolding 
While the idea of personalised learning may on the surface appear to relate to 
a more ‘solitary’ understanding of education, some evidence points to the 
potential benefits of personalised learning for collaborative working. Peer 
interaction can be promoted directly through personalised technologies or 
enabled offline as students use the technology to acquire core knowledge and 
skills that allows them to contribute to group-based work taking place outside 
of the technology itself.  
For instance, in Ogan and colleagues’ (2012) study on the use of mathematics 
tutoring software in middle schools in Latin America, students collaborated 
extensively while using a technology primarily designed for individual use; the 
pace of work was often interdependent, and work often occurred at 
classmates’ computers in addition to their own. Further, the authors observed 
that the greater the (group) use in the class, the greater the advantage that 
the students obtained. Such findings have led to calls for research to explore 
how personalised technology may be used within classrooms to promote 
conceptual change through scaffolding and peer tutoring (Araya & Van der 
Technology-Supported Personalised Learning: A Rapid Evidence Review 31 
EdTech Hub 

Molen, 2013), and active learner participation and classroom dialogue (Stott & 
Case, 2014). The way that technology-supported personalised learning can 
enable comparison and competition between peers has also been suggested 
as a contributing factor to positive learning gains (Brunskill et al., 2010; Bai et 
al., 2018). Consideration has also been given to how students’ social skills 
might be fostered (Ige, 2019).  
While the features of technology-supported learning initiatives differ 
according to many factors, including the intended audience and deployment 
location, a case study on how interactive adaptive tutor software (Wayang 
Outpost) has been used to support mathematics learners (Grades 5-12) in 
Pakistan is useful in demonstrating how such technology can be designed to 
support pedagogy by: 
● Modelling (introduces the topic via worked examples, making steps 
explicit, and working through a problem aloud);  
● Providing practice with coaching (offering multimedia feedback and 
hints to sculpt performance to match/resemble that of an expert's);  
● Scaffolding (putting into place strategies and methods to support 
student learning);  
● Providing affective support (via characters that reflect about emotions, 
encourage students to persevere and demystify misconceptions about 
mathematics problem solving);  
● Encouraging reflection (self-referenced progress charts allow students 
to look back and analyse their performance) at key moments of loss or 
boredom (Zualkerman et al., 2013). 
Such technology features have been reported to improve students’ learning 
efficiency and productivity (Ito et al., 2019) and enable teachers to spend more 
time on supporting group-based learning strategies that may help build social 
and other non-cognitive skills (Muralidharan et al., 2019).  
Impact on learner motivation 
Technology-supported personalised learning appears to be well received by 
most learners and has a broadly motivational impact as well as improving 
subject learning. For example, after the implementation of a cognitive 
tutoring strategy for mathematics learners in Latin America, a high 
percentage (67%) of students in the intervention group (n=388) increased 
their motivation toward learning maths, felt more certain about their abilities 
to solve maths problems (68%), and viewed the technology as a useful tool 
that substantially helped their learning process (81%; Casas et al., 2014). Other 
evidence corroborates this conclusion. This includes a study showing that 
secondary school students in Nigeria performed better on chemistry 

Technology-Supported Personalised Learning: A Rapid Evidence Review 32 


EdTech Hub 

achievement and motivation tests when compared to those taught without 


computer simulations (Gambari et al., 2016a). Positive effects on student 
interest in mathematics have also been found (whereas there was no effect on 
maths interest from extra time learning maths; Ma et al., 2020). Indeed, this 
‘interest-oriented stimulation’ is regarded by some researchers as one of the 
main sources of improvement among students (Bai et al., 2018), although this 
may in part be due to a novelty effect.  
A more general positive impact on student motivation as a result of 
technology-supported personalised learning is also reported. This includes the 
adaptive and/or gamified capabilities of technology increasing the probability 
that students will remain engaged and challenged (Brunskill et al., 2010; Ma et 
al., 2020), in a way that can significantly increase their interest in learning (Lai 
et al., 2015) and aspirations for their future education level (Bai et al., 2018; Ito 
et al., 2019). Trials of emotionally intelligent personalised mathematics 
software that provides encouragement and support while students learn 
algebra indicate the creative potential of technology-supported personalised 
learning to simulate interactions similar to that provided by the teacher 
(Andallaza et al., 2012). Other research also reveals a strong positive correlation 
between performance and engagement (Mutahi et al., 2017). Questions 
remain, however, about whether such motivational benefits manifest across 
different age and subject groups. For instance, Ito and colleagues (2019) 
reported only a very slight change in motivation and self-esteem in younger 
learners following the introduction of a computer-aided instruction 
programme. Other issues must also be considered, including the problem of 
questions that do not challenge those at higher attainment levels (Kumar & 
Mehra, 2018) or how to prevent learners from ‘gaming’ a system to get better 
results (Mutahi et al., 2017).  

Theme 4: Potential challenges and barriers in 


implementation 
Cost 
As outlined above, due to the constraints of the RER process and scope, we do 
not differentiate between the distinct ways in which technology has been 
used to support personalised learning (i.e. whether this is implemented as a 
supplementary, integrative or substitute approach; see RQ1). Such 
heterogeneity presents a challenge to drawing firm conclusions about the 
costs associated with technology-supported personalised learning initiatives. 
Our findings in this regard are, therefore, tentative and further research is 
recommended to unpack such factors. Nonetheless, this initial exploration 
indicates that implementing technology-supported personalised learning 
Technology-Supported Personalised Learning: A Rapid Evidence Review 33 
EdTech Hub 

need not be prohibitively expensive, even if it may be somewhat more 


expensive than non-technology based solutions.  
Banerjee and colleagues (2007) reported the cost of a non-technology based 
tutor-led programme for developing primary school literacy and numeracy 
skills at US$2.25 per student per year, with technology-supported 
programmes costing $15.18 per student per year (including the cost of 
computers and assuming a five-year depreciation cycle). In terms of cost for a 
given improvement in test scores, therefore, scaling up the non-technology 
based programme would thus be much more cost effective (if it brings about 
a similar increase in test scores at a much lower cost). Other research has 
concluded that the implementation of one personalised-learning technology 
can be calculated as broadly on par with other interventions to improve 
student performance in LMICs (e.g. a girls scholarship program, cash 
incentives for teachers and new textbooks), though less cost-effective than 
remedial education and teacher training programmes (Linden, 2008). In an 
experiment by Ma and colleagues (2020), however, the researchers found that 
the marginal costs of paper workbooks are unsurprisingly lower than those 
associated with technology and lead to roughly similar effects on academic 
performance. Importantly they also do not require the high fixed costs and 
maintenance costs of computers, internet connections, and extra space to 
securely house such equipment. 
Such findings have prompted interest in how lower cost (and less 
resource-intensive) technology-supported personalised learning initiatives 
may be implemented in LMIC contexts — for instance, an adaptive multi-user 
software that splits screen resources and pushes different questions to 
individual input devices (Brunskill et al., 2010). Beyond an upfront investment, 
such software can be provided at low cost or even open access, which 
improves its scalability potential. Another approach includes 
computer-generated personalised homework, which is reported to be both 
somewhat effective (showing a 4.16% improvement in exam scores in a study 
involving 240 students) and inexpensive as associated costs can be spread 
over a large number of students when applied on a large scale (e.g. less than 
$1.00 per student; Kumar & Mehra, 2018).  
In summary, additional work is needed to explore the cost implications 
associated with technology-supported personalised learning initiatives. This is 
a complex matter that boils down to more than the cost of software 
development or purchasing of a device. Models of technology-supported 
personalised learning that charge fees may limit the ability of low-income 
students to access them (Muralidharan et al., 2019). Donated (up-to-date) 
hardware (Banerjee et al., 2007), ‘online’ programmes (e.g. Open Educational 
Technology-Supported Personalised Learning: A Rapid Evidence Review 34 
EdTech Hub 

Resources or Massive Open Online Course) and government-led initiatives 


may all play a role in enabling greater access to personalised and adaptive 
learning technology (Muralidharan et al., 2019).  
Infrastructure, scalability and sustainability 
In a similar manner, further research is needed to determine other factors 
involved in the broader EdTech ecosystem (including in relation to the 
potential to scale and sustain technology-supported personalised learning 
initiatives).  
Significant resource constraints and challenges (e.g. intermittent network 
connectivity, lack of battery power, etc) have been reported in the deployment 
of technology-supported personalised learning programmes, and this should 
be a consideration when developing systems for resource-constrained regions 
or countries (Mutahi et al., 2017). Weak technology infrastructure, poor 
equipment maintenance, poorly prepared technical support personnel, high 
frequency of electric supply problems, and unstable connections to the 
internet have all been reported to present problems; in addition, such 
technical difficulties may be more pronounced in students’ homes (Araya & 
Van der Molen, 2013). ‘Start-up’ costs associated with the development and 
maintenance of adaptive software have also been flagged as a potential 
concern, indicating how more research is needed on the trade-offs between 
adaptive versus non-adaptive software (Ma et al., 2020). In addition to 
technological deployment (technical issues such as lack of local servers and 
networks because of poor internet bandwidth and lack of technical assistance 
for the setup of computer labs), the potential impact of changing political 
priorities and teachers’ attitudes (owing to lack of confidence and engrained 
practices, particularly for more established teachers) for scalability and 
sustainability must also be considered (Casas et al., 2014). 
While ‘traditional’ software-based technology-supported personalised 
learning programmes may sometimes be particularly difficult and costly to 
implement (compared to other EdTech uses that potentially do not require as 
high a learner-to-device ratio), solutions that bypass some of these problems 
have been proposed (e.g. ‘online’ computer-assisted learning; Bai et al., 2018). 
Such an approach is reported to eliminate the need to manually install and 
maintain software in addition to enabling the ability to log in ‘anywhere and 
anytime’. Additional features, such as the integration of social functions (Bai et 
al., 2018), may open up new avenues for learning. Other personalised 
approaches, such as computer-generated personalised homework (Kumar & 
Mehra 2018), have also been reported as relatively easy to implement with 
minimal need for external monitoring. Moreover, one thing is clear from the 
literature: access to technology alone is insufficient (Ito et al., 2019). 
Technology-Supported Personalised Learning: A Rapid Evidence Review 35 
EdTech Hub 

   

Technology-Supported Personalised Learning: A Rapid Evidence Review 36 


EdTech Hub 

4. Recommendations 
Personalised learning in LMICs, as both a concept and a practice, remains in 
its infancy. In general, there is still much to learn about the potential benefits 
of personalised learning, including how learning environments that can adapt 
to the unique needs and strengths of students and allow them to have 
greater ownership of their learning may enable more meaningful and 
effective education (Gro, 2017). Nonetheless, this RER demonstrates that there 
is a growing base of evidence on the impact of technology-supported 
personalised learning to support school-age learners in LMIC contexts.  
Following a systematic search of the literature since 2006, 24 studies in 12 
countries were identified. On the whole, an encouraging and positive impact 
on learning outcomes is reported. As previously discussed, the limitations of 
the RER, heterogeneity of included studies, and fact that the majority of 
included research reports on the use of technology-supported personalised 
learning approaches in a school (or school-like) context must be considered 
when drawing conclusions. Despite these challenges, recommendations can 
be made to inform educational decision makers, including donors and those 
in government and NGOs, about the potential to use technology-supported 
personalised learning as a response to the current pandemic in LMICs:  
■ Technology-supported personalised learning appears to offer 
significant promise to improve learning outcomes, including 
potentially ‘out-of-class’ and ‘out-of-school’ learning.​ This has been 
successful in providing remedial instruction in mathematics and 
science. Further research is needed, however, to support these claims 
and it is important to note that most existing research conducted 
‘out-of-school’ has been in classroom-type settings with support from 
facilitators. It is also unclear how long any learning gains persist over 
time. 
■ The adaptive nature of technology-supported personalised learning 
to ‘teach at the right level’ is key as it enables students to learn at 
their own pace and according to their current proficiency.​ It can 
deliver individually customised resources and activities for all students 
regardless of the extent of heterogeneity in learning levels in the class. 
Importantly, these adaptive features appear to make a difference to 
learning, while technology with fewer personalised affordances does not 
seem to positively impact learning in the same way. Of particular 
significance in the context of mass school shutdowns, given that many 
learners are likely to require additional support upon returning to 

Technology-Supported Personalised Learning: A Rapid Evidence Review 37 


EdTech Hub 

school, is that technology-supported personalised learning may help 


most in closing educational gaps for marginalised learners. 
■ Technology-supported personalised learning may be most beneficial 
in closing educational gaps for lower-attaining students, potentially 
including those returning to school after an absence.​ Much of the 
evidence points to it being an effective avenue for delivering remedial 
instruction. Questions remain, however, about whether the approach is 
as effective for higher-attaining learners. Moreover, ‘personalised 
learning’ does not necessarily mean ‘individualised learning’; it can 
include group-level adaptation and some research points to the 
beneficial nature of student collaboration in this context (as in many 
others). Indeed, technology-supported personalised learning can also 
open up a range of other important pedagogical and motivational 
affordances (e.g. relating to feedback and the scaffolding of learning). 
■ Any introduction of personalised learning technology should not be 
interpreted as decreasing the importance of the teacher, but rather 
enhancing it.​ Technology-supported personalised learning approaches 
appear to have promise in helping to teach or remediate deficiencies in 
student understanding as well as in potentially helping teachers 
improve their subject and conceptual knowledge. This is particularly 
important to note when considering low-resource contexts where 
teaching quality may be low. Such approaches have potential to 
function as a medium for continuous learning beyond classroom 
instruction.  
■ Implications for cost and infrastructure are unclear, but using 
existing hardware solutions is likely to help to reduce costs and 
increase access.​ While significantly more research is needed into the 
costs associated with technology-supported personalised learning, a 
number of studies report that such an approach need not necessarily be 
prohibitively expensive. Whether the ‘added value’ of 
technology-supported approaches is sufficient to merit the additional 
expenditure remains to be determined. Using existing hardware 
solutions (e.g. mobile devices or desktop computers in those areas 
where these are readily available) can clearly help to reduce associated 
costs and enable greater numbers of students to access personalised 
learning through technology. In settings without sufficient 
infrastructure, it is likely that implementation costs will be high. 
Further robust quantitative, qualitative and/or secondary research is needed 
to investigate the various complex and nuanced factors associated with 
technology-supported personalised learning presented in the RER. In addition 
Technology-Supported Personalised Learning: A Rapid Evidence Review 38 
EdTech Hub 

to addressing questions relating to cost effectiveness, a particularly important 


consideration for future research is to understand which approach to the use 
of technology in personalising student learning will have the greatest impact 
on learning outcomes (including how this varies according to countries, 
culture and context). Integrated approaches to design, research and 
development (e.g. design-based research), that feature close collaboration 
with practitioners and learners as an integral part of the research process in 
order to solve ‘real-world’ educational problems, may be particularly fruitful. 
Such approaches can help to engender ‘buy in’ and avoid situations where 
personalisation technologies developed in higher-income countries are 
‘parachuted’ into LMICs (Zualkernan, 2016). Other avenues of research could 
include: rigorous comparison of EdTech personalised adaptive learning and 
non-EdTech personalised learning approaches; greater consideration of 
differences in the use of personalised technologies in urban and rural settings; 
nuanced investigations into learning outcomes (e.g. broken down by gender 
and level of achievement over time); how the role of teacher may change in 
the presence of personalised technology; and consideration of the 
motivational affordances of technology-supported personalised learning from 
both teacher and learner perspectives (particularly in contexts where a 
teacher may not be physically present with students).  
One important area noticeably absent from the analysis relates to the ethics of 
technology-supported personalised learning. There are, of course, many 
assumptions that underpin personalised technologies that warrant scrutiny. 
This includes whether there is a risk of perpetuating a narrow idea of what it 
means to ‘succeed’ academically (e.g. due to an overt focus on ‘traditional’ 
learning outcomes such as test scores); whether personalised learning risks 
promoting individualistic learning aspirations; whether valuing more ‘closed’ 
tasks over ‘open’ ones may be to the detriment of deeper learning 
experiences; and in what ways personalised data collection impinges upon 
students’ privacy.  
It is also worth noting how the majority of research to date has been 
undertaken in a school context. Many of the most disadvantaged learners will 
not have regular access to schooling in the traditional sense (much like in the 
present situation given the Covid-19 pandemic). Future technology-supported 
personalised learning initiatives should potentially look, therefore, to 
specifically target such learners, in particular lower-attaining students who are 
left behind in ‘business-as-usual’ instruction (Muralidharan et al., 2019).  
 
 

Technology-Supported Personalised Learning: A Rapid Evidence Review 39 


EdTech Hub 

 
 
 
 
 
 
 
 

   

Technology-Supported Personalised Learning: A Rapid Evidence Review 40 


EdTech Hub 

5. Annex A: Bibliography  
N.b. Those items astericked (*) represent those included in the final set of 24 
studies. 

*Andallaza, T. C. S., Rodrigo, M. M. T., Lagud, M. C. V., Jimenez, R. J. M., & 


Sugay, J. O. (2012). M
​ odeling the Affective States of Students Using an 
Intelligent Tutoring System for Algebra​. Proceedings of The Third 
International Workshop on Empathic Computing (IWEC). p. 4. 

*Araya, R., & Van der Molen, J. (2013). ​Impact of a blended ICT adoption 
model on Chilean vulnerable schools correlates with amount of on online 
practice​. In: Proceedings of the Workshops at the 16th International 
Conference on Artificial Intelligence in Education AIED. Memphis, USA, 
9–13 July 2013 
http://ceur-ws.org/Vol-1009/aied2013ws_volume6.pdf#page=43 

Arksey, H., & O'Malley, L. (2005). Scoping studies: Towards a 


methodological framework.​ International Journal of Social Research 
Methodology, 8​(1), 19–32 

Azevedo, J. P., Wagner De; Hasan, A., Goldemberg, D., Iqbal, S. A., Geven, K. 
M. (2020). S
​ imulating the Potential Impacts of COVID-19 School Closures 
on Schooling and Learning Outcomes : A Set of Global Estimates 
(English). Policy Research working paper; no. WPS 9284; COVID-19 
(Coronavirus). Washington, D.C. : World Bank Group. 
http://documents.worldbank.org/curated/en/329961592483383689/Simula
ting-the-Potential-Impacts-of-COVID-19-School-Closures-on-Schooling-a
nd-Learning-Outcomes-A-Set-of-Global-Estimates  

*Bai, Y., Tang, B., Wang, B., Auden, E., & Mandell, B. (2018). ​Impact of Online 
Computer Assisted Learning on Education: Evidence from a Randomized 
Controlled Trial in China​. REAP Working Paper, 51. 

*Banerjee A. V., Cole S., Duflo E. & Linden L. (2007). Remedying Education: 
Evidence from Two Randomized Experiments in India. T ​ he Quarterly 
Journal of Economics​, 1​ 22​(3): 1235–1264, 
https://doi-org.ezp.lib.cam.ac.uk/10.1162/qjec.122.3.1235  

Beetham, H. (2005). “Personalisation in the curriculum: A view from 


learning theory”. In S. de Freitas, S. & C. Yapp (eds.) P
​ ersonalizing learning 
in the 21st century​, pp. 17–24. Network Educational Press. 

*Brunskill, E., Garg, S., Tseng, C., Pal, J., & Findlater, L. (2010). ​Evaluating an 
Adaptive Multi-User Educational Tool for Low-Resource Environments​. 
Proceedings of the IEEE/ACM International Conference on Information 
and Communication Technologies and Development. 
http://www.cs.cmu.edu/afs/cs.cmu.edu/Web/People/ebrun/ictd_brunskill2
010.pdf 
Technology-Supported Personalised Learning: A Rapid Evidence Review 41 
EdTech Hub 

*Buchel, K., Jakob, M., Kuhnhanss, C., Steffen, D., & Brunetti, A. (2020). ​The 
Relative Effectiveness of Teachers and Learning Software. Evidence from 
a Field Experiment in El Salvador​. Working Paper No. 2006, Department 
of Economics, University of Bern. URL: w ​ ww.vwi.unibe.ch​ (last access: 
10.03.2020). 

*Casas, I., Imbrogno, J., Ochoa, S. F., & Vergara, A. (2014). Adapting a 
cognitive tutoring strategy for mathematics in Latin America. In Fifth 
International Workshop on Culturally-Aware Tutoring Systems (CATS2014) 
(p. 27). 

*Carrillo, P. E., Onofa, M., & Ponce, J. (2011). Information Technology and 
Student Achievement: Evidence from a Randomized Experiment in 
Ecuador. S ​ SRN Electronic Journal​.​ h
​ ttps://doi.org/10.2139/ssrn.1818756 

Cuban, L. (2018). ​A Continuum of Personalized Learning (Second Draft)​. 


https://larrycuban.wordpress.com/2018/09/27/second-draft-a-continuum-
of-personalized-learning/ 

Education from disruption to recovery (nd). UNESCO. Retrieved from 


https://en.unesco.org/covid19/educationresponse  

Escueta, M., Quan, V., Nickow, A. J., & Oreopoulos, P. (2017). E


​ ducation 
technology: An evidence-based review​. National Bureau of Economic 
Research. 

FitzGerald, E., Jones, A., Kucirkova, N., & Scanlon, E. (2018). A literature 
synthesis of personalised technology-enhanced learning: What works 
and why. ​Research in Learning Technology​, ​26​. 
https://doi.org/10.25304/rlt.v26.2095 

Garritty, C., Gartlehner, G., Kamel, C., King, V., Nussbaumer-Streit, B., 
Stevens, A., Hamel, C., & Affengruber, L. (2020). C​ochrane Rapid Reviews: 
Interim Guidance from the Cochrane Rapid Reviews Methods Group​. 
Cochrane. 

*Gambari, I. A., Gbodi, B. E., Olakanmi, E. U., & Abalaka, E. N. (2016a). 


Promoting Intrinsic and Extrinsic Motivation among Chemistry Students 
using Computer-Assisted Instruction. ​Contemporary Educational 
Technology​, ​7(​ 1), 25–46.​ ​https://doi.org/10/ggv6gc 

*Gambari, A. I., Shittu, A. T., Falode, O. C., & Adegunna, A. D. (2016b). Effects 
of computer-self interactive package (CSIP) on students’ performance, 
achievement level and attitude toward mathematics at secondary school 
in Nigeria. ​Al-hikma Journal of Education​, 3 ​ ​(1), 14.  

*Gambrari, I. A., Yusuf, M. O., & Thomas, D. A. (2015). Effects of 


Computer-Assisted STAD, LTM and ICI Cooperative Learning Strategies 
on Nigerian Secondary School Students’ Achievement, Gender and 
Motivation in Physics. ​Journal of Education and Practice​, ​6​(19), 16–28. 

Technology-Supported Personalised Learning: A Rapid Evidence Review 42 


EdTech Hub 

Gough, D. (2007). Weight of evidence: a framework for the appraisal of the 


quality and relevance of evidence. ​Research papers in education​, 2
​ 2​(2), 
213-228. 

Gro, J. S. (2017). ​The State of the Field & Future Directions​ (p. 47). 
www.curriculumredesign.org  

Holmes, W., Anastopoulou, S., Schaumburg, H., & Mavrikis, M. (2018). 


Technology-enhanced Personalised Learning: Untangling the Evidence 
[Other]. Robert Bosch Stiftung GmbH. 
http://www.studie-personalisiertes-lernen.de/en/ 

*Ige, O. A. (2019). Impact of Computer-Assisted Instructional Strategy on 


Schoolchildren’s Social Skills. ​Journal of Social Studies Education 
Research​, ​10​(4), 490–505. 

*Ito H., Kasia K. & Nakamuro M. (2019). "Does Computer-aided Instruction 


Improve Children's Cognitive and Non-cognitive Skills?: Evidence from 
Cambodia." Discussion papers 19040, Research Institute of Economy, 
Trade and Industry (RIETI). 
https://ideas.repec.org/p/eti/dpaper/19040.html  

Kishore, D., & Shah, D. (2019). ​Using technology to facilitate educational 


attainment: Reviewing the past and looking to the future​. Pathways for 
Prosperity Commission Background Paper Series, No. 23, p. 47. 

*Kumar, A., & Mehra, A. (2018). ​Remedying Education with Personalized 


Homework: Evidence from a Randomized Field Experiment in India​ . 
Social Science Research Network ​(SSRN Scholarly Paper ID 2756059). 
https://doi.org/10.2139/ssrn.2756059 

*Lai, F., Luo, R., Zhang, L., Huang, X., & Rozelle, S. (2015). Does 
computer-assisted learning improve learning outcomes? Evidence from a 
randomized experiment in migrant schools in Beijing. ​Economics of 
Education Review​, ​47​, 34–48.​ h
​ ttps://doi.org/10/f7m2vj 

*Lai, F., Zhang, L., Hu, X., Qu, Q., Shi, Y., Qiao, Y., Boswell, M., & Rozelle, S. 
(2013). Computer assisted learning as extracurricular tutor? Evidence from 
a randomised experiment in rural boarding schools in Shaanxi. J ​ ournal of 
Development Effectiveness​, ​5(​ 2), 208–231.​ ​https://doi.org/10/ggdcnf 

Levac, D., Colquhoun, H., & O'Brien, K. K. (2010). Scoping studies: 


Advancing the methodology. ​Implementation Science​, 5 ​ (​ 1), 69. 

*Linden, L. L. (2008). ​Complement or Substitute? The Effect of Technology 


on Student Achievement in India​. JPAL Working Paper 

*Ma, Y., Fairlie, R., Loyalka, P., & Rozelle, S. (2020). ​Isolating the “Tech” from 
EdTech: Experimental Evidence on Computer Assisted Learning in China 
National Bureau of Economic Research (No. w26953; p. w26953). 
https://doi.org/10.3386/w26953 
Technology-Supported Personalised Learning: A Rapid Evidence Review 43 
EdTech Hub 

*Mo, D., Swinnen, J., Zhang, L., Yi, H., Qu, Q., Boswell, M., & Rozelle, S. (2013). 
Can One-to-One Computing Narrow the Digital Divide and the 
Educational Gap in China? The Case of Beijing Migrant Schools. W ​ orld 
Development​, 4
​ 6​, 14–29.​ ​https://doi.org/10.1016/j.worlddev.2012.12.019 

*Mo, D., Zhang, L., Luo, R., Qu, Q., Huang, W., Wang, J., Qiao, Y., Boswell, M., 
& Rozelle, S. (2014). Integrating computer-assisted learning into a regular 
curriculum: Evidence from a randomised experiment in rural schools in 
Shaanxi. ​Journal of Development Effectiveness​, 6​ ​(3), 300–323. 
https://doi.org/10/gf5f39 

*Muralidharan, K., Singh A. & Ganimian A. J. (2019). "Disrupting Education? 


Experimental Evidence on Technology-Aided Instruction in India." 
American Economic Review​, ​109​(4): 1426–60. DOI: 10.1257/aer.20171112 

*Mutahi, J., Bent, O., Kinai, A., Weldemariam, K., Sengupta, B., & Contractor, 
D. (2015). Seamless blended learning using the Cognitive Learning 
Companion: A systemic view. ​IBM Journal of Research and Development​, 
59​(6), 8:1-8:13.​ h
​ ttps://doi.org/10/ggxwf9 

*Mutahi, J., Kinai, A., Bore, N., Diriye, A., & Weldemariam, K. (2017). Studying 
engagement and performance with learning technology in an African 
classroom. ​Proceedings of the Seventh International Learning Analytics 
& Knowledge Conference​, 148–152.​ ​https://doi.org/10/ggvw56 

No evidence to back the idea of learning styles (2017, March). T


​ he 
Guardian​. 
https://www.theguardian.com/education/2017/mar/12/no-evidence-to-bac
k-idea-of-learning-styles​ (Accessed 26/06/20) 

Office of Educational Technology. (2017). R


​ eimagining the Role of Technology 
in Education: 2017 National Education Technology Plan Update​. U.S. 
Department of Education.​ ​http://tech.ed.gov 

*Ogan, A., Walker, E., Baker, R. S. J. D., Rebolledo Mendez, G., Jimenez 
Castro, M., Laurentino, T., & de Carvalho, A. (2012). Collaboration in 
cognitive tutor use in latin America: Field study and design 
recommendations. P ​ roceedings of the SIGCHI Conference on Human 
Factors in Computing Systems​, 1381–1390.  

Perera, M.; Aboal, D. (2017a). Evaluación del impacto de la plataforma 


adaptativa de 
matemática en los resultados de los aprendizajes. Centro de 
Investigaciones 
Económicas, Montevideo.  
 
Perera, M.; Aboal, D. (2017b). Diferencias por género y contexto 
socioeconómico del  

Technology-Supported Personalised Learning: A Rapid Evidence Review 44 


EdTech Hub 

impacto de la Plataforma Adaptativa de Matemática 1 - PDF Descargar 


libre.  
entro de Investigaciones Económicas, Montevideo. 
 

Ryan R. (2013). C​ ochrane Consumers and Communication Review Group​. 


‘Cochrane Consumers and Communication Review Group :data synthesis 
and analysis’. h
​ ttp://cccrg.cochrane.org  

Stott, A., & Case, J. M. (2014). Electronic Tutoring as a Tool for Promoting 
Conceptual Change: A Case Study of In-service Science Teacher 
Workshops. ​African Journal of Research in Mathematics, Science and 
Technology Education​, 1​ 8​(2), 139–150.​ ​https://doi.org/10/ggvw59 

Thomas, J., & Harden, A. (2008). Methods for the thematic synthesis of 
qualitative research in systematic reviews.​ BMC Medical Research 
Methodology​, 8​ ​(1), 45. 

UNESCO (2019). Education from disruption to recover. Retrieved from 


https://en.unesco.org/covid19/educationresponse 

Zaulkerman, I., Arroyo, I., & Woolf, B. (2013). T


​ owards Localization of 
Automated Tutors for Developing Countries​. AIED Workshops. 
http://ceur-ws.org/Vol-1009/aied2013ws_volume6.pdf#page=6 

Zualkernan, I. A. (2016). Personalized Learning for the Developing World. 


In T
​ he Future of Ubiquitous Learning​ (pp. 241–258). Springer.  

   

Technology-Supported Personalised Learning: A Rapid Evidence Review 45 


EdTech Hub 

6. Annex B: Search terms  


 

Records  After 
Source  Search terms  returned  screening 

Google Scholar (GS)  “Personalised Adaptive Learning”  132  2 


GS  "Personalized Adaptive Learning"  619  3 
“Personalised 
GS  technology-enhanced learning”  34  4 
“Personalized 
GS  technology-enhanced learning”  76  12 
“Technology-enhanced 
GS  personalised learning”  18  6 
“Technology-enhanced 
GS  personalized learning”  30  18 
GS  “Personalised TEL”  13  5 
GS  “Personalized TEL”  11  3 
“Personalised learning 
GS  environment”  593  20 
“Personalized learning 
GS  environment”  3490  5 
GS  “Teaching at the right level”  266  5 
"Combined Activities for 
GS  Maximized Learning"  15  1 
... (“Edtech” OR “Education 
technology” OR “digital learning” 
OR "eLearning" OR school) AND 
("africa" OR “LMIC" OR "developing 
world” OR “developing country*” 
  OR “ICT4D” OR “global south”)      
GS  “Personalised education” AND   160  6 
GS  “Personalized education” AND  626  6 
GS  “Personalised learning” AND   1810  6 
GS  “Personalized learning” AND   3660  5 
GS  “adaptive learning” AND  6910  1 
GS  “adapting learning” AND  396  5 
GS  “Differentiated learning” AND )  1310  8 
“Computer-assisted instruction” 
GS  AND   6160  27 
GS  “Computer-assisted learning” AND   8130  10 
GS  “Computer-aided learning” AND   1530  3 

Technology-Supported Personalised Learning: A Rapid Evidence Review 46 


EdTech Hub 

GS  “Intelligent tutoring system” AND   765  5 


“Exploratory learning 
GS  environments” AND   33  0 
“Adaptive Educational 
GS  Hypermedia” AND   112  2 
GS  “Adaptive hypermedia” AND   414  1 
“Personalised Adaptive Learning” 
GS  AND   7  3 
"Personalized Adaptive Learning" 
GS  AND   43  3 
SPUD (TE or (TT and PP))  Teaching at the Right Level  2  2 
SPUD (TE or (TT and PP))  TARL  43  0 
SPUD (TE or (TT and PP))  personalised  534  4 
SPUD (GC.HM or GR or 
GD)  personalized  255  2 
SPUD (GC.HM or GR or 
GD)  adaptive learning  42  3 
SPUD (GC.HM or GR or 
GD)  intelligent tutoring system  76  8 
SPUD (GC.HM or GR or 
GD)  computer assisted learning  20  4 
TOTAL    38,335  198 
 

7. Annex C: Data description spreadsheet  


 

Available here 

 
 

Technology-Supported Personalised Learning: A Rapid Evidence Review 47 

You might also like