You are on page 1of 20

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/343617537

System design of Pintrich's SRL in a supervised-PLE platform: a pilot test in


higher education

Article  in  Interactive Learning Environments · August 2020


DOI: 10.1080/10494820.2020.1802296

CITATIONS READS

4 276

3 authors, including:

Xiaoshu Xu
Wenzhou University
13 PUBLICATIONS   44 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

personal learning environment View project

All content following this page was uploaded by Xiaoshu Xu on 18 September 2020.

The user has requested enhancement of the downloaded file.


Interactive Learning Environments

ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/nile20

System design of Pintrich's SRL in a supervised-PLE


platform: a pilot test in higher education

Xiaoshu Xu , Xiaoshen Zhu & Fai Man Chan

To cite this article: Xiaoshu Xu , Xiaoshen Zhu & Fai Man Chan (2020): System design of
Pintrich's SRL in a supervised-PLE platform: a pilot test in higher education, Interactive Learning
Environments, DOI: 10.1080/10494820.2020.1802296

To link to this article: https://doi.org/10.1080/10494820.2020.1802296

View supplementary material

Published online: 12 Aug 2020.

Submit your article to this journal

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=nile20
INTERACTIVE LEARNING ENVIRONMENTS
https://doi.org/10.1080/10494820.2020.1802296

System design of Pintrich’s SRL in a supervised-PLE platform: a


pilot test in higher education
a
Xiaoshu Xu , Xiaoshen Zhua and Fai Man Chanb
a
School of Foreign Studies, Wenzhou University, Wenzhou, People’s Republic of China; bMacau Polytechnic Institute,
Macau, People’s Republic of China

ABSTRACT ARTICLE HISTORY


Personal Learning Environment (PLE) represents a shift of learning Received 12 January 2019
paradigm towards learner-centered pedagogy, where users become Accepted 24 July 2020
masters of their own learning. PLEs are best used by learners with Self-
KEYWORDS
regulated Learning (SRL) abilities. Previous research showed that learners System design; IDEF0;
felt lost or socially isolated in PLEs due to their limited SRL abilities. To supervised-PLE; Pintrich’s
increase the accessibility of the PLEs, this research used IDEF0 to analyze SRL; higher education
the system design of Pintrich’s SRL in a PLE platform. A supervised-PLE
platform was developed with International English Language Test (IELTS)
as its subject matter and was pilot-tested for two years in one of the
largest comprehensive universities in East China. Distance Education
Learning Environments Survey (DELES) was carried out among 285
learners to test their perceptions of the PLE-IELTS platform and MOOC
engagement scale (MES) survey was used among 96 learners to diagnose
their engagement on the platform. The DELES survey result showed that
the majority of the respondents were positive towards the PLE platform.
MES survey result indicated an active engagement on the platform, with
cognitive engagement scored the highest compared to social
engagement as the lowest. Future research on the system design of the
SRL in the PLE platform was suggested.

1. Introduction
The ongoing transition from Learning Management System (LMS) towards Web 2.0-based personal-
ized learning systems emphasizes learners’ needs and preferences. Individual technologies which
embody the spirit of the Personal Learning Environment (PLE) have emerged over recent years,
such as wikis and weblogs, along with newsreaders, communication tools, and calendaring software,
etc. PLEs gradually draw significant attention from educational organizations due to the potential
educational and cost benefits (Monova-zjeleva, 2005; Tseng et al., 2008).
However, can PLEs be combined with formal education? The concept of PLEs is listed in the 2011
Horizon Report as an emerging technology that is likely to have a large impact on teaching and learn-
ing within education around the globe and time-to-adoption of four to five years (Johnson et al.,
2011). Some research has already revealed that PLEs could help integrate formal and informal learn-
ing in the higher education context (McGloughlin & Lee, 2010).
Can students take advantage of the benefits that PLEs have to offer for formal education?
Learners might not be able to learn independently as expected because of the complexity of
the web-based learning system (Mcloughlin & Luca, 2002). Learning in these hybrid learning

CONTACT Xiaoshu Xu lisaxu@wzu.edu.cn School of Foreign Studies, Wenzhou University, Wenzhou 325035, People’s
Republic of China
Supplemental data for this article can be accessed at https://doi.org/10.1080/10494820.2020.1802296
© 2020 Informa UK Limited, trading as Taylor & Francis Group
2 X. XU ET AL.

environments typically involves the use of numerous self-regulatory processes such as planning,
reflection, and metacognitive monitoring and regulation (Azevedo, 2005, 2007, 2008, 2009;
Greene & Azevedo, 2009, 2010; Moos & Azevedo, 2008; Veenman, 2007). Bartolomé and
Steffens (2011) point out that it is vital that learning within such learner-centered environments
be accompanied by strategies that promote Self-regulated Learning (SRL), to remain attentive,
motivated, and engaged in learning tasks. SRL is an important factor in determining learners’
success in online learning environments (Artino, 2008; Dabbagh & Kitsantas, 2005; Puzziferro,
2008).
Unfortunately, the typical learner does not engage in these complex adaptive cognitive and meta-
cognitive processes during learning with Student-Centered Learning Environments (Azevedo & With-
erspoon, 2009; Biswas et al., 2010). SRL is challenging to many students in a technology-mediated
learning environment, especially in an online learning environment, where they may lack immediate
support and feel lost or socially isolated (Cho et al., 2010; Sun & Rueda, 2012). As authors such as Clark
et al. (2009) point out, learners need support, guidance and training to make the best use of social
media. Otherwise, the potential and effectiveness of PLEs as technology-mediated learning environ-
ments will be limited.
Is it possible to develop a PLE platform that is accessible by learners without special training on
SRL? Theoretically, SRL improves with practice, requires an “enabling environment”, and can be
supported by ICT tools (Nussbaumer et al., 2014). PLE as a self-regulated and evolving environ-
ment of tools, services and resources can help to foster learners’ skills in managing, monitoring,
and reflecting their own learning (McGloughlin & Lee, 2010). A PLE platform developed at the gui-
dance of SRL Model (SRL-guided PLE) is assumed to help regulate learners’ activities.
Is there any existing language teaching-learning system in formal education that can promote self-
regulated learning? Learning Management System (LMS) was widely adopted worldwide in formal
education. However, LMS considers a learning environment as a predefined element in instruction
created and controlled by an educational authority. Learners have little power on self-organization
of different learning activities within LMS due to the pre-defined settings. Thus, an SRL-guided super-
vised-PLE platform is worth of investigating.

2. Literature review
2.1. Definition of PLE
PLE is a user-customized environment based on the understanding of learning as an ongoing process
with numerous resources providers. Most scholars consider PLEs as personal systems/environments
or collections of tools and external services. The definitions of PLE can be classified into two perspec-
tives: (1) knowledge management perspective; (2) technical perspective.
From knowledge management perspective, Harmelen (2006) claims that PLEs are systems that
provide support for learners to set their learning goals, manage their learning, and communicate
with others during achieving their learning goals. Attwell (2007) takes PLE as a virtual learning eco-
logical system which is not simply a technical term, but a display of technique supported individua-
lized learning style. Rubin (2010) and McGloughlin and Lee (2010) believe that PLEs empower learners
to control their own learning by prompting them to select tools and resources to create, organize and
package learning content (Sanders, 1992). Rubin adds that PLEs emphasize individual responsibility
for organizing learning. PLEs are built on externally hosted (in-the-cloud) Web 2.0 tools and services
designed to help learners aggregate and share resources, participate in collective knowledge gener-
ation, and manage their own meaning making (Dabbagh & Reo, 2011) instead of integrating different
services into a centralized system.
From technical perspective, PLE can be viewed as a concept that the learner use technology to
choose the appropriate tools and resources to facilitate learning. Stephen Downes (2005) believes
that PLE is “a personal learning center, where content is reused and remixed according to the
INTERACTIVE LEARNING ENVIRONMENTS 3

learner’s own needs and interests. It was a collection of interoperating applications – an environment
rather than a system”. FitzGerald (2006) defines PLE as “a collection of free, distributed, web-based
tools, usually centered around a blog, linked together and aggregating content using RSS (Really
Simple Syndication) feeds and simple HTML scripts”.
Both perspectives agree that PLE has reflexive functionalities such as goals setting, learning
resources and results control, support to the internalization/externalization of learning processes
and results. The authors of this project agreed that PLE was not a specific software application nor
a one-size-fits-all learning environment but a concept to have learner-centric Web 2.0-based environ-
ments individually designed. In this study, the PLE is supervised, which means the non-tracked learn-
ing be integrated into learning analysis using highly elaborated and intertwined learning analytic
system. Learners can monitor and regulate their learning on the PLE platform with the helps from
peers, instructors as well as the system.

2.2. Definition of self-regulated learning


Self-regulated learning (SRL) theory emerges in the mid1980s, aiming to understand the cognitive,
metacognitive, behavioral, motivational, and emotional/affective aspects of learning. SRL is distin-
guished for whether the learner displays personal initiative, perseverance, and adaptive skill in pur-
suing it. The theory has become one of the most important areas of research within educational
psychology, and is regarded as a vital factor for learning success even in the digital era (Nussbaumer
et al., 2015; Zimmerman & Schunk, 2008). The field of SRL consists of various theoretical perspectives
that make different assumptions and focus on different constructs, processes, and phases (Azevedo
et al., 2010; Dunlosky & Lipko, 2007; Pintrich et al., 2000; Zimmerman, 2008).
In literature, there are six widely known SRL models which have consolidated theoretical and
empirical background, such as Zimmerman’s Social Cognitive Model of Self-regulation (1990,
1998), Boekaerts’ Model of Adaptable Learning (1992), Winne’s Four-stage Model of Self-regulated
Learning (Winne & Hadwin, 1998); Pintrich’s General Framework for SRL (2000). Pintrich organized
SRL research using a taxonomy focusing on the phases and areas of self-regulation. His work
addresses for the first time the relationship of SRL and motivation (Pintrich et al., 1993). Whereas
to bring online-learning into full play demand learners’ high self-regulation (Azevedo et al., 2004)
and motivation. This requirement also works for PLE which shares the basic features of online
learning.
According to Pintrich (2000) model (see Table 1), SRL is compounded by four phases: (1) Fore-
thought, planning and activation; (2) Monitoring; (3) Control; and (4) Reaction and reflection. The
various areas in which self-regulation can occur fall into four broad categories: cognition, motiv-
ation/affect, behavior and context. By crossing phases and areas, Pintrich presents a four-by-four
grid which offers a comprehensive picture that includes a significant number of SRL processes
(e.g. prior content knowledge activation, efficacy judgments, self-observations of behavior).
To sum up, self-regulated learning can be regarded as a skill in setting goals and attaining them,
while students must be driven to self-regulate their learning and direct their own behavior (Kitsantas
& Dabbagh, 2010), and given advices on how to choose and use appropriate learning strategies
(Azevedo et al., 2004).

2.3. Existing SRL with learning environment


In terms of the contemporary research on SRL with Student Centered Learning Environments (SCLEs),
from macro-level, most of the study have drawn on Winne and colleagues’ (Butler & Winne, 1995;
Winne, 2001; Winne & Hadwin, 1998) Information Processing Theory (IPT) of SRL. Their models
focus on the underlying cognitive and metacognitive processes, the accuracy of metacognitive judg-
ments, and control processes used to achieve particular learning goals (Hacker et al., 2009).
4 X. XU ET AL.

Table 1. Phrases and areas of Pintrich’s self-regulated learning model.


Phases Areas for regulation
Cognition Motivation/effect Behavior Context
Forethought, Target and setting Goal orientation adoption {time and effort {perception of
planning, and Prior content planning} task}
activation knowledge activation Efficacy judgements {planning for self- {perception of
metacognitive observation of context}
knowledge activation Ease of learning behavior}
judgements(EOLs);
perceptions of task
difficulty
Task value activation
internet activation
Monitoring Metacognitive awareness awareness and monitoring awareness and monitoring
and monitoring of of motivation and affect monitoring of effect, changing
cognition (FOKs, JOLs) time use, need for help task and
self-observation of context
behavior condition
Control selection and adaptation of selection and adaptation increase/decrease effort change or
cognitive strategies for of strategies for persist, give up renegotiate
learning⍰thinking managing motivation Help-seeking behavior task
and affect change or leave
context
Reaction cognitive judgements Affective reactions choice behavior evaluation of task
and attributions Attributions evaluation of
reflection context

From micro-level, Azevedo, et al. follow Winne’s model to provide a detailed analysis of the cog-
nitive and metacognitive processes used by learners of all ages when they were using SCLEs, includ-
ing hypermedia, simulations, intelligent tutoring systems, and multi-agent learning environments
(Azevedo & Witherspoon, 2009; Greene & Azevedo, 2009, 2009).
Azevedo et al. (2012) describe the assumptions and components of a leading information-proces-
sing model of SRL, provide examples and definitions of key specific metacognitive monitoring pro-
cesses and regulatory skills used when learning with SCLEs, and provide implications for the future
of student-centered SCLEs that focus on metacognition and SRL.
Kinnebrew et al. (2010) develop learning by teaching environment named Betty’s Brain project
which takes Pintrich’s (2000) self-regulation model, and utilizes learning-by-teaching and social con-
structive learning frameworks to help students learn science and mathematics topics in a self-
directed and open-ended setting. Their teaching interactions and agent feedback support students’
engagement and promote the development and use of educationally-productive cognitive and
metacognitive processes. They use hidden Markov models to analyze students’ activity sequences
which were claimed of good results.
As for SRL with PLEs, Hiebert (2006) draws heavily on ELGG and e-Portfolios in his Collecting-
Reflecting-Connecting-Publishing PLE Model which manages past, present and future learning
through a set of self-directed learning tools and generic activities. Nussbaumer et al. (2014)
develop a framework that integrates guidance and reflection support for SRL in the PLEs platform
by implementing a learner model, SRL widgets, monitoring and analytic tools, however, the platform
was mainly used for research purpose. Informed by the Technology Acceptance Model and Inno-
vation Diffusion Theory, Chatterjee et al. (2011) develop a framework for the adoption and
diffusion of PLE in commercial organizations with ten factors categorized in four high-level cat-
egories. Some scholars develop self-monitoring tools and tools that reflect learning behavior in
PLE. For example, Schmitz et al. introduce CAMera, a tool for monitoring and reporting on learning
behavior by collecting usage metadata from diverse application programs.
To sum up, the above studies explained the correlation of metacognition and SRL as important
components for developing effective learning in SCLEs and the way they were embodied in the
INTERACTIVE LEARNING ENVIRONMENTS 5

learning environments. However, most of the previous research focuses on fostering certain aspect of
learners’ SRL from a Technology Enhanced Learning (TEL) perspective by designing a system/model
or a tool/software. This project aims to develop a system design of SRL in PLE, using IDEF0 analysis to
generate a model with a hierarchical series of diagrams, text, and glossary cross-referenced to each
other. The system design of SRL could be used by system designers as a reference for PLE function
and features development, at the meantime, the instructors could use it to design their own PLE-
based courses.

2.4. Existing language teaching-learning systems


There have been various review studies on mobile and ubiquitous learning (Saleh & Bhat, 2015).
These new strategies encourage the use of personal devices (tablets, smartphones, laptops) in
language learning classes, integrating the idea of BYOD (Bring-Your-Own-Device). It is claimed that
m-learning has a positive effect on learners’ engagement (Hargis et al., 2014), and can develop lear-
ners’ habit of self-learning and reflection (Wong & Looi, 2010). Most mobile assisted language learn-
ing (MALL) applications lack the fundamentals of theory and methodology (Moreno & Vermeulen,
2015); m-learning itself lacks standardization and comparability (Viberg & Grönlund, 2013).
Meanwhile, Learning Management Systems (LMS) software is extensively used in many Higher
Education language courses, for instance, Edmodo, Moodle and Blackboard Capterra. However,
some LMS software have a great number of implemented learning modules, which makes the plat-
form complex to configure, such as Moodle; some software, such as blackboard requires a payment of
licenses; and some other software, such as Socrative and Kahoot!, only provide single functionality of
the language learning-teaching process.
In sum, the existing language learning system research focuses on software, mobile devices or
technologies, however, none of these alternatives comply with every requirement. They either
need to integrate several software tools or can only implement limited educational tasks. This
research aims to develop a system design of SRL in the PLE platform using IDEF0.

3. Research design
3.1. Research problem
Based on the above analysis, this study aims to tackle the following problem:
How to model the process of an SRL system in a PLE platform to enhance platform accessibility?
To tackle with this problem, the study tried to make a system design of Pintrich’s (2000) SRL in the
PLE platform using IDEF0 and construct a PLE-IELTS platform to pilot test it in one of the largest com-
prehensive universities in East China.

3.2. Research methodology


The research methodology was divided into five steps: (1) Situation analysis and problem definition;
(2) system design of Pintrich’s SRL in a PLE platform using IDEF0; (3) application of the SRL system
design in the PLE-IELTS platform (4) application of the PLE-IELTS platform; (5) evaluation of the
PLE-IELTS platform.

3.3. Situation analysis and problem definition


The IELTS course in the subject university in China started in 2015, as a one-semester selective course
for sophomores. Since 2015, the university attached importance to internationalization, many short-
term and long-term projects with international universities and companies were booming, most of
which required IELTS score.
6 X. XU ET AL.

However, a handful of problems emerged during the application of the project, among them, the
major challenges were: firstly, a good number of learners selected the course for the sake of credits
rather than interests or needs which led to weak engagement; secondly, learners’ English proficiency
and majors were diversified in one class (an average of 8 different majors per class); thirdly, the sche-
dule for the required two-week internship differed among different majors.
After a comprehensive situational analysis, it was realized that these obstacles could all be tackled
by the construction of a Personal Learning Environment. In PLEs, learners make decisions and have a
choice on the content, the sequence of learning steps, and most importantly, the learning tools and
use of these tools to support individual learning (Buchem et al., 2014). Thus, their intrinsic motivation
and extrinsic motivation could be raised which can greatly enhance their learning outcomes
(Steffens, 2006; Fruhmann et al., 2010). To guarantee a more accessible PLE platform as discussed
above, the construction of an SRL-guided PLE-IELTS platform was put on the agenda.

3.4. System design of SRL in a PLE platform using IDEF0


In Pintrich’s taxonomy model, there were four general phases and four different aspects of self-regu-
lation (see Table 1). To apply this model in PLE, an activity to do list was designed according to Pin-
trich’s phases and areas for self-regulated learning table (Pintrich’s, 2004). (see Table 2) It should be
known that although the four phases in the table represented a time-ordered sequence that learners
would go through as they perform a task, in practice, the phases were not hierarchically or linearly
structured (Pintrich, 2000).
Process modeling and analysis are needed to identify the major functions required to guarantee
the implementation of SRL in the PLE platform properly. This research applied the IDEF0 (Integration
DEFinition level 0), which is used as functional modeling methodology for the analysis and develop-
ment of systems. IDEF0 describes any process as a series of linked activities, each with inputs and
outputs. External or internal factors control each activity, and each activity requires one or more
mechanisms or resources (Fülscher & Powell, 1999).
Figure 1 indicates how IDEF0 is used to depict inputs, outputs, controls, and mechanisms.

Inputs are data or objects that are consumed or transformed by an activity.


Outputs are data or objects that are the direct result of an activity.
Controls are data or objects that specify conditions that must exist for an activity to produce correct
outputs.

Table 2. SRL activity to do lists.


Cognition Motivation/effect Behavior Context
Forethought, 1.1.1 Set target goal 1.2.1 Adopt goal 1.3.1 Plan time & 1.4.1 Interpret the task &
planning & 1.1.2 Activate prior orientation effort context
Activation content knowledge 1.2.2 Make self-efficacy 1.3.2 Plan for self-
1.1.3 Activate judgement observation of
metacognitive 1.2.3 Understand task behavior
knowledge difficulty & activate
task value and interest
Monitoring 2.1.1 Metacognitive 2.2.1 Detect & adapt 2.3.1 Monitor effort, 2.4.1 Monitor the task
awareness and motivation time, behavior, and and context
monitoring of cognition help seek
Control & 3.1.1 Select & use 3.2.1 Select & adapt 3.3.1 Increase/ 3.4.1 Structure the
Regulation cognitive strategies motivation managing decrease effort environment to
strategies 3.3.2 Persist/give up facilitate goals & task
3.3.3 Seek help completion
Reaction & 4.1.1 Make adaptive 4.2.1 Reflect the emotion 4.3.1 Reflect on actual 4.4.1 Evaluate the
Reflection attributions for the & reasons for the behavior environment and tasks
performance outcome
Source: Pintrich, Pintrich (2004). A conceptual framework for assessing motivation and self-regulated learning in college students.
Educational Psychology Review, 16, 385-407.
INTERACTIVE LEARNING ENVIRONMENTS 7

Figure 1. Integration Definition for Function Modelling (IDEF0) box format.

Mechanisms (or resources) support the successful completion of an activity but are not changed in
any way by the activity.
The application of IDEF0 to a system can show the interrelationships between different pro-
cedures, thus the impact of changing or updating any of these procedures on the remaining ones
would be clarified. IDEF0 would generate a model, consisting a hierarchical series of diagrams,
text, and glossary cross-referenced to each other (Perera & Liyanage, 2001). The following paragraphs
would describe the proposed system design of SRL in a supervised-PLE platform from the perspective
of system user (especially the instructors), meanwhile, system designers could also use it as a user
requirement manual.
(1) The top-level diagram
The top-level context diagram (see Figure 2) was titled “Implement SRL in the PLE platform”. The
purpose of the analysis was to facilitate the implementation of SRL in the PLE platform. The diagram
was decomposed to the next level diagram A0 which showed the basic four-process cycle of self-
regulated learning. (see Figure 3) This proposed SRL learning structure helped the users to design
courses on PLEs through guided activities and tasks in the latitudes of input, control and mechanism,
where each activity is clearly decomposed into sub-activities.
(2) The four Nodes of the SRL diagram
(1) Node 1: Initiate learning in the course design
Node 1 was decomposed into three main activities (see Figure 4). The first activity was forethink;
where, the system would guide learners in setting learning goals, making self-assessment and self-
efficacy judgement report as well as task statement report, which were based on course pretest
results and constrained by learning outcomes as well as the PLE principle.
The second activity was plan, the input of which were self-efficacy judgement report and task
statement report generated from the first activity and the pretest results. The output was weekly
schedules.
The third activity was action, which needed the input of the pretest result and would generate
revised learning goals and learners’ present knowledge level as a supplement input for the
second activity.
(2) Node 2: Monitor learning in the course design
The decomposition of node 2 was shown in Figure 5, which was composed of two main activities.
The first activity was perceived learning status, where, the system would guide learners to know their
learning mastery level and motivation strategy alternatives based on the input such as task list, learn-
ing gap and weekly schedule. This activity was constrained by learning goals set in Node 1 and sup-
ported by the judgement of learning rubrics.
The second activity was monitor learning process, where learners would have a clear understand-
ing of their weekly schedule status, task list status, and the problem list based on the input such as the
8 X. XU ET AL.

Figure 2. Implement SRL in the PLE platform.

task list, learning profile and weekly schedule. The whole process was monitored by the learning mile-
stone and supported by task management software/tools.
(3) Node 3: Control learning in the course design
Node 3 was composed of two main activities shown in Figure 6. The first activity was select cog-
nitive & metacognitive strategies, where the system would insist learners in producing selected cog-
nitive strategies and motivation strategies, and help-seeking item list based on the input such as
weekly schedule, task list and problem list. The process was compelled by the learning goals and
assisted by decision support tools and survey/questionnaire as mechanism.
The second activity was task management, where all the input in the first activity became the basis
for helping learners update weekly schedule and task list, and generate learning modules. The

Figure 3. Apply the system design of SRL in course design.


INTERACTIVE LEARNING ENVIRONMENTS 9

Figure 4. Initiate learning in the course design.

process was controlled by the learning goals as well as learning milestone, and supported by task
management software/tools and PLE/LMS system.
(4) Node 4: Reflect learning in the course design
Node 4 was decomposed into two main activities (see Figure 7). The first activity was reflecting on
learning process and emotion, where the instructors help learners induce causal attribution report,
emotion attribution report, and emotion adjustment list constrained by course outcome. The
process was realized in view of the input such as learning profile, learning assessment report and
help-seeking list and supported by the causal attribution rubrics.
The second activity was evaluating the environment and task, where learners were guided to
produce learning schedule and goal adjustment list, and help-seeking adjustment list monitored
by the course outcome. The process was supported by the environment evaluation scale and task
evaluation scale, with task list and learning module worked as its input.

Figure 5. Monitor learning in the course design.


10 X. XU ET AL.

Figure 6. Control learning in the course design.

It should be noted that the IDEF0 diagrams presented above were further decomposed to lower
levels which revealed greater details about the process to realize SRL in PLEs. Figure 8 indicated the
node diagram of the IDEF0 to model the number of functions and activities in the whole system
design of SRL.

3.5. Construction of the SRL-guided PLE-IELTS platform


1. Infrastructure of the platform
The infrastructure of the PLE-IELTS platform was mainly based on the open-source Learning Man-
agement System (Moodle) and Content Management System (Joomla), which provided perfect

Figure 7. Reflect learning in the course design.


INTERACTIVE LEARNING ENVIRONMENTS 11

Figure 8. IDEF0 node diagram for system design of SRL.

integration with rich extensions and plugins, empowering the extensibility and flexibility of the plat-
form, connecting to the cutting-edge Internet environment, such as different social media, Cloud ser-
vices and Internet application resources. The integration was through different communication
protocols such as Learning Tools Interoperability (LTI), SCORM and xAPI. The Internet application
resources such as Dropbox, Google Doc, Quizlet, TEDEd, Vimeo, Youtube, Turnitin, BigBlueButton,
Zoom, Panapto, Trello, Slack were also applied.
2. Content construction of the platform
The content structure of the PLE-IELTS platform was built based on the 16-week course design (see
Figure 4). In this research, the content was composed of consuming materials and learner-generated
materials. The former was packaged in different IELTS levels, in this project, the levels were IELTS 4.5-
5, 5-6, and 6-7, together with ICT learning package. The content was in the form of videos, audios,
documents and pictures (the multimedia materials were uploaded to the Baidu cloud, and then
the links were posted to the website). The latter was composed of learners’ tasks and forum and
blog posters.

3.6. Application of the platform


The whole project ran for two years (four rounds), with each round lasts for one semester (16 weeks),
starting from March 2017 to January 2019. A total of 410 participants joined the project. The functions
and features of the SRL-guided PLE-IELTS platform were built according to the system design
described in 3.4. It was important that before applying the platform, make sure learners were
aware of the concept of PLE and how to use the platform.

3.7. Learners’ perception of and engagement on the SRL-guided PLE-IELTS platform


Learners’ perception of the PLE-IELTS platform was evaluated by the Distance Education Learning
Environments Survey (DELES). Meanwhile, Deng et al.’s (2019) MOOC engagement scale (MES) was
applied to diagnose learners’ engagement on the platform.
1. Learners’ perceptions survey of the PLE-IELTS platform
(1) The Subjects
Altogether 410 surveys on the satisfactoriness of the PLE-IELTS platform were sent out on the PLE-
IELTS platform during the four semesters. 285 respondents who attended the online program fulfilled
the questionnaires, representing a valid response rate of 69.5%.
(2) Instruments
As mentioned above, this research used the Distance Education Learning Environments Survey
(DELES), which has been utilized in the United States to compare social work students’ perceptions
12 X. XU ET AL.

of their learning environments in blended learning (hybrid) class. The survey modified the DELES
slightly by changing the response into 5-Likert Scale statements, where 5 (Strongly Agree) rep-
resented the maximum score of the scale and 1 (Strongly Disagree) represented the minimum
score. Moreover, an open question about the suggestions for the improvement of the platform
was included. The questionnaires administrated online had high reliability in terms of factor with
initiation 0.987.
(3) Demographics
The average age of these participants was 20, and the subject areas of them covered more than a
dozen, ranging from Arts to Science.
(4) Results and discussion
A mean score on a 5-point scale presented participants’ attitudes towards the platform. Table 3
illustrated the distribution of mean scores. It was indicated in Table 3 that the participants’ overall
attitudes towards the supervised PLE-IELTS platform were positive (mean score >3.84). The lowest
scores (3.84-3.86) were statements such as “I study real cases related to the learning process”, “I
can relate what I have learned to life outside the learning environment”, and “I involve real
people and facts in my learning activities” which indicated the participants wanted more connection
between the real-life experience and the learning on the platform. Besides, another issue (scored
3.86) was that “The instructor would help me figure out problems in the learning process”. It indi-
cated that participants needed more guidance in the personalized virtual learning environment.
The researcher predicted that these concerns were raised because:

1. Participants’ lack of motivation in cooperative learning, for example in forum discussion, a good
number of them just post once or twice each week to fulfill the basic task requirement.
2. Participants’ knowledge and skills of dialogue were missing, some of them do not even know how
to start to work together in blended learning environment, for example, they rarely engaged in
solving knowledge-related conflicts or reflecting on different points of view with the peers or
asked questions about others’ thinking.
3. Participants’ limitation in ICT literacy: most participants tended to show low interest in
searching for different technologies to support the learning processes as Harris et al.’s (2009)
pointed out.

The questions with the highest scores (4-4.13) were “I share information with other learners”, “I
discuss my ideas with other students”, “The work of the group is part of my activities”, “I can
pursue topics of interest”, “I decide my own study”, “I study in my own way” which indicated that
firstly, learners were enjoyed with the self-regulated activities and found the values of the tasks;
besides, group work such as the forum offered opportunities for knowledge sharing and social
engagement which helped building up their sense of belonging or relatedness and achievement
as well as social responsibilities. This was consistent with the assumptions of most self-regulation
models that self-regulatory activities were directly linked to learners’ achievement and performance
(Pintrich, 2000).
2. Learners’ engagement on the PLE-IELTS platform
It is gradually acknowledged that student engagement plays a pivotal role in successful learning
and teaching (Henrie et al., 2015; Trowler, 2010). This paper focused on the engagement within the
context of supervised Personal Learning Environments (PLEs), which needed a validated and multi-
dimensional scale. Deng et al.’s (2019) 12-item MOOC engagement scale (MES) was applied in the
research which consisted of four dimensions: behavioral, cognitive, emotional and social
engagement.
(1) The Subjects
Altogether 98 surveys on the learner engagement on the PLE-IELTS platform were sent out on the
PLE-IELTS platform during the fourth semester. 96 respondents who attended the online program
fulfilled the questionnaires, representing a valid response rate of 98%.
INTERACTIVE LEARNING ENVIRONMENTS 13

Table 3. Descriptive statistics for the attitudes towards the PLE-IELTS platform.
N Minimum Maximum Mean Std. deviation Variance
A1 258 1 5 3.94 1.025 1.051
A2 257 1 5 3.86 1.019 1.038
A3 257 1 5 3.95 1.016 1.033
A4 258 1 5 3.93 1.039 1.08
A5 258 1 5 3.98 0.998 0.996
A6 257 1 5 4.13 0.958 0.917
A7 257 1 5 3.97 1.013 1.027
A8 257 1 5 3.95 1.043 1.087
B1 250 1 5 3.92 0.958 0.917
B2 250 1 5 3.94 1.008 1.017
B3 250 1 5 4.06 0.949 0.9
B4 250 1 5 4.03 0.954 0.911
B5 250 1 5 3.99 0.946 0.895
B6 250 1 5 4.05 0.93 0.865
C1 247 1 5 3.85 0.963 0.927
C2 247 1 5 4.01 0.956 0.914
C3 247 1 5 3.87 0.96 0.921
C4 247 1 5 3.9 0.989 0.978
C5 247 1 5 3.88 0.94 0.885
C6 247 1 5 3.96 0.932 0.868
C7 247 1 5 3.93 0.968 0.938
D1 244 1 5 3.84 0.946 0.895
D2 244 1 5 3.86 0.896 0.804
D3 244 1 5 3.97 0.925 0.855
D4 244 1 5 3.93 0.922 0.851
D5 244 1 5 3.9 0.952 0.907
E1 244 1 5 3.91 0.918 0.843
E2 244 1 5 3.95 0.922 0.85
E3 244 1 5 3.92 0.894 0.8
F1 244 1 5 4.04 0.913 0.834
F2 244 1 5 3.94 0.901 0.812
F3 244 1 5 3.96 0.955 0.912
F4 244 1 5 3.98 0.927 0.86
F5 244 1 5 4 0.943 0.889
Valid N (list wise) 244

(2) Instruments
This research used Deng et al.’s (2019) MOOC engagement scale (MES), which contained 12 items
intended to measure student levels of behavioral engagement (e.g. a regular time each week to work
on the MOOC.), cognitive engagement (e.g. searched for further information when encountered
something puzzling), emotional engagement (e.g. inspired to expand knowledge in the MOOC)
and social engagement (e.g. contributed regularly to course discussions). We made a minor revision,
changing the word “MOOC” to “PLE-IELTS platform”, to fit the present context.
The participants were required to respond to a 6-point Likert scale, where 6 (Strongly Agree) rep-
resented the maximum score of the scale and 1 (Strongly Disagree) represented the minimum score.
The questionnaires administrated online had high reliability in terms of factor with initiation 0.931.
(3) Demographics
All the participants were in year two (from the semester 2018-2019-1), and the subject areas of
them were mainly Finance and Engineering.
(4) Results and discussion
The MES survey had four dimensions, namely behavioral, cognitive, emotional and social engage-
ment, and each dimension had three items. A mean score on a 6-point scale presented participants’
engagement on the platform was shown in Table 4 below.
It was indicated in Table 4 that the participants’ overall attitudes towards their engagement on the
supervised PLE-IELTS platform were positive (mean score = 4.1 > 3). Among the four dimensions, the
lowest scores were social engagement (average mean score = 3.88), and emotional engagement
14 X. XU ET AL.

Table 4. Descriptive statistics for the engagement on the PLE-IELTS platform.


N Minimum Maximum Mean Std. deviation Variance
B1 regular time 96 2 6 4.45 1.141 1.303
B2 took notes 96 2 6 4.16 1.04 1.081
B3 revisited notes 96 2 6 3.89 1.055 1.113
C1 searched for information 96 2 6 4.36 0.975 0.95
C2 went over it again 96 1 6 4.25 1.095 1.2
C3 watch it again 96 2 6 4.28 1.073 1.152
E1 inspired 96 1 6 3.97 0.967 0.936
E2 interesting 96 2 6 4.12 0.997 0.995
E3 enjoyed watching 96 2 6 4.14 1.042 1.087
S1 responded 96 1 6 3.52 1.105 1.221
S2 contributed 96 1 6 4.25 0.973 0.947
S3 shared learning materials 96 1 6 3.88 1.163 1.353
Behavioral engagement 96 2 6 4.1632 0.94899 0.901
Cognitive engagement 96 2 6 4.2986 0.93594 0.876
Emotional engagement 96 1.67 6 4.0764 0.86041 0.74
Social engagement 96 2 6 3.8819 0.89701 0.805

(average mean score = 4.08), while the highest scores belonged to cognitive engagement (average
mean score = 4.3). Besides, it could be seen that the statement “I often responded to other learners’
questions” in social engagement got the lowest score (mean score = 3.52), in the contrast, “I set aside
a regular time each week to work on the PLE-IELTS platform” in the behavioral engagement had the
highest score.
The result revealed that the participants believed they were well engaged cognitively on the PLE-
IELTS platform, including searching for further information when encountering something puzzling,
and going over a concept or watching a video again until understood. Extensive learning materials,
recorded videos and other links or APPs on the platform were required to facilitate learners’ full cog-
nitive engagement on the PLE platform. In terms of social and emotional engagement, more inspiring
and interesting learning tasks or materials were needed in the future course content design, and
learning peers as well as learning communities could be further strengthened in the platform
design, through the ways such as the discussion board and peer evaluation tasks. Moreover, the
SRL system design could improve in emotional strategies and help-seeking strategies control and
reflection.

4. Conclusion
At the guidance of Pintrich’s General Framework for SRL (2000), this research developed a system
design of SRL in the PLE platform and pilot tested it through constructing a supervised-PLE-IELTS
platform. The system design of SRL used IDEF0 which generated a model with a hierarchical series
of diagrams, text, and glossary cross-referenced to each other. The system designed supported a
variety of SRL processes, including goals setting, prior knowledge activation, evaluation of meta-
cognitive strategies, content and task evaluation, etc. Meanwhile, it scaffolded metacognitive pro-
cesses, including learning judgement, feelings of knowing, and monitored learners’ progress
towards goals.
After a year’s construction of the platform, a two-year project (altogether four rounds) was carried
out in one of the largest comprehensive universities in East China. In the end of each round, learners’
perception survey was carried out to evaluate the environment of the supervised-PLE. The survey
results of 285 respondents revealed that the participants were generally positive towards the PLE-
IELTS platform, especially enjoyed being able to control their own learning, which was consistent
with the Ferrer-Cascales and his coresearchers’ (2011) finding that Active Learning and Autonomy
were most influential on distance education student satisfaction in Spain. Moreover, to investigate
learners’ engagement in the SRL-guided PLE-IELTS platform, Deng et al.’s (2019) MOOC engagement
INTERACTIVE LEARNING ENVIRONMENTS 15

scale (MES) was applied. The survey result of 96 participants indicated that their overall attitudes
towards engagement on the supervised PLE-IELTS platform were positive, especially cognitive
engagement.

5. Limitation and suggestion for future research


This paper presented the use of the IDEF0 to completely analyze the system design of SRL in the PLE
platform. The IDEF0 analysis helped in identifying the interrelationships between the different pro-
cedures of SRL and clarifying the impact of changing and/or updating any of these procedures on
the remaining ones. However, the IDEF0 models for SRL may need to redraw in the future to
include or remove information to certain process after a period of practical implication of the PLE
platform. Meanwhile, some features of the SRL-guided PLE platform were yet to be implemented
and some others are still under testing. For instance, to identify and interpret learners’ learning strat-
egies through tracing their interactions with the platform and detect the influence of platform feed-
backs towards their learning activities. In addition, the IELTS course was only for a limited number of
the sample population, thus, the study risked being a special case and the methodology being only
restricted to IELTS courses. Last, further research on the effectiveness of the platform could be carried
out in addition to the Distance Education Learning Environments Survey (DELES) and MOOC engage-
ment scale (MES) survey.
In the future research, the System design of the SRL in the PLE platform could be applied in other
fields in addition to language teaching. Meanwhile, it is suggested that the ICT maturity model survey
to be carried out beforehand among teachers, institutions, and learners to secure its application and
sustainability. Trainings to “reading” the IDEF0 diagrams are also needed for the platform users
before application.

Disclosure statement
No potential conflict of interest was reported by the author(s).

Funding
This work was supported by 产教融合视阈下新技术应用型人才培养的研究与实践 [grant number YJC880053]; 国际化
背景下基于产出的大学英语教学探索与实践 [grant number JW2018120403].

ORCID
Xiaoshu Xu http://orcid.org/0000-0002-0667-4511

References
Artino, A. R. (2008). Motivational beliefs and perceptions of instructional quality: Predicting satisfaction with online train-
ing. Journal of Computer Assisted Learning, 24(3), 260–270. https://doi.org/10.1111/j.1365-2729.2007.00258.x
Attwell, G. (2007). The personal learning environments-the future of eLearning? eLearning Papers 2(1). ISSN 1887-1542.
http://www.elearningeuropa.info/files/media/media11561.pdf
Azevedo, R., Cromley, J. G., & Selbert, D. (2004). Does adaptive scaffolding facilitate students’ ability to regulate their
learning with hypermedia? Contemporary Educational Psychology, 29(3), 344–370. https://doi.org/10.1016/j.
cedpsych.2003.09.002
Azevedo, R., Johnson, A., Chauncey, A., & Burkett, C. (2010). Self-regulated learning with meta tutor: Advancing the
science of learning with meta cognitive tools. In M. Khine, & I. Saleh (Eds.), New science of learning: Computers, cogni-
tion, and collaboration in education (pp. 225–247). Springer.
Azevedo, R. (2005). Computer environments as metacognitive tools for enhancing learning. Educational Psychologist, 40
(4), 193–197. https://doi.org/10.1207/s15326985ep4004_1
16 X. XU ET AL.

Azevedo, R. (2007). Understanding the complex nature of self-regulated learning processes in learning with computer-
based learning environments: An introduction. Metacognition and Learning, 2(2/3), 57–65. https://doi.org/10.1007/
s11409-007-9018-5
Azevedo, R. (2008). The role of self-regulation in learning about science with hypermedia. In D. Robinson, & G. Schraw
(Eds.), Recent innovations in educational technology that facilitate student learning (pp. 127–156). Information Age.
Azevedo, R. (2009). Theoretical, conceptual, methodological, and instructional issues in research on metacognition and
self-regulated learning: A discussion. Metacognition and Learning, 4(1), 87–95. https://doi.org/10.1007/s11409-009-
9035-7.
Azevedo, R., & Witherspoon, A. M. (2009). Self-regulated learning with hypermedia. In D. J. Hacker, J. Dunlosky, & A. C.
Graesser (Eds.), Handbook of metacognition in education (pp. 319–339). Routledge.
Azevedo, R., Behnagh, R., Duffy, M., Harley, J., & Trevors, G. (2012). Metacognition and self-regulated learning in student-
centered leaning environments. Theoretical Foundations of Student-Centered Learning Environments, 171–197.
Bartolomé, A., & Steffens, K. (2011). Technologies for self-regulated learning. Self-regulated learning in technology enhanced
learning environments. Sense Publishers, 21-23.
Biswas, G., Jeong, H., Kinnebrew, J., Sulcer, B., & Roscoe, R. (2010). Measuring self-regulated learning skills through social
interactions in a teachable agent environment. Research and Practice in Technology-Enhanced Learning, 5(2), 123–152.
https://doi.org/10.1142/S1793206810000839
Boekaerts, M. (1992). The adaptable learning process: Initiating and maintaining behavioral change. Applied Psychology,
41(4), 377–397. https://doi.org/10.1111/j.1464-0597.1992.tb00713.x
Buchem, I., Tur, G., & Hoelterhof, T. (2014). Learner control in personal learning environments: A cross-cultural study.
Learning and Diversity in the Cities of the Future, 15, 14–53.
Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical synthesis. Review of Educational
Research, 65(3), 245–281. https://doi.org/10.3102/00346543065003245
Chatterjee, A., Law, E., Owen, G., Velasco, K., & Mikroyannidis, A. (2011). A framework for the adoption and diffusion of per-
sonal learning environments in commercial organizations: An exploratory study in the learning and development sector in
the UK. In: PLE Conference 2011, 11–13 Jul 2011, Southampton, UK.
Cho, M.-H., Shen, D., & Laffey, J. (2010). Relationships between self-regulation and social experiences in asynchronous
online learning environments. Journal of Interactive Learning Research, 21, 297–316.
Clark, W., Logan, K., Luckin, R., Mee, A., & Oliver, M. (2009). Beyond Web 2.0: Mapping the technology landscapes of young
learners. Journal of Computer Assisted Learning, 25(1), 56–69. https://doi.org/10.1111/j.1365-2729.2008.00305.x
Dabbagh, N., & Kitsantas, A. (2005). Using web-based pedagogical tools as scaffolds for self-regulated learning.
Instructional Science, 33(5–6), 513–540. https://doi.org/10.1007/s11251-005-1278-3
Dabbagh, N., & Reo, R. (2011). Impact of web 2.0 on higher education. In D. W. Surry, T. Stefurak, & R. Gray (Eds.),
Technology integration in higher education: Social and organizational aspects (pp. 174–187). IGI Global.
Deng, R., Benckendorff, P., & Gannaway, D. (2019). Learner engagement in MOOCs: Scale development and validation.
British Journal of Educational Technology, 51, 245–262. https://doi.org/10.1111/bjet.12810.
Downes, S. (2005). E-learning 2.0, E-learn magazine. http://www.elearnmag.org/subpage.cfm?section=articles&article=
29-1
Dunlosky, J., & Lipko, A. R. (2007). Metacomprehension. Current Directions in Psychological Science, 16(4), 228–232. https://
doi.org/10.1111/j.1467-8721.2007.00509.x
FitzGerald, S. (2006). Creating your personal learning environment. http://seanfitz.wikispaces.com/creatingyourple
Fruhmann, K., Nussbaumer, A., & Albert, D. (2010). A psycho-pedagogical framework for self-regulated learning in a respon-
sive open learning environment. In International conference eLearning Baltics Science (eLBa Science 2010), Rostock,
Germany.
Fülscher, J., & Powell, S. G. (1999). Anatomy of a process mapping workshop. Business Process Management Journal, 5(3),
208–238. https://doi.org/10.1108/14637159910283029
Greene, J. A., & Azevedo, R. (2009). A macro-level analysis of SRL processes and their relations to the acquisition of a soph-
isticated mental model of a complex system. Contemporary Educational Psychology, 34(1), 18–29. https://doi.org/10.
1016/j.cedpsych.2008.05.006
Greene, J. A., & Azevedo, R. (2010). The measurement of learners’ self-regulated cognitive and metacognitive processes
while using computer-based learning environments. Educational Psychologist, 45(4), 203–209. https://doi.org/10.1080/
00461520.2010.515935
Hacker, D., Dunlosky, J., & Graesser, A. (2009). Handbook of metacognition in education. Erlbaum.
Hargis, J., Cavanaugh, C., Kamali, T., & Soto, M. (2014). A federal higher education iPad mobile learning initiative:
Triangulation of data to determine early effectiveness. Innovative Higher Education, 39(1), 45–57. https://doi.org/10.
1007/s10755-013-9259-y
Harmelen, V. M. (2006). Personal learning environments. In R. Kinshuk, P. Koper, P. Kommers, D. Kirschner, W. Didderen, &
Sampson (Eds.), Proceedings of the sixth international conference on advanced learning technologies (pp. 815–816). IEEE
Computer Society.
Harris, J., Mishra, P., & Koehler, M. (2009). Teachers’ technological pedagogical content knowledge and learning activity
types. Journal of Research on Technology in Education, 41(4), 393. https://doi.org/10.1080/15391523.2009.10782536
INTERACTIVE LEARNING ENVIRONMENTS 17

Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A
review. Computers & Education, 90, 36–53. https://doi.org/10.1016/j.compedu.2015.09.005
Hiebert, J. (2006). Personal learning environment model. Retrieved April 13 2006 from http://headspacej.blogspot.com/
2006/02/personal-learning-environment-model.html
Johnson, L., Adams, S., & Haywood, K. (2011). The NMC horizon report: 2011 K-12 edition. New Media Consortium. http://
www.nmc.org
Kinnebrew, J., Biswas, G., & Sulcer, B. (2010). Measuring self-regulated learning skills through social interactions in a teach-
able agent environment. AAAI Fall Symposium on Cognitive and Metacognitive Educational Systems (MCES), Arlington,
VA.
Kitsantas, A., & Dabbagh, N. (2010). Learning to learn with integrative learning technologies (ILT): A practical guide for aca-
demic success. Information Age.
McGloughlin, C., & Lee, M. J. W. (2010). Personalized and self-regulated learning in the web 2.0 era: International exem-
plars of innovative pedagogy using social software. Australasian Journal of Educational Technology, 26(1), 28–43.
https://doi.org/10.14742/ajet.1100.
Mcloughlin, C., & Luca, J. (2002). A learner-centered approach to developing team skills through web-based learning and
assessment. British Journal of Educational Technology, 33(5), 571–582. https://doi.org/10.1111/1467-8535.00292
Monova-zjeleva (2005). Adaptive learning in web-based educational environments. Cybernetics and Information
Technologies, 5(1), 44–55.
Moos, D. C., & Azevedo, R. (2008). Exploring the fluctuation of motivation and use of self-regulatory processes during
learning with hypermedia. Instructional Science, 36(3), 203–231. https://doi.org/10.1007/s11251-007-9028-3
Moreno, A. I., & Vermeulen, A. (2015). Using VISP (videos for speaking), a mobile app based on audio description, to
promote English language learning among Spanish students: A case study. Procedia - Social and Behavioral
Sciences, 178, 132–138. https://doi.org/10.1016/j.sbspro.2015.03.169
Nussbaumer, A., Dahn, I., Kroop, S., Mikroyannidis, A., & Albert, D. (2015). Supporting self-regulated learning. In S. Kroop, A.
Mikroyannidis, & M. Wolpers (Eds.), Responsive open learning environments (pp. 17–48). Springer.https://doi.org/10.
1007/978-3-319-02399-1_2.
Nussbaumer, A., Kravcik, M., Renzel, D., Klamma, R., Berthold, M., & Albert, D. (2014). A Framework for Facilitating Self-
Regulation in Responsive Open Learning Environments. Computers and Society, 1–41. arXiv:1407.5891.
Perera, T., & Liyanage, K. (2001). IDEF based methodology for rapid data collection. Integrated Manufacturing Systems, 12
(3), 187–194. https://doi.org/10.1108/09576060110391147
Pintrich, P. R., Marx, R. W., & Boyle, R. A. (1993). Beyond cold conceptual change: The role of motivational beliefs and class-
room contextual factors in the process of conceptual change. Review of Educational Research, 63(2), 167–199. https://
doi.org/10.3102/00346543063002167
Pintrich, P. R. (2000). The role of goal orientation in self-regulated learning. In M. Boekaerts, P. R. Pintrich, & M.
Zeidner (Eds.), Handbook of self-regulation (pp. 451–502). Academic Press. https://doi.org/10.1016/B978-012109890-
2/50043-3.
Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students.
Educational Psychology Review, 16(4), 385–407. https://doi.org/10.1007/s10648-004-0006-x
Pintrich, P. R., Wolters, C., & Baxter, G. (2000). Assessing metacognition and self-regulated learning. In G. Schraw & J.
Impara (Eds.), Issues in the measurement of metacognition (pp. 43–97).
Puzziferro, M. (2008). Online technologies self-efficacy and self-regulated learning as predictors of final grade and satis-
faction in college-level online courses. American Journal of Distance Education, 22(2), 72–89. https://doi.org/10.1080/
08923640802039024
Rubin, N. (2010). Creating a user-centric learning environment with Campus Pack personal learning paces: PLS Webinar,
Learning Objects Community. http://community.learningobjects.com/Users/Nancy.Rubin/Creating_a_User-entric_
Learning
Saleh, S. A., & Bhat, S. A. (2015). Mobile learning: A systematic review. International Journal of Computer Applications, 114
(11), 1–5. https://doi.org/10.5120/20019-1406
Sanders, J. R. (1992). Evaluating school programs: An educator’s guide. Corwin Press. 112-230.
Steffens, K. (2006). Self-regulated learning in technology-enhanced learning environments: Lessons of a European peer
review. European Journal of Education, 41(3/4), 353–379. https://doi.org/10.1111/j.1465-3435.2006.00271.x
Sun, J. C.-Y., & Rueda, R. (2012). Situational interest, computer self-efficacy and self-regulation: Their impact on student
engagement in distance education. British Journal of Educational Technology, 43(2), 191–204. https://doi.org/10.1111/j.
1467-8535.2010.01157.x
Trowler, V. (2010). Student engagement literature review. The Higher Education Academy, 11(1), 1–15.
Tseng, K. H., Chiang, F. K., & Hsu, W. H. (2008). Interactive processes and learning attitudes in a web-based problem-based
learning (PBL) platform. Computers in Human Behavior, 24(3), 940–955. https://doi.org/10.1016/j.chb.2007.02.023
Veenman, M. (2007). The assessment and instruction of self-regulation in computer-based environments: A discussion.
Metacognition and Learning, 2(2-3), 177–183. https://doi.org/10.1007/s11409-007-9017-6
18 X. XU ET AL.

Viberg, O., & Grönlund, A. (2013). Cross-cultural analysis of users’ attitudes toward the use of mobile devices in second
and foreign language learning in higher education: A case from Sweden and China. Computers & Education, 69,
169–180. https://doi.org/10.1016/j.compedu.2013.07.014
Winne, P. H., & Hadwin, A. F. (1998). Studying as self-regulated learning. In D. J. Hacker, J. Dunlosky, & A. Graesser (Eds.),
Metacognition in educational theory and practice (pp. 277–304). Lawrence Erlbaum.
Winne, P. H. (2001). Self-regulated learning viewed from models of information processing. In B. Zimmerman, & D. Schunk
(Eds.), Self-regulated learning and academic achievement: Theoretical perspectives (pp. 153–189). Erlbaum.
Wong, L. H., & Looi, C. K. (2010). Mobile-assisted vocabulary learning in real-life setting for primary school students: Two
case studies. in Proc. 6th IEEE Int. Conf. WMUTE, Apr. 2010, pp. 88–95.
Zimmerman, B. (2008). Investigating self-regulation and motivation: Historical background, methodological develop-
ments, and future prospects. American Educational Research Journal, 45(1), 166–183. https://doi.org/10.3102/
0002831207312909
Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An overview. Educational Phycologist, 25(1),
3–17. https://doi.org/10.1207/s15326985ep2501_2
Zimmerman, B. J. (1998). Academic studying and the development of personal skill: A self-regulatory perspective.
Educational Phycologist, 33(2-3), 73–86. https://doi.org/10.1080/00461520.1998.9653292
Zimmerman, B. J., & Schunk, D. H. (2008). Motivation an essential dimension of self- regulated learning. In D. H. Schunk, &
B. J. Zimmerman (Eds.), Motivation and self-regulated learning: Theory, research, and applications (pp. 1–30). Routledge.

View publication stats

You might also like