You are on page 1of 17

Interactive Learning Environments

ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/nile20

Enhancing EFL reading and writing through AI-


powered tools: design, implementation, and
evaluation of an online course

Jo-Chi Hsiao & Jason S. Chang

To cite this article: Jo-Chi Hsiao & Jason S. Chang (2023): Enhancing EFL reading and writing
through AI-powered tools: design, implementation, and evaluation of an online course,
Interactive Learning Environments, DOI: 10.1080/10494820.2023.2207187

To link to this article: https://doi.org/10.1080/10494820.2023.2207187

Published online: 03 May 2023.

Submit your article to this journal

Article views: 226

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=nile20
INTERACTIVE LEARNING ENVIRONMENTS
https://doi.org/10.1080/10494820.2023.2207187

Enhancing EFL reading and writing through AI-powered tools:


design, implementation, and evaluation of an online course
a,b c
Jo-Chi Hsiao and Jason S. Chang
a
National Hsinchu Senior High School, Hsinchu City, Taiwan, ROC; bInstitute of Education, National Yang Ming
Chiao Tung University, Hsinchu City, Taiwan, ROC; cDepartment of Computer Science, National Tsing Hua
University, Hsinchu City, Taiwan, ROC

ABSTRACT ARTICLE HISTORY


During the Covid-19 pandemic, global teachers gained extensive Received 7 March 2023
experiences with teaching online courses. To design quality online Accepted 20 April 2023
courses in the post-pandemic era, the impact of the latest technology,
KEYWORDS
such as artificial intelligence (AI), must be considered. Investigating how Quality synchronous online
teachers incorporate AI-powered tools and how students perceive course design; artificial
learning experiences with the tools is crucial for informing effective intelligence; EFL Reading
online course design. In this case study, we presented an 18-week and writing; post COVID-19
online English course for 43 senior high school students in Taiwan. pandemic era; learning
Designed according to learning theories, this course aimed to cultivate theories; teaching research
autonomous EFL learners, who learned to read and write with the
support of three AI-powered tools (i.e. Linggle Write, Linggle Read, and
Linggle Search). Data were collected from the instructors and students
in both qualitative and quantitative formats. Results showed that a
learning loop was created to connect online learning with offline
practice; students had better optimal experience in student-centered
presentations than in teacher-centered lectures; and language
proficiency predicted semester grade and assignment quantity. This
study has both theoretical and practical value; it serves as an example
of how to design a quality synchronous online course on AI tools for
EFL reading and writing based on theoretical approaches.

1. Introduction
1.1. Research gaps
The Covid-19 pandemic forced global K-12 teachers to shift to online courses, resulting in a wealth of
new experiences and challenges (An et al., 2021). As we move into the post-pandemic era, it is crucial
that we explore how to design quality online courses for K-12 students to avoid learning losses that
have been found (Engzell et al., 2021) and that could have happened again in future educational
disruption. So far, most research endeavors have focused on online courses in higher education
(Baldwin et al., 2018); even if some researchers studied online courses for K-12 learners, these
courses were mostly hybrid switching between offline and online instructions over a period of
time (Zhao et al., 2020). Little has been known about how to design synchronous online courses
for K-12 learners.
Recently, artificial intelligence (AI) is increasingly impacting education in many ways. In the field
of foreign language learning, the application of AI-powered tools is a subsector of computer-assisted

CONTACT Jo-Chi Hsiao jochihsiao@nycu.edu.tw National Hsinchu Senior High School, Hsinchu City, Taiwan, ROC;
Institute of Education, National Yang Ming Chiao Tung University, Hsinchu City, Taiwan, ROC
© 2023 Informa UK Limited, trading as Taylor & Francis Group
2 J.-C. HSIAO AND J. S. CHANG

language learning (CALL), which has been known as ICALL (Intelligent CALL) (Kannan & Munday,
2018; Pokrivčáková, 2019). These tools are designed based on natural language processing and
machine learning. To date, many AI-powered tools for English reading and writing (e.g. AI writing
assistants, machine translation, and chatting robots) have been developed, marketed, and used in
language classrooms, and researchers have claimed that AI-powered tools can promote indepen-
dent learners with self-regulation and autonomy (Pokrivčáková, 2019). This positive learning
outcome matches one of the curriculum goals (i.e. “introducing learners to methods with which
they can learn English effectively on their own … ”) described in the Curriculum Guidelines of 12-
Year Basic Education issued by the Ministry of Education (2018) in Taiwan. However, few studies
have investigated how to integrate these tools into a cohesive course design, how students perceive
different types of learning activities (e.g. student-centered presentations versus teacher-centered
lectures), and how they perform in such a course. Therefore, it is imperative to conduct further
research that takes a holistic approach to course design, considering the interplay between AI-
powered tools, other course elements, and student learning outcomes.
By examining the intersection of the two research gaps (the design of online courses in K-12 edu-
cation and AI-powered tools in authentic pedagogy for English language teaching and learning), this
paper aimed to design, implement, and evaluate a synchronous online course on AI-powered
reading and writing tools for senior high school students learning how to read and write English.
Specifically, this course was developed to nurture autonomous EFL learners, who practiced using
the tools in real life and reflected upon their user experiences. Taken together, this study can
provide insight into how AI-powered tools can be effectively incorporated into online courses in
K-12 settings to support the learning outcomes of English for senior high school students.

1.2. Research questions


To design an effective synchronous online course for EFL learners, we resorted to common principles
and guidelines such as setting up clear teaching/learning goals, understanding student background,
deciding on in-class activities, evaluating learning performance, etc. (Celce-Murica et al., 2014). We
were aware that in the literature, there was a dispute about the effectiveness of teacher-centered
and student-centered teaching approaches (Emaliana, 2017). Since both have their own issues, we
decided to go beyond the dichotomy and take a more integrated conception of teaching and learn-
ing (Mascolo, 2009) by organizing both teacher-centered lectures and student-centered presenta-
tions and by viewing them as social activities in which students participate. We were also aware
that in the literature, students’ background variables (e.g. proficiency, residence, gender, etc.)
have been widely investigated to explore the extent to which they affect learning performance in
English (Roever et al., 2014). Taken together, three research questions were posed:
RQ1: What did a synchronous online course in which AI-powered reading and writing tools were taught look
like?

RQ2: In this synchronous online course, did students perceive two main learning activities (i.e. student-centered
presentations and teacher-centered lectures) differently?

RQ3: In this synchronous online course, what were the learner background variables (i.e. grade level, gender,
language proficiency, and school location) that predicted learning performance?

For RQ1, the syllabus, lesson plans, teaching materials, students’ notes, reflective writings, peer
evaluation sheets, and videos of oral presentations were carefully examined using the content analy-
sis. Also, experts’ evaluation of the course design was collected. For RQ2, students’ responses to the
flow questionnaires of student-centered presentations and teacher-centered lectures were analyzed
via paired sample t-tests. For RQ3, the regression analysis was conducted to determine to what
extent learner background variables predicted their learning performance in class (indicated by
semester grade and assignment quantity).
INTERACTIVE LEARNING ENVIRONMENTS 3

2. Literature review
We grounded our study on four learning-relevant theories: Community of Inquiry framework (Garri-
son et al., 1999); ICAP theory (Chi, 2009), scaffolding theory (Wood et al., 1976), and flow theory
(Csikszentmihalyi, 1990).

2.1. Community of inquiry framework


The Community of Inquiry (CoI) framework was proposed by Garrison et al. (1999) to describe three
overlapping elements for successful online learning: social presence, cognitive presence, and teach-
ing presence. Social presence means learners are able to identify with the learning community, com-
municate in a safe environment, and build relationships with teachers and peers. Cognitive presence
refers to the extent to which learners can construct knowledge using critical thinking skills. Teaching
presence includes the design of course contents, the facilitation of discourse, and direct teaching
(Anderson et al., 2001). Garrison et al. (1999) emphasized that the interaction between the three
elements creates meaningful online educational experiences. This framework has been verified in
qualitative and quantitative studies (Akyol & Garrison, 2008; Anderson et al., 2001; Arbaugh et al.,
2008; Garrison & Arbaugh, 2007), so we considered it an appropriate guide to draw upon to
design the online-only English language learning course that orchestrated social presence, cognitive
presence, and teaching presence.

2.2. ICAP theory


ICAP theory (Chi, 2009) explains how cognitively engaged learners are when they engage in
learning activities with various materials. ICAP stands for Interactive, Constructive, Active, and
Passive modes of learning. In the interactive mode, learners collaborate to generate knowledge
that goes beyond the materials. They elaborate, justify, and challenge ideas in groups. In the con-
structive mode, learners generate knowledge based on the given materials. They explain, invent,
and argue for/against the ideas in the learning materials. In the active mode, learners link the
new information to their prior knowledge and stop at memorizing the given information. They
do pointing, underlying, and copying. As for the passive mode, learners absorb the information
without engaging in overt activities. They read a text, watch a video, or listen to a lecture. ICAP
has been supported by several studies (Chi, 2009; Chi & Wylie, 2014; Fonseca & Chi, 2011;
Menekse et al., 2013), and we found it useful in balancing teacher-centered and student-centered
instructional approaches, as both promote student engagement in different but equally impor-
tant ways (Wu & Huang, 2007).

2.3. Scaffolding theory


Wood et al. (1976) used the metaphor of scaffolding to describe the language development of
children under the guidance of their parents. Later, this metaphor was applied to the context
of language learning in classrooms where teachers assist learners to cross the Zone of Proximal
Development (Vygotsky, 1978) and develop new knowledge and skills (Hammond & Gibbons,
2005). Ribbe and Bezanilla (2013) defined three principles of scaffolding and argued that they
could promote learner autonomy in online learning: allowing learners to determine learning
goals and contents, promoting learner-reflection by emphasizing self-monitoring and self-evalu-
ation, and designing authentic learning tasks so that learners can apply learned knowledge
and skills in the real world. We decided to follow these principles in designing the online
English language learning course with an aim to fostering autonomous EFL learners who can
use AI-powered tools to learn English in their regular English classes at school and in their
daily lives.
4 J.-C. HSIAO AND J. S. CHANG

2.4. Flow theory


Csikszentmihalyi’s (1990) flow theory explains the mental state of deep engagement that pro-
fessionals experience when they work on challenging but not overwhelming activities. Many
researchers have drawn on this theory to examine students’ optimal experience in school (Cheng
et al., 2017; Shernoff et al., 2014). The flow experience was viewed as a holistic integration of nine
indicators: challenge-skill balance, clear goals, immediate feedback, concentration on the task,
action-awareness merging, loss of self-consciousness, sense of control, time distortion, and regard-
ing the activity as intrinsically rewarding (Beard & Hoy, 2010; Cheng et al., 2017). In the present study,
we compared students’ flow experience in student-centered presentations and teacher-centered
lectures to determine whether they elicited different levels of student engagement in the online-
only English language learning course.

3. Methods
3.1. Genesis of the synchronous online course
The online course, entitled “Let’s Learn to Use AI-Powered Tools to Become Autonomous English
Learners,” was one of the selective courses included in a collaborative national education
program sponsored by the Taipei City Government. Every semester, around 40 synchronous
online courses taught by professors at universities or teachers at senior high schools were offered
on an online platform called Taipei CooC-Cloud to senior high school students aged 15–17 in
Taiwan. Students from different regions, schools, and grade levels enrolled in courses that
matched their interests. For 18 weeks, they virtually attended the synchronous live class session
(100 minutes per week) and completed course requirements to earn credits.
We, an expert in English language teaching/learning and educational psychology as well as an
expert in computational linguistics, collaborated to design and implement the course. We
decided to use Google Meet to have real-time meetings with students, and Google Classroom to dis-
tribute assignments and provide written feedback to them, because students were familiar with
these two platforms which were widely used during school closures in the Covid-19 pandemic
quarantine.

3.2. AI-powered tool design


For EFL senior high school students in Taiwan, although they spent a lot of time learning how to read
and write in regular English classes, they still encountered difficulties when they read and wrote
English in real life. Therefore, we chose three AI-powered tools developed by the research team
led by the second author: Linggle Write, Linggle Read, and Linggle Search (see Table 1).

3.2.1. AI-powered writing tool


Linggle Write (w.linggle.com) was designed to provide corrective feedback in response to a given
document submitted by a learner, handling so-called Grammatical Error Correction (GEC). This
tool also displays a proficiency level based on the European Framework for Language Reference
(CEFR). Most commercial grammar checkers (e.g. Grammarly and Ginger) can detect errors due to
misuse of function words, incorrect word forms (spelling, punctuation, tense, plurality, and the
like), or incorrect word order, while very few systems (e.g. InstaText) are capable of handling
errors of word choice. However, the wrong choices of verbs, adjectives, and nouns tops the list of
most common errors (second, seventh, and ninth respectively) according to the Cambridge
Learner Corpus First Certificate in English (CLC-FCE). Therefore, it is imperative that a grammar
checker examines the error-prone content words and collocations (e.g. “big accident” and “killed
INTERACTIVE LEARNING ENVIRONMENTS 5

Table 1. Summary of the AI-powered tools.


Tools Tools and English learning AI features Tools in use
Linggle Students are instructed to submit their Bidirectional Encoder
Write writing onto this tool. The system Representations from
then responds with a CEFR a level, Transformers (BERT) and Text-To-
score b, as well as suggestions for Text Transfer Transformer (T5).
Linggle revising grammatical and word-
Read choice errors.

Students are instructed to use this tool Word sense disambiguation,


to read articles written in English. A automatic speech synthesis, and
simple click on an unfamiliar word collocation extraction.
trigger the system to show its
meaning in context, word usage, and
relevant example sentences
retrieved from the Cambridge
dictionary. The words can be
Linggle highlighted according to their word
Search level defined by CEFR a or CEEC c.
Students are instructed to use this tool Word embedding (vectorized
to query how words are used in representation of words).
context. Frequencies and example
sentences of different usages are
presented to users.

CEFR a = The Common European Framework of Reference for Languages; score b = following the scoring system in the English
writing section in General Scholastic Ability Test; CEEC c = College Entrance Examination Center.

thousands of people”) and provide corrective feedback (e.g. “serious accident” and “slaughtered
thousands of people”).
Existing GEC systems such as Grammarly (grammarly.com) and Google Docs (docs.google.com)
typically use crowd-sourced learner corpus with error annotation (e.g. Lang-8 Dataset) and train
deep learning models to replicate the behavior of human editors. However, the best feedback
may not be present in the output of these models, because many error annotations may be
under-represented (or even not recognized as such) in existing learner corpora. These errors could
be corrected more effectively if a GEC system also made use of unannotated corpora of native
speaker texts in self-supervised machine learning. Recently, self-supervised learning using a huge
amount of naturally occurring text has contributed to advances of such large language models
(LLMs) as BERT, T5, and GPT, drastically improving solutions to almost all natural language tasks.
Riding on this wave of NLP renaissance, we designed Linggle Write by combining the best of both
worlds of supervised and self-supervised learning using T5 and BERT. As a result, Linggle Write is
capable of handling common grammatical errors as well as lexical errors. With this newly developed
GEC model, we designed a very simple and user-friendly interface. The user interface of Linggle Write
has a working area for writing and an information area for displaying corrective feedback.

3.2.2. AI-powered Reading tool


Linggle Read (r.linggle.com) was designed to support learners in reading the content of their own
choice. This tool can automatically construct a word list with a simple gloss for the target proficiency
level out of the user-provided content. Most commercial assistive readers (AR) can display simple
word glosses on demand. However, very few systems can explain an ambiguous word (e.g.
“barrel”) in context (e.g. “We obtain our whisky barrels from various distilleries in the USA and Scot-
land.”). It is imperative that an assistive reader disambiguate word sense according to context and
display the context-sensitive meanings (e.g. “a large wooden or plastic container”). Additionally,
Linggle Read also turns user-provided text into an audiobook to support listening to learn.
Existing AR systems typically use handcrafted glossaries to accompany curated content. However,
the approach is not feasible for assisting readers with unknown words in self-selective content.
6 J.-C. HSIAO AND J. S. CHANG

Inevitably, there will be ambiguous words with which the intended meaning cannot be explained
with a hand-crafted glossary. For this, we cope with the problem using pretrained, self-supervised
models such as BERT to predict word senses in context. We designed Linggle Read by fine-tuning
the BERT model to predict the topic (e.g. CONTAINER) of the target word and thereby derive its
intended meaning (e.g. “a large wooden or plastic container”). As a result, Linggle Read is
capable of determining reasonably well word sense in context based on the online version of Cam-
bridge Learners’ Dictionary (CLD). With word senses, the CLD also provides CEFR proficiency levels
which help to determine whether ambiguous words indeed belong to a particular target level.
For intensive reading, Linggle Read also provides reference information of collocations and
grammar patterns on demand. Linggle Read displays the most common collocations and
grammar patterns to avoid overwhelming the readers. Furthermore, Linggle Read displays infor-
mation by frequency, while existing English collocations dictionaries display a barrage of information
alphabetically. Frequency based and selective listing allows the users to stop at any point in the list
and always get the most useful information.

3.2.3. AI-powered search and reference tool


Linggle Search (linggle.com) was designed for easy access to phrasal and collocation information in a
very large corpus, which is particularly useful for self-editing. Linggle Search is a Web-scale linguistics
search engine, capable of retrieving common recurring word sequences (i.e. lexical bundles) in
response to a given query. The query might contain keywords, wildcards, wild part-of-speech, syno-
nyms, and additional regular expression (RE) operators. In our approach, we incorporate inverted file
indexing, part-of-speech information from BNC, and word2vec (vectorized word representation
scheme), to support such a powerful query language for retrieval of phrasal, syntactic, and semantic
information for language reference.
The Web site provides a short tutorial of meta symbols and functions. These symbols include the
slash symbol (alternative search), the dollar sign ($ denotes an affix), the underscore (“_” denotes a
single word), the star sign (“*” denotes zero or more words), the tilde (“∼” denotes a synonym),
the question mark (“?” denotes presence or absence of a keyword), and finally part-of-speech
tags (such as “n.,” “v.,” and “adj.” denote a noun, verb, and adjective). These meta symbols can
be interspersed between keywords or precede a word (question mark and tilde) to express not a
phrase but a potentially infinite set of phrases known in mathematics and programming as the
regular expression.
The method for searching and retrieving involves parsing the query to transforming it into several
keyword retrieval commands. Word sequences (lexical bundles) are retrieved with counts, and these
bundles were further filtered by matching the query as a regular expression for fuzzy search. Finally,
we display the results of matched phrases according to frequency counts. Additionally, Linggle
Search provides examples for retrieved phrases by gleaning short, easy-to-understand sentences
from very large Web corpora such as C4 (Colossal Cleaned version of Common Crawl’s web crawl
corpus (https://commoncrawl.org)). We use a classifier to select dictionary-like example sentences
to accompany retrieved phrases.
The overarching principle of Linggle Search is to allow users to use a set of simple meta-symbols
in conjunction with keywords to express information needs for self-editing. The search results are
displayed phrase by phrase first, and then the users can drill down to examples of each retrieved
phrase. The current implementation of Linggle is user-friendly, versatile, and intuitive with a top-
down information design.

3.3. Participants
The participants were 43 students from 10 senior high schools. They took the course in either the
autumn of 2021 or the spring of 2022. More than half of them were female students (about 65%).
Around 58% were 10th-grade students, 21% were 11th-grade students, and another 21% were
INTERACTIVE LEARNING ENVIRONMENTS 7

12th-grade students. About 44% were from northern Taiwan, 35% from central Taiwan, and 21%
from southern Taiwan. Judging by their academic performance, around 42% were high achievers
(grade point average for the previous semester in English above 80) and 10% were low achievers
(grade point average for the previous semester in English below 60).

3.4. Data collection and analysis


We collected rich data from the instructors and students. Data collected from the instructors
included the syllabus, lesson plans, and class materials. Data collected from the students
included background information (i.e. grade level, gender, language proficiency, and school
location), which students provided at the beginning of the semester; assignments (see Appendix
2 for examples; three notes taken in teacher-centered lectures following the Cornell notebook
format, three reflective writings on user experience of AI-powered tools, three slides based
on the content of the reflective writings, and three peer-evaluation sheets completed in the
phase of student-centered presentations); Flow questionnaire (9-point Likert scale, see Appendix
1; Cronbach’s alpha = .87; variance explained = 46.51%; adapted from Cheng et al. (2017), whose
participants were students in Taiwan as well) which students responded to every week after
class; three video recordings of student oral presentations; and the semester grades determined
by the instructors based on the quality of the Cornell notes, reflective writings, slides, and oral
presentations (grade = 60%), final group presentation on three tools (30%), and participation in
class (10%).
To analyze the qualitative data (i.e. the syllabus, lesson plans, class materials, assignments, and
video recordings), we used a two-cycle content analysis (Saldaña, 2015). At stage 1, descriptive
codes were proposed and decided. At stage 2, the codes were classified according to the four learn-
ing theories (i.e. CoI framework, ICAP theory, scaffolding theory, and flow theory). The two steps were
repeated if one coder requested to change the initial codes. Several iterations were necessary to
reach the final consensus. As for quantitative data, we conducted paired sample t-tests and
regressions using SPSS 21. To incorporate nominal variables into regression analysis, we created
dummy variables for learner background variables.
To determine the quality of this synchronous online course, a panel of experts (three professors
and three English teachers) was invited to evaluate the course design. They were given the syllabus,
direct access to its Google Classroom, and an evaluation sheet with 22 criteria on a 6-point Likert
scale. These criteria have been widely used in various instruments designed to evaluate online
courses (Baldwin et al., 2018).

4. Results
4.1. RQ1: What did a synchronous online course in which AI-powered Reading and writing
tools were taught look like?
Figure 1 shows the conceptual framework of the synchronous online course, along with the support-
ing theories and the time points when the flow questionnaire was distributed. Since the ultimate
goal of the course was to cultivate autonomous EFL learners, the center of the figure was labeled
“Autonomous EFL learners.” The bidirectional arrows indicate the constant interactions between
individual learners and the learning activities which were divided into online learning in class (i.e.
Teacher-centered lectures of AI-powered tools and Student-centered oral presentations) and
offline learning after class (i.e. Practice using AI-powered tools in real world and Reflective
writing). A learning loop was used to represent the dynamic connections between online and
offline learning which lasted for a semester.
The learning loop began with teacher-centered lectures on AI-powered tools. In this phase, teach-
ing presence based on CoI framework dominated the class. The teacher relied on direct teaching to
8 J.-C. HSIAO AND J. S. CHANG

Figure 1. Course design to nurture autonomous EFL learners.

introduce, demonstrate, and explain the AI-powered tools. At the same time, students engaged in
passive and active learning modes based on ICAP theory by listening, reading, copying, and
taking notes in the format of the Cornell notebook. Immediately after the class, their optimal experi-
ence was measured by the Flow questionnaire. Next, students practiced using the tools they learned
in real life. They read self-selected passages using Linggle Read (AI-powered reading tool), they
looked up words they were learning in regular English classes with Linggle Search (AI-powered
search and reference tool), and they edited their writing with Linggle Write (AI-powered writing
tool). In this phase, students applied the knowledge and skills of the tools to the authentic
context of the English classroom. They then reflected on their user experiences with the tools and
evaluated how useful and beneficial the tools were from users’ perspective. This way, they compared
their learning experiences before and after using the tools for learning. Using the writing template
provided by the teacher (scaffolding), students had to write a 100-word paragraph with a clear topic
sentence, details about past and present learning experiences, and a concluding sentence. At the
same time, they converted the reflective writing into slides (here, another slide template was
given to the students; scaffolding) and rehearsed for the 2-minute oral presentation in class.
Finally, students attended the class with their written reflections and slides. First, they exchanged
ideas in the break rooms, and then volunteers were invited to speak to the class. In this phase, stu-
dents built interpersonal relationships with peers and teachers (social presence based on CoI frame-
work and interactive mode based on ICAP theory) by asking/answering questions and by giving/
taking feedback. They also constructed knowledge about English presentations by discussing and
critically evaluating their classmates’ presentations (cognitive presence based on CoI framework
and constructive mode based on ICAP theory). Again, the flow questionnaire was distributed after
class to measure students’ optimal experience. Figure 2 presents an example of one learning loop
that learners underwent.
This course was highly valued by the experts, who gave it an overall grade of 5.5. It scored par-
ticularly highly (i.e. 6) on the following eight criteria: “Objectives are available,” “Technology is used
to promote learner engagement/facilitate learning,” “Student-to-student interaction is supported,”
“Assessments align with objectives,” “Course activities promote achievement of objectives,” “Self-
assessment options are provided,” “Assessments occur frequently throughout course,” and “Infor-
mation is chunked.”
INTERACTIVE LEARNING ENVIRONMENTS 9

Figure 2. Example of one learning loop.

4.2. RQ2: In this synchronous online course, did students perceive two main learning
activities (i.e. student-centered presentations and teacher-centered lectures) differently?
Figure 3 shows the significant paired sample t-test results. Compared with teacher-centered lectures,
in student-centered presentations, students found it easier to avoid mind wandering (Flow 1), to
establish a relationship with teachers/classmates (Flow 2), to follow the lesson (Flow 5), to engage
in the activity to the point where they lost track of time (Flow 6), to avoid being distracted or
feeling bored (Flow 7), and to receive feedback on their performance in class (Flow 8).
To clarify how students perceived teacher-centered lectures, we analyzed the weekly reflections
written after the lectures. Table 2 demonstrates the categorization of issues that might negatively
affect student engagement. As Table 2 shows, students found it challenging to engage in teacher-cen-
tered lectures because of these reasons: They were late for the class; the class was interrupted due to
technical problems; there were too many course contents to digest; and they somehow felt tired.

4.3. RQ3: In this synchronous online course, what were the learner background variables
(i.e. grade level, gender, language proficiency, and school location) that predicted
learning performance?
Table 3 shows the means, standard deviations, and Pearson correlation matrix for learner back-
ground variables (i.e. grade level, gender, language proficiency, and school location), semester
grade, and assignment quantity. Positive correlations were found between language proficiency
and semester grade (r(43) = .42, p = .005), language proficiency and assignment quantity (r(43) = .50,
p = .001), and semester grade and assignment quantity (r(43) = .69, p < .001). Negative correlations
were found between grade level and gender (r(43) = −.45, p = .003), and gender and semester
grade (r(43) = −.44, p = .003), indicating that more female students were in the higher grade level
and more female students received a higher semester grade.
Using multiple linear regression, we tested whether grade level, gender, language proficiency,
and school location significantly predicted learning performance, as indicated by semester grade
and assignment quantity (how many assignments and notes taken in teacher-centered lectures
10 J.-C. HSIAO AND J. S. CHANG

Figure 3. Significant results of paired sample t-test comparing flow ratings between teacher-centered lectures and student-cen-
tered presentations.

and evaluation sheets written in student-centered presentations). The results are summarized in
Table 4. The overall regression of semester grade was statistically significant (Adj R2 = .26, F4, 38 =
4.65, p = .004). Gender (β = −2.28, p = .029) and language proficiency (β = .37, p = .011) were found
to significantly predict semester grade. In contrast, grade level and school location were not signifi-
cant predictors of semester grade (β = .09, p = .585; β = .02, p = .895, respectively).
The overall regression of assignment quantity was statistically significant (Adj R 2 = .20, F4, 38 =
3.70, p = .012). Language proficiency (β = .49, p = .001) was the only significant predictor of assign-
ment quantity. Grade level, gender, and school location did not significantly predict assignment
quantity (β = −.16, p = .3.29; β = −.15, p = .338; β = −.01, p = .931, respectively).

5. Discussion and conclusion


The purpose of this study is to design, implement, and evaluate a synchronous online course whose
goal is to teach senior high school students how to use AI-powered reading and writing tools to
become autonomous EFL learners. Three major findings were summarized and discussed below.
First, this course was designed based on four learning theories and was highly valued by experts.
There was an obvious learning loop that connected online and offline learning. This finding is par-
ticularly significant, because we extended the learning theories that only focus on teaching and
learning in class. In our study, we continued guiding and scaffolding the students after class by
designing assignments that had reachable goals, clear instructions, and various examples. Most
importantly, these assignments are both person- and group-relevant. By incorporating AI-
powered tools into real-world scenarios, students can see the practical application and benefits of
these tools in their lives; by presenting their experiences with these tools to the class, they are
INTERACTIVE LEARNING ENVIRONMENTS 11

Table 2. Categories of issues in teacher-centered lectures and selected quotes.


Categories Definitions Example quotes
Late for the class Students were late to class due to technical “I was late because my homeroom teacher was
problems or other reasons. By the time they were talking to me during the break. When I was
online, the class had already started. finally online, I did not know what the teacher
was talking about. That’s why I took few notes.”
(11001AD a)
Internet Students’ or teachers’ Wi-Fi connection was “I could not hear the teacher clearly. Was that my
disconnection interrupted. Students could not hear teachers problem or teachers’?” (11002XR a)
clearly.
Overwhelming There was switching between slides, websites, and “There were many details and I did not know what
course contents supplemental materials. Students could not was important for me to know.” (11001PJ a)
distinguish what was good to know and what
was needed to know.
Fatigue Students described feeling tired or lacking energy. “I was tired. I am sorry I did not feel like taking
notes.” (11002LG a)
a
The seven-digit value indicates the participant ID.

Table 3. Means, standard deviation, and Pearson correlation matrix of learner background variables, semester grade, and
assignment quantity.
M SD 1 2 3 4 5
1. Grade level .63 .82
2. Gender .35 .48 −.45**
3. Language proficiency 1.33 .64 .10 −.14
4. School location .77 .78 .35* −.16 −.18
5. Semester grade 81.65 15.50 .28 −.44** .42** .04
6. Assignment quantity 8.74 5.41 −.05 −.15 .50** −.13 .69***
Note: N = 43. ***p < .001, **p < .01, *p < .05. Dummy coding was applied to nominal variables with levels: 1. Grade level: 0 = 10th
grade, 1 = 11th grade, 2 = 12th grade; 2. Gender: 0 = female, 1 = male; 3. Language proficiency: 0 = low achievers, 1 = mid
achievers, 2 = high achievers; 4. School location: 0 = Northern Taiwan, 1 = Central Taiwan, 2 = Southern Taiwan.

engaged in interpersonal interaction, which results in the highest level of engagement according to
ICAP theory (Chi, 2009) and immediate feedback that fosters optimal experience according to flow
theory (Csikszentmihalyi, 1990).
Second, students were found to have better flow experiences in student-centered presentations
than in teacher-centered lectures. Among the six significant indicators of experiencing flow (i.e.
avoiding mind wandering, establishing relationships, being able to follow the class, forgetting
about time, not being disturbed by online distractions, and receiving immediate feedback), the
finding that students were able to make connections with teachers/classmates indicates that
social presence (Garrison et al., 1999) was successfully achieved in this investigated course. As
many studies on online learning have suggested, interpersonal relationships are crucial for the
success of educational experiences (Kostenius & Alerby, 2020; Ozaydın Ozkara & Cakir, 2018).
When it comes to teacher-centered lectures, students faced problems including tardiness, Internet
disconnection, overwhelming course contents, and feelings of fatigue. These difficulties should be
seriously considered and appropriately addressed to ensure the effectiveness of teaching presence
(Anderson et al., 2001), the important transfer from social presence to cognitive presence (Garrison &
Cleveland-Innes, 2005). According to CoI framework (Garrison et al., 1999), it is only through the
interaction among the three presences can the online educational experiences become meaningful.
Interestingly, students rarely mentioned similar issues in student-centered presentations. The
reasons, however, were not mentioned in the data collected from the students.
Third, language proficiency was found to be a significant predictor of both semester grade and
assignment quantity. This finding is in accordance with previous studies (Liang et al., 2023), and it
reminds teachers of the importance of offering additional and customized scaffolding for students
whose language proficiency is behind most students (Haruehansawasin & Kiattikomol, 2018). They
needed special attention and support from teachers to complete the assignments and pass the
12 J.-C. HSIAO AND J. S. CHANG

Table 4. Estimates for the impact of grade level, gender, proficiency, and school location on semester grade and assignment
quantity.
Unstandardized coefficient Standardized coefficient
B SE β t
Semester grade
(Constant) 72.44 6.26 11.57***
Grade level 1.64 2.99 .09 .55
Gender −10.95 4.81 −.34 −2.28*
Language proficiency 8.83 3.32 .37 2.66*
School location .38 2.89 .02 .13
Assignment quantity
(Constant) 4.57 2.26 2.02#
Grade level −1.07 1.08 −.16 −.99
Gender −1.69 1.74 −.15 −.97
Language proficiency 4.15 1.20 .49 3.46**
School location −.09 1.04 −.01 −.09
#
p = .05, *p < .05, ***p < .01.

course. In the literature, differentiated instruction, supported by theoretical frameworks and empiri-
cal evidence, has been promoted for the promising outcomes it can bring to students of multiple
intelligence and learning proficiency (Subban, 2006). By tailoring instruction to meet learners’
various abilities, interests, and needs, this instruction has been reported to increase engagement,
motivation, and academic success (Subban, 2006). However, to what extent this differentiated
instruction facilitates synchronous online learning process still warrants additional research.
Taken together, our results suggest the following implications for synchronous course design in this
post-pandemic era. First, basing the design of a brand-new course on learning theories is beneficial,
because it helps ensure that the course is effective in promoting meaningful learning and achieving
intended outcomes. Second, the learning of any new tools (AI-powered tools in our case) requires
a short and clear explanation of the rationale, a step-by-step demonstration, practice in and after
class, reflecting on and sharing user experiences with the tools, and timely/direct feedback from tea-
chers and peers. Third, striking a balance between teacher-centered and student-centered teaching
approaches is recommended. An integrated view of organizing both teacher-centered and student-
centered activities (Mascolo, 2009) seems preferable to achieve meaningful online learning experiences
with adequate teaching presence (e.g. clear goals and scaffolding), cognitive presence (e.g. discussion
on challenging questions), and social presence (e.g. teacher-student and student-student interaction)
(Garrison et al., 1999), and to foster flow experiences in which optimal and immersive engagement
occurs throughout the learning activities (Csikszentmihalyi, 1990). Finally, it is necessary to recognize,
admit, and address difficulties students face in synchronous online course, especially in teacher-cen-
tered lectures, such as overwhelming course contents and feelings of fatigue. Teachers could split
course contents into smaller parts, provide more scaffolding and support, ask for genuine feedback
from students, and reach out to those whose English language proficiency lags behind.
The present study has limitations that should be considered. First, as a case study, the sample size
was not large enough to guarantee the statistical power. Besides, no control group was included to
compare course design and learning performance. Future studies could increase the sample size and
collect data from the control group to compare students’ perceptions and learning outcomes (Chen
& Wang, 2018). Second, students’ language proficiency was measured by their grade in the previous
semester of regular English classes. Future studies could measure this variable through standardized
tests with better validity and reliability. In addition, future studies could further investigate how stu-
dents with different language proficiency perceive the course differently.

Disclosure statement
No potential conflict of interest was reported by the author(s).
INTERACTIVE LEARNING ENVIRONMENTS 13

Funding
This work was supported by Ministry of Science and Technology [grant number NSTC 111-2221-E-007-107-].

Notes on contributors
Dr. Jo-Chi Hsiao is an Adjunct Assistant Professor at the Institute of Education at National Yang Ming Chiao Tung Uni-
versity and an experienced English teacher at National Hsinchu Senior High School. Her research focuses on teacher
professional development, TESOL, and Computer Assisted Language Learning.
Dr. Jason S. Chang received the PhD degree in computer science from New York University. He is a professor of com-
puter science at National Tsing Hua University, Taiwan. His research interests span across the fields of natural language
processing, computer-assisted language learning, information retrieval, and machine translation.

ORCID
Jo-Chi Hsiao http://orcid.org/0000-0001-7378-043X
Jason S. Chang http://orcid.org/0000-0002-8227-7382

References
Akyol, Z., & Garrison, D. R. (2008). The development of a community of inquiry over time in an online course:
Understanding the progression and integration of social, cognitive and teaching presence. Journal of
Asynchronous Learning Networks, 12, 3–22. https://doi.org/10.24059/olj.v12i3.66
An, Y., Kaplan-Rakowski, R., Yang, J., Conan, J., Kinard, W., & Daughrity, L. (2021). Examining K-12 teachers’ feelings,
experiences, and perspectives regarding online teaching during the early stage of the COVID-19 pandemic.
Educational Technology Research and Development, 69(5), 2589–2613. https://doi.org/10.1007/s11423-021-10008-5
Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conference
context. Journal of Asynchronous Learning Networks, 5(2) , 1–17. https://doi.org/10.24059/olj.v5i2.1875.
Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a
community of inquiry instrument: Testing a measure of the community of inquiry framework using a multi-insti-
tutional sample. The Internet and Higher Education, 11(3-4), 133–136. https://doi.org/10.1016/j.iheduc.2008.06.003
Baldwin, S., Ching, Y. H., & Hsu, Y. C. (2018). Online course design in higher education: A review of national and statewide
evaluation instruments. TechTrends, 62(1), 46–57. https://doi.org/10.1007/s11528-017-0215-z
Beard, K. S., & Hoy, W. K. (2010). The nature, meaning, and measure of teacher flow in elementary schools: A test of rival
hypotheses. Educational Administration Quarterly, 46(3), 426–458. https://doi.org/10.1177/0013161X10375294
Celce-Murica, M., Brinton, D. M., & Snow, M. A. (2014). Teaching English as a second language or foreign language (4th ed).
Newbury House.
Chen, C.-M., & Wang, J.-Y. (2018). Effects of online synchronous instruction with an attention monitoring and alarm
mechanism on sustained attention and learning performance. Interactive Learning Environments, 26(4), 427–443.
https://doi.org/10.1080/10494820.2017.1341938
Cheng, C. Y., Chen, S. Y., & Lin, S. S. J. (2017). Episodic and individual effects of elementary students’ optimal experience:
An HLM study. The Journal of Educational Research, 110(6), 653–664. https://doi.org/10.1080/00220671.2016.1172551
Chi, M. T. H. (2009). Active-constructive-interactive: A conceptual framework for differentiating learning activities. Topics
in Cognitive Science, 1(1), 73–105. https://doi.org/10.1111/j.1756-8765.2008.01005.x
Chi, M. T. H., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes.
Educational Psychologist, 49(4), 219–243. https://doi.org/10.1080/00461520.2014.965823
Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. Harper. https://doi.org/10.1080/00222216.1992.
11969876
Emaliana, I. (2017). Teacher-centered or student-centered learning approach to promote learning? Jurnal Sosial
Humaniora, 10(2), 59–70. https://doi.org/10.12962/j24433527.v10i2.2161
Engzell, P., Frey, A., & Verhagen, M. D. (2021). Learning loss due to school closures during the COVID-19 pandemic.
Proceedings of the National Academy of Sciences, 118(17), e2022376118. https://doi.org/10.1073/pnas.2022376118
Fonseca, B., & Chi, M. T. H. (2011). Instruction based on self-explanation. In R. E. Mayer, & P. A. Alexander (Eds.), The hand-
book of research on learning and instruction (pp. 296–321). Routledge. ISBN 9781138831766.
Garrison, D. R., & Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future
directions. The Internet and Higher Education, 10(3), 157–172. https://doi.org/10.1016/j.iheduc.2007.04.001
Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough.
American Journal of Distance Education, 19(3), 133–148. https://doi.org/10.1207/s15389286ajde1903_2
Garrison, D.Randy, Anderson, Terry, & Archer, Walter. (1999). Critical Inquiry in a Text-Based Environment: Computer
Conferencing in Higher Education. The Internet and Higher Education, 2(2-3), 87–105. http://dx.doi.org/10.1016/
S1096-7516(00)00016-6
14 J.-C. HSIAO AND J. S. CHANG

Hammon, J., & Gibbons, P. (2005). Putting scaffolding to work: The contribution of scaffolding in articulating ESL edu-
cation. Prospect, 20(1), 6–30.
Haruehansawasin, S., & Kiattikomol, P. (2018). Scaffolding in problem-based learning for low-achieving learners. The
Journal of Educational Research, 111(3), 363–370. https://doi.org/10.1080/00220671.2017.1287045
Kannan, J., & Munday, P. (2018). New trends in second language learning and teaching through the lens of ICT, net-
worked learning, and artificial intelligence. Cłrculo de Lingłstica Aplicada a la Comunicacin, 76, 13–30. https://doi.
org/10.5209/CLAC.62495
Kostenius, C., & Alerby, E. (2020). Room for interpersonal relationships in online educational spaces–A philosophical dis-
cussion. International Journal of Qualitative Studies on Health and Well-Being, 15(sup1), 1689603. https://doi.org/10.
1080/17482631.2019.1689603
Liang, G., Jiang, C., Ping, Q., & Jiang, X. (2023). Academic performance prediction associated with synchronous online
interactive learning behaviors based on the machine learning approach. Interactive Learning Environments, 1–16.
https://doi.org/10.1080/10494820.2023.2167836
Mascolo, M. F. (2009). Beyond student-centered and teacher-centered pedagogy: Teaching and learning as guided par-
ticipation. Pedagogy and the Human Sciences, 1(1), 3–27. https://scholarworks.merrimack.edu/phs/vol1/iss1/6
Menekse, M., Stump, G. S., Krause, S., & Chi, M. T. H. (2013). Differentiated overt learning activities for effective instruction
in engineering classrooms. Journal of Engineering Education, 102(3), 346–374. https://doi.org/10.1002/jee.20021
Ministry of Education. (2018). Curriculum guidelines for the 12-year basic education: The domain of language arts: English.
Ministry of Education.
Ozaydın Ozkara, B., & Cakir, H. (2018). Participation in online courses from the students’ perspective. Interactive Learning
Environments, 26(7), 924–942. https://doi.org/10.1080/10494820.2017.1421562
Pokrivčáková, S. (2019). Preparing teachers for the application of AI-powered technologies in foreign language edu-
cation. Journal of Language and Cultural Education, 7(3), 135–153. https://doi.org/10.2478/jolace-2019-0025
Ribbe, E., & Bezanilla, M. J. (2013). Scaffolding learner autonomy in online university courses. Digital Education Review,
98–112. https://doi.org/10.1344/DER.2013.24.98-112
Roever, C., Wang, S., & Brophy, S. (2014). Learner background factors and learning of second language pragmatics.
International Review of Applied Linguistics in Language Teaching, 52(4), 377–401. https://doi.org/10.1515/iral-2014-0016
Saldaña, J. (2015). The coding manual for qualitative researchers. Sage.
Shernoff, D. J., Csikszentmihalyi, M., Schneider, B., & Shernoff, E. S. (2014). Student engagement in high school class-
rooms from the perspective of flow theory. In M. Csikszentmihalyi (Ed.), Applications of flow in human development
and education (pp. 475–494). Springer.
Subban, P. (2006). Differentiated instruction: A research basis. International Education Journal, 7(7), 935–947.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Harvard University Press.
Wood, D., Bruner, J., & Ross, G. (1976). The role of tutoring in problem solving. Journal of Child Psychology and Psychiatry,
17(2), 89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x
Wu, H. K., & Huang, Y. L. (2007). Ninth grade student engagement in teacher-centered and student-centered technol-
ogy-enhanced learning environments. Science Education, 91(5), 727–749. https://doi.org/10.1002/sce.20216
Zhao, N., Zhou, X., Liu, B., & Liu, W. (2020). Guiding teaching strategies with the education platform during the COVID-19
epidemic: Taking Guiyang No. 1 Middle School teaching practice as an example. Science Insights Education Frontiers, 5
(2), 531–539. https://doi.org/10.15354/sief.20.rp005

Appendix 1: Flow questionnaire


1. When I was in class just now, sometimes I was thinking about things in life (school). (-) *
2. I tried to relate myself to what teachers/classmates said in class. *
3. When I was in class just now, I took notes and recorded the key points of the course in order to enhance my
understanding.
4. I was focused in class.
5. I could follow teachers, knowing what I had to do. *
6. When I was in class just now, I was immersed in the course and didn’t notice that time had already passed. *
7. When I was in class just now, I browsed other non-course websites (or replied to Line/Messenger messages). (-) *
8. I received feedback from teachers/classmates, so I knew how good my performance was. *
9. Online classes are meaningful to me.
10. Online classes make me feel nervous and uneasy. (-)
11. Online classes make my body exhausted. (-)
12. Overall, I was satisfied with today’s class.
13. Overall, I learned something new in today’s class.
Notes. 1. (-) means reverse scoring. 2. * means significant difference was found between student-centered presentations
and teacher-centered lectures.
INTERACTIVE LEARNING ENVIRONMENTS 15

Appendix 2: Examples of student assignments


1. Cornell notes

(11001BD-CN-02)
16 J.-C. HSIAO AND J. S. CHANG

2. Reflective writing
Linggle Read is very useful, before I met this website I always have to copy the word, and paste it on the Google translate, it’s
tired and sometime even incorrect, but now I know it, I can just paste it on the Linggle Read, it’ll show me every meaning of
the word (you just have to touch it). And also, if you want learn more about this word, you can use the Wikipedia mode, it’ll
show you immediately. And the favorite part is that it separate the word from different level, it was really cool, because it let
we know our level clearly, so I think it change my habit of reading. (11002ZY -RW-02)

3. Slides

(11001JQ-S-03)

4. Peer evaluation sheet

(11002KR-PE-01)
Notes. All of the student assignments were allocated a code. Take 11001BD-CN-02 for example. The first seven-digit
value indicates the participant ID; CN = Cornell notes; RW = Reflective writing; S = Slides; PE = Peer evaluation (sheet).
The final number is the ID for AI-powered tools. 01 = Linggle Write; 02 = Linggle Read; 03 = Linggle Search.

You might also like