You are on page 1of 11

Computers and Education: Artificial Intelligence 5 (2023) 100155

Contents lists available at ScienceDirect

Computers and Education: Artificial Intelligence


journal homepage: www.sciencedirect.com/journal/computers-and-education-artificial-intelligence

Developing middle school students’ understanding of machine learning in


an African school
Ismaila Temitayo Sanusi a, *, Solomon Sunday Oyelere b, Henriikka Vartiainen c,
Jarkko Suhonen a, Markku Tukiainen a
a
School of Computing, University of Eastern Finland, P.O.Box 111, 80101, Joensuu, Finland
b
Department of Computer Science, Electrical and Space Engineering, Luleå University of Technology, SE-931 87, Skellefteå, Sweden
c
School of Applied Educational Science and Teacher Education, PO Box 111, 80101, Joensuu, Finland

A R T I C L E I N F O A B S T R A C T

Keywords: Researchers’ efforts to build a knowledge base of how middle school students learn about machine learning (ML)
Machine learning education is limited, particularly, considering the African context. Hence, we conducted an experimental classroom study
Data (N = 32) within the context of extracurricular activities in a Nigerian middle school to discern how students
Ethics
engaged with ML activities. Furthermore, we explored whether participation in our intervention program elicit
Middle school
changes in students’ ML comprehension, and perceptions. Using multiple qualitative data collection techniques
Nigeria
including interviews, pre-post open-ended surveys and written assessments, we uncover evidence that indicated
evolution of students’ ML understanding, ethical awareness, and societal implication of ML. In addition, our
findings showed that a middle school student can learn and understand ML, even when one had no prior
knowledge or interest in science related careers. The findings have implication for pedagogical design of AI
instruction in middle school context. We discuss the implication of our results for researchers and relevant
stakeholders, highlight the limitations and chart future work paths.

1. Introduction consensus among relevant stakeholders about the significance of ML


lessons for young learners (Long & Magerko, 2020; Touretzky et al.,
The increasing pervasiveness and permeation of artificial intelli­ 2019; Vartiainen et al., 2021), integrating ML into middle school edu­
gence (AI) & machine learning (ML) in every sphere of our lives has cation is crucial. The focus on middle schoolers is germane as re­
necessitated the educational ecosystem to take cognizance of the searchers’ efforts to build a knowledge base of how middle school
compelling need to design a curriculum including tools and approaches students learn about ML is limited (Sanusi et al., 2022). However,
that makes children use ML efficiently, become more future-ready, integrating ML concepts in the formal school system requires weaving of
improve their critical thinking skills (Xia et al., 2022) and raise their relevant content into a curriculum, hence the need to create learning
desire to learn ML. While AI encompasses different subsets which in­ activities appropriate for the target population.
cludes ML, the vast majority of advancements in AI today are due to Indeed, a few curricula activities have been developed to support
machine learning models (Rodríguez-García et al., 2021). Owing to the learners to engage with ML in the middle school context. For instance,
claim that the current development in AI is mainly carried out by ML Lee et al. (2021) created AI curriculum activities grounded in child
algorithms, it is rational to assume that understanding how ML works is development literature, ethics, and career development. Lee and col­
key to AI literacy. Preparing students for ML will help create a genera­ leagues implemented the curriculum online with middle students and
tion of active creators and designers of future ML-powered technologies. reported a significant increase in students’ conceptual understanding of
According to Touretzky and Gardner-McCune (2022), ML is more than AI. Sabuncuoglu (2020) developed an AI curriculum consisting of three
technology; it can help students better appreciate the complexity of main modules which include how computers interact, how computers
humanity. With the benefits of teaching ML highlighted in literature and see and how computers hear. Sabuncuoglu’s workshops with students

* Corresponding author.
E-mail addresses: ismaila.sanusi@uef.fi (I.T. Sanusi), solomon.oyelere@ltu.se (S.S. Oyelere), henriikka.vartiainen@uef.fi (H. Vartiainen), jarkko.suhonen@uef.fi
(J. Suhonen), markku.tukiainen@uef.fi (M. Tukiainen).

https://doi.org/10.1016/j.caeai.2023.100155
Received 24 April 2023; Received in revised form 6 July 2023; Accepted 8 July 2023
Available online 17 July 2023
2666-920X/© 2023 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-
nc-nd/4.0/).
I.T. Sanusi et al. Computers and Education: Artificial Intelligence 5 (2023) 100155

showed how the curriculum could be applied in an engaging and fruitful agreement that basic knowledge of AI, affective component and societal
way. Williams et al. (2022) developed three middle school AI literacy implication of AI are critical for developing AI competencies (Ayanwale
curricula, called Creative AI, Dancing with AI, and How to Train Your et al., 2022; Kong et al., 2022, 2023; Lee et al., 2021; Xia et al., 2022). In
Robot. The authors evaluation of the curricula indicated that students their framework for AI, Kong and Zhang (2021) propose a
developed a critical lens to better grasp how AI systems work and how three-dimensional conceptual framework of AI literacy as a) cognitive,
they impact society. In addition to curriculum development, different b) affective, and c) socio-cultural dimensions. The cognitive dimension
visual tools and hands-on activities have been employed at the middle of AI literacy refers to educating people about AI basics and developing
school level to teach ML concepts (Mahipal et al., 2023; Sakulkuea­ their skills; the affective dimension involves equipping individual to
kulsuk et al., 2018). While existing works have demonstrated an respond appropriately with the ubiquitous use of AI in their lives or
important start to how students can be supported to learn ML, more workplaces; and the socio-cultural component focuses on the ethical
efforts are required to particularly extend ML knowledge propagation to implication of AI use in the society.
middle schoolers in African settings, a region with dearth of studies in As shown in Fig. 1, the three dimensions were adapted in this study
relation to this initiative. to include applicable concepts in the scope of extracurricular activities
Investigating how students in an African setting learn ML is neces­ designed to promote ML ideas in a public middle school. The partici­
sary to understand how the uniqueness of the social context can be pants a) learned about some ML techniques, specifically supervised
leveraged to best introduce ML to them (Sanusi, 2021; Gwagwa, learning which include classification algorithms such as decision trees,
Kachidza, Siminyu, & Smith, 2021). Gwagwa et al. (2021) argued that if k-nearest neighbor (k-NN), convolutional neural networks (CNN), and
there is “absence of significant AI research and development in Africa, expanded the scope to include algorithm bias (cognitive dimension); b)
the AI applications deployed in Africa tend to originate from outside the engaged with ML platforms such as Google’s Teachable Machine (GTM,
continent and will thus lack contextual relevance.” In order for youths in Carney et al., 2020) and Doodleit (Mahipal et al., 2023) among other
the developing context to keep pace with the global technological unplugged activities to learn to confidently interact with ML as well as to
advancement, it is imperative that “these sets of demography be be inspired to formulate new solutions and understand the impact of ML
equipped with AI skills required to prepare them for workplaces where for the social good (affective dimension); and were c) prompted to think
human-AI teaming is the norm” (Oyelere et al., 2022). A few studies ethically through discussions and videos of some negative and positive
have demonstrated an important start to how ML can be demystified in examples of facial recognition systems as well as public cameras (so­
classrooms in both formal and informal (DSN, n.d.; Sanusi et al., 2023) cio-cultural dimension).
settings. However, with ML concepts been postulated as difficult to
teach in the compulsory level of education owing to its abstract features 2.2. Learning ML in middle school
(Kim et al., 2023; Ottenbreit-Leftwich et al., 2022), students cannot be
expected to develop a full understanding of the concepts within the short While there is a call for more effort and initiatives regarding ML in
intervention (45 min) of the identified studies. Hence, the need for more middle school (Sanusi et al., 2022), few studies have laid the foundation
study with robust design and implementation of curriculum activities to for understanding what and how students learn ML within middle school
understand how students learn and how their attitude shifts towards education. Ali et al. (2021) and DiPaola et al. (2021) introduced the
learning ML over time. Creative AI curriculum which was used to teach how generative
To provide evidence for policy makers, researchers, and relevant adversarial networks work including deepfakes and its societal impli­
stakeholders on the feasibility of ML inclusion in the formal school cation. Lee et al. ‘s (2021) DAILy curriculum was built on creative AI
system in Nigeria, we embarked on an intervention study. Adapting an with much focus on ethics and career development. Leveraging on
existing ethics infused ML curriculum, called AI MyData (Anonymous, middle school students interests in dance and art, Jordan et al. (2021)
2023), we engaged middle school students in learning ML during a created Dancing with AI which comprise of AI-powered block-based
3-week, 6-sessions intervention program. Student engagement has been coding tools and learning modules which allows students to “design,
identified as a valuable means to foster and increase student learning build, and reflect on interactive, movement-based, multimedia experi­
(Renninger & Bachrach, 2015; Tarantino et al., 2013). Fredricks et al. ences through a user-friendly gestural interface.” Williams et al. (2021)
(2004) described engagement as a multidimensional construct that en­ designed How to Train Your Robot curriculum with learning activities
compasses behavior, emotion, and cognition. It was further stressed that centered around machine learning, autonomous robotics, as well as
engagement can result to commitment or investment which is key to speech and image recognition. Sabuncuoglu (2020) also developed an AI
enhancing students learning. We believe that our program will help to
introduce students to the joys of science at an early age and pique their
curiosity to be future ML experts or researchers. Drawing on prior
studies, the purpose of this study was to contribute to our understanding
of the evolution of middle schoolers’ comprehension and perceptions as
they engage in learning ML within the context of extracurricular activ­
ities in a public middle school.
This study addressed the following research questions (RQs):

RQ1. How does middle school students engage with the ML activities
in an African school?
RQ2. Does participation in our intervention program elicit changes
in students’ ML comprehension and perceptions in an African
school?

2. Conceptual framework

2.1. The three dimensions of ML literacy

Even though researchers and educators still grapple with AI learning


objectives, standards, and frameworks (Kim et al., 2023), there is an Fig. 1. The adapted three dimensions of ML literacy (Kong & Zhang, 2021).

2
I.T. Sanusi et al. Computers and Education: Artificial Intelligence 5 (2023) 100155

curriculum with specific focus on three areas including how computers and secondary school in Nigeria (Nwite, 2019). Relatedly, “AI IN AF­
interact, how computers see & how computers hear. It is worthy of note RICA,” an organization domiciled in South Africa create platforms to
that all the curriculum highlighted above in a way incorporate ethics in equip the youth to design an inclusive future with AI through work­
their learning activities which shows its significance in raising ethically shops, events, and collaborative programs with academic, public, and
aware citizens of future creators of AI. private institutions (Aiinafrica, n.d.).
In addition to the curricula initiatives for middle schoolers, we Besides NGOs intervention, researchers and authors of the present
recognize other efforts being made by researchers to promote ML. One of study have begun to provide scientific evidence on how teachers and
such efforts include development of technology platforms and tools to students can be supported to teach and learn ML respectively. The effort
ensure novices grasp the basics of ML and how ML works (Mahipal et al., based on our previous works can be summarized into three categories of
2023; Rodríguez-García et al., 2021). Another attempt to popularize ML teacher conception, student perception and learner experience in a non-
among youth has been using manipulatives and hands-on activities (Ma formal setting. We identified three papers on teacher conception. Of the
et al., 2023; Sakulkueakulsuk et al., 2018). As shown above, all the three papers, one explored the initial conceptions of some African
initiatives on AI and middle school education were reports of experi­ teachers about teaching ML in schools while others focus on examining
ences from Global North with a high proportion from the United States. factors influencing teachers’ intention and readiness to teach AI. Three
This suggests that more research focusing on youth in the Global South papers were found that reported the student perception of AI which
should be pursued to get a clearer picture of how students can be sup­ specifically centers around the role of students’ competencies in
ported to learn ML across contexts and make democratization of learning considering cultural competence and teamwork among stu­
teaching and learning of AI indeed a global initiative. Since it has been dents. One paper was found to have explored young learners experience
established in literature (e.g., Sanusi et al., 2022) that additional ML with ML in a non-formal setting. Our knowledge of the past research
resources are needed for middle schools, considering the suggests that initiatives regarding instruction design, curriculum
under-researched population, especially Africa, is important. framework or student motivation on ML/AI education were not
considered. This suggests the need to pursue these to uncover how to
2.3. Initiatives on learning ML with young learners in African context make ML learning effective and engaging to meet learning needs. This
study followed Merrill’s (2002) instructional design principles. Merrill
Different countries have begun to research into ML curricula among (2002) established five instructional design principles that can be
other initiatives to promote ML within the Europe and USA as well as applied when designing any program or practice to achieve effective and
HongKong among other developed regions. However, more curricula efficient instruction. This include making the content relatable and
activities need to be developed especially in the Africa context with solving real-world problem (problem-centered), activating existing
paucity of initiative towards ML in the compulsory level of education knowledge as foundation for new knowledge (activation), demon­
(Sanusi, 2021b). While different African countries may have different strating knowledge to learners (demonstration), applying new knowl­
needs for ML because African societies are intrinsically multiethnic and edge by the learners (application), and integrating new knowledge into
multicultural, the use of the emerging technology is inevitable. For the learner’s world (Integration).
instance, a recent comprehensive review of ML research trend in Africa
found that ML technology can help address some of Africa’s most 3. Methodology
pervasive problems, such as improving education, providing quality
healthcare services among other sustainability development challenges 3.1. Study design and procedure
(Ezugwu et al., 2023). As a result, it is imperative to build human ca­
pabilities to be able to harness the use of this technology. In an attempt Drawing from previous literature with similar goals (Lee et al.,
to build these capabilities, a proactive approach is preparing youth at 2021), we utilized a pretest-posttest design approach to evaluate our
their early age to begin to learn the fundamentals of how ML operate. intervention program on ML education for middle school students. Be­
Utilizing this approach helps to build a foundation for future learning as sides the pretest-posttest phases, to triangulate our findings, there were
several literature outside Africa, especially in North America, and written assessments and interviews (focus group discussions, FGD)
Europe has shown that learning ML at young age help children to during the learning phase of the program. The study was carried out in a
develop useful mental model (Mahipal et al., 2023) and motivate them regular school system within the context of extracurricular activities.
to be AI and ML applications developer and researcher (Touretzky et al., The whole program includes six 1-h sessions on ML lessons in the first
2019). Based on the potential benefit attributed to teaching ML in three weeks of second term in January 2023.
schools, as suggested in numerous studies (Vartiainen et al., 2021; Xia Participation was voluntary and participants were assured of ano­
et al., 2022), we hold the view that learning ML should be integrated nymity. Their parents’ consent to participate in the study was sought
into the curriculum in African schools or incorporated across the cur­ after informing them about the purpose of the research, the activities,
riculum to orientate them towards the development of future technol­ and the kind of data to be collected. We subsequently got the partici­
ogies for their countries as well as prepare them for global citizenship. pants assent at the onboarding phase; a day before the start of the pro­
The discourse about learning ML within the compulsory level of gram having been apprised of the research goal, as well as the topics that
education in African context is ill-informed (Miao & Holmes, 2022; will be covered during the study, the ML tools that will be adopted and
Sanusi et al., 2022d). However, building a knowledge base of how the data gathering techniques. The policy statement regarding enroll­
middle school students learn about ML in the developing context will be ment of children in research in Nigeria (National Health Research Ethics
valuable for African scholars, curriculum designers and policy makers to Committee of Nigeria NHREC, 2016) served as guidance in relation to
develop and strengthen ML education in schools. Although ML is not ethical consideration.
currently introduced as a new subject area or embedded across the
curriculum in African schools (Miao & Holmes, 2022), initiatives are 3.2. Context and participants
emerging among non-governmental organizations (NGO) and re­
searchers to equip school-aged students with ML knowledge. One of This study was carried out at a public middle school in southwestern
such organizations include Data Scientists Network in Nigeria who Nigeria during its extracurricular activities period. The extracurricular
design series of programs such as boot camps and competitions to spur activities are structured such that teachers and each student belong to a
children’s interest and prepare them for opportunities in Artificial In­ particular club (e.g., Jet club, Drama Club, Public Speaking club, etc.)
telligence (AI) and Data Science (DSN, n.d.). DSN, in bid to empower based on their interest. These clubs assemble for an hour bi-weekly to
children with present and future skills also unveils AI books for primary engage in activities set aside for the day/week. For instance, in Jet club

3
I.T. Sanusi et al. Computers and Education: Artificial Intelligence 5 (2023) 100155

which centers around discussion and engagement with hands-on activ­


ities related to STEAM, a meeting could focus on “how do we come to
know that a chemical reaction has taken place?” using intuitive exam­
ples. The club also engage its members in science fairs and excursions to
promote their interest in science, engineering, and technology. All these
clubs were ultimately designed to encourage and prepare students for
STEAM careers. Since the clubs are organized along the lines of science
and art, we reach out to all groups to accommodate participants of
varying interest and career aspirations.
Initially, 47 students expressed interest and signed up for the study
across grade levels within the middle grade bands. Given that we
planned to work with a sizable number 0f students with varying grades,
interest and background, this study reported only the demographics and
activities involving 32 students we selected. The selected students were
eight candidates each across four clubs considering their grade levels
and sex. Ten of the participants were in 7th grade, ten were in 8th grade
while twelve were in 9th grade. In terms of gender representation, 20
were boys and 12 were girls. They have their ages ranging between 11
and 14 and about fifty percent of them are interested in a STEM-related
career in future. Consistent with the design of our program which did
not require knowledge of programming, AI, or ML, all the participants
reported that they did not have background on any of the three concepts.
The students take computer studies since it is a compulsory subject in the
Nigerian school system.
Fig. 2. The ML lessons covered in the program.

3.3. Materials
neural network (CNN), Algorithm bias, and Ethics of ML. Our peda­
This study took a cue from Eguchi et al.’s (2021) work who adapted gogical strategy across the sessions includes a fair mix of activity-based
an AI ethics middle school curriculum implemented with students in the learning, group work, instruction, and discussion.
USA (Payne, 2019) for Japanese classrooms. We deployed AI MyData In session 1, we began with the definition of AI and discussed the four
(Anonymous, 2023) with slight modification (see 3.3.2 for the changes) main things AI can do (perceive, learn, decide & interact) with illus­
to further calibrate the curriculum and some of its activities to ensure trations. We further prompted the participants to mention what they use
contextual relevance. The adapted curriculum was initially piloted with or see on a daily that has AI in them. What is AI and what is not AI
sixth to ninth graders in the United States. Since the present study focus formed part of our discussion using different examples. Further, we
on similar set of demography (6th to 9th grade students), we assume it introduced Google’s Teachable Machine (GTM) and demonstrated how
could be suitable for them even though they reside in different region. it works with the students – allowing them to critique if it recognizes two
The curriculum’s name was coined based on the premise that without persons differently. Building on the GTM demonstration, we presented a
“data,” there is no AI/ML. The activities in the adopted curriculum fish recognition activity where students were to work with the hypoth­
include data science and ML concepts weaved with ethical orientation. esis “is GTM able to recognize venomous fish and harmless fish?.” In
We however focused primarily on the ML-related concepts and its session 2, we taught the students decision trees with an unplugged ac­
ethical implication in this study. Detailed below are the pre-test, the tivity approach – the use of pasta kits (Lee & Martin, 2018). Having
learning, and the post-test phases. introduced the concept of decision trees through a PowerPoint presen­
tation and how to execute the activity, we gave the students tasks to
3.3.1. Pretest build a decision tree to classify five different types of pasta handed out to
The pre-test provided us the opportunity to assess the participants’ them. After the task, we handed out a mystery pasta and had the stu­
entry behavior before unveiling the activities in the intervention pro­ dents classify based on the decision tree they earlier built. We had
gram. In this phase, we also gathered the participants demographic in­ conversations about the mystery to know if their tree classifies it with a
formation which includes gender, age, grade level, and career closely related pasta shape. The discussion led to conversations about
aspiration. Four open-ended items adapted from prior study were used the tree having bias or not. Session 3 focused on k-nearest neighbor
to assess the students’ prior knowledge. The items centers on how the activity. The paper activity originally used in the study of Ma et al.
participants understand ML, how ML is important to them, how they (2023) applied k-NN to find the species of three mystery penguins. Since
would like to use ML in their daily life and discerning their concerns duck is predominant and easily recognized by the students in the study
with ethical implications of ML. To determine whether participation in area, we had them identify three species of duck (Muscovy duck,
our intervention program elicit changes in students’ ML comprehension, Campbell duck and Pekin Duck) in Nigeria. The activity was introduced
we launched the pre-test to serve as a baseline. The pretest was after explaining that k-NN is one of the simplest classification methods,
completed a day before the learning phase. when k-NN is to be applied and the illustration of when k = 1 and K = 3.
Session 4 centered on how features are extracted from images in CNN.
3.3.2. Learning phase The aim of the lesson was to learn about how CNN performs image
The learning phase was organized as six sessions with different ML recognition. The session commenced with the introduction to image
topics and learning activities per session. Each of the sessions lasted for recognition. We then describe CNN after which all the students had a
an hour period. The adopted AI MyData curriculum with slight modifi­ hands-on practice with DoodleIt, a simple sketch recognizer for learning
cation to its activities to further calibrate it and as well ensure contextual CNN (Mahipal et al., 2023). We further introduced an unplugged
relevance. Learning the concepts in Fig. 2 contribute to ML literacy as paper-based activity about how computer sees different types of animals
depicted in Fig. 1 (see 2.1). As shown in Fig. 2, we specifically taught the to reinforce learning of the CNN concept. In session 5, algorithm bias
students six ML-related concepts which includes: Introduction to AI & was the topic of discussion. The session started off with definition of
classification, Decision trees, k-nearest neighbors (k-NN), Convolutional algorithm bias and gender shades video (Buolamwini, 2018) showing

4
I.T. Sanusi et al. Computers and Education: Artificial Intelligence 5 (2023) 100155

bias in face recognition technology. Since the issue of bias in AI/ML capture and summarize their thoughts. We also refer to their feedback as
technology is still pervasive and embedded, we consider it as a topic to quotes to buttress our assertions. Per the interviews, the transcribed data
consider. We further introduced an activity using the pose model provided us with information on students’ comprehension of ML con­
segment of GTM (with their bodily expressions) to depict bias by varying cepts. We used quotes and excerpts from the participants to show how
samples uploaded in the classes to be trained. In the final session, we they were thinking about ML.
taught and discussed ethics of ML. We explained the idea of ethics and
why ethics of ML is important. We further showed videos of some 4. Result
non-desirable and positive examples of facial recognition in the society
to prompt the students to think about the ethical and societal implica­ 4.1. Pretest results
tions of ML. The students worked with either computer-based applica­
tions or pencil and paper/manipulatives or both in most of the sessions. In the pretest responses, 80% (N = 32) of the students recognized
The students were assigned to a group with a total of 8 groups with each only smartphones as the common things they know that have ML in
group comprising four students. Throughout the program the partici­ them. Only two of the participants specifically mentioned face ID
pants mostly performed tasks and worked in groups. Our program is application and automated teller machine. About 60% (N = 23) who
designed to understand how students behave, how they feel, and how responded to the question “is AI important to you?” believes that AI is
they think about ML which is the overall aim of engagement (Fredricks important to them because of income and learning of relevant skills as
et al., 2004). reflected in their statements: Yes, one can gain relevant skills that can
provide income; It helps to understand changes in our environment.
3.3.3. Posttest Regarding how they would like to use ML technology in their daily life,
The posttest was carried out immediately after the intervention students were interested in using the technology in planning their time,
where the participants answered the paper-based test. Considering that for future career, and providing services for organizations. Finally, when
our intention is to measure students’ conceptual understanding of ML they were asked about the concerns they have with respect to ethics and
and the effect of the intervention vis-à-vis their prior knowledge, we social effect of ML, about 12% (N = 4) had concerns which were coined
adopted the items used in the pretest phase. However, we added two as “negative things about the internet,” “less use of ML applications in our
questions to allow participants to reflect on what they found most society.”
interesting and/or most confusing in the learning materials to assist us in
further curating those materials.
4.2. Evidence of engagement with learning activities
3.3.4. Data collection and analysis
Literature has shown that to enhance the credibility of what is found, Here, we detailed some of the evidence that portrayed the partici­
triangulation of evidence should be adopted (Lincoln & Guba, 1985). As pants’ engagement with the learning activities across the sessions. The
this paper focuses on discerning participants’ comprehension and per­ findings presented are based on the transcribed data of the interviews
ceptions as they engage in ML lessons, multiple data collection tech­ and written assessment/completed tasks during the activities.
niques were considered to gauge how they process the learning
materials including their experiences. Specifically, we elicit information 4.2.1. Session 1
through pre-post open-ended surveys, FGD, and written assessments. Per The students’ engagement with GTM activity provided us with a
the focus group interviews which were conducted in the last 10 min of proof of how they began to learn about classification algorithms. Having
each session, we used semi-structured questions which bothers on the demoed the functionality of GTM with the students to determine
topics/activity for each session. For instance, one of the cognitive whether the learning tool recognized two individuals differently, they
prompts for the first activity was “Does varying the size of your input embarked on the main activity for the session – fish recognition activity.
data impacts accuracy? What else could impact accuracy? Why?.” The hypothesis for the activity was “Is GTM able to recognize venomous
Another example of a question on the algorithm bias lesson was “How fish and harmless fish? The students in groups were handed 12 different
could we make our training data better to avoid bias?.” The written printed cards with 4 each being venomous, harmless and test cards. The
assessments and deliverable tasks were also topic specific as shown in e. test cards (scorpionfish, bluefish, stargazer fish & croaker fish), a mix of
g.,Figs. 3, 4 & 5 to assess the comprehension of the ML concepts. venomous and harmless fish were unlabeled since they are meant to test
Following the approach adopted by earlier studies (Ma et al., 2023; out the output of the trained labelled model. Prior to the lesson, the
Mahipal et al., 2023), we showcased some of the assessment including participants are not aware of the different types of harmless fish and
completed tasks and provided a narrative on how it suggests the par­ venomous fish. Getting to use these ideas to teach classification based on
ticipants understand our learning materials. While our analysis relied on the categories the fishes belong opens up the possibility of teaching ML
a qualitative content analysis (Elo & Kyngäs, 2008), we specifically within the context of life sciences.
adopted the approach of Mahipal et al. (2023). We analyzed the Before the participants explored the GTM by training both categories
pretest-posttest by reading through the responses of the participants to of fishes, they had earlier predicted the two fishes that were either
venomous or harmless in a paper activity. With the learning tool, the

Fig. 3. Decision tree created by the participants using pasta.

5
I.T. Sanusi et al. Computers and Education: Artificial Intelligence 5 (2023) 100155

Fig. 4. Worksheet completed by the participants on classification of ducks with k-NN algorithm.

Fig. 5. Completed tasks used to assess the way participants comprehend how computers recognize images.

students’ grouped both examples of the fishes into categories they accuracy when they were prompted to ascertain if GTM can recognize
wanted the computer to learn and trained their model. They then tested themselves and their colleagues differently. The students also believe
the model to see whether it correctly classified the new examples of the using a variety of sample data with different features could also impact
unlabeled fishes. The students were excited to see that the computer accuracy because the model will only provide outputs similar to or based
recognized venomous and harmless fish differently. However, in most of on what it was trained on.
the groups, the computer fails to correctly identify if one of the fishes
Researcher: Does varying the size of your input data impacts
(croaker) was venomous or harmless with a high confidence rating. This
accuracy?
contention led to focus group discussions where participants reflect on
what transpired in the session. FGD3: Yes, more data in different classes may affect the accuracy of
We were able to gain insight into how the students understood the output. It recognizes our test data with high confidence rating
classification during conversation about the GTM activity. About 90 when we demonstrate if a computer can differentiate the faces of four
percent of the students stated that when they compare the GTM output of us.
with their own conclusions from the unplugged activity, the model
Researcher: What else could impact accuracy? Why?
agrees with them. As shown in the excerpts of the FGD below, the stu­
dents indicated that they recognized venomous and harmless fish FGD2: The use of a variety of data samples can affect. I also think
differently based on their features. The conversations further revealed about the features of those data. We thought of these examples
that they assume computers also recognize the two kinds of fish differ­ because you must train your model first to see the effect as output.
ently based on their physical attributes.
Researcher: Does the model agree with you when you compare 4.2.2. Session 2
GTM’s output with your own conclusions from the unplugged The decision trees built by the participants and their reflection at the
activity? end of the session indicates how they understood the concept. As shown
in Fig. 3, the students were able to classify different types of pasta
FGD1: Yes, most (3 out of 4) of the fishes were recognized correctly.
handed to them. The session started with introducing the concept of
It is unsure if croaker is either venomous or harmless and most of the
decision trees through a 5 questions game with animals, where the
time identify it as venomous. The confidence rating is below 50
students organize different questions into different trees based on their
percent.
characteristics. We further introduce the Pasta land activity with a few
Researcher: Why does the computer classify croaker as a venomous example questions about the pasta to get the kids started. The partici­
fish? pants worked as a group to create pasta decision trees with 5 pasta types
handed to them and then share the model. We provided a new pasta for
FGD2: It is possible the computer identifies croaker as a venomous
the students to test the testing pasta on their own model and discuss the
fish due to the similar characteristics it shares with other venomous
outcomes.
fishes. They have rough body types unlike harmless fish.
Vignette of completed tasks (Fig. 3) indicated how the decision tree
Further, we got a glimpse of how students think of input data and its method was used for classification. In both (a) and (b), the students
impact on the output of the model. As shown in some passages of dia­ created decision trees based on the data features, in this case pasta
logue between the researcher(s) and participants below. More than half features. Both images adopted different attributes of the pastas to build
of the students opined that having more data may impact accuracy when their flowchart-like tree structure. This different approach of thinking
testing out the model. They conceived this idea of more data, more about the data categorization showed that the students were making

6
I.T. Sanusi et al. Computers and Education: Artificial Intelligence 5 (2023) 100155

their own choices based on their understanding that the decision tree works. For instance, the students in (a) likened the duck image to either
depicts splitting the source set into subsets based on attributes. The chicken, goose, or turkey with the explanation that they share compa­
students thought about the activity further indicate how they think rable characteristics. Also, in (b), the students even choose only goose
about decision tree concept. A student mentioned that I classified the with the justification that it is closer to duck in terms of attributes. The
pasta along the line of physical appearance and another stated that the responses provided for the completed assessment suggests that the stu­
pasta was categorized according to the similarities they share, when asked dents understood the basic concept of CNN.
how they were asked how the answers were arrived at. It is further
evident that the students understand that there are not many ways to the 4.2.5. Session 5
classification other than considering the characteristics of those The algorithm bias activity which focused on using the pose feature
elements. of GTM provided us some pointers to how students construed algorithm
bias. The activity was conducted by varying the pose samples of different
4.2.3. Session 3 classes and testing out the model. Their tested model was found to be
Fig. 4 showcases the tasks performed by students where they applied biased most times towards the samples with more datasets. The con­
k-NN to identify three mystery species of ducks. Having described k-NN versations with students after the activity revealed that they recognize
as a supervised learning classifier which uses proximity to make classi­ that ML depends on the size of the data used to teach it. Below is an
fications, we demonstrated through the species of duck that it classifies excerpt from two of the groups.
the data point on how its neighbor is classified. We also mentioned that k
Researcher: What happened after testing your algorithm? Why?
represents the number of the nearest neighbors used to classify new data
points. In the task completed by the students, we asked them to classify Group 2: It recognizes our poses that have more data. I tripled one
when k = 1 and k = 3. As shown in Fig. 4, the completed tasks revealed category of the samples during the training.
the students realized the rationale behind the k-NN algorithm.
Group 4: There is bias towards the classes that has much data of
In (a) above, the student was able to show that the quality of the
poses. We have few poses captured in one class and several poses
predictions depends on the distance measure, likewise in (b) who could
recorded in another class.
apply the k-NN idea to identify the mystery ducks. The figures a & b
exemplifies the kind of feedback retrieved from the participants about The response from the students suggests their understanding of al­
the distance-based algorithm. From the cognitive interviews feedback, gorithm bias in that they recognize that bias can emerge from decisions
we deduced that while students identified when k = 1 differently, they relating to how data is collected and selected to train the algorithm. To
all understood that the k-NN algorithm considers the closest training make the students aware that many factors contribute to a computer
data point to the point they want to make a prediction for. We concluded system yielding unfair outcomes, we initiated a discussion around the
that students realized the rationale behind k-NN algorithm based on the subject prompted through a video on racial discrimination in face
follow-up conversations we had with them. For instance, when the recognition technology.
students that completed tasks in Fig. 4 (a) & (b) were asked how they With a video by Buolamwini (2018) on facial recognition systems that
arrived at the answers, (a) mentioned that “I measured the spaces between did not recognize darker, female faces as well as lighter, male faces, we
the different types of duck and unknown duck to determine the closest” and further reinforce the idea of “training data matters,”. The 5-min long
(b) similar stated that “I was looking for the duck species that are within the video was followed by discussion on why the problem identified in the
shortest distance to the mystery ducks.” The students reported that they video should be of concern and ways it could be addressed. When the
can now recognize the three species of ducks they see around almost students were asked what they make of the video. Students’ answers
every day by their name. In response to “where else do you think k-NN were that recognition technology does not work for people of colors as
can be applied to,” the students mostly mentioned different species of exemplified by a student response that “the technology does not work for
animals and objects. However, two students gave descriptions that people with dark faces.” Regarding why the issues raised in the video
suggest recommender systems. A student said facebook suggests friends should be a topic of interest, most of the students responded that the
and people we may know and another mentioned that an online shopping technology would work for selected people which is different from the
platform could suggest items close to what was initially searched or bought.” intended function of the algorithm if not addressed. For instance, a
This response is an indication that students can learn ML concepts and student mentioned that “facial recognition technology will only work for
infer how it is applied in the real-world through use of practical exam­ few sets of people if the topic is not discussed” and another student stated
ples and approaches. that “mistakes of identifying people wrongly may occur”. In response to
how the racial discrimination pointed out in the video can be managed,
4.2.4. Session 4 the students recognized that use of variety of data considering different
With a computational tool, called Doodleit and an accompanying skin types among other qualities is key.
paper activity, we introduced CNN concepts to the students. How the The students’ responses indicated that when the technology doesn’t
tool operates is that whenever a student sketches on the area dedicated work the same way for all people, it can be misleading and causes
for drawing, it visualizes the filter and features maps including the humans to make inaccurate decisions. Their feedback on how to address
output neurons in percentages based on the six classes of images (cat, bias suggests that datasets should be better curated. The students were
sheep, triangle, cake, door & apple) with which it was trained. The able to establish that one of the major causes of bias in ML is insufficient
students interfaced with the Doodleit in their respective groups and training data.
were fascinated with the interactive nature of the tool. However, they
have concerns with how the image drawn is being misrepresented and 4.2.6. Session 6
attributed to a totally different image in the output neuron. For example, We were able to raise the consciousness of the participants on ethics
a sketch of a fish, dog or a random image could be shown as a cat. We of ML through videos of scenarios that showed the pros and cons of the
showed the students some examples of datasets with which each of the use of facial recognition technology. Further, we discussed some ethical
six images were trained and we mentioned to them that the image drawn dilemmas as a prompt to understand how they think about the ethical
was identified by feature extraction. They understood that data has so implication of ML around social media and facial recognition
much influence in the recognition accuracy and that the computer rec­ technology.
ognizes images based on similarity in their features as shown in the Our discussion was guided by three main questions adapted from
completed assessment in Fig. 5. Skirpan et al. (2018): 1) Do you believe facebook/instagram/tiktok is
As shown in Fig. 5, it shows the students got the idea of how CNN good or bad for society?, and 2) Do you believe face recognition

7
I.T. Sanusi et al. Computers and Education: Artificial Intelligence 5 (2023) 100155

technology is good or bad for society?. students can be engaged in learning ML concepts. The students’ inter­
In response to question 1, all the students believe that social media action with the computer-based tools and concrete materials across the
platforms can either be good or bad for society depending how and what learning phase showed that employing such approaches is effective for
individuals use it for. While most of them speak highly of the chances it promoting children’s participation in ML activities. This is consistent
provides for communication and networking, few reflect on revealing with past research (Lee et al., 2021; Ma et al., 2023; Williams et al.,
personal information. 2022) which adopted similar approaches with middle school students in
developed contexts. Having subjects from the developing context
With respect to question 2, almost all the participants saw the
demonstrate their understanding of ML concepts through engagement
technology from a positive perspective with appreciation for how the
with ML platforms, tangibles and manipulatives indicate that students
technology can aid in policing. They specifically wonder if the
regardless of their geographical location can learn ML using such
technology is in use by Nigerian law enforcement agencies and how
approaches.
it should be adopted to track down suspected criminals, solve crimes
As it is widely acknowledged that engagement plays a critical role in
and find missing people. We got an indication that the students have
learning (Greene, 2015; D’Mello et al., 2017), advancing the scientific
begun to have ethical orientation based on the dialogue that ensued
study of how students learn ML requires investigating students’
among some FGDs:
engagement with ML activities. This study’s findings align with existing
FDG4: M1 – … ….. if facial recognition technology is being used by evidence that indicates when students are engaged in their own
police for example, it means our locations will be monitored. F2: I learning, it comes with multifarious benefits including increased moti­
also think the technology may be inaccurate, as we all see when vation and achievement (Aguilar et al., 2022; Sinatra et al., 2015). Our
Google’s Teachable Machine recognizes harmless fish as venomous result suggests the students were cognitively, emotionally, and behav­
fish. iorally engaged based on the three components of engagement proposed
by Fredricks et al. (2004). The result that demonstrates cognitive
FDG7: F1 - … ….. so, with technology, will the government or other
engagement pertains to the students’ investment in the learning task,
people be aware of what I do in private? M1: Yes, it looks like
specifically as it relates to understanding of content and learning ma­
someone will be watching whatever we do which I don’t think is fine.
terials. As detailed in the result (section 4.2), we could deduce increase
The excerpts above suggest that students have started to think ethi­ in learning gains based on the completed tasks, conversations, and re­
cally in relation to use of ML applications which suggest they will be sponses to cognitive interviews across the program sessions. Further in
more attentive to such issues in future. They were also able to identify the analysis, the feelings, and attitudes of the students about the ML
why ethics is important – ensuring ML initiatives maintain human dig­ activities illustrates emotional engagement (Renninger & Bachrach,
nity and do not in any way cause harm to people. All the participants 2015). Specific examples include the expression of interest by the stu­
indicated they would like more lessons on ethics because it will help dents to take more lessons on ethics of ML, interest in exploring GTM
them understand better the effect of ML on society. more as well as other ML tools utilized in the program and general
satisfaction about most of the program activities. According to King
4.3. Posttest results (2020), “a student who arrives to class prepared to learn and engages in
the learning process through on-task behavior and class participation is
In contrast to the findings from the pretest results, students exhibiting the broad indicators of behavioral engagement.” As obtain­
mentioned several things they know that have ML in them which in­ able in the study of Gomes et al. (2023), examples of actions that re­
cludes social media, google, face ID and voice assistants, image recog­ flected behavioral engagement were completion of exercises, tasks, and
nition apps. Almost all the students (26 out of 32) found ML more written assessments including interaction with peers and instructors.
important with reasons related to preparation for future learning and its There is also an indication of social-behavioral engagement (Pekrun &
use for social good. For example: It will help me to learn machine language Linnenbrink-Garcia, 2012) considering how the students collaboratively
in the future; It will help to solve peoples and societal challenges. In response participated in the ML activities with their peers. An instance is inter­
to how they would like to use technology in their daily life, they action among the participants during the group work activities to
mentioned crime detection, to improve security and assist the physically complete a given task. Such as in Fig. 3 where students deliberated on
challenged. Most of the students had concerns about the societal effect of pasta classifications in a bid to understand the decision tree concepts.
ML after the program. Their concerns can be summarized into algorithm The quality of the social interactions during the session on ethical
bias, misuse of data and infringement of personal privacy as shown in implication of ML using facial recognition technology as a point of dis­
some of their responses: My concern is that if you did not have enough data, cussion also illustrate the occurrence of social-behavioral engagement
you will not get accurate results; our data can be used without permission. (Linnenbrink-Garcia & Pekrun, 2011).
Almost all the students reported they found GTM activity more inter­
esting and decision tree lesson less interesting which suggest that the 5.1.2. Does students’ participation in our intervention program elicit
activity could be more calibrated. changes in students’ ML comprehension and perceptions?
We analyzed the participants’ responses to our pretest-posttest
5. Discussion questions to determine whether the students’ participation in our pro­
gram elicit qualitative changes. The findings show an obvious difference
5.1. Reflection of findings and practical implication from how the students conceptualize ML before and after the interven­
tion study. The participants now realize that ML is encountered by them
Here, we briefly reflect on our findings with respect to the two daily with the use of certain devices or services. Our findings revealed
research questions guiding this study and highlight the implication for that even though there is unequal access to new technology in Nigeria
practice. due to social inequalities in income distribution (Sanusi et al., 2023),
students are not unaware of some of these emerging technologies. They
5.1.1. How does students engage with the ML activities? interact with smart devices, but they do not know they contain ML ap­
Our study supports the ample evidence that indicated technology plications. It is also interesting to note that students in their posttest
incorporation in classrooms as an instructional tool enhances student views after the program showed they recognize the use of ML for social
learning and educational outcomes (Devlin et al., 2013). The impact, whereby ML applications are developed to tackle social, envi­
grade-appropriate materials which encompasses hands-on, and collab­ ronmental and health challenges in the society. Against the imagination
orative approaches to learning adopted in this study suggest how that ML is only focused on by big tech giants and experts, the young

8
I.T. Sanusi et al. Computers and Education: Artificial Intelligence 5 (2023) 100155

students were able to see that this technology impacts their everyday school students in Nigeria, which is an indication that children in other
life, and they are a key stakeholder in ensuring we build ethical ML into Africa countries may be able to process the ML learning materials. This
the applications, product, and services in future. Based on claim should be validated by future research involving children from
pretest-posttest responses, we could infer that our intervention, that is other Africa countries. This study further showed that out-of-school time
the students’ engagement with the ML activities led to the changes can be used to promote the concept to students, to this end, policy­
observed. This finding is consistent with past and related research makers, education leaders and practitioners and non-governmental or­
exploring the pre and post effect of an instructional AI program in ganizations in the continent can provide support for educators and
middle school context (Lee et al., 2021). The outcome of this study is an students to learn through out-of-school learning venues. Pursing such
indication that teaching of ML algorithms concept to middle schoolers is initiatives is important as evidence have shown, especially in the west­
possible in Nigerian schools given that learning resources are available ern world that afterschool and summer learning programs are an
including the utilization of relevant approaches. essential tool to support and expand K-12 computer science education
Our findings have implications for practice. First, since results from (Dunton & Rodriguez, 2018; Ma et al., 2023).
this study corroborate the contention that learning cannot have the In summary, considering the three dimensions of the ML literacy
desired impact on learners unless the instruction is engaging and adapted to this study, we analyzed the triangulated data from partici­
effective (Chiu, 2021), educators should design ML activities that will pants for pointers to show their a) understanding of basic ML concepts
provide students opportunity to engage cognitively, emotionally, so­ (cognitive dimension), b) use and interaction with ML applications
cially, and behaviorally. Second, the use of both plugged and unplugged (affective dimension), and c) recognition of ethical implications of ML
approaches should be encouraged to effectively support students applications (socio-cultural dimensions). We fulfilled all these di­
learning of ML concepts. Unplugged learning has been found to increase mensions by demystifying the concepts of ML with bite sized examples,
young students’ mastery of several technology-related concepts (Her­ facilitating interaction with ML platforms, and initiated discussion
mans & Aivaloglou, 2017; Wohl et al., 2015). Likewise, past research has around conduct of facial recognition technology use.
showed that use of plugged activities gets the students engaged and can
foster computer literacy learning including programming concepts 5.2. Limitation and future work
(Brackmann et al., 2017; Sáez-López et al., 2016). To address the limi­
tations that may arise with the use of either of both approaches, it is We noted four limitations of this project which borders on the study
recommended to combine both computer-based activities and activities design. First, the participants were recruited from a government-owned
for which no digital equipment is needed (Brackmann et al., 2017; school in an urban city in Nigeria. Future research needs to examine
Sigayret et al., 2022). Third, the use of collaborative learning approach other regions in the country with different ML needs including private-
as exemplified in this study could be adopted by the teachers. While owned schools and rural areas to address equity issues and the di­
Gillies (2014) describes collaborative learning a teaching tool to chotomy between school location and ownership (Sanusi & Olaleye,
encourage students to think critically and solve problem, research has 2022). Second, the program detailed in this paper was conducted as an
shown that collaborative learning promotes student STEM interest extracurricular activity within the formal school system which makes it
(Casad & Jawaharlal, 2012; Mosley et al., 2016). Adopting collaborative optional learning. Future research could conduct the study as part of
approach is more important in Nigerian context since past research regular classroom activities or adopt a longitudinal research design
suggested that there is lack of collaborative skills among students in approach. Third, we did not analyze our results based on the partici­
Nigerian schools (Akor et al., 2019; Sanusi et al., 2022c). Fourth, based pants’ interest. This may be an interesting future research direction since
on the premise that introducing materials, tools and activities that are our participants were recruited from the clubs organized along the lines
familiar to the student for instruction leads to effective learning (Russell of sciences, languages, and art, with varying career interest and career
et al., 2012), educators should embark on contextualizing students’ aspirations. Fourth, the sole focus on student’s feedbacks without
learning experience using items or object that are of interest in their teachers’ data is a limitation to this study. Supporting data from teachers
specific context. This is particularly relevant as prior research indicated would have provided us more evidence to support how they believe their
that contextualized resources are effective for AI implementation in students regard ML. However, the series of data collection approaches
schools (Eguchi et al., 2021; Oyelere et al., 2022). In the present study, we used offered us enough proof to conclude students understood basic
we attempt to introduce context-bound activity by adapting a k-NN ML concepts and its ethical implication. Lastly, this study attempts to
penguin activity to duck which is common in the study area. We also use ensure the credibility and transferability of the findings to fulfil the
images of mammals and birds familiar to the participants to illustrate trustworthiness requirement of a qualitative research (Shenton, 2004).
the idea of recognition in CNN. We believe that the use of relatable This was done by triangulating the data and methods with detailed
materials could encourage students to learn about ML. Fifth, school reporting of the data collection process. However, we did not consider
administrators could popularize ML concepts in their various schools by investigator triangulation or participants checking during the analytical
creating AI or ML club which would be a platform to promote learning phase of the research (Twining et al., 2017) which could further enhance
and design of ML applications. While educators are expected to engage the trustworthiness of our work.
the learners with series of approaches, they also require professional We have four future work paths. First, we plan to build a no-code
learning opportunities to execute effective instruction. Therefore, pro­ machine learning platform for young learners in an African context
fessional development programs designed to equip the teachers with with a bias towards nature, culture, accent, and artefacts. The applica­
technological pedagogical and content knowledge to provide them with tion interface will accommodate interaction among teachers, students,
sufficient knowledge and skills to integrate ML into teaching (Tang et al., and parents aside other modules on the learning platform. Developing
2022). Finally, this study’s findings offer the policymakers the feasibility tools with more functionalities that support collaborative teamwork and
of teaching ML in classrooms, as such resources can be made available to sharing is important as earlier studies (Gresse von Wangenheim et al.,
support integration of ML in schools. 2021) observed existing tools do not support teachers monitoring their
Additionally, this study provides some suggestions for developing students’ learning process. Second, we plan to further revise the content
middle schoolers ML literacy in Africa or similar contexts. Based on the in this study and create an ML curriculum that meets an interdisciplinary
result of this study, we recommend that educators and practitioners approach. We will particularly customize the topics addressed to moti­
contextualize the activities based on the needs of each African societies vate the students by engaging them with problems and approaches that
since the continent is home to diverse populations with each country are of interest in their specific local context, e.g., the use of relatable
having its own tribes, languages, and cultural differences. This study examples and contextual games as proposed for students in Nigeria by
also suggests that the basic idea behind ML can be understood by middle Oyelere et al. (2022). Developing an ML virtual tool and curriculum for

9
I.T. Sanusi et al. Computers and Education: Artificial Intelligence 5 (2023) 100155

the emerging context is supported by past research (Eguchi et al., 2021; References
Oyelere et al., 2022) that exemplified the importance of contextualizing
resources and learning experience for the targeted population. Third, as Aguilar, S. J., Galperin, H., Baek, C., & Gonzalez, E. (2022). Live instruction predicts
engagement in K–12 remote learning. Educational Researcher, 51(1), 81–84.
widely observed in literature that professional learning opportunities for Aiinafrica. (n.d.). AI IN AFRICA. Retrieved January 31, 2023, from https://aiinafrica.
ML is lacking, especially in Africa (Ayanwale et al., 2022; Ayanwale & org/.
Sanusi, 2023; Sanusi et al., 2022b), we plan to create teacher training Akor, T. S., Bin Subari, K., Binti Jambari, H., Bin Noordin, M. K., & Onyilo, I. R. (2019).
Engineering and related programs’ teaching methods in Nigeria. International Journal
initiatives through workshops and short courses to prepare educators of Recent Technology and Engineering, 8(2), 1279–1282.
adequately for the implementation of the contents in the classrooms. Ali, S., DiPaola, D., Lee, I., Hong, J., & Breazeal, C. (2021). Exploring generative models
Teachers’ centrality in promoting ML ideas in school has been estab­ with middle school students. In CHI conference on human factors in computing systems
(CHI ‘21).
lished (Sanusi et al., 2022e), hence the need to have them ready and Anonymous. (2023). AI MyData: Fostering middle school students’ engagement with
believe in its relevance for students through the provision of ML machine learning through an ethics-infused AI curriculum. In redacted. Submitted.
knowledge as well as relevant pedagogical and technological content. Ayanwale, M. A., & Sanusi, I. T. (2023). Perceptions of STEM vs. Non-STEM teachers
toward teaching artificial intelligence. In 2023 IEEE AFRICON conference. IEEE
Lastly, we aim to carry out a semester and or an academic year-long
(Accepted).
study to evaluate the students’ learning outcomes, their understanding Ayanwale, M. A., Sanusi, I. T., Adelana, P., Aruleba, K., & Oyelere, S. S. (2022). Teachers’
and application of ML ideals and attitudes towards ML. We anticipate readiness and intention to teach artificial intelligence in schools. Computers in
that our resources will be scaled to students and teachers across schools Education: Artificial Intelligence, 1–11. https://doi.org/10.1016/j.caeai.2022.100099
Brackmann, C. P., Román-González, M., Robles, G., Moreno-León, J., Casali, A., &
and learning context in different regions of the context for wider access Barone, D. (2017). Development of computational thinking skills through unplugged
and acceptability. In addition, how are the policymakers thinking about activities in primary school. In Proceedings of the 12th workshop on primary and
ML education for young learners? is a potential research question. The secondary computing education (pp. 65–72).
Buolamwini, J. (2018). Gender shades. Retrieved from https://www.youtube.com/watch
point of view of the policymakers would suggest how they regard the ?v=TWWsW1w-BVo.
subject and ways they could be convinced since adopting a new subject Carney, M., Webster, B., Alvarado, I., Phillips, K., Howell, N., Griffith, J., Jongejan, J.,
as a curricular material requires the analysis of the state’s policy and Pitaru, A., & Chen, A. (2020). Teachable machine: Approachable web-based tool for
exploring machine learning classification. CHI 2020, April 25–30, 2020, Honolulu, HI,
future needs (Sanusi et al., 2022c). USA.
Casad, B. J., & Jawaharlal, M. (2012). Learning through guided discovery: An engaging
6. Conclusion approach to K-12 STEM education. In 2012 ASEE annual conference & exposition (pp.
25–886).
Chiu, T. K. (2021). Student engagement in K-12 online learning amid COVID-19: A
There is a dearth of research in developing countries, specifically qualitative approach from a self-determination theory perspective. Interactive
Africa on ML education within the compulsory level of education. Learning Environments, 1–14.
D’Mello, S., Dieterle, E., & Duckworth, A. (2017). Advanced, analytic, automated (AAA)
Consequently, exploring ways and implementing instructional units to
measurement of engagement during learning. Educational Psychologist, 52(2),
gain insight into how best to approach the increasingly popular concept 104–123.
in schools is necessary. This paper provided a qualitative insight of how Devlin, T. J., Feldhaus, C. R., & Bentrem, K. M. (2013). The evolving classroom: A study
middle school students comprehend ML while engaging with ML activ­ of traditional and technology-based instruction in a STEM classroom. Journal of
Technology Education, 25(1), 34–54.
ities using plugged and unplugged approaches. Our analysis revealed DiPaola, D., Ali, S., & Breazeal, C. (2021). What are GANs?: Introducing generative
that participation in our intervention program elicit changes in students’ adversarial net- works to middle school students. In Proceedings of the 11th symposium
ML comprehension, perceptions, and practices. The multiple data on education advances in artificial intelligence (EAAI ‘21).
DSN. (n.d.). Data Scientists Network, AI for Kids and Teens. Retrieved January 31, 2023,
collection techniques provided evidence to show that a middle school from https://www.datasciencenigeria.org/ai-for-kids-and-teens/.
student can learn and understand ML, even when one had no prior Dunton, S. T., & Rodriguez, S. (2018). Examining the role of informal education in K-12
knowledge or interest in science related careers. It also demonstrated computing pathways & CS education reform efforts. In Proceedings of the 49th ACM
technical symposium on computer science education, 1064-1064.
that ethically aware citizens in relation to the use of ML applications can Eguchi, A., Okada, H., & Muto, Y. (2021). Contextualizing AI education for K-12 students
be raised as the participants were able to weigh in on ML ethical issues to enhance their learning of AI literacy through culturally responsive approaches. KI-
and dilemmas. This study contributes to knowledge on middle schoolers Künstliche Intelligenz, 35(2), 153–161.
Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of
learning of ML concepts using relatable materials and approaches that Advanced Nursing, 62(1), 107–115.
future study can leverage in engaging similar populations, particularly Ezugwu, A. E., Oyelade, O. N., Ikotun, A. M., Agushaka, J. O., & Ho, Y. S. (2023).
in developing regions. Machine learning research trends in Africa: A 30 Years overview with bibliometric
analysis review. Archives of Computational Methods in Engineering, 1–31.
Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of
Declaration of competing interest the concept, state of the evidence. Review of Educational Research, 74(1), 59–109.
Gillies, R. (2014). Cooperative learning: Developments in research. International Journal
The authors declare that they have no known competing financial of Educational Psychology, 3(2), 125–140.
Gomes, S., Costa, L., Martinho, C., Dias, J., Xexéo, G., & Moura Santos, A. (2023).
interests or personal relationships that could have appeared to influence Modeling students’ behavioral engagement through different in-class behavior
the work reported in this paper. styles. International Journal of STEM Education, 10(1), 1–15.
Greene, B. A. (2015). Measuring cognitive engagement with self-report scales:
Reflections from over 20 years of research. Educational Psychologist, 50(1), 14–30.
List of Acronyms Gresse von Wangenheim, C., Hauck, J. C., Pacheco, F. S., & Bertonceli Bueno, M. F.
(2021). Visual tools for teaching machine learning in K-12: A ten-year systematic
AI Artificial intelligence mapping. Education and Information Technologies, 26(5), 5733–5778.
Gwagwa, A., Kachidza, P., Siminyu, K., & Smith, M. (2021). Responsible artificial
ML Machine learning intelligence in sub-saharan Africa: Landscape and general state of play. AI4D Africa.
k-NN k-nearest neighbor Hermans, F., & Aivaloglou, E. (2017). To scratch or not to scratch? A controlled
CNN Convolutional neural networks experiment comparing plugged first and unplugged first programming lessons. In
Proceedings of the 12th workshop on primary and secondary computing education (pp.
NGO Non-governmental organization 49–56).
DSN Data Scientists Network Jordan, B., Devasia, N., Hong, J., Williams, R., & Breazeal, C. (2021). PoseBlocks: A
NHREC National Health Research Ethics Committee of Nigeria toolkit for creating (and dancing) with AI. In Proceedings of the 11th symposium on
education advances in artificial intelligence (EAAI ‘21).
FGD Focus group discussion
Kim, K., Kwon, K., Ottenbreit-Leftwich, A., Bae, H., & Glazewski, K. (2023). Exploring
STEAM Science, Technology, Engineering, Art, and Mathematics middle school students’ common naive conceptions of Artificial Intelligence
GTM Google’s Teachable Machine concepts, and the evolution of these ideas. Education and Information Technologies,
1–28.
King, K. (2020). Interventions to enhance behavioral engagement. Student Engagement:
Effective Academic, Behavioral, Cognitive, and Affective Interventions at School,
133–156.

10
I.T. Sanusi et al. Computers and Education: Artificial Intelligence 5 (2023) 100155

Kong, S. C., Cheung, W. M. Y., & Tsang, O. (2022). Evaluating an artificial intelligence Integrating machine learning, gamification, and social context in STEM education. In
literacy programme for empowering and developing concepts, literacy and ethical 2018 IEEE international conference on teaching, assessment, and learning for engineering
awareness in senior secondary students. Education and Information Technologies, (TALE) (pp. 1005–1010). IEEE.
1–22. Sanusi, I. T. (2021). Intercontinental evidence on learners’ differentials in sense-making
Kong, S. C., Cheung, W. M. Y., & Zhang, G. (2023). Evaluating an artificial intelligence of machine learning in schools. In Proceedings of the 21st koli calling international
literacy programme for developing university students’ conceptual understanding, conference on computing education research (pp. 1–2).
literacy, empowerment, and ethical awareness. Educational Technology & Society, 26 Sanusi, I. T. (2021b). Teaching machine learning in K-12 education. In Proceedings of the
(1), 16–30. 17th ACM conference on international computing education research (pp. 395–397).
Kong, S.-C., & Zhang, G. (2021). A conceptual framework for designing artificial Sanusi, I. T., Jormanainen, I., Oyelere, S. S., Mahipal, V., & Martin, F. (2022e). Promoting
intelligence literacy programmes for educated citizens. In S.-C. Kong, Q. Wang, machine learning concept to young learners in a national science fair. In Proceedings
R. Huang, Y. Li, & T.-C. Hsu (Eds.), Conference proceedings (English paper) of the 25th of the 22nd koli calling international conference on computing education research (pp.
global Chinese conference on computers in education (GCCCE 2021) (pp. 11–15). The 1–2).
Education University of Hong Kong. Sanusi, I. T., & Olaleye, S. A. (2022). An insight into cultural competence and ethics in K-
Lee, I., Ali, S., Zhang, H., DiPaola, D., & Breazeal, C. (2021). Developing middle school 12 artificial intelligence education. In 2022 IEEE global engineering education
students’ AI literacy. In Proceedings of the 52nd ACM technical symposium on computer conference (EDUCON) (pp. 790–794). IEEE.
science education (pp. 191–197). Sanusi, I. T., Olaleye, S. A., Agbo, F. J., & Chiu, T. K. (2022d). The role of learners’
Lee, I., & Martin, F. (2018). Investigating bias in machine learning applications. In CSTA competencies in artificial intelligence education. Computers in Education: Artificial
conference. Intelligence, 3, Article 100098.
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. sage. Sanusi, I. T., Olaleye, S. A., Oyelere, S. S., & Dixon, R. A. (2022c). Investigating learners’
Linnenbrink-Garcia, L., & Pekrun, R. (2011). Students’ emotions and academic competencies for artificial intelligence education in an African K-12 setting.
engagement: Introduction to the special issue. Contemporary Educational Psychology, Computers and Education Open, 3, Article 100083.
36, 1–3. Sanusi, I. T., Oyelere, S. S., & Omidiora, J. O. (2022b). Exploring teachers’
Long, D., & Magerko, B. (2020). What is AI literacy? Competencies and design preconceptions of teaching machine learning in high school: A preliminary insight
considerations. In Proceedings of the 2020 CHI conference on human factors in from Africa. Computers and Education Open. https://doi.org/10.1016/j.
computing systems (1-16). caeo.2021.100072
Mahipal, V., Ghosh, S., Sanusi, I. T., Ma, R., Gonzales, J. E., & Martin, F. G. (2023). Sanusi, I. T., Oyelere, S. S., Vartiainen, H., Suhonen, J., & Tukiainen, M. (2022).
DoodleIt: A novel tool and approach for teaching how CNNs perform image A systematic review of teaching and learning machine learning in K-12 education.
recognition. In Proceedings of the 25th australasian computing education conference (pp. Education and Information Technologies, 1–31.
31–38). Sanusi, I. T., Sunday, K., Oyelere, S. S., Suhonen, J., Vartiainen, H., & Tukiainen, M.
Ma, R., Sanusi, I. T., Mahipal, V., Gonzales, J. E., & Martin, F. G. (2023). March). (2023). Learning machine learning with young children: Exploring informal settings
Developing machine learning algorithm literacy with novel plugged and unplugged in an African context. Computer Science Education. https://doi.org/10.1080/
approaches. In Proceedings of the 54th ACM technical symposium on computer science 08993408.2023.2175559
education (Vol. 1, pp. 298–304). Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research
Merrill, M. D. (2002). First principles of instruction. Educational Technology Research & projects. Education for Information, 22(2), 63–75.
Development, 50, 43–59. Sigayret, K., Tricot, A., & Blanc, N. (2022). Unplugged or plugged-in programming
Miao, F., & Holmes, W. (2022). International forum on AI and education: Ensuring AI as a learning: A comparative experimental study. Computers & Education, 184, Article
common good to transform education, 7-8 december; synthesis report. 104505.
Mosley, P., Ardito, G., & Scollins, L. (2016). Robotic cooperative learning promotes Sinatra, G. M., Heddy, B. C., & Lombardi, D. (2015). The challenges of defining and
student STEM interest. American Journal of Engineering Education, 7(2), 117–128. measuring student engagement in science. Educational Psychologist, 50(1), 1–13.
National Health Research Ethics Committee of Nigeria, NHREC. (2016). Policy statement Skirpan, M., Beard, N., Bhaduri, S., Fiesler, C., & Yeh, T. (2018). Ethics education in
regarding enrollment of children in research in Nigeria. http://nhrec.net/nhrec/Final% context: A case study of novel ethics activities for the CS classroom. In Proceedings of
20NHREC%20Policy%20Statement%20on%20Enrollment%20of%20Children%20in the 49th ACM technical symposium on computer science education (pp. 940–945).
%20Research.pdf. (Accessed 13 January 2023). Tang, J., Zhou, X., Wan, X., Daley, M., & Bai, Z. (2022). ML4STEM professional
Nwite, S. (2019). Data science Nigeria unveils AI book for primary and secondary schools. development program: Enriching K-12 STEM teaching with machine learning.
https://www.tekedia.com/data-science-nigeria-unveils-ai-book-for-primary-and-se International Journal of Artificial Intelligence in Education, 1–40.
condary-schools/. (Accessed 31 January 2023). Tarantino, K., McDonough, J., & Hua, M. (2013). Effects of student engagement with
Ottenbreit-Leftwich, A., Glazewski, K., Jeon, M., Jantaraweragul, K., Hmelo-Silver, C. E., social media on student learning: A review of literature. The Journal of Technology in
Scribner, A., … Lester, J. (2022). Lessons learned for AI education with elementary Student Affairs, 1(8), 1–8.
students and teachers. International Journal of Artificial Intelligence in Education, 1–23. Touretzky, D. S., & Gardner-McCune, C. (2022). Artificial intelligence thinking in k-12.
Oyelere, S. S., Sanusi, I. T., Agbo, F. J., Oyelere, A. S., Omidiora, J. O., Adewumi, A. E., & Computational Thinking Education in K-12: Artificial Intelligence Literacy and Physical
Ogbebor, C. (2022). Artificial intelligence in african schools: Towards a Computing, 153–180.
contextualized approach. In 2022 IEEE global engineering education conference Touretzky, D., Gardner-McCune, C., Cynthia Breazeal, F. M., & Deborah, S. (2019).
(EDUCON) (pp. 1577–1582). IEEE. A year in K-12 AI education. AI Magazine, 40(4), 88–90, 2019.
Pekrun, R., & Linnenbrink-Garcia, L. (2012). Academic emotions and student Twining, P., Heller, R. S., Nussbaum, M., & Tsai, C. C. (2017). Some guidance on
engagement. In S. Christenson, A. Reschly, & C. Wylie (Eds.), Handbook of research on conducting and reporting qualitative studies. Computers & Education, 106, A1–A9.
student engagement (pp. 259–282). New York, NY: Springer. Vartiainen, H., Toivonen, T., Jormanainen, I., Kahila, J., Tedre, M., & Valtonen, T.
Renninger, K. A., & Bachrach, J. E. (2015). Studying triggers for interestand engagement (2021). Machine learning for middle schoolers: Learning through data-driven design.
using observational methods. Educational Psychologist, 50(1), 58–69. International Journal of Child-Computer Interaction, 29, Article 100281.
Rodríguez-García, J. D., Moreno-León, J., Román-González, M., & Robles, G. (2021). Williams, R., Ali, S., Devasia, N., DiPaola, D., Hong, J., Kaputsos, S. P., & Breazeal, C.
Evaluation of an online intervention to teach artificial intelligence with learningml (2022). AI+ ethics curricula for middle school youth: Lessons learned from three
to 10-16-year-old students. In Proceedings of the 52nd ACM technical symposium on project-based curricula. International Journal of Artificial Intelligence in Education,
computer science education (pp. 177–183). 1–59.
Russell, I., Coleman, S., & Markov, Z. (2012). A contextualized project-based approach Williams, R., Kaputsos, S., & Breazeal, C. (2021). Teacher perspectives on how to train
for improving student engagement and learning in AI courses. In Proceedings of your Robot: A middle school AI and ethics curriculum. In Proceedings of the 11th
second computer science education research conference (pp. 9–15). symposium on education advances in artificial intelligence (EAAI ‘21).
Sabuncuoglu, A. (2020). Designing one year curriculum to teach artificial intelligence for Wohl, B., Porter, B., & Clinch, S. (2015). Teaching computer science to 5-7 year-olds: An
middle school. In Proceedings of the 2020 ACM conference on innovation and technology initial study with scratch, cubelets and unplugged computing. In Proceedings of the
in computer science education (pp. 96–102). workshop in primary and secondary computing education (pp. 55–60).
Sáez-López, J. M., Román-González, M., & Vázquez-Cano, E. (2016). Visual programming Xia, Q., Chiu, T. K., Lee, M., Sanusi, I. T., Dai, Y., & Chai, C. S. (2022). A self-
languages integrated across the curriculum in elementary school: A two year case determination theory (SDT) design approach for inclusive and diverse artificial
study using “scratch” in five schools. Computers & Education, 97, 129–141. intelligence (AI) education. Computers & Education, Article 104582.
Sakulkueakulsuk, B., Witoon, S., Ngarmkajornwiwat, P., Pataranutaporn, P.,
Surareungchai, W., Pataranutaporn, P., & Subsoontorn, P. (2018). Kids making AI:

11

You might also like