You are on page 1of 9

114 IEEE REVISTA IBEROAMERICANA DE TECNOLOGIAS DEL APRENDIZAJE, VOL. 18, NO.

1, FEBRUARY 2023

Development of an Interactive Mobile Robot for


Playful Learning and Language Teaching
Manuel Alejandro Ojeda-Misses

Abstract— This paper presents a mobile robot for development a human player in chess. Today, robots and mobile robots
of interactive games and language learning purposes based on continue to generate interest as playful devices. Examples are
the Kinect v2. The game uses interface seeks to take advantage developed and proposed by Ohashi et al. [5], Galán et al. [6],
of human computer interaction in a playful way through gesture
recognition. This work combines robotics and games, games and Su et al. [7], among others.
and language learning, and some recent developments about The robot-based tools enable interaction by playing an
robotics and language learning. The hardware used to build important role in the development of learning. This allows
the mobile robot is described as well as the ludic game for the development of experimental robot-based infrastructure.
language teaching by an interface human-robot. Performance However, an alternative for learning has been developed
of the platform is assessed by means of real-time experiments by
students using a non-linear control based on a trajectory tracking through tools that use computer programs for the realization of
during various practices. The experiment proves the platform applications, [8]but these do not generate the needs associated
potential for supporting ludic game and tested successfully and with an experimental platform. However, applications based
then discuss the results and mention future perspectives. on simulations are not enough for a case study, since it
Index Terms— Playful game, mobile robot, non-linear control, is necessary to carry out a practical analysis, where the
trajectory tracking, language learning, Kinect v2. learners complement the theoretical aspects and allow a direct
interaction with experiences to understand abstract concepts
I. I NTRODUCTION and put them into practice. Therefore, one solution is the

T HE accelerated development and the constant transforma-


tions that society has experienced in recent decades, due
to the multiple technological and scientific advances applied to
development of platforms for educational purposes that are
easy to build and available to learners through ergonomic
design that allow quantifiable experiments to be obtained and
a work, personal and educational environment, in an increas- performed through interactions with computer applications.
ingly demanding aspect. The incorporation of technologies
into our daily lives has turned out to be a complement to II. C ONTRIBUTIONS OF THE I NTERACTIVE M OBILE ROBOT
address and facilitate tasks previously unthinkable or too
In the case of tools for learning a language there are some
complex to carry out by ourselves. Many of them are made
works [8], [9], which are based on applications for language
by technological devices with a certain intelligence or robots
learning [10] educational robots. These robots are developed
that, without the slightest human appearance, facilitate tasks
for games, for example Steels presented a robot in [11], [12]
in industry, at home, at school, among others.
combines artificial intelligence and language games through
In recent years, the applications of robotics have been
sequences of verbal interaction between two agents located in
implemented in schools as tools that allow learning and teach-
a given environment. Other applications for language learning
ing, this represents a revolutionary and progressive advance
with robots have been conceived through applications with
that enables and improves the performance of multiple tasks
the robot Nao [13], [14] These research works have been
through a reprogrammable machine and considering education
conceived through applications of educational robotics that bet
as research.
on unidirectional learning practices and that are underpinned
The robotics has become a discipline that has allowed to
by a behaviorist approach, insofar as human-robot interaction
apply works to learning under approaches and principles of
consists only of repeating words.
learning of the XXI century [1], [2], [3] and new multimodal
However, this article seeks the development of a pro-
media literacies [1], [2], [3], [4]
grammable interactive mobile robot intended for the devel-
The origins of robotics applied to playful games originate
opment of recreational applications for learning purposes and
in the late eighteenth century, when Wolfgang von Kempele
consisting of a computer, as well as sensors, actuators and/or
developed an autonomous machine capable of challenging
cameras, as well as a control system.
Manuscript received 14 January 2021; revised 23 March 2021, 15 July This robot uses a human-machine interface that allows
2021, and 1 May 2022; accepted 20 December 2022. Date of publication specific communication with the user and can make use of
28 February 2023; date of current version 18 April 2023.
The author is with the Division of Informatics Engineering, expressive sounds, symbols, figures, among others. The robot
TECNM/TESSFP, San Felipe del Progreso 50640, Mexico (e-mail: is able to perceive both the characteristics of the environment
manuel.om@sfelipeprogreso.tecnm.mx). and the movements made by the user. The main purpose of
A Spanish version of this article is available as supplementary material at
https://doi.org/10.1109/RITA.2023.3250582. the robot is to promote more effective learning, by taking
Digital Object Identifier 10.1109/RITA.2023.3250582 into account variables such as the learning environment, group
1932-8540 © 2023 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See https://www.ieee.org/publications/rights/index.html for more information.

Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
OJEDA-MISSES: DEVELOPMENT OF AN INTERACTIVE MOBILE ROBOT FOR PLAYFUL LEARNING AND LANGUAGE TEACHING 115

participation, frequent interaction, feedback and connections compact mobile robot under ROS for navigation and location
with the real-world context [2], [3] through applications using in unstructured environments [34], [35], [36]
cognitivism, constructivism and interactionism [15].Whose The human-robot interfaces more relevant are the gesture
objective is to conceive, design and develop playful appli- recognition system developed by Baron et al. [37] to move a
cations that promote learning through the use of a robot robotic arm; the interface [37] created by Cicirelli et al. [38]
to perform specific tasks, movements, recognition and data to recognize static postures of the human body; the interface
processing through the use of control systems, computer and designed by Dardan et al. [38] [39] for gesture recognition
language of programming. It contributes to complement the thanks to a Lego-like robot that identifies the actions
learning/teaching process thanks to the use of technologies performed by a group of children while dancing; the software
applied in formal, semi-formal or informal learning situations, development libraries for Kinect v2 (SDK) chosen by
through the introduction of new dynamics and attitudes, resort- Cueva et al. [40] to control a seven-degree-of-freedom
ing to and promoting the use of multimodal media literacy and industrial robot; and, finally, the interface generated by
twenty-first century skills. Benabdallah et al. [41] to move an industrial robot, based on
In sum, it has been proven that robots have been applied for LabVIEW [41]
various approaches, in the particular case, the mobile robots During the literature review, it was not possible to identify
have had an important auge, these are used as guide papers that integrated robotics, game science, and language
robots [16], [17], rehabilitation support robots [18], [19], [20], didactics. However, among the games programmed with
robots for navigation [21] and simultaneous mapping [19], [22] the Kinect v2 that coincide to a greater or lesser degree
multi-agent robots [23] and/or robots applied to games [5], [6], with the ludopedagogical approach, we can mention, among
[7], [8], [9], [10]. What allows, the development of an inter- others, the proposal of Fernández and von Lucken [42], whose
active mobile robot [5], [6], [7], [8], [9], [10], [16], [17], objective is to promote the playful approach to mathematics
[18], [19], [20], [21], [22], [23] for the playful learning of of preschool children, as well as the games designed by
a language, where learners not only have theoretical concepts, Qingtang et al. [43] and by Grammatikopoulou et al. [44],
but also apply them in practical applications through an in which the user directs an avatar through body movements.
interactive mobile robot that allows to relate abstract concepts The work of Iturrate et al. [45] combines Kinect v2 with
under their own experience through the infrastructure and video games and remote mobile robots, within the framework
equipment and movement of the robot. of an interactive environment where participants are invited
For the development of the interactive mobile robot, it is to cooperate or fight each other by solving general culture
essential to condition and program sensors capable of measur- challenges [45]
ing the variables of interest – related to the environment, the In this article, it is presented a mobile robot that include
user and/or the robot – as well as actuators capable of setting a Kinect v2 sensor in particular, a playful application is
the limbs, links or wheels of the device in motion. Some presented for learning a set of eight parts of the body in French
devices commonly used for this purpose are joysticks [24], through the mobile robot, which aims to combine in a balanced
haptic devices [9], cameras [25], [26], smartphones [26] and way the contributions of robotics, game science and language
smart watches [9], [25], [26], [27] didactics.
Among the devices that allow obtaining useful data for The product obtained consists of a proposal of a playful
the development and implementation of human-machine inter- game based on gesture recognition for the learning of body
faces in real time, the Kinect v2 stands out. That allows parts. The application is developed to learn words in French,
you to measure the parameters of the human body, the but this can be implemented for other languages.
Kinect v2 integrates three types of sensors: a color camera, The interface leverages nonverbal communication for active
an infrared sensor and a set of microphones. Compared to and meaningful learning. With a view to promoting autonomy,
other devices, it presents special interest for its accessibility it includes a menu that allows the user to independently
and low cost [28], [29]. The Kinect v2 has been used in various discover the operation of the robot, practice the vocabulary
body and/or artistic disciplines: martial arts [30], yoga [31], to be acquired and, finally, play. The playful game aims to be
dance [32] or music [33], among others [28], [29], [30], [31], used in the process of language learning from the acquisition,
[32], [33] conceptualization and/or systematization of vocabulary related
This research combines the robotics, the game science and to body parts.
the language didactics, priority has been given to proposals The main elements that make up the mobile robot are
for the use of the Kinect v2 sensor related to robotics and described, the non-linear control used for trajectory tracking
games. In the field of robotics, the Kinect v2 has been used is exposed and control results are presented for the playful
in mobile robots or manipulators robots to realize applications application called De la tê te aux pieds (using from head to
and interfaces. Most applications have been implemented for knee), designed to promote vocabulary learning for French
mobile robots, while interfaces have been developed only for learners, the results obtained are discussed and future prospects
manipulators robots. as are presented.
The applications using Kinect are the RGBD device val-
idation joint system to measure the balance of patients in III. I NTERACTIVE M OBILE ROBOT C OMPONENTS
therapy [34]; the application for the automatic capture of The interactive mobile robot has a differential configuration
body postures for analysis purposes for learning [35] and the with a support wheel. The prototype, shown in Figure 1,

Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
116 IEEE REVISTA IBEROAMERICANA DE TECNOLOGIAS DEL APRENDIZAJE, VOL. 18, NO. 1, FEBRUARY 2023

optical decoders, a speed estimation, internal speed loops, the


configuration of minimum and maximum voltages, the config-
uration of currents and the handling of errors, thus facilitating
the implementation. Of course, there are other alternatives,
such as Raspberry Pi or Arduino. However, these require
an external H-bridge, in addition to a USB communication
device to USART TTL 3.3V in the case of Raspberry Pi.
Arduino is an interesting option for projects that do not require
high performance, but it is an inadequate option when time
requirements are demanding, as is the case with real-time
systems.

E. Matlab-Simulink y Visual Studio


Fig. 1. Connecting mobile robot components [46], [47], [48]. Matlab-Simulink is used to carry out the block diagram of
the control strategy, in addition a TCP / IP communication
includes a mechanical structure (1), a Kinect v2 sensor (2), is carried out that allows the connection with the interface
a laptop (3), a RoboClaw direct current (CD) motor con- programmed in the C # language using Visual Studio under the
troller (4), a Matlab-Simulink environment (5) [46], a real- .NET environment and the Kinect SDK. These tools allow you
time QUARC core [47] and Visual Studio (5) [48], two to fully utilize the power of the Kinect SDK. Body recognition
direct current motors (6) with incremental optical decoders (7), is carried out through two functions included in the SDK,
a horn (8) and a lithium battery (9). Body Tracking and Body Frame Ready. These are used to
measure specific parameters of body parts, such as lengths,
A. Mechanical Structure angles and distances, among others. These parameters are used
A simple construction was used for the mechanical struc- in Matlab-Simulink to move and control the mobile robot in
ture of the mobile robot that would accommodate both the real time.
Kinect sensor and the control computer using a differential
configuration for the mobile robot. This structure allows the F. Electric Motors
other components to be placed at the bottom: the RoboClaw
controller, the DC motors, the incremental optical decoders The motors are the main actuators of the mobile robot,
and the battery. Pneumatic wheels were used because they as they allow to carry out its locomotion. The robot has two
give better traction on uneven surfaces and contribute to the 12 V direct current motors with an array of planetary gears,
damping of the equipment, while the rigid wheel allows the each coupled to its axis a wheel [52]
robot to reorient itself towards any angle and position.
G. Incremental Optical Decoders
B. Kinect v2
These can measure and send the feedback the angular
The Kinect v2 sensor allows the tracking of 26 joints position, each motor has a built-in incremental decoder of
of the human body and detects three states of the hands 5000 cycles per revolution, with two channels in quadrature,
(open hand, closed hand and loop). The Kinect v2 uses an the connections are made through a standard connector of
infrared sensor to estimate distance, calculating the time it 5 terminals given by a ground connection, an index, an A chan-
takes for that beam to go and return. User identification nel, a 5V power and a B channel [53].
and parameter measurement is performed using Kinect v2
software development kit (SDK) tools [49] which provide
a representation of the body’s joints and their positions in H. Lithium Battery
three-dimensional space [50] The electric battery allows the mobile robot to operate and
energize. The lithium battery used is Zippy brand and offers
C. Computer a voltage of 14.5 V at 3200 mA [54].
A Dell Precision M4700 computer is used to run the The system architecture is shown in Figure 2.
programs necessary for its operation. Matlab2012b-Simulink-
QuaRC 2.3, Visual Studio V.12 (C#), and Kinect Studio I. Kinematic Modeling and Control Strategy
Software Development Kit (SDK v2) are installed. In general,
the processor runs two programs: a Kinect graphical interface The motion kinematics of the mobile robot is represented as:
and real-time control using the Matlab2012b-Simulink-QuaRC ẋ(t) = v(t) cos(θ )
environment.
ẏ(t) = v(t) sin(θ )
D. RoboClaw Controller θ̇ (t) = ω(t) (1)
RoboClaw [51], designed specifically for this purpose. This where (ẋ, ẏ) are the velocities with respect to the position in
controller provides a reading interface of the incremental the plane, the position vector (x, y, θ ) is measured based on

Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
OJEDA-MISSES: DEVELOPMENT OF AN INTERACTIVE MOBILE ROBOT FOR PLAYFUL LEARNING AND LANGUAGE TEACHING 117

Fig. 3. Main graphical interface window in Visual Studio 2012.

Fig. 2. Connection architecture of mobile robot components.


system given by e = 0. Considering the following Lyapunov
candidate function, the stability of the controller is validated:
the Kinect v2 reference frame, v represents the linear velocity,
and ω represents the angular velocity of the robot. The linear k2 2 e2
and angular velocities are defined as: V (e) = (e1 + e22 ) + 3 (7)
2 2
r (ωd + ωi ) deriving (7) with respect to time t along the trajectories of the
v(t) = system gives:
2
r (ωd − ωi )
ω(t) = (2) V̇ (e) = k2 e1 ė1 + k2 e2 ė2 + e3 ė3 (8)
L
V̇ (e) = k2 e1 (u 1 +ωr e f e2 )+k2 e2 (vr e f sin(e3 )−ωr e f e1 )+e3 u 2
where ωd represents the angular velocity of the right wheel,
ωi represents the angular velocity of the left wheel, r is (9)
determines the wheel radius and L is defined as the distance obtaining
between the wheels.
The problem for trajectories V̇ (e) = k2 e1 u 1 + k2 e2 vr e f sin(e3 ) + e3 u 2 (10)
T tracking is designed with a
control strategy u = u d u i such that the error is:
substituting (5) into (10) and simplifying
e = lim[ p(t) − pr e f (t)] = 0. (3) V̇ (e) = −k1 k2 e12 − k3 e32 (11)
t→∞

Using the equation of the dynamics of the error (3) and using Therefore, V̇ is negative defined and ensures that the system
a change of coordinates we have: is asymptotically stable.
    
e1 cos(θ) sin(θ) 0 xr e f − x
 e2  =  − sin(θ) cos(θ) 0   yr e f − y  (4) IV. G RAPHICAL I NTERFACE
e3 0 0 1 θr e f − θ The operation of the interactive mobile robot is achieved
through an interactive human-mobile robot interface. The
where e1 , e2 are the position errors in the reference frame interactive interface of the mobile robot is inspired by a finite
of the mobile robot, e3 is the error in orientation and qr = state machine model. By giving the user the ability to choose
[xr e f , yr e f , θr e f ]T is the reference path. The controller for between various functions, it allows transitions using hand
trajectory tracking is given by: gestures as input. These gestures are defined according to two
positions (up/down) and two states (open hand/closed hand).
u d = −k1 (vr e f , ωr e f )e1
The user manages to move the robot from one state to
sin(e3 )
u i = −k2 vr e f e2 − k3 (vr e f , ωr e f )e3 (5) another by acquiring and recognizing the parameters of the
e3 human body made by the Kinect v2. Each action of the mobile
where k2 is a positive constant and k1 (·), k3 (·) are strictly robot is associated with a specific gesture. Thanks to them, the
positive continuous functions in R × R − (0, 0). The function user will be able to drive the robot from a source state to a
is defined as: destination state, passing through other primary and secondary
q states that we will mention later.
k1 (vr e f , ωr e f ) = k3 (vr e f , ωr e f ) = 2ζ σ vr2e f + ωr2e f (6) For its part, the graphical interface (Figure 3) is charac-
terized by being interactive, multifunctional, and easy to use.
where k2 = σ , it is important to mention that ζ, σ are real Thanks to Kinect v2, it allows interaction between the user
and positive gains. Control strategies are applied to the angular and the mobile robot through sounds, images, movements and
velocities of each robot motor. Now, it is essential to check the gestures. It thus offers an interesting potential to capture the
stability of the system through the control strategy. For this, attention of learners and promote a more active and integral
the dynamics of the error and a stable equilibrium point for the participation.

Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
118 IEEE REVISTA IBEROAMERICANA DE TECNOLOGIAS DEL APRENDIZAJE, VOL. 18, NO. 1, FEBRUARY 2023

The interface is defined as multifunctional, since the user


can select between five different states; Only one state is
activated at a time, but it is possible to enter and exit as many
times as desired. Finally, its operation is easy since the user
controls the robot through two positions of the arms and two
states of the hands.
The graphical interface includes the status bar (1), the
animation window (2), the detected gesture window (3), the
window with the available gesture repertoire (4), the detection
status by Kinect v2 (5), and finally the battery power level
bar (6). Each of these elements is briefly explained below.
The status bar describes the current state of the robot.
Among the functions implemented are: “Learn to control me”,
“Menu”, “I follow you!”, “Position me”, “Let’s play”. Fig. 4. Parts of the human body included in De la tête aux pieds.
The animation window displays the body as representation
derived from Kinect v2 information. The detection of 25 preset
joints allows to measure various parameters, such as the can be both learners approaching that vocabulary for the first
distance at which the user is, the condition of the hands, the time, and learners looking to organize or review prior lexical
angles of the arms and legs. To position and move the mobile knowledge.
robot, the interface employs the following hand states: open The playful game is easily adaptable to any language
hand (green circle), closed hand (red circle), hand up or down because it is enough to replace the files of the audios and
the upper chest point at shoulder height. images with the corresponding files in the desired language.
The detected gesture window presents the image of the The language level can also be adjusted, enriching the lexical
user’s position and gesture as they are being identified at any repertoire, or replacing it with a more advanced level list.
given time. It allows you to visualize what the active gesture It is also possible to program adaptations to other lexical
is and, therefore, make sure of the relevance of the gesture fields, as long as each lexical unit corresponds to a specific
that has been selected. position in space (for example, identifying figures projected on
The window with the repertoire of available gestures asso- a screen or locating places on a map). Subject to future devel-
ciates the graphical representation of various states and the opments, De la tête aux pieds has already enabled a number
respective names. It indicates to the user the states accessible of tests, which are mentioned below. As mentioned above,
at a time and serves to remember the preset gesture for each the functionality of the mobile robot within the primary state
state. “Let’s Play” is based on three secondary states: “Review”,
The state detected by the Kinect v2 is represented by “Train” and “Play De la tête aux pieds”. It aims to be used
a sequence of characters that symbolize a given emotion, for learning the French vocabulary of human body parts.
as emoticons. The character sequence (a) represents a face Figure 4 illustrates the parts of the human body included
with its eyes open and indicates that Kinect v2 is properly so far in the robot’s repertoire. In the future, it is planned to
detecting the user. The character sequence (b) represents a expand the list of available vocabulary.
face with its eyes closed, which means that Kinect v2 is not This first dynamic provides a sound, visual and written
detecting any users. approach to the vocabulary corpus. The interface sets out
The power level bar indicates the state of charge of the the name of each part of the body and, at the same time,
battery in real time. displays on the screen the corresponding image, accompanied
Once the interface and its elements have been presented, by the name in question, always preceded by the definite article
the playful game developed by the interactive mobile robot (le, la or l ′ ) which, in the first two cases, makes it possible to
presented in this article is described below. determine the gender of the word.
The purpose is to allow the user to start with an intuitive
V. P LAYFUL G AME D E L A TÊTE AUX P IEDS learning process, listening to the words and touching the
The development of the game De la tête aux pieds required corresponding body part. In this way, the learner discovers
the design of a specific architecture, which includes a robot or reviews lexical (name of the body part), grammatical
and an interactive interface. The human-machine communica- (word gender) and phonetic (pronunciation) elements, iden-
tion protocol establishes the connection to perform the sending tifying the parts of the body in it, as shown in Figure 5.
and receiving of data, while the law of control allows the robot The screen includes three sections: on the left appears
to make the necessary movements. the title of the game mode, followed by a box with the
The playful game developed with the robot is intended for representation of the user’s body. In the case of Figure 5,
learning through a directed or semi-autonomous dynamic, with we see the player touching his own head, in response to the
the linguistic objective of sensitizing, conceptualizing and/or previously broadcast audio (la tête [latεt]). Kinect detection
systematizing the basic vocabulary relating to eight parts of status and battery status are listed below. In the center, the
the human body (see Figure 4), as well as promoting the human figure model is shown to identify the position, name
corresponding grammar and phonetics. In other words, users and spelling of the mentioned body part. On the right, the user

Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
OJEDA-MISSES: DEVELOPMENT OF AN INTERACTIVE MOBILE ROBOT FOR PLAYFUL LEARNING AND LANGUAGE TEACHING 119

to the starting point; In addition, the lower right corner of the


screen indicates the number of accumulated hits. With each hit,
the mobile robot advances towards the player. A video with
the explanation and operation of the robot is added under the
following link:
https://www.youtube.com/watch?v=8NygBIo3rE8.
To ensure an optimal integration of robotics in the edu-
cational field, it is necessary to investigate the needs of
the personnel in charge of teaching processes. According
to the needs of therobot, the robotic prototype must have
the following characteristics: a mechanical structure; easy
programming conducive to the design of new recreational
applications for learning; the possibility of moving in lan-
Fig. 5. Interface with the “Train” status of the playful robot. guage classrooms, media libraries and / or self-access centers
with the aim of making different specific trajectories. As for
learning theories, the interactive mobile robot is based on the
principles of cognitionism, constructivism and interactionism.
From cognoscitivism the idea is taken up according to which
learning results from the internal mental processes of the indi-
vidual, since the actions of the human being are not determined
by the properties of the phenomena of the environment, but
by the interpretation that the subject makes of them. From
constructivism, the postulates are adopted according to which
learning is dependent on the quantity and quality of organi-
zational structures in a person, and the learning environment
can have multiple effects on the constitution of reality [2], [3]
Fig. 6. Interface counting success with the status Move to mobile robot. and knowledge. Finally, interactionism takes advantage of the
idea of not establishing relations of mechanical determinism,
is offered the option “Exit” and reminded of the corresponding but of considering the individual as an active subject: the
gesture. environment and the person interact and complement each
The second and third dynamics of the state “Let’s Play” are other. All three theories are focused on an active subject who
“Train” and “Play De la tête aux pieds”. Both states are based builds his own knowledge, focusing on tasks that usually have
on touching the part of the body enunciated by the mobile relevance and usefulness in the real world. Indeed, multimodal
robot using the secondary state “Receive indication”, whose media literacy underlines the importance of providing learners
function is to play an audio randomly. The game continues with tools for reception, but also for interpretation and the
until you get five correct answers from the user. creation of meaning, conveyed today according to increasingly
Each time the user gets a hit, the robot emits a specific sound diverse and complex ways, while the development of twenty-
signal, displays an applause icon on the screen, and advances first century skills requires significant learning activities based
a preset distance. Conversely, if the user makes a mistake or on the use of recent technologies to expand the capacity to
takes more than five seconds to give his answer, the robot create, share and learn the knowledge [1]
emits an error sound signal and does not move forward.
When the player gets five hits, he wins. The robot emits a VI. T EACHING E XPERIENCES W ITH THE ROBOT
fanfare sound signal to congratulate him, therefore, the game The teaching experiences allowed to identify clues to
restarts automatically. However, if the user accumulates three improve the technical, playful, and linguistic performance of
errors, a different sound signal is played, indicating that the the robot, which are considered for piloting in formal and
player has lost the game. In this case, the game is automatically informal language learning situations, in the French classroom
restarted. In the future, the different sound signals mentioned or in the media library.
will be replaced by voice messages in the language of learning. Preliminary tests of the game De la tête aux pieds with the
What distinguishes the state “Train” from the state “Play mobile robot and the interactive interface were successfully
De la tête aux pieds” is, as mentioned above, the introduction developed in an enclosed place, similar to the intended use
of an additional difficulty, caused by the lag between the part environment, whose lighting levels did not significantly affect
stated and the part to be played. While, in the “Train” state it the recognition and acquisition of body data. On the other
is possible to use the game without any competitiveness, that hand, the robot works optimally on a floor whose surface
is, here the learner can make various attempts as he wishes is regular. The trajectory is given by a straight line, the
(an endless game). results shown in Figure 7 were carried out with the gains
The mobile robot is capable of carrying out the hit count ζ = 2.5, σ = 11.2, k2 = 3, the linear and angular velocities
(Figure 6), according to two modalities: the central bar, red, are vr e f = 0.5m/s, ωr e f = rad/s, and the parameters used
increases in length as the robot advances in space with respect are r = 0.095m, L = 0.42m.

Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
120 IEEE REVISTA IBEROAMERICANA DE TECNOLOGIAS DEL APRENDIZAJE, VOL. 18, NO. 1, FEBRUARY 2023

-Execution: consists of the experimental tests of the project


carried out by the learners with advice from the teacher, that is,
it is the phase of greater duration that implies the performance
of the generic and specific competences to be developed.
-Evaluation: it is the final phase that applies a value judg-
ment in the work, professional, social, and learning context,
this must be done through the recognition of achievements and
aspects to improve, the concept of evaluation will be promoted
for continuous improvement, goal cognition, the development
of critical and reflective thinking in learners.
Also, in the evaluation phase the learners presented the
development of the practices through the use of the inter-
active mobile robot, where the robot is able to evaluate the
performance of the learner through its dynamics (that is, it can
Fig. 7. Graphs of the evaluation of the degree of satisfaction of the theoretical
and practical aspects, zero (very bad) to five (very good). evaluate them and consider when it has been won and/or lost)
and answered a series of questions, which were important
because we wanted to know if the topics covered during the
The experimental tests were carried out with French learners practice They were novel to them. Most learners reported
at the master’s and doctoral levels of CINVESTAV-Mexico familiarity with the theoretical part and grammar of the body
(Spanish speakers from 25 to 45 years old, with a basic level parts used.
of French). The version presented in this paper has been Finally, we asked the learners to assess their overall degree
implemented and presented for this article and for the doctoral of satisfaction with the theoretical and practical aspects pro-
thesis [56] Theaim is to provide a platform for learning body vided during the practice. Where each of the learners evaluated
parts in French. Whose classes consisted of theory and practice both aspects very favorably (from zero, being very bad to five,
under four activities: being very good) obtaining more than 80% acceptance with
1. Familiarization and conceptualization. Knowledge of 68 of the 83 French learners (see Figure 7).
grammar, lexicon, and use of articles for each part of the body, This point is relevant to identify the associated competences
writing and reading. with pedagogical play are related to the participant’s disposi-
2. Awareness, pronunciation, and sound. Phonetics, identi- tion of mind towards the activity rather than to the playing
fication of body parts and knowing how to say them correctly material or rules governing the activity.
using simple and compound articles and vowels. Considering the learning needs, it is contemplated to include
3. Systematization and structuring. Use and use of body in the “Review” modality the possibility of choosing between
parts in sentences and phrases in the French language. Carry- displaying or hiding the written name of the word enunciated,
ing out activities (prayers, drawings, etc.). so that the spelling does not interfere with sound perception.
4. Practice. Realization of various dynamics through the In the case of “le poignet”, for example, a beginner Spanish
parts of the body through the use of the computer and the speaker might tend to read [lepoignet] instead of reproducing
interactive mobile robot. the correct pronunciation [l∂pwaηε]. Another case is given
The internships were carried out during the period from when you have the word breast, whose article in Spanish is
July to December 2019. These practices included activities 1, “el pecho”, however, in French it is necessary the article in
2, 3 and 4 from conceptualization to the use of the interactive feminine “la poitrine”, which can generate a very common
mobile robot, where learners made use of the interactive grammatical error during the learning of the language.
mobile robot and the game De la tête aux pieds. At the end Finally, it is important to highlight that the works presented
of the practices, the learners were evaluated where it was in [8], [9], and [10] are used as tools for learning a language,
analyzed if the body parts were learned in French. however, despite this, they are only based on applications
At the end of the experimental results, the following compe- based on applications on computers. Therefore, the use of
tence(s) were analyzed, considering the following objectives: the interactive mobile robot allowed to obtain quantifiable
-Foundation: it is the referential framework (theoretical- results through a scientific method, this is because it is possible
conceptual), which provides learners to achieve the under- to apply the learning of body parts as a transaction [57]
standing of the reality or situation of the object of study to during the learning of a language an interactive approach
define an intervention process or design a model. is essential under practice using a mobile robot that is able
-Planning: based on the diagnosis, in this phase the design, to identify the parts of the body, user movement, broadcast
programming and dynamics of the robot are carried out so that audios, make movements and allow the development of an
the learners with advice from the teacher can use it; It involves innovative game. Where learners through the constructivist
planning an intervention process, the design of a model, among and interactionist method can build new ideas, contexts, sen-
others; the type of project, the activities to be carried out, the tences and vocabulary through their past experiences and new
resources required (that is, the robot can be reprogrammed knowledge. In addition, language teachers can combine theory
by the teacher and adjusted to other languages) and the work with practice using this prototype to help explain theoretical
schedule. concepts and stimulate hands-on learning and can reprogram

Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
OJEDA-MISSES: DEVELOPMENT OF AN INTERACTIVE MOBILE ROBOT FOR PLAYFUL LEARNING AND LANGUAGE TEACHING 121

the robot. The points mentioned are relevant, insofar as many In addition, a constructivist approach supported by interac-
of the benefits of the platform are associated with fulfilling tionism invites to consider the teacher as a guide and mentor,
the competencies by making available to the learner a novel giving the student the necessary freedom to explore the tech-
platform with real-time performance considering the teaching nological environment and build their knowledge, providing
and learning needs. support in case they have doubts or face a problem. From the
approach of interactionism, student interaction plays a very
VII. C ONCLUSION important role, of course, especially since recent technologies
This article presented an interactive mobile robot designed have generalized the use of sensors capable of measuring
for the development of playful applications in Kinect v2 and various variables of interest, relative to both the environment
used to implement interactive games for language learning and the user and / or the robot.
purposes. The playful game uses an interface that seeks to take The current version, in French, is easily adaptable to other
advantage of the interaction between the player and the inter- languages. It is also possible to contemplate the work around
active mobile robot in a playful way through gesture recog- different contents or according to different rules, resorting to
nition. This work combines robotics and games, games and individual and/or group dynamics. However, language teachers
language learning, and some recent developments on robotics are being trained and trained to select and program the robot’s
and language learning, one of the main contributions of the parameters. Finally, as future work it is contemplated to
project is the uniqueness of its multidisciplinary approach, improve the appearance of the robot through some disguise,
which seeks a deliberate balance between robotics, game personality to make it more friendly for teachers and learners.
studies and language. The hardware used to build the mobile
robot was described, as well as the playful game for learning
and teaching languages through a human-robot interface. R EFERENCES
Platform performance is evaluated by real-time experiments
[1] K. Ananiadou and M. Claro, “21st century skills and competences
using nonlinear control based on trajectory tracking. The for new millennium learners in OECD countries,” OECD Education
experiment demonstrates the potential of the platform to Working Papers 41, 2009.
support playful play and was successfully tested and then dis- [2] New Vision for Education: Fostering Social and Emotional Learning
cuss the results and mention future prospects. The interactive Through Technology, World Economic Forum, Boston, MA, USA, 2016.
[3] New Vision for Education Unlocking the Potential of Technology, World
interface and playful application have been successfully tested Economic Forum, Boston, MA, USA, 2015.
with a group of students in CINVESTAV’s language lab, and [4] N. Lacelle, J.-F. Boutin, and M. Lebrun, “La littératie médiatique
new development tracks have been identified. multimodale appliquée en contexte numérique,” in LMM@: Outils
Conceptuals et Didactiques, Montreal, QC, Canada: PUQ, 2017.
Finally, it can be mentioned that with respect to the works
[5] O. Ohashi, E. Ochiai, and Y. Kato, “A remote control method for mobile
presented in [5], [10], that this work has the following robots using game engines,” in Proc. 28th Int. Conf. Adv. Inf. Netw. Appl.
advantages: first, it presents a multimodal interface that allows Workshops, Victoria, BC, Canada, May 2014, pp. 79–84.
the use of verbal and [10] communication with the user; it [6] J G. Munévar, E. L. R. Sánchez, and H. M. Mosquero, “La robótica
aplicada a La lúdica,” Tecnural, vol. 15, no. 30, pp. 52–63, 2011.
allows the use of an interactive mobile robot powered by
[7] K. L. Su, S. V. Shiau, J. H. Guo, and C.-W. Shiau, “Mobile robot based
control strategies, which in turn allows the realization and online Chinese chess game,” in Proc. 4th Int. Conf. Innov. Comput., Inf.
programming of various trajectories in a classroom, makes use Control (ICICIC), Taiwan, Dec. 2009, pp. 528–531.
of the Kinect v2 sensor, which allows the identification of the [8] N. Yanes and I. Bououd, “Using gamification and serious games for
English language learning,” in Proc. Int. Conf. Comput. Inf. Sci. (ICCIS),
user’s movements and the recognition of body parts; allows vol. 3, Apr. 2019, pp. 7301–7312.
the development of a playful game applied for the learning of [9] X. Hou, P. Fang, and Y. Qu, “Dynamic kinesthetic boundary for haptic
body parts; Finally, the interactive mobile robot is considered teleoperation of unicycle type ground mobile robots,” in Proc. 36th Chin.
Control Conf. (CCC), vol. 2, Jul. 2017, pp. 6246–6251.
flexible since it allows the programming of other dynamics,
[10] P. Kitichaiwat, M. Thongsuk, and S. Ngamsuriyaroj, “Melody touch: A
movements, games and the learning of other topics during the game for learning English from songs,” in Proc. 3rd ICT Int. Student
learning of a language. It starts from the idea that playful Project Conf. (ICT-ISPC), Mar. 2014, pp. 13–16.
tools for learning can help motivate learning, as long as you [11] L. Steels and M. Hild, Language Grounding in Robots. New York, NY,
USA: Springer, 2012.
keep in mind the complexity of both the learning process and
[12] L. Steels, “Language games for autonomous robots,” IEEE Intell. Syst.,
the playful metaphor: not everything called play produces the vol. 16, no. 5, pp. 16–22, Sep. 2001.
expected effects of a game. [13] M. Ishida, A. Khalifa, T. Kato, and S. Yamamoto, “Features of learner
The type of learning promoted by the interactive robot is corpus collected with joining-in type robot assisted language learning
system,” in Proc. Conf. Oriental Chapter Int. Committee Coordination
active and cognitive, thanks to playful activities that require Standardization Speech Databases Assessment Techn. (O-COCOSDA),
bi- and multidirectional interaction, rather than passive obser- Oct. 2016, pp. 128–131.
vation or an explanation-repetition sequence, as is the case [14] R. van den Berghe, J. Verhagen, O. Oudgenoeg-Paz, S. van der Ven,
with the identified uses of the Nao robot, inspired by a and P. Leseman, “Social robots for language learning: A review,” Rev.
Educ. Res., vol. 89, no. 2, pp. 259–295, Apr. 2019.
behaviorist approach [58] In sum, the interactive mobile robot [15] G. C. Da Silva and D. A. Signoret, Temas Sobre la Adquisición de una
has two general educational principles: learning as an active Segunda Lengua. Mexico City, Mexico: Trillas, 2005.
process and complete, authentic and real learning according [16] W. Burgard et al., “Experiences with an interactive museum tour-guide
robot,” Artif. Intell., vol114, nos. 1–2, pp. 3–55, 1999.
to Piaget [59] Therefore, from the action-oriented approach
[17] S. Thrun et al., “MINERVA: A second-generation museum tour-guide
in language teaching [13] properly and the development of robot,” in Proc. IEEE Int. Conf. Robot. Automat., Detroit, MI, USA,
general and communication competences. vol. 3, May 1999, pp. 1999–2005.

Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
122 IEEE REVISTA IBEROAMERICANA DE TECNOLOGIAS DEL APRENDIZAJE, VOL. 18, NO. 1, FEBRUARY 2023

[18] S. T. Hansen, T. Bak, and C. Risager, “An adaptive game algorithm for [38] G. Cicirelli, C. Attolico, C. Guaragnella, and T. D’Orazio, “A Kinect-
an autonomous, mobile robot—A real world study with elderly users,” in based gesture recognition approach for a natural human robot interface,”
Proc. IEEE 21st IEEE Int. Symp. Robot Hum. Interact. Commun. (RO- Int. J. Adv. Robotic Syst., vol. 12, no. 3, 2015, pp. 1–11, 2015.
MAN), Paris, France, Sep. 2012, pp. 125–130. [39] D. Maraj, A. Maraj, and A. Hajzeraj, “Application interface for gesture
[19] S. T. Hansen, D. M. Rasmussen, and T. Bak, “Field study of a physical recognition with Kinect sensor,” in Proc. IEEE Int. Conf. Knowl. Eng.
game for older adults based on an autonomous, mobile robot,” in Appl. (ICKEA), Sep. 2016, pp. 98–102.
Proc. Int. Conf. Collaboration Technol. Syst. (CTS), Denver, CO, USA, [40] F. C. C. Wilson, S. H. M. Torres, and M. J. Kern, “7 DOF industrial robot
May 2012, pp. 125–130. controlled by hand gestures using Microsoft Kinect v2,” in Proc. IEEE
[20] L. V. Calderita, P. Bustos, C. S. Mejías, F. Fernández, R. Viciana, and 3rd Colombian Conf. Autom. Control (CCAC), Cartagena, Colombia,
A. Bandera, “Socially interactive robotic assistant for motor rehabil- Oct. 2017, pp. 1–6.
itation therapies with pediatric patients,” Revista Iberoamericana de [41] I. Benabdallah, Y. Bouteraa, R. Boucetta, and C. Rekik, “Kinect-based
Automática e Informática Ind., vol. 12, no. 1, pp. 99–110, 2015. computed torque control for lynxmotion robotic arm,” in Proc. 7th Int.
[21] T. Zafar, M. U. Khan, A. Nawaz, and K. F. Ahmad, “Smart phone Conf. Modeling, Identificat. Control (ICMIC), Dec. 2015, pp. 1–6.
interface for robust control of mobile robots,” in Proc. IEEE Int. Conf. [42] R. Fernandez and C. von Lucken, “Using the Kinect sensor with open
Auto. Robot Syst. Competitions (ICARSC), Espinho, Portugal, May 2014, source tools for the development of educational games for kids in pre-
pp. 42–46. school age,” in Proc. Latin Amer. Comput. Conf. (CLEI), Oct. 2015,
[22] K. Koide, J. Miura, M. Yokozuka, S. Oishi, and A. Banno, “Interactive pp. 1–12.
3D graph SLAM for map correction,” IEEE Robot. Autom. Lett., vol. 6, [43] L. Qingtang, W. Yang, W. Linjing, H. Jingxiu, and W. Peng, “Design
no. 1, pp. 40–47, Jan. 2021. and implementation of a serious game based on Kinect,” in Proc. Int.
[23] C. G. Cena, R. Saltarén, J. L. Blázquez, and R. Aracil, “Desarrollo de Conf. Educ. Innov. through Technol. (EITT), Wuhan, China, Oct. 2015,
una interfaz de usuario para el sistema robótico multiagente SMART,” pp. 13–18.
Revista Iberoamericana de Automática e Informática Ind., vol. 7, no. 4, [44] A. Grammatikopoulou, S. Laraba, O. Sahbenderoglu, K. Dimitropoulos,
pp. 17–27, 2010. and N. Grammalidis, “An adaptive framework for the creation of
[24] S. Kruzic, J. Music, and I. Stancic, “Influence of human-computer bodymotion-based games,” in Proc. 9th Int. Conf. Virtual Worlds Games
interface elements on performance of teleoperated mobile robot,” in Serious Appl. (VS-Games), Athens, Greece, Sep. 2017, pp. 209–216.
Proc. 40th Int. Conv. Inf. Commun. Technol., Electron. Microelectron. [45] I. Iturrate et al., “A mobile robot platform for open learning based
(MIPRO), Opatija, Croacia, May 2017, pp. 1015–1020. on serious games and remote laboratories,” in Proc. 1st Int. Conf.
[25] J. Wu, C. Lv, L. Zhao, R. Li, and G. Wang, “Design and implementation Portuguese Soc. Eng. Educ. (CISPEE), Porto, Portugal, Oct./Nov. 2013,
of an omnidirectional mobile robot platform with unified I/O interfaces,” pp. 1–7.
in Proc. IEEE Int. Conf. Mechatronics Autom. (ICMA), Aug. 2017, [46] The Math Works. (2012). MATLAB-Simulink/Versión 8.0 (R2012b).
pp. 410–415. Natick, MA, USA. [Online]. Available: https://www.mathworks.com
[26] S. Waldherr, R. Romero, and S. Thrun, “A gesture based interface for [47] Quanser Consulting. (2011). QuarC/Versión 2.3.603. Markham, Ontario,
human–robot interaction,” Auton. Robots, vol. 9, no. 2, pp. 151–173, CA. [Online]. Available: https://www.quanser.com
2000. [48] Microsoft Corporation. (2013). Visual Studio/Versión 12. Redmond, WA,
[27] V. Villani et al., “Interacting with a mobile robot with a natural USA. [Online]. Available: https://www.microsoft.com
infrastructure-less interface,” in Proc. 20th IFAC World Congr., 2017, [49] Microsoft Corporation. Kinect for Windows Software Development
vol. 50, no. 1, pp. 12753–12758. Kit (SDK) 2.0. Redmond, WA, USA. 2013. [Online]. Available:
[28] C. Escolano and J. Minguez, “Sistema de teleoperación multi-robot https://www.microsoft.com
basada en interfaz cerebro-computador,” Revista Iberoamericana de [50] C. Giorio and M. Fascinari, “Kinect in motion,” in Audio and Visual
Automática e Informática Ind., vol. 8, no. 2, pp. 16–23, 2011. Tracking by Example. Birmingham, U.K.: Packt, 2013.
[29] J. L. Sirvent, J. M. Azorín, E. Iáñez, A. Úbeda, and E. Fernández, “Non- [51] Basic Micro Motion Control. (2015). RoboClaw 2×30A Motor
invasive brain interface based on evoked potentials for the control of a Controller. Temecula, CA, USA. [Online]. Available: https://www.
robot arm,” Revista Iberoamericana de Automática e Informática Ind., basicmicro.com
vol. 8, no. 2, pp. 103–111, 2011. [52] Robotzone, Servocity. 313 RPM HD Premium Planetary Gear
[30] N. Alharbi, Y. Liang, and D. Wu, “A data preprocessing technique Motor, p. 1. [Online]. Available: https://www.servocity.com/313-rpm-
for gesture recognition based on extended-Kalman-filter,” in Proc. hdpremium-planetary-gear-motor
IEEE/ACM Int. Conf. Connected Health: Appl., Syst. Eng. Technol. [53] U.S. Digital, Vancouver, BC, Canada. E2 Optical Kit Encoder,
(CHASE), Philadelphia, PA, USA, Jul. 2017, pp. 77–83. pp. 1–10. [Online]. Available: https://cdn.usdigital.com/assets/datasheets/
[31] M. U. Islam, H. Mahmud, F. B. Ashraf, I. Hossain, and M. K. Hasan, E2_datasheet.pdf
“Yoga posture recognition by detecting human joint points in real [54] Zippy, U.S. Hobbyking. Zippy Drums, p. 1. [Online]. Available:
time using Microsoft Kinect,” in Proc. IEEE Region 10 Humanitarian https://hobbyking.com/en_us/zippy-compact-4500mah-4s-35c-lipo-
Technol. Conf. (R10-HTC), Dec. 2017, pp. 668–673. pack.pdf
[32] A. Masurelle, S. Essid, and G. Richard, “Multimodal classification of [55] C. C. de Wit, H. Khennouf, C. Samson, and O. J. Sordalen, “Nonlinear
dance movements using body joint trajectories and step sounds,” in control design for mobile robots,” in Recent Trends in Mobile Robots
Proc. 14th Int. Workshop Image Anal. Multimedia Interact. Services (World Scientific Series in Robotics and Intelligent Systems), no. 11,
(WIAMIS), Jul. 2013, pp. 1–4. Y. F. Zheng, Ed. 1994, pp. 121–156, doi: 10.1142/9789814354301_0005.
[33] P. Payeur et al., “Human gesture quantification: An evaluation tool for [56] M. A. O. Misses, “Ludibot: A ludoeducational robot designed from
somatic training and piano performance,” in Proc. IEEE Int. Symp. mobile robotics, game sciences and the didactics of languages and
Haptic, Audio Vis. Environments Games (HAVE), Richardson, TX, USA, cultures,” CINVESTAV, Mexico City, Mexico, Tech. Rep. CINVES-
Oct. 2014, pp. 100–105. TAV 2246 and the Automatic Control 177, 2020. [Online]. Available:
[34] I. Ayed, B. Moyà-Alcover, P. Martínez-Bueso, J. Varona, A. Ghazel, https://repositorio.cinvestav.mx/handle/cinvestav/2246
and A. Jaume-i-Capó, “Validation of RGBD devices to therapeutically [57] J. P. Miller, The Holistic Curriculum, 2nd ed. Toronto, ON, Canada:
measure balance: The functional scope test with Microsoft Kinect,” OISE Press, 2007.
Revista Iberoamericana de Automática e Informática Ind., vol. 14, no. 1, [58] Common European Framework of Reference for Languages, Ministry of
pp. 115–120, 2017. Education, Culture and Sport, Instituto Cervantes, Madrid, Spain, 2002.
[35] R. Munoz, T. Schumacher Barcelos, R. Villarroel, R. Guinez, and [59] J. Piaget, Psychology and Epistemology. Mexico City, Mexico: Ariel,
E. Merino, “Body posture visualizer to support multimodal learning 1979.
analytics,” IEEE Latin Amer. Trans., vol. 16, no. 11, pp. 2706–2715,
Nov. 2018.
[36] A. Araújo, D. Portugal, M. S. Couceiro, J. Sales, and R. P. Rocha,
“Development of a compact mobile robot integrated in the ROS mid- Manuel Alejandro Ojeda-Misses received the degree in mechatronics engi-
dleware,” Revista Iberoamericana de Automática e Informática Ind., neering from UPIITA-IPN, Mexico, in 2013, and the M.Sc. and Ph.D. degrees
vol. 11, no. 3, pp. 315–326, 2014. in automatic control from CINVESTAV in 2015 and 2020, respectively.
[37] G. Baron, P. Czekalski, M. Golenia, and K. Tokarz, “Gesture and voice He is currently an Assistant Professor with the Division of Informatics
driven mobile tribot robot using Kinect sensor,” in Proc. Int. Symp. Engineering, TESSFP/TECNM. His research interests include software and
Electrodynamic Mechatronic Syst. (SELM), Opole-Zawiercie, Polonia, hardware development, ludoeducational robotics, ludic games, mechatronic
May 2013, pp. 33–34. systems, automatic control, and applied electronics.

Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.

You might also like