Professional Documents
Culture Documents
1, FEBRUARY 2023
Abstract— This paper presents a mobile robot for development a human player in chess. Today, robots and mobile robots
of interactive games and language learning purposes based on continue to generate interest as playful devices. Examples are
the Kinect v2. The game uses interface seeks to take advantage developed and proposed by Ohashi et al. [5], Galán et al. [6],
of human computer interaction in a playful way through gesture
recognition. This work combines robotics and games, games and Su et al. [7], among others.
and language learning, and some recent developments about The robot-based tools enable interaction by playing an
robotics and language learning. The hardware used to build important role in the development of learning. This allows
the mobile robot is described as well as the ludic game for the development of experimental robot-based infrastructure.
language teaching by an interface human-robot. Performance However, an alternative for learning has been developed
of the platform is assessed by means of real-time experiments by
students using a non-linear control based on a trajectory tracking through tools that use computer programs for the realization of
during various practices. The experiment proves the platform applications, [8]but these do not generate the needs associated
potential for supporting ludic game and tested successfully and with an experimental platform. However, applications based
then discuss the results and mention future perspectives. on simulations are not enough for a case study, since it
Index Terms— Playful game, mobile robot, non-linear control, is necessary to carry out a practical analysis, where the
trajectory tracking, language learning, Kinect v2. learners complement the theoretical aspects and allow a direct
interaction with experiences to understand abstract concepts
I. I NTRODUCTION and put them into practice. Therefore, one solution is the
Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
OJEDA-MISSES: DEVELOPMENT OF AN INTERACTIVE MOBILE ROBOT FOR PLAYFUL LEARNING AND LANGUAGE TEACHING 115
participation, frequent interaction, feedback and connections compact mobile robot under ROS for navigation and location
with the real-world context [2], [3] through applications using in unstructured environments [34], [35], [36]
cognitivism, constructivism and interactionism [15].Whose The human-robot interfaces more relevant are the gesture
objective is to conceive, design and develop playful appli- recognition system developed by Baron et al. [37] to move a
cations that promote learning through the use of a robot robotic arm; the interface [37] created by Cicirelli et al. [38]
to perform specific tasks, movements, recognition and data to recognize static postures of the human body; the interface
processing through the use of control systems, computer and designed by Dardan et al. [38] [39] for gesture recognition
language of programming. It contributes to complement the thanks to a Lego-like robot that identifies the actions
learning/teaching process thanks to the use of technologies performed by a group of children while dancing; the software
applied in formal, semi-formal or informal learning situations, development libraries for Kinect v2 (SDK) chosen by
through the introduction of new dynamics and attitudes, resort- Cueva et al. [40] to control a seven-degree-of-freedom
ing to and promoting the use of multimodal media literacy and industrial robot; and, finally, the interface generated by
twenty-first century skills. Benabdallah et al. [41] to move an industrial robot, based on
In sum, it has been proven that robots have been applied for LabVIEW [41]
various approaches, in the particular case, the mobile robots During the literature review, it was not possible to identify
have had an important auge, these are used as guide papers that integrated robotics, game science, and language
robots [16], [17], rehabilitation support robots [18], [19], [20], didactics. However, among the games programmed with
robots for navigation [21] and simultaneous mapping [19], [22] the Kinect v2 that coincide to a greater or lesser degree
multi-agent robots [23] and/or robots applied to games [5], [6], with the ludopedagogical approach, we can mention, among
[7], [8], [9], [10]. What allows, the development of an inter- others, the proposal of Fernández and von Lucken [42], whose
active mobile robot [5], [6], [7], [8], [9], [10], [16], [17], objective is to promote the playful approach to mathematics
[18], [19], [20], [21], [22], [23] for the playful learning of of preschool children, as well as the games designed by
a language, where learners not only have theoretical concepts, Qingtang et al. [43] and by Grammatikopoulou et al. [44],
but also apply them in practical applications through an in which the user directs an avatar through body movements.
interactive mobile robot that allows to relate abstract concepts The work of Iturrate et al. [45] combines Kinect v2 with
under their own experience through the infrastructure and video games and remote mobile robots, within the framework
equipment and movement of the robot. of an interactive environment where participants are invited
For the development of the interactive mobile robot, it is to cooperate or fight each other by solving general culture
essential to condition and program sensors capable of measur- challenges [45]
ing the variables of interest – related to the environment, the In this article, it is presented a mobile robot that include
user and/or the robot – as well as actuators capable of setting a Kinect v2 sensor in particular, a playful application is
the limbs, links or wheels of the device in motion. Some presented for learning a set of eight parts of the body in French
devices commonly used for this purpose are joysticks [24], through the mobile robot, which aims to combine in a balanced
haptic devices [9], cameras [25], [26], smartphones [26] and way the contributions of robotics, game science and language
smart watches [9], [25], [26], [27] didactics.
Among the devices that allow obtaining useful data for The product obtained consists of a proposal of a playful
the development and implementation of human-machine inter- game based on gesture recognition for the learning of body
faces in real time, the Kinect v2 stands out. That allows parts. The application is developed to learn words in French,
you to measure the parameters of the human body, the but this can be implemented for other languages.
Kinect v2 integrates three types of sensors: a color camera, The interface leverages nonverbal communication for active
an infrared sensor and a set of microphones. Compared to and meaningful learning. With a view to promoting autonomy,
other devices, it presents special interest for its accessibility it includes a menu that allows the user to independently
and low cost [28], [29]. The Kinect v2 has been used in various discover the operation of the robot, practice the vocabulary
body and/or artistic disciplines: martial arts [30], yoga [31], to be acquired and, finally, play. The playful game aims to be
dance [32] or music [33], among others [28], [29], [30], [31], used in the process of language learning from the acquisition,
[32], [33] conceptualization and/or systematization of vocabulary related
This research combines the robotics, the game science and to body parts.
the language didactics, priority has been given to proposals The main elements that make up the mobile robot are
for the use of the Kinect v2 sensor related to robotics and described, the non-linear control used for trajectory tracking
games. In the field of robotics, the Kinect v2 has been used is exposed and control results are presented for the playful
in mobile robots or manipulators robots to realize applications application called De la tê te aux pieds (using from head to
and interfaces. Most applications have been implemented for knee), designed to promote vocabulary learning for French
mobile robots, while interfaces have been developed only for learners, the results obtained are discussed and future prospects
manipulators robots. as are presented.
The applications using Kinect are the RGBD device val-
idation joint system to measure the balance of patients in III. I NTERACTIVE M OBILE ROBOT C OMPONENTS
therapy [34]; the application for the automatic capture of The interactive mobile robot has a differential configuration
body postures for analysis purposes for learning [35] and the with a support wheel. The prototype, shown in Figure 1,
Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
116 IEEE REVISTA IBEROAMERICANA DE TECNOLOGIAS DEL APRENDIZAJE, VOL. 18, NO. 1, FEBRUARY 2023
Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
OJEDA-MISSES: DEVELOPMENT OF AN INTERACTIVE MOBILE ROBOT FOR PLAYFUL LEARNING AND LANGUAGE TEACHING 117
Using the equation of the dynamics of the error (3) and using Therefore, V̇ is negative defined and ensures that the system
a change of coordinates we have: is asymptotically stable.
e1 cos(θ) sin(θ) 0 xr e f − x
e2 = − sin(θ) cos(θ) 0 yr e f − y (4) IV. G RAPHICAL I NTERFACE
e3 0 0 1 θr e f − θ The operation of the interactive mobile robot is achieved
through an interactive human-mobile robot interface. The
where e1 , e2 are the position errors in the reference frame interactive interface of the mobile robot is inspired by a finite
of the mobile robot, e3 is the error in orientation and qr = state machine model. By giving the user the ability to choose
[xr e f , yr e f , θr e f ]T is the reference path. The controller for between various functions, it allows transitions using hand
trajectory tracking is given by: gestures as input. These gestures are defined according to two
positions (up/down) and two states (open hand/closed hand).
u d = −k1 (vr e f , ωr e f )e1
The user manages to move the robot from one state to
sin(e3 )
u i = −k2 vr e f e2 − k3 (vr e f , ωr e f )e3 (5) another by acquiring and recognizing the parameters of the
e3 human body made by the Kinect v2. Each action of the mobile
where k2 is a positive constant and k1 (·), k3 (·) are strictly robot is associated with a specific gesture. Thanks to them, the
positive continuous functions in R × R − (0, 0). The function user will be able to drive the robot from a source state to a
is defined as: destination state, passing through other primary and secondary
q states that we will mention later.
k1 (vr e f , ωr e f ) = k3 (vr e f , ωr e f ) = 2ζ σ vr2e f + ωr2e f (6) For its part, the graphical interface (Figure 3) is charac-
terized by being interactive, multifunctional, and easy to use.
where k2 = σ , it is important to mention that ζ, σ are real Thanks to Kinect v2, it allows interaction between the user
and positive gains. Control strategies are applied to the angular and the mobile robot through sounds, images, movements and
velocities of each robot motor. Now, it is essential to check the gestures. It thus offers an interesting potential to capture the
stability of the system through the control strategy. For this, attention of learners and promote a more active and integral
the dynamics of the error and a stable equilibrium point for the participation.
Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
118 IEEE REVISTA IBEROAMERICANA DE TECNOLOGIAS DEL APRENDIZAJE, VOL. 18, NO. 1, FEBRUARY 2023
Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
OJEDA-MISSES: DEVELOPMENT OF AN INTERACTIVE MOBILE ROBOT FOR PLAYFUL LEARNING AND LANGUAGE TEACHING 119
Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
120 IEEE REVISTA IBEROAMERICANA DE TECNOLOGIAS DEL APRENDIZAJE, VOL. 18, NO. 1, FEBRUARY 2023
Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
OJEDA-MISSES: DEVELOPMENT OF AN INTERACTIVE MOBILE ROBOT FOR PLAYFUL LEARNING AND LANGUAGE TEACHING 121
the robot. The points mentioned are relevant, insofar as many In addition, a constructivist approach supported by interac-
of the benefits of the platform are associated with fulfilling tionism invites to consider the teacher as a guide and mentor,
the competencies by making available to the learner a novel giving the student the necessary freedom to explore the tech-
platform with real-time performance considering the teaching nological environment and build their knowledge, providing
and learning needs. support in case they have doubts or face a problem. From the
approach of interactionism, student interaction plays a very
VII. C ONCLUSION important role, of course, especially since recent technologies
This article presented an interactive mobile robot designed have generalized the use of sensors capable of measuring
for the development of playful applications in Kinect v2 and various variables of interest, relative to both the environment
used to implement interactive games for language learning and the user and / or the robot.
purposes. The playful game uses an interface that seeks to take The current version, in French, is easily adaptable to other
advantage of the interaction between the player and the inter- languages. It is also possible to contemplate the work around
active mobile robot in a playful way through gesture recog- different contents or according to different rules, resorting to
nition. This work combines robotics and games, games and individual and/or group dynamics. However, language teachers
language learning, and some recent developments on robotics are being trained and trained to select and program the robot’s
and language learning, one of the main contributions of the parameters. Finally, as future work it is contemplated to
project is the uniqueness of its multidisciplinary approach, improve the appearance of the robot through some disguise,
which seeks a deliberate balance between robotics, game personality to make it more friendly for teachers and learners.
studies and language. The hardware used to build the mobile
robot was described, as well as the playful game for learning
and teaching languages through a human-robot interface. R EFERENCES
Platform performance is evaluated by real-time experiments
[1] K. Ananiadou and M. Claro, “21st century skills and competences
using nonlinear control based on trajectory tracking. The for new millennium learners in OECD countries,” OECD Education
experiment demonstrates the potential of the platform to Working Papers 41, 2009.
support playful play and was successfully tested and then dis- [2] New Vision for Education: Fostering Social and Emotional Learning
cuss the results and mention future prospects. The interactive Through Technology, World Economic Forum, Boston, MA, USA, 2016.
[3] New Vision for Education Unlocking the Potential of Technology, World
interface and playful application have been successfully tested Economic Forum, Boston, MA, USA, 2015.
with a group of students in CINVESTAV’s language lab, and [4] N. Lacelle, J.-F. Boutin, and M. Lebrun, “La littératie médiatique
new development tracks have been identified. multimodale appliquée en contexte numérique,” in LMM@: Outils
Conceptuals et Didactiques, Montreal, QC, Canada: PUQ, 2017.
Finally, it can be mentioned that with respect to the works
[5] O. Ohashi, E. Ochiai, and Y. Kato, “A remote control method for mobile
presented in [5], [10], that this work has the following robots using game engines,” in Proc. 28th Int. Conf. Adv. Inf. Netw. Appl.
advantages: first, it presents a multimodal interface that allows Workshops, Victoria, BC, Canada, May 2014, pp. 79–84.
the use of verbal and [10] communication with the user; it [6] J G. Munévar, E. L. R. Sánchez, and H. M. Mosquero, “La robótica
aplicada a La lúdica,” Tecnural, vol. 15, no. 30, pp. 52–63, 2011.
allows the use of an interactive mobile robot powered by
[7] K. L. Su, S. V. Shiau, J. H. Guo, and C.-W. Shiau, “Mobile robot based
control strategies, which in turn allows the realization and online Chinese chess game,” in Proc. 4th Int. Conf. Innov. Comput., Inf.
programming of various trajectories in a classroom, makes use Control (ICICIC), Taiwan, Dec. 2009, pp. 528–531.
of the Kinect v2 sensor, which allows the identification of the [8] N. Yanes and I. Bououd, “Using gamification and serious games for
English language learning,” in Proc. Int. Conf. Comput. Inf. Sci. (ICCIS),
user’s movements and the recognition of body parts; allows vol. 3, Apr. 2019, pp. 7301–7312.
the development of a playful game applied for the learning of [9] X. Hou, P. Fang, and Y. Qu, “Dynamic kinesthetic boundary for haptic
body parts; Finally, the interactive mobile robot is considered teleoperation of unicycle type ground mobile robots,” in Proc. 36th Chin.
Control Conf. (CCC), vol. 2, Jul. 2017, pp. 6246–6251.
flexible since it allows the programming of other dynamics,
[10] P. Kitichaiwat, M. Thongsuk, and S. Ngamsuriyaroj, “Melody touch: A
movements, games and the learning of other topics during the game for learning English from songs,” in Proc. 3rd ICT Int. Student
learning of a language. It starts from the idea that playful Project Conf. (ICT-ISPC), Mar. 2014, pp. 13–16.
tools for learning can help motivate learning, as long as you [11] L. Steels and M. Hild, Language Grounding in Robots. New York, NY,
USA: Springer, 2012.
keep in mind the complexity of both the learning process and
[12] L. Steels, “Language games for autonomous robots,” IEEE Intell. Syst.,
the playful metaphor: not everything called play produces the vol. 16, no. 5, pp. 16–22, Sep. 2001.
expected effects of a game. [13] M. Ishida, A. Khalifa, T. Kato, and S. Yamamoto, “Features of learner
The type of learning promoted by the interactive robot is corpus collected with joining-in type robot assisted language learning
system,” in Proc. Conf. Oriental Chapter Int. Committee Coordination
active and cognitive, thanks to playful activities that require Standardization Speech Databases Assessment Techn. (O-COCOSDA),
bi- and multidirectional interaction, rather than passive obser- Oct. 2016, pp. 128–131.
vation or an explanation-repetition sequence, as is the case [14] R. van den Berghe, J. Verhagen, O. Oudgenoeg-Paz, S. van der Ven,
with the identified uses of the Nao robot, inspired by a and P. Leseman, “Social robots for language learning: A review,” Rev.
Educ. Res., vol. 89, no. 2, pp. 259–295, Apr. 2019.
behaviorist approach [58] In sum, the interactive mobile robot [15] G. C. Da Silva and D. A. Signoret, Temas Sobre la Adquisición de una
has two general educational principles: learning as an active Segunda Lengua. Mexico City, Mexico: Trillas, 2005.
process and complete, authentic and real learning according [16] W. Burgard et al., “Experiences with an interactive museum tour-guide
robot,” Artif. Intell., vol114, nos. 1–2, pp. 3–55, 1999.
to Piaget [59] Therefore, from the action-oriented approach
[17] S. Thrun et al., “MINERVA: A second-generation museum tour-guide
in language teaching [13] properly and the development of robot,” in Proc. IEEE Int. Conf. Robot. Automat., Detroit, MI, USA,
general and communication competences. vol. 3, May 1999, pp. 1999–2005.
Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.
122 IEEE REVISTA IBEROAMERICANA DE TECNOLOGIAS DEL APRENDIZAJE, VOL. 18, NO. 1, FEBRUARY 2023
[18] S. T. Hansen, T. Bak, and C. Risager, “An adaptive game algorithm for [38] G. Cicirelli, C. Attolico, C. Guaragnella, and T. D’Orazio, “A Kinect-
an autonomous, mobile robot—A real world study with elderly users,” in based gesture recognition approach for a natural human robot interface,”
Proc. IEEE 21st IEEE Int. Symp. Robot Hum. Interact. Commun. (RO- Int. J. Adv. Robotic Syst., vol. 12, no. 3, 2015, pp. 1–11, 2015.
MAN), Paris, France, Sep. 2012, pp. 125–130. [39] D. Maraj, A. Maraj, and A. Hajzeraj, “Application interface for gesture
[19] S. T. Hansen, D. M. Rasmussen, and T. Bak, “Field study of a physical recognition with Kinect sensor,” in Proc. IEEE Int. Conf. Knowl. Eng.
game for older adults based on an autonomous, mobile robot,” in Appl. (ICKEA), Sep. 2016, pp. 98–102.
Proc. Int. Conf. Collaboration Technol. Syst. (CTS), Denver, CO, USA, [40] F. C. C. Wilson, S. H. M. Torres, and M. J. Kern, “7 DOF industrial robot
May 2012, pp. 125–130. controlled by hand gestures using Microsoft Kinect v2,” in Proc. IEEE
[20] L. V. Calderita, P. Bustos, C. S. Mejías, F. Fernández, R. Viciana, and 3rd Colombian Conf. Autom. Control (CCAC), Cartagena, Colombia,
A. Bandera, “Socially interactive robotic assistant for motor rehabil- Oct. 2017, pp. 1–6.
itation therapies with pediatric patients,” Revista Iberoamericana de [41] I. Benabdallah, Y. Bouteraa, R. Boucetta, and C. Rekik, “Kinect-based
Automática e Informática Ind., vol. 12, no. 1, pp. 99–110, 2015. computed torque control for lynxmotion robotic arm,” in Proc. 7th Int.
[21] T. Zafar, M. U. Khan, A. Nawaz, and K. F. Ahmad, “Smart phone Conf. Modeling, Identificat. Control (ICMIC), Dec. 2015, pp. 1–6.
interface for robust control of mobile robots,” in Proc. IEEE Int. Conf. [42] R. Fernandez and C. von Lucken, “Using the Kinect sensor with open
Auto. Robot Syst. Competitions (ICARSC), Espinho, Portugal, May 2014, source tools for the development of educational games for kids in pre-
pp. 42–46. school age,” in Proc. Latin Amer. Comput. Conf. (CLEI), Oct. 2015,
[22] K. Koide, J. Miura, M. Yokozuka, S. Oishi, and A. Banno, “Interactive pp. 1–12.
3D graph SLAM for map correction,” IEEE Robot. Autom. Lett., vol. 6, [43] L. Qingtang, W. Yang, W. Linjing, H. Jingxiu, and W. Peng, “Design
no. 1, pp. 40–47, Jan. 2021. and implementation of a serious game based on Kinect,” in Proc. Int.
[23] C. G. Cena, R. Saltarén, J. L. Blázquez, and R. Aracil, “Desarrollo de Conf. Educ. Innov. through Technol. (EITT), Wuhan, China, Oct. 2015,
una interfaz de usuario para el sistema robótico multiagente SMART,” pp. 13–18.
Revista Iberoamericana de Automática e Informática Ind., vol. 7, no. 4, [44] A. Grammatikopoulou, S. Laraba, O. Sahbenderoglu, K. Dimitropoulos,
pp. 17–27, 2010. and N. Grammalidis, “An adaptive framework for the creation of
[24] S. Kruzic, J. Music, and I. Stancic, “Influence of human-computer bodymotion-based games,” in Proc. 9th Int. Conf. Virtual Worlds Games
interface elements on performance of teleoperated mobile robot,” in Serious Appl. (VS-Games), Athens, Greece, Sep. 2017, pp. 209–216.
Proc. 40th Int. Conv. Inf. Commun. Technol., Electron. Microelectron. [45] I. Iturrate et al., “A mobile robot platform for open learning based
(MIPRO), Opatija, Croacia, May 2017, pp. 1015–1020. on serious games and remote laboratories,” in Proc. 1st Int. Conf.
[25] J. Wu, C. Lv, L. Zhao, R. Li, and G. Wang, “Design and implementation Portuguese Soc. Eng. Educ. (CISPEE), Porto, Portugal, Oct./Nov. 2013,
of an omnidirectional mobile robot platform with unified I/O interfaces,” pp. 1–7.
in Proc. IEEE Int. Conf. Mechatronics Autom. (ICMA), Aug. 2017, [46] The Math Works. (2012). MATLAB-Simulink/Versión 8.0 (R2012b).
pp. 410–415. Natick, MA, USA. [Online]. Available: https://www.mathworks.com
[26] S. Waldherr, R. Romero, and S. Thrun, “A gesture based interface for [47] Quanser Consulting. (2011). QuarC/Versión 2.3.603. Markham, Ontario,
human–robot interaction,” Auton. Robots, vol. 9, no. 2, pp. 151–173, CA. [Online]. Available: https://www.quanser.com
2000. [48] Microsoft Corporation. (2013). Visual Studio/Versión 12. Redmond, WA,
[27] V. Villani et al., “Interacting with a mobile robot with a natural USA. [Online]. Available: https://www.microsoft.com
infrastructure-less interface,” in Proc. 20th IFAC World Congr., 2017, [49] Microsoft Corporation. Kinect for Windows Software Development
vol. 50, no. 1, pp. 12753–12758. Kit (SDK) 2.0. Redmond, WA, USA. 2013. [Online]. Available:
[28] C. Escolano and J. Minguez, “Sistema de teleoperación multi-robot https://www.microsoft.com
basada en interfaz cerebro-computador,” Revista Iberoamericana de [50] C. Giorio and M. Fascinari, “Kinect in motion,” in Audio and Visual
Automática e Informática Ind., vol. 8, no. 2, pp. 16–23, 2011. Tracking by Example. Birmingham, U.K.: Packt, 2013.
[29] J. L. Sirvent, J. M. Azorín, E. Iáñez, A. Úbeda, and E. Fernández, “Non- [51] Basic Micro Motion Control. (2015). RoboClaw 2×30A Motor
invasive brain interface based on evoked potentials for the control of a Controller. Temecula, CA, USA. [Online]. Available: https://www.
robot arm,” Revista Iberoamericana de Automática e Informática Ind., basicmicro.com
vol. 8, no. 2, pp. 103–111, 2011. [52] Robotzone, Servocity. 313 RPM HD Premium Planetary Gear
[30] N. Alharbi, Y. Liang, and D. Wu, “A data preprocessing technique Motor, p. 1. [Online]. Available: https://www.servocity.com/313-rpm-
for gesture recognition based on extended-Kalman-filter,” in Proc. hdpremium-planetary-gear-motor
IEEE/ACM Int. Conf. Connected Health: Appl., Syst. Eng. Technol. [53] U.S. Digital, Vancouver, BC, Canada. E2 Optical Kit Encoder,
(CHASE), Philadelphia, PA, USA, Jul. 2017, pp. 77–83. pp. 1–10. [Online]. Available: https://cdn.usdigital.com/assets/datasheets/
[31] M. U. Islam, H. Mahmud, F. B. Ashraf, I. Hossain, and M. K. Hasan, E2_datasheet.pdf
“Yoga posture recognition by detecting human joint points in real [54] Zippy, U.S. Hobbyking. Zippy Drums, p. 1. [Online]. Available:
time using Microsoft Kinect,” in Proc. IEEE Region 10 Humanitarian https://hobbyking.com/en_us/zippy-compact-4500mah-4s-35c-lipo-
Technol. Conf. (R10-HTC), Dec. 2017, pp. 668–673. pack.pdf
[32] A. Masurelle, S. Essid, and G. Richard, “Multimodal classification of [55] C. C. de Wit, H. Khennouf, C. Samson, and O. J. Sordalen, “Nonlinear
dance movements using body joint trajectories and step sounds,” in control design for mobile robots,” in Recent Trends in Mobile Robots
Proc. 14th Int. Workshop Image Anal. Multimedia Interact. Services (World Scientific Series in Robotics and Intelligent Systems), no. 11,
(WIAMIS), Jul. 2013, pp. 1–4. Y. F. Zheng, Ed. 1994, pp. 121–156, doi: 10.1142/9789814354301_0005.
[33] P. Payeur et al., “Human gesture quantification: An evaluation tool for [56] M. A. O. Misses, “Ludibot: A ludoeducational robot designed from
somatic training and piano performance,” in Proc. IEEE Int. Symp. mobile robotics, game sciences and the didactics of languages and
Haptic, Audio Vis. Environments Games (HAVE), Richardson, TX, USA, cultures,” CINVESTAV, Mexico City, Mexico, Tech. Rep. CINVES-
Oct. 2014, pp. 100–105. TAV 2246 and the Automatic Control 177, 2020. [Online]. Available:
[34] I. Ayed, B. Moyà-Alcover, P. Martínez-Bueso, J. Varona, A. Ghazel, https://repositorio.cinvestav.mx/handle/cinvestav/2246
and A. Jaume-i-Capó, “Validation of RGBD devices to therapeutically [57] J. P. Miller, The Holistic Curriculum, 2nd ed. Toronto, ON, Canada:
measure balance: The functional scope test with Microsoft Kinect,” OISE Press, 2007.
Revista Iberoamericana de Automática e Informática Ind., vol. 14, no. 1, [58] Common European Framework of Reference for Languages, Ministry of
pp. 115–120, 2017. Education, Culture and Sport, Instituto Cervantes, Madrid, Spain, 2002.
[35] R. Munoz, T. Schumacher Barcelos, R. Villarroel, R. Guinez, and [59] J. Piaget, Psychology and Epistemology. Mexico City, Mexico: Ariel,
E. Merino, “Body posture visualizer to support multimodal learning 1979.
analytics,” IEEE Latin Amer. Trans., vol. 16, no. 11, pp. 2706–2715,
Nov. 2018.
[36] A. Araújo, D. Portugal, M. S. Couceiro, J. Sales, and R. P. Rocha,
“Development of a compact mobile robot integrated in the ROS mid- Manuel Alejandro Ojeda-Misses received the degree in mechatronics engi-
dleware,” Revista Iberoamericana de Automática e Informática Ind., neering from UPIITA-IPN, Mexico, in 2013, and the M.Sc. and Ph.D. degrees
vol. 11, no. 3, pp. 315–326, 2014. in automatic control from CINVESTAV in 2015 and 2020, respectively.
[37] G. Baron, P. Czekalski, M. Golenia, and K. Tokarz, “Gesture and voice He is currently an Assistant Professor with the Division of Informatics
driven mobile tribot robot using Kinect sensor,” in Proc. Int. Symp. Engineering, TESSFP/TECNM. His research interests include software and
Electrodynamic Mechatronic Syst. (SELM), Opole-Zawiercie, Polonia, hardware development, ludoeducational robotics, ludic games, mechatronic
May 2013, pp. 33–34. systems, automatic control, and applied electronics.
Authorized licensed use limited to: Escuela Superior de Ingeneria Mecanica. Downloaded on April 26,2023 at 19:07:52 UTC from IEEE Xplore. Restrictions apply.