You are on page 1of 6

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/303751985

Control of Home Devices based on Hand Gestures

Chapter · September 2015


DOI: 10.1109/ICCE-Berlin.2015.7391325

CITATIONS READS

14 8,308

2 authors:

Gonzalo Pomboza-Junez Juan A. Holgado-Terriza


National University of Chimborazo, Ecuador, Riobamba University of Granada
9 PUBLICATIONS   45 CITATIONS    132 PUBLICATIONS   1,481 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Internet of Agents View project

Metamodeling for Industrial Environments View project

All content following this page was uploaded by Juan A. Holgado-Terriza on 03 November 2016.

The user has requested enhancement of the downloaded file.


Control of Home Devices based on Hand Gestures
Pomboza-Junez Gonzalo Holgado-Terriza Juan A.
Department of Languages and Systems Department of Software Engineering
National University of Chimborazo University of Granada
Riobamba, Ecuador Granada, Spain
gpomboza@correo.ugr.es jholgado@ugr.es

Abstract—Human-machine interfaces are constantly evolving. later the processing of this data in order to interpret and
Within this group, the interfaces based on natural gestures of identify the gesture [8]. In this sense, there have been several
users allow converting these movements into commands for a researches to capture process and interpret the gesture [9], [10],
computer system without contacting any surface. This paper [11] either using their hands, eyes, face or even the whole
describes the architectural model of a system that addresses these
body. Then, new innovative devices are launched in the market
interfaces to control home automation systems. An armband for
gestural capture called MYO® is selected in this case. This device to ensure the capture, processing and identification of gestures,
combines the measurement of electrical activity produced by the [12], [13], [14]. Hence, gesture-based interfaces can become
movement of forearm and hand muscles in order to detect the the most direct way of human-machine interaction [15].
hand gesture, but it is also able to capture the orientation and The integration of a natural user interface for Gesture
rotation of the movement. Finally, a study is performed of its recognition (NUI-G) support into home devices is not
application. commonly added because it increases the complexity of the
system. But, the user requires simpler and easier mechanisms
Index Terms—Gesture recognition, hand gesture, gesture
that will improve their experience in controlling home devices.
interface, human computer interaction, natural user interfaces.
For this reason, some manufacturers such as Samsung have
developed different gestural recognition systems into TV sets
I. INTRODUCTION giving an alternative control to control the TV set instead of
Automating a home involves the incorporation of a set of using the common remote controls. However, they have not
devices into a centralized home automation system that aims to defined a standard gesture language for the control of any
improve energy management, security, welfare and device, and they are very limited to specific models of Tv sets
communication [1]. The interaction between the user and their with restrictions in distance and location. Similarly, Microsoft,
home devices is becoming an essential concern to be using Kinect proposes the gesture recognition using body, and
considered to improve the design of new consumer devices and has improved the user experience in many applications [12].
to consequently obtain their public acceptance. In fact, the key But, the problem is how to control different devices using the
task is enhancing the user experience and the simplicity in its same gesture recognition system.
handling. Home devices have evolved from simple remote Some other related works explore the approaching of using
controls to home automation switchboards, which guarantee gestures for controlling consumer devices. But they focused on
connectivity with mobile devices such as smartphones or the gestures performing by moving the hand with the mobile
tablets. Hence, the current trend in the development of new device [16] or by touching the smartphone screen with one or
innovative home devices promotes the ability to identify and more fingers[17][18]. The natural gestures used in this work
control of any everyday use device managing the local or are more intuitive, because they can be carried out without any
remote interconnection with other users [2], [3], [4], [5]. physical restrictions. This fact makes our set of gestures can be
The interaction between the user and the devices through a easily memorized [19]. The proposal system could also be
natural gesture is relatively new. The gesture is defined as a considered for projects studying the use of devices such as
mental concept of an idea associated to an action, response or a remote controls for computers, home and office [20].
requirement that the user realizes with the intention of In this paper, we propose a new Hand Gesture Control
achieving a result [6]. The gestures provide a non-verbal System (HGCS) valid to control several home devices with a
communication between partners without using words or similar behavior using the same NUI-G. The HGCS will allow
sounds. However the vocabulary is more limited than verbal us to: 1) transparently manage autonomous home devices in a
communications, and hence a sequence of gestures have to be unified; 2) define a simple gestural language to express actions
performed to express more complex ideas. Examples of on home devices; 3) maintain a gestural library that can be
complex languages based in gestures are the Sign Language in configured and expanded by the user; 4) provide a practical
their American versions (ASL) or British (BSL) among others, application to test the reliability and robustness of hand
which are directed to the expression of thought using the gestural interface and determine the performance and
proper hands and their hand positions [7]. responsivity of the system.
The identification of a gesture by a computer requires This paper is structured in the following sections. The
firstly the capture of the data that represent the gesture, and Section II describes the system architecture and its
components. Details of these components are presented in The HD is the home device tier and includes the set of
Section III. Section IV shows how the gestures are selected. home devices (lighting, ventilation, television, etc.) that
Section V describes a proposed prototype with the gestural- receives and executes commands from HGC.
based interface. Section VI includes the results, and finally, The execution of commands on a Home Device can be
Section VII exposes the conclusions and future work. performed directly by HGC through a wireless communication
link (WiFi, Bluetooth, IR). Depending on the wireless support
II. SYSTEM ARCHITECTURE of home device, the HGC can send the request to the HD
The Hand Gesture Control System (HGCS) proposed in directly using the proprietary protocol of the home device.
this work is structured into a three-tier architecture in which Conversely in connectionless home devices, an Arduino
each tier is a layer responsible of a specific logical microcontroller with a wireless link (e.g. Bluetooth is
functionality located on a separate physical computing device. preferable) is attached to HD directly or using a wire network,
All tiers are interconnected using a client-server scheme. providing to the user a customized control of the HD.
Figure 1 shows the system architecture with the three tiers, the The control gestures to be designed must be comfortable
Hand Gesture Device (HGD), the Hand Gesture Controller [21], low complexity [8] as well as reliable.
(HGC) and finally the set of Home Devices (HD).
III. SYSTEM IMPLEMENTATION

A. Hand Gesture Device (HGD).


We have selected MYO® armband [14] as the hand gesture
device for HGCS for several reasons. Firstly, it is a new
generation of active NUI-G interfaces that detect hand gestures
measuring direct muscle movements. In addition, the device is
able to read hand gestures with a high frequency of about 200
Hz, giving sufficient high detection rates in order to be
responsive to the user. Finally, data of hand gestures can be
sent to other systems using Bluetooth links, being the preferred
option in the design of home and consumer devices (Fig. 2).

Fig. 2. A schematic description of a MYO armband (HGD).

It is equipped with an ARM Cortex-M4 microprocessor of


low consumption. The armband is composed of a set of eight
electromyography (EMG) sensors positioned radially around a
circumference of the band and an inertial (IMU, inertial
measurement unit) sensor (see Figure 2). EMG sensors are able
Fig. 1. Diagram of the System Architecture to measure electric signals originating from muscles at ultra-
low signal resolution. Furthermore, its configuration on parallel
The HGD is the gestural interface tier handled by a device bars ensures the measurement consistency. On the other hand,
capable of capturing the signals that conforms a possible the IMU sensor gives measurements of Gyroscope three axis,
gesture with a rate sufficiently fast in order to generate valid three-axis accelerometer and three-axis magnetometer,
commands to home devices. obtaining the reference position using Euler angles and the
The HGC is the gestural processing and identification tier coordinates of position by Quaternions, with at an angle
managed by an embedded device. It is responsible for calculated in radians [14].
processing the signals obtained from the HGD. Then, after
B. Hand Gesture Controler (HGC).
applying a gesture recognition process, the HGC identifies the
valid gesture comparing it with a gesture library. Once the The selected HGC is portable and can be placed discreetly
gesture is identified, the corresponding command will be in home. In the present work we have chosen two types of
associated. Then the HGC prepares the set of commands which possible HGCs. The first one was an embedded system based
will be executed by the HD devices. on a Raspberry PI 2 with a Broadcom BCM2836 SoC (system-
on-a-chip) that includes a 900 MHz quad-core ARM Cortex
A7, a Broadcom VideoCore IV @ 250 MHz GPU and 1 GB of The sample used for the study is composed by 17
RAM. This device is bootable and automatically loads the participants that are young adults (8 women and 9 men), aged
necessary software. between 18 and 30 (average is 23.5, standard deviation is 3.66)
The second one is a smartphone with a Qualcomm years. All of them have superior studies and were owners of
MSM8926 SoC that includes a 1,2 GHz quad-core ARM smartphones or tablets. Eleven participants had experience with
Cortex A7, an Adreno 305 @ 450 MHz GPU and 1 Gb of games on consoles and eight participants knew mobile
RAM. In addition the smartphone includes 8 Gb of flash, applications using gestures.
Bluetooth 4.0 support, and Android 4.4 as the operating Figure 3, shows the results obtained in the study. The
system. In this case it is only necessary to install an application participants considered that a very comfortable gesture is open
to activate HGC, which it is executed as an Android Service in hand (66%), while comfortable gestures are close hand (78%),
the background. This device is connected to the HGD device right turn (63%), turn left (68%) palm inwards (52%) and palm
through a Bluetooth protocol. out (47%). The other gestures of the gestures library are more
difficult to reproduce.
C. Home Devices (HD).
The HD will be responsible for interacting with the home
environment and will execute all commands received from the
HGC. Such devices are specifically smart home automation
systems of different types, such as smoke detectors, fire
detectors, dimmers, regulators, air conditioning unit set,
devices of opening-closing doors set, and so on. In short, they
are devices that can be controlled in a simple way based on the
following set of commands:
 On / Off. The HD devices start to activate or deactivate
when they receive an ON or OFF command. The
activation implies the running of the HD services. This
applies to all devices in the home.
 Up / Down. It refers to the intensity degree with which
a specific functionality of a HD device will be used. In
a HiFi home device, the action command would
increase or reduce the volume intensity, while on a
HVAC (heating, ventilation, and air conditioning)
Fig. 3. Comfort levels that the users indicate for each gesture.
system, the action command would raise or lower the
temperature setpoint to lower or raise the flow of air The most comfortable gestures (with highest percentages)
that must be cooled. are selected to assign the most frequent commands; in our case,
 Change mode / channel. It allows switching between open hand as the start command (reserved) and close hand as
operating modes of a HD. In a HVAC system, it can the stop command. The next comfortable gestures are selected
change from hot to cold, or in a TV set it can change to assign commands that can be repeated continuously, and
from one frequency to another. The change from one they are grouped into ACTION commands. Conversely the less
mode (+) to another (-) it is always performed comfortable gestures are used to identify the home device, and
sequentially. they are grouped into DEVICE commands.
According to the above, there is the possibility to use the Each gesture in the library is associated with a command
same commands to control various HD devices. This increases that HGC can potentially execute. The command is not
system interoperability. sufficient to determine the execution of a specific functionality
because the system has to manage different home devices. We
IV. IDENTIFICATION AND SELECTION OF GESTURES
define the concept of activity that allows expressing complex
The processing, identification and the selection of gestures actions. Formally, an activity is a command triplet which can
are important issues for the correct operation of the gestural be represented as (START, DEVICE, ACTION), where:
interface. The first decision was determining the most - START represents the command that initiates the
appropriate gestures for controlling the system. From a gesture construction of an activity. If the user does not
library we carried out a study with real users to find what continue to reproduce other gestures, the construction
gestures are easy to reproduce. The study is composed of two of activity is cancelled.
parts; first, the participants placed the armband on the forearm - DEVICE identifies a specific HD on which we are
and they had to execute all the gestures of the library at least going to execute an ACTION command
five times; second, the participants filled a report giving their - ACTION comprises those gestures that run a specific
impressions of each gesture according to the comfort level action on a specific HD according to the Table I. A
(zero when the gesture is very comfortable and five when a chinese nomenclature has been used for counting to ten
gesture is painful) according to Rempel et al [21].
with one hand, although more combinations can be V. CASE STUDY
generated. To test the applicability of HGCS a prototype was designed
The gestures have been categorized by reference to their using the second type of HGC based in a smartphone. We
level of comfort in Table I. implemented an Android application that includes a library for
accessing to MYO driver, since the armband is connected to
TABLE I. USED GESTURES AND PROPERTIES smartphone using a proprietary data protocol over the
Bluetooth communication protocol. Table II, shows the
activities that are accepted in each HD device of our prototype,
which are: telephone control, TV control and door control.

TABLE II. DESCRIPTION OF ACTIONS GESTURAL CONTROL

The Android application (HGC) receives the data read by


the HGD device, and then the data is processed to identify a
gesture comparing it with other gestures in the gesture library.
Next, the activity will be built using the TGC (Fig. 4) and it
will run on the device selected by it.

Fig. 4. Gesture control using Tree Gestural Control.

The requirement to build an activity previous to the


For example, the triplet (OPEN-HAND, TWO, PALM- execution of an action in a HD device has important
OUT), when it is applied to a TV set, can indicate that the user consequences. The false positives are avoided because the open
changes the channel to the next one in the TV. hand command is the only gesture than can start an activity.
The activity (or command triplet) is implemented by a While the START command is not detected, the HGC will
hierarchical structure that we call the Tree Gesture Control ignore the readings of the HGD device. In the case an activity
(TCG) (Fig. 4). The TCG can be navigated to complete the is not correctly built, we can revoke the activity by means of
activity. The uses of every gesture depend on the system the STOP command.
configuration performed by the user. Once an activity is built, The user knows the correct execution of an activity when
the HGC sends a command to the specific HD to execute the the home device runs the corresponding action. In this
corresponding ACTION command. However a special unary prototype, a light-vibratory stimuli mechanism was added to
activity, the close hand, can be used to stop immediately the the system to confirm to the user the correct capture of the
execution of an activity. For instance, a command of opening gestures. We can associate a vibratory pulse of the HGD or a
or closing a door can be accomplished while a stop command light signal in the HGC with each gesture. The number of
is not detected. vibrations or light blinks depends of the gesture. The selected
control gestures meet the definition of Gestures of Effective
Control, which determines the system's ability to read and [4] A. Elshafee and K. A. Hamed, “Design and Implementation of a
identify the gesture and the complexity involved doing such WiFi Based Home Automation System,” World Acad. Sci. Eng.
gestures. Technol., vol. 6, no. 8, pp. 1856–1862, 2012.
[5] P. Bhagyalakshmi and N. L. Aravinda, “Raspberry PI And Wifi
VI. RESULTS AND DISCUSSION Based Home Automation,” Int. J. Eng. Res. Appl., no. January,
pp. 57–60, 2015.
The tests of the HGCS system were performed with the
participation of ten participants. However the same gesture was [6] V. I. Pavlovic, S. Member, and R. Sharma, “Visual
Interpretation of Hand Gestures for Human-Computer
not recognized immediately with some participants. This was
Interaction : A Review,” vol. 19, no. 7, pp. 677–695, 1997.
due to the fact that the armband has to be placed correctly on
[7] C. Oz and M. C. Leu, “American Sign Language word
the forearm to have an accurate reading, and the default
recognition with a sensory glove using artificial neural
calibration was used for all readings. networks,” Eng. Appl. Artif. Intell. Infrastructures Tools
During testing, five attempts for recognition were made for Multiagent Syst., vol. 24, pp. 1204–1213, 2011.
each gesture to determine the efficacy of the HGD device for [8] S. Mitra and T. Acharya, “Gesture Recognition : A Survey,”
each gesture used, and the following percentages were IEEE Trans. Syst. Man, Cybern. - Part C Appl. Rev., vol. 37, no.
obtained: a) open hand (96%), b) palm inwards (76%), c) palm 3, pp. 311–324, 2007.
out (68%), d) close hand (94%), e) right turn (96%), and f) turn [9] T. Zimmerman, J. Lanier, and C. Blanchard, “A hand gesture
left (97%). These percentages indicate that gestures a, d, e and interface device,” ACM SIGCHI …, vol. 17, pp. 189–192, 1987.
f have a reliable reading (over 90%) by placing the device [10] H. Eglowstein, “reach out and touch your data: Theree input
directly into the user's forearm, while for gestures b and c it devices, ranging from your hand it to computers.,” j-Byte, vol.
was necessary to perform a calibration or relocating the device. 15, pp. 283–286, 288–290, 1990.
However, in some cases it was necessary to make these [11] S. N. Heri Setiawan, Iwan Setyawan, “Hand Gesture
gestures with greater emphasis to achieve their recognition. Recognition Using Optimized Neural.pdf.” IEEE, Salatiga,
Indonesia, 2013.
VII. CONCLUSIONS AND FUTURE WORK [12] A. Sanna, F. Lamberti, G. Paravati, and F. Manuri, “A Kinect-
In this paper, we have presented the design and based natural interface for quadrotor control,” Entertain.
implementation of a home automation system controlled by Comput., vol. 4, no. 3, pp. 179–186, Aug. 2013.
hand gestures. Experimental results showed the feasibility of [13] I. Leap Motion, D. Plemmons, and P. Mandel, “Introduction to
using the gesture as a remote control interface for controlling Motion Control,” Developer Portal, 2014. [Online]. Available:
home automation devices. The set of gestures used in the https://developer.leapmotion.com/articles/intro-to-motion-
gestural interface was selected after a thorough study of control. [Accessed: 15-Feb-2015].
comfortability with a group of participants. However, the [14] Thalmic Labs, “MYO,” Gesture Control ARMBand, 2014.
gestures may also be easily transferable to other application [Online]. Available: https://www.thalmic.com/en/myo/.
[Accessed: 24-Feb-2015].
domains due to their character of universality. The use of
HGCS system was very intuitive, because it just required a [15] S. Yang, P. Premaratne, and P. Vial, “Hand Gesture
brief explanation to the user to learn it, being functional and Recognition: An Overview,” Procedings IEEE, pp. 63–69, 2013.
adaptable. The construction of activities by the controller [16] C. Kühnel, T. Westermann, F. Hemmert, S. Kratz, A. Müller,
(HGC) using the hierarchical tree TGC helps the execution of and S. Möller, “Im home: Defining and evaluating a gesture set
for smart-home control,” Int. J. Hum. Comput. Stud., vol. 69,
actions on multiple home devices. It seems to be a significant no. 11, pp. 693–704, 2011.
step toward future explorations in this field of research. On
[17] S. Kim, S. Park, and J. Hong, “GUI screen-sharing smart remote
conclusion, the gestural interface allows a complete freedom of
control for smart TV user interface,” Int. Conf. ICT Converg.,
movement and breaks the barrier of using touch user interface. pp. 711–713, 2013.
As future work we want to increase the gestures library to [18] B. Poppinga, A. S. Shirazi, N. Henze, W. Heuten, and S. Boll,
customize the system according to a user profile. The “Understanding Shortcut Gestures on Mobile Touch Devices,”
comfortability is not the same for every user, and the system Mob. HCI 2014, 2014.
can accommodate different usage. We want to expand the [19] M. Nielsen, M. Störring, T. B. Moeslund, and E. Granum, “A
applications to industrial control where the gestural interface procedure for developing intuitive and ergonomic gesture
should be more precise. interfaces for HCI,” Gesture-Based Commun. Human-Computer
Interact., no. August 2015, pp. 409–420, 2004.
REFERENCES
[20] B. a. Myers, “Using handhelds for wireless remote control of
[1] V. Miori and D. Russo, “Domotic evolution towards the IoT,” PCs and appliances,” Interact. Comput., vol. 17, no. 3, pp. 251–
Proc. - 2014 IEEE 28th Int. Conf. Adv. Inf. Netw. Appl. Work. 264, 2005.
IEEE WAINA 2014, pp. 809–814, 2014. [21] D. Rempel, M. J. Camilleri, and D. L. Lee, “The design of hand
[2] N. Sriskanthan, F. Tan, and A. Karande, “Bluetooth based home gestures for human–computer interaction: Lessons from sign
automation system,” Microprocess. Microsyst., vol. 26, no. 6, language interpreters,” Int. J. Hum. Comput. Stud., vol. 72, no.
pp. 281–289, 2002. 10–11, pp. 728–735, Oct. 2014.
[3] S. Nupur, W. Payal, and P. Kajal, “Bluetooth Based Device
Automation System Using Cellphone,” Int. J. Comput. Appl.
Inf. Technol., vol. 7, no. I, pp. 136–141, 2014.

View publication stats

You might also like