You are on page 1of 6

IOT Hand Gestures Translator for Deaf and Mute People

1,2
Norhazlin Khairudin, 1Amiratul Huda Ahmad Puad, 1Yusnira Husaini, 1Norhafizah Burham
1
Faculty of Electrical Engineering
2
Electronic Architecture and Applications Research Group
3
Integrated Microelectronic System Application Research Group
Universiti Teknologi MARA 40450 Shah Alam, Selangor
norhazlin380@uitm.edu.my

Abstract— There are disabled people living in our society. decoded and translated by microcontroller into text and voice
Disability is impairment that maybe in form of deaf and mute. form via apps on mobile phone[3]
The statistic proves that the number of disabilities in hearing and Since hearing problem people has trouble to converse in
speech is quite high compared to the other disabilities. The sign language with other normal people due to it exclusively
people who are deaf and mute have problems to communicate used for mute and deaf people. Therefore, with the human
with people around them. Thus, the hand gestures translator awareness towards them arises when it comes predicament in
system has been developed to translate the sign language into text communication. Thus, the development of hand gestures glove
and voice on the mobile phone. This system consists of flex
translator has been introduced as an alternative way to them to
sensors and accelerometer sensors are fitted on the glove to detect
communicate with others.
the degree of each finger bending and wrist movement
respectively. The microcontroller is used to obtain the signal
In previous research mentioned, February 2017, Prof.
from all the sensors then converts the hand gesture into text and Shyam D. bawankar and Sprush S. Naredi has been reviewed
voice via apps on the mobile phone. the project related on “Review on Design and Development of
Hand Talk for Deaf and Mute People in Haptic Environment”.
Keywords—Gestures; Flex sensor; Accelerometer sensor; In this project, flex sensor is used for getting more data from
Microcontroller; Thunkable Apps; deaf and mute people using sign language, microcontroller
ATMega8 and ATMega16 for controlling all operation, APR
960 voice chip for voice storage and LCD for display[4].
I. INTRODUCTION While in June 2014, Priyanka R. Potdar and Dr. D. M. Yadav
from University of Pune, Pune, Maharashtra, India has been
Communication gives meaning of sending or receiving developed project on the “Innovative Approach for Gesture to
information or exchanging of information by speaking, Voice Conversion”. The smart glove is used as input device
writing, or by using some other medium. Communication is which is normal cloth driving glove fitted with five flex
very important to communicate each other. The meaning of sensors along the length of each finger and the thumb[2]. It
deaf is who are having impaired hearing and mute is who are has been reported, in May 2014, V. Padmanabhan, and M.
disability to speak. Deaf and mute are usually caused by Sornalatha from International Journal of Scientific &
syndrome from when they were born, or they may have gotten Engineering Research has built a project on “Hand gesture
involve in an accident which causes them to be either mute or recognition and voice conversion system for dumb people”.
deaf. The number of disabilities in hearing and speech is high This system is based on the motion sensor. For every motion
based on the statistic released in 2016[1]. Mute or deaf people have a meaning that is kept in a database. According to the
are only able to communicate using hand gestures which template database is fed into a microcontroller and the motion
known as sign language.The sign language is nonverbal form sensor is fixed in their hand. For every action the motion
of language that uses gestures to convey thoughts and gesture sensors get accelerated and give the signal to the
particular movement of hands with a specific share made out microcontroller. The microcontroller matches the motion with
of them[2]. It is a natural full-fledged with own lexicon and the database and produces the speech signal[5]. In addition,
grammar. The sign language is used worldwide, and it is March 2018, Yuhui Zhu and Shuo Jiang has completed the
available for one and two hands depend on their locals. There project on “Wrist-worn Hand Gestures Recognition Based on
are several type of sign languages spoken in the world and is a Barometric Pressure Sensing”. This project is to test hand
core of local deaf-mute culture. Additionally, the sign gestures performed for six wrist gestures, five single finger
language can be interpreted in words, number and alphabet. flexion and ten Chinese number gestures[6]. Finally, Sruthi
The development of communication technology helps them to Upendran, has designed the “American Sign Language
communicate with other people in their daily communication. Interpreter System Deaf and Dumb Individuals”. This project
In order to overcome this problem, hand gestures translator is to translates static hand gestures of alphabets in ASL into
for mute and deaf people has been developed. The flex sensor textual output[7]. But, in this project, we focus to develop an
and accelerometer sensor play an important role in this IOT system using flex sensors and accelerometer sensor to
project. The microcontroller is used as a brain of the system to convert the data interface to microcontroller translate onto text
controls the device. The glove is fitted with flex sensors and and voice form. Thus, to confirm our newly design system we
accelerometer sensor to detect the angle of finger deflection use words such as EAT, HELLO, PLEASE, SLEEP, DRINK,
and wrist movement. The hand gestures data will be recorded, HUNGRY, GOODBYE and THANK YOU.
II. METHODOLOGY 2) Flowchart of the system
The software and hardware development are the two main
part involve accomplishing the project.

A. Hardware development
1) Block diagram of System

Fig. 1. Block diagram of hand gestures translator

In Figure 1 shows the block diagram of the system, the input


of the system consists of five flex sensors which are fitted on
the glove of every finger. Flex sensor is a variable resistor
which has a deflection of body component for increasing the
resistance. The flex sensor is used to measure the deflection of
fingers from the sign language. It is a variable resistor which
has a deflection of body component for increasing the
resistance. On the other hand, accelerometer sensor is a device
to measure proper acceleration. Acceleration is the
measurement of the change in velocity. The accelerometer
sensor is used to detect the movement of the hand from the
sign language.
The input signal has received from the sensors will be sent
to the microcontroller and Bluetooth module on the mobile
phone. The data obtained will be interpreted in microcontroller
through the signal conditioning circuit (Analog Digital Fig. 2. Flowchart process of hand gestures
Converter). All the flex sensors and accelerometer sensor have
been assigned on analog pins in the microcontroller. The data In Figure 2 shows the flowchart process of hand gestures.
obtained from these pin needs to translate and encode. The First, set the device at “Active mode” means the system in ON
translation process involves the mapping degree of fingers mode by connecting to power supply and Bluetooth in order to
deflection and hand movement into analog reading which has sense the signal from the input sensor. Then, make the shape
been explained in experimental works. The output form in text of movement towards the hand gesture. The degree of bending
display and voice on the mobile thru Thunkable Apps. and movement will be detected by the sensor thats fitted on
the glove. The data obtained will be compared to the data
stored in the microcontroller. If the data received is the same
with the data stored, the data will be sent to the mobile apps
through the Bluetooth mode interface which already it
connected on the mobile phone. Then, the Thunkable apps will
display the text and generate the sound if the data is match
with the programmed one. Figure 3 shows the hand gestures
movement that have been programmed onto the
microcontroller.
III. RESULTS AND DISCUSSIONS

A. Experimental Results
1) Flex Sensor
The flex sensor elements are resistive carbon and act as a
variable resistor that combination of two circuits such as
voltage divider circuit and buffer as shown in Figure 5. The
sensor produces a resistance output when the substrate is bent.
The value of the resistance will increase when the degree of
deflection increase.

Fig. 3. The hand gestures movement

3) Schematic Diagram of the system

Fig. 5. The equivalent circuit of flex sensor

Fig. 6. Various degree of bending by the flex sensor

Fig. 4. Schematic diagram of the system TABLE I ADC VALUE, RESISTANCE AND VOLTAGE BASED ON THE
DEGREE OF BENDING BY THE FLEX SENSOR
Figure 4 shows the schematic circuit for the IOT hand
gestures project. Flex sensors and accelerometer sensor is the Degree of bending ADC Value Resistance (Ω) Voltage (V)
main function for hand gestures project are used to read the 0º 0 8.41k 3.72
deflection of fingers and movement of hand. Flex sensor is 10 º 10 8.72k 3.76
connected to the Vcc pin and the other side is connected to the 20 º 18 8.82k 3.82
analog pin arduino which is pin A0, A1, A2, A3 and A4 while
30 º 30 9.01k 3.90
the same pin is connected to the resistance. The other side of
resistance is connected to the GND Arduino. Accelerometer is 40 º 39 9.25k 3.95
connected to the 5V input pin, GND and for the 3 axis is 50 º 52 9.39k 3.98
connected to the pin A8, A9 and A10. Arduino Mega act as a 60 º 56 9.49k 4.02
microcontroller where it receive the input signal from the flex 70 º 71 9.57k 4.08
sensor and accelerometer sensor to compare with the data 80 º 77 9.66k 4.13
stored which is it can be interpreted into text and voice form
on the mobile apps. Bluetooth Module is connected to the 90 º 92 9.72k 4.20
3.3V input pin, GND, TX18 and RX19 on Arduino board.
Bluetooth Module becomes the communication interface
between the microcontroller and mobile phone for transferring
the data. Thunkable is the mobile apps uses to receive the data TABLE II TYPE OF FINGER ON FLEX SENSOR CONNECTED TO ARDUINO PIN.
sent from the Bluetooth module. Flex Sensor Type of finger
A0 Little finger
A1 Ring finger
A2 Middle finger
A3 Point finger
A4 Thumb finger
TABLE III THE INITIAL ADC VALUE OF FLEX SENSOR TABLE IV FLEX SENSORS RESULTS FOR ALL TESTED GESTURES
Flex Sensor Degree ADC Value NO THANK
Pin EAT HELLO PLEASE SLEEP DRINK HUNGRY GOODBYE
A0 0º 775 GESTURE YOU
A1 0º 773 A0 0 21 -13 3 -15 83 -10 66 23
A2 0º 782 A1 0 51 -15 2 -16 110 -7 108 32
A3 0º 852
A2 0 55 -17 9 -18 122 -8 134 42
A4 0º 836
A3 0 14 -31 3 -36 73 -14 74 20
The various degree of bending from the flex sensor has been A4 0 -34 68 -27 -46 69 21 12 17
shown in Figure 6. In Table I stated the ADC value, resistance
and the output voltage based on the different degree of From the results obtained, it shown each of flex sensor placed
bending generated by flex sensor from 0º to 90º. It shows that on finger which is little, ring, middle, point and thumb have a
degree of deflection changes proportional in ADC value, different analog reading. The analog reading changes toward
resistance and output voltage. While, table II shows the flex the flex sensor degree of bending. The analog read at 0 value
sensor interface to microcontroller pin with each of finger. for each flex sensor placed on finger is important for initial set
The initial ADC value for every sensor has stated in Table III up during “no gesture” condition. This setup is needed to
when the substrate at 0º point. avoid incorrect reading later. Figure 7 shows the sign language
that have been used in this experiment. From the analysis,
ADC reading for words THANK YOU, HELLO, and SLEEP
become negative value because that flex sensor captured the
changes of deflection of each finger that slightly bent at
backward position. The resistance changes in flex sensor
depend on the degree of bending. While, the words for
HUNGRY, GOODBYE and DRINK showed positive ADC
reading because the more flex sensor each finger bent forward,
the more resistance output increase.

2) Accelerometer Sensor

Fig. 9. Representation of the accelerometer used ADXL335 in 3-axis

The accelerometer sensor ADXL335 is a simple breakout


Fig. 7. Hand Gestures: a) No Gestures, b) Eat, c) Hello, d) Please, e)Sleep, f) board that allows quick evaluation of the performance. Since,
Drink, g) Thank You, h) Hungry, and i) Goodbye the hand gestures needed the movement of wrist and hand
hence this gestures need to consider. The accelerometer is
used to identify the gestures. There are 3 axis analog –output
is used in accelerometer which are x-axis, y-axis dan z-axis to
identify the roll, pitch and yaw wrist movement. Table IV
below shows the pin at microcontroller that have been
assigned to each of axis in accelerometer.

TABLE IV THE AXIS PIN OF ACCELEROMETER SENSOR CONNECTED TO THE


ARDUINO PIN

Pin Axis
A10 x
A9 y
A8 z

Fig. 8. Flex sensor ADC reading for all tested words


Fig. 14. Accelerometer reading for the word “GOODBYE”
Fig. 10. Accelerometer reading for word “EAT”

Fig. 15. Accelerometer reading for the word “HELLO”

Fig. 11. Accelerometer reading for word “SLEEP”

Fig. 16. Accelerometer reading for word “DRINK”

Fig. 12. Accelerometer reading for the word “THANK YOU”

Fig. 17. Accelerometer reading for word “PLEASE”

Besides that, the words such as GOODBYE, HELLO, DRINK


and PLEASE show little movement in x, y, z axis because
Fig. 13. Accelerometer reading for the word “HUNGRY” there is no wrist movement needed which illustrated in Figure
14, Figure 15, Figure 16 and Figure 17.
The results recorded on accelerometer movement, showed in
Figure 10, the word gestures for EAT only presented the great This project has displayed the text and voice form on
movement in y axis compared to x and z axis. While, Figure mobile phone as a medium to ordinary people to understand
11 showed the y axis and z axis movement for SLEEP word. the sign language. The flex sensor is used to detect deflection
The word THANK YOU do not have any z axis movement, of each finger. The resistance of flex sensor is proportional to
the data capture movement on x axis and y axis. This also ADC value as well as voltage and current. There are 5 flex
occured for HUNGRY word which having the same axis sensor were used to detect the degree of bending, such as little
movement. finger, ring finger, middle finger, pointer finger and thumb
finger. In other hand, the accelerometer with comes along 3
axis including x axis, y axis and z axis also used to capture the
movement of wrist and hand. All the sensors are fitted on the
glove to detect the degree of finger bending and the wrist
movement respectively. There are some challenges towards
obtaining accurate readings for sensitive flex sensor.

IV. CONCLUSION
The development of this hand gestures translator
using flex sensor and accelerometer has been successfully
completed to help deaf and mute people. The flex sensor and
accelerometer fitted on the glove are characterized properly as
it need to be displayed at the translator on the mobile
application. For future , the mobile application should be
develop extensively.
REFERENCES
[1] “Malaysian experience in supported employment and
job coach service programme,” 2016.
[2] P. R. Potdar, “Innovative Approach for Gesture to
Voice Conversion : Review,” vol. 3, no. 6, pp. 459–
462, 2014.
[3] C. P. Fiel, C. C. Castro, N. Pitalúa-díaz, and D. S.
Jiménez, “Design of Translator Glove for Deaf-Mute
Alphabet,” no. Eeic, pp. 485–488, 2013.
[4] P. S. D. Bawankar, S. S. Naredi, V. V Tamhan, and S.
S. Itware, “REVIEW ON DESIGN &
DEVELOPMENT OF HAND TALK FOR DEAF
AND MUTE PEOPLE IN HAPTIC
ENVIRONMENT,” no. 1, pp. 1–4, 2017.
[5] V. Padmanabhan and M. Sornalatha, “Hand gesture
recognition and voice conversion system for dumb
people,” Int. J. Sci. Eng. Res., vol. 5, no. 5, pp. 427–
431, 2014.
[6] Y. Zhu, S. Jiang, S. Member, P. B. Shull, and A. P.
System, “Wrist-worn Hand Gesture Recognition
Based on Barometric Pressure Sensing,” no. March,
pp. 4–7, 2018
[7] S. Upendran, “American Sign Language Interpreter
System for,” pp. 1477–1481, 2014

You might also like