You are on page 1of 19

TECHNICAL ANSWERS TO REAL WORLD PROBLEMS

SIGN LANGUAGE DETECTION SYSTEM USING


ARDUINO UNO MICROCONTROLLER

Submitted to: Dr. Nithish Kumar V

Submitted By:
Chinmayee (reg no: 19BEC0311)
Adithya Neelakantan (reg no: 19BEC0647)
Akshay Jaithalia (reg no: 19BEC0704)
Shanmathi (reg no: 19BEC0761)
Aditya Krishna Nayak (reg no: 19BEC0781)
Abstract:
The disabled community has lost a major portion of their voice because of their disability.
American Sign Language is one of the ways to help give them that lost voice, and we aim to aid
in that process. We intend to make a glove which has sensors which are capable of sensing one's
motions, in a manner where it gives us an output of the letters being signed. This then forms
sensible words and outputs them on a screen. With this
work, we intend to take a basic step in bridging this
communication gap using Sign Language Recognition.
We aim to convert basic symbols that represent the 26 letters
in the alphabet as mentioned under American Sign Language
script and display them with the help of a glove. We use an
accelerometer to measure the tilt in the palm and five flex
sensors, one for each finger on the glove. These sensors
measure the bend in the fingers and thumb and palm and
according to the bend angle value, the Arduino Uno
microcontroller understands which set of values represents
each symbol and transfers the appropriate outcome value
which is displayed on the Arduino IDE serial monitor.

Fig1. American Sign Language


Problem Statement:
Sign Language Recognition is a major breakthrough for helping deaf-mute people and has been
researched for many years. Unfortunately, every research has its own limitations and for this
method of gesture communication, it is still unable to be used commercially. Some of the
research has been known to be successful for recognizing sign language, but they require an
expensive cost to be commercialized.
The knowledge of sign language is still very limited to the majority of the population so it
becomes quite a challenge for deaf and mute communities to usually convey their messages.
There is a need for a method or an application that can recognize sign language gestures so that
communication is possible even if someone has not learned the American Sign Language.
While sign language is very important to deaf-mute people, to communicate both with abled
people and with themselves, is still getting little attention from the general public. We, as
people, tend to ignore the importance of sign language, unless there are loved ones and family
who are deaf-mute. One of the solutions to communicate with the deaf-mute people is by using
the services of sign language interpreters. But the usage of sign language interpreters can be
expensive. A cheap solution is required so that the deaf-mute people can communicate in a way
where there isn’t a loss in translation.
Our project aims to find a way to bridge the gap between the two differently abled, by making a
glove that can be very helpful for the deaf and dumb people in communicating with other people
as knowing sign language is not something that is common to all, moreover, this can be extended
to creating automatic editors, where the person can easily write by just their hand gestures.
As a solution to this task, a sign language recognition system is going to be implemented using a
glove which links each letter in the alphabet to a certain hand gesture. A custom-designed glove
is developed using Arduino, to recognize the Hand gestures. The prototype utilizes flex sensors
which basically measure the amount of defection, i.e., degree of bending and the corresponding
output is displayed.

Literature Review:

Serial No. Title, author, Methodology Pros Cons


name of the
journal, year of
publication
1. Halder, A. and Glove based -Has a fast -dataset needs to
Tayade, A., approach is used response time be generated
2021. Real-time to detect the -can work in a - accuracy can
Vernacular Sign motion, the resource be improved
Language novel algorithm constrained
Recognition enables for environment
using MediaPipe dynamic
and Machine character
Learning. ISSN, recognition
2582, p.7421.

2. Saquib, N. and Detecting Generates a When the


Rahman, A., fingerspelling in dataset number of
2020, May. American Sign containing sensors are
Application of Language and 100 data points increased,the
machine Bengali sign for each letter. cost of the glove
learning Language using The user has will also
techniques for gloves with greater mobility increase.
real-time sign sensors which when compared Specific sensors
language accurately to for commercial
detection detects both digital image gloves are not
using wearable static and processing. easily
sensors. In dynamic A glove with accessible.
Proceedings of symbols. With only
the11th ACM the constructed required sensors
Multimedia glove a are cheaper than
Systems significant the DIP method.
Conference amount of data
(pp. 178-189). is collected
which is fed to
train an ML
model to classify
the captured
data.

3. Shukor, A.Z., Utilizes a glove Suitable in Hand gloves can


Miskon, that perceiving get quite costly
M.F., acquires data on fingerspelling and accuracy
Jamaluddin, the bending of and sign increases with
M.H., bin Ali, fingers and 3D motions. increasing
F., Asyraf, M.F. orientation Acquiring data sensors. If no. of
and bin Bahar, of the hand. To is simple as it sensors used is
M.B., 2015. A Expand requires less reduced, it’ll
new data glove exactness, an computational bring about loss
approach for accelerometer is force and of important
Malaysian sign used to identify continuous data about the
language change of interpretation is hand
detection. acceleration. easier to stance.
Procedia achieve.
Computer
Science, 76,
pp.60-67.

4. Kushwah, M.S., Flex sensors Fast detection as This can lead to


Sharma, M., connected to an it detects the inaccurate data,
Jain, K. and ATmega 16 are exact values as it does a
Chopra, A., used given by the flex binary search
2017. Sign to detect the sign sensors, and it and matches to
language being shown. converts that the right data
interpretation The information data into the set. In case of
using pseudo is processed and value predefined small errors, you
gloves. then sent to a by the user. will not get an
In Proceeding of bluetooth device output.
International using the
Conference on bluetooth
Intelligent module. Values
Communication, are also shown
Control and on the LCD.
Devices

5. Das, A., Yadav, It makes use of High processing High cost of


L., Singhal, M., flex rates of the Galileo Gen 2.
Sachan, R., sensors and microcontroller Accuracy rate of
Goyal, H., gyroscopes to make the only 85% is
Taparia, K., detect process achieved.
Gulati, R., the movement of faster.
Singh, A. and the
Trivedi, G., fingers and the
2016, December. wrist.
Smart glove for The data is
Sign Language processed by the
Communications Intel Gallelio
In 2016 Gen 2
International microcontroller
Conference on and
Accessibility to the data is
Digital World shown on
(ICADW) (pp. the LCD screen.
27-31).
IEEE

Proposed Methodology
Results and Discussion:

Firstly, we tried using a full glove hardware, but that was out of the question, given that some of
the sensors received were faulty, and we did not have a strong supply of the materials, and
therefore we turned to the next best thing: Simulation.
We tried executing our project code on Proteus, but we were not able to get the correct modules
which were required to make our flex sensors work properly, even though our code was perfectly
right. This led us to use Tinkercad for the execution, where the execution took shape in two
stages. We made a circuit for one finger, just to see if the current variations were visible, and to
see if the project was feasible to execute, given the time crunch.
We were successfully able to complete this project using an online simulation platform,
TINKERCAD, which is an online service where we can make realistic circuits and make
incredibly usable and economic applications using concepts like IOT, Microcontrollers and so
on. We used it to implement our idea to make a glove prototype, using our previous execution of
one finger. The five fingers resemble a typical right hand, with the flex sensor for the thumb
being placed a little lower.

Our Initial Approach, using Proteus that had some limitations


The components involved in the above approach

1) Accelerometer
This was not an inbuilt library and had to be designed within Proteus software.

2) Flex Sensor
This component was also not available within Proteus and the library had to be imported
from EngineeringProjects.com
However, the sensor seems to be having some sort of error and hence, is not functional
within the simulation.

3) Arduino Uno R3
This component was imported from EngineeringProjects.com, and worked well with the
built-in LCD Display library.

4) LCD display
This was a built-in library within Proteus and had no issues.
Final Approach, Using TINKERCAD

The execution of the Tinkercad simulation had 2 stages, as mentioned below:

1. Initial Execution:
We first tried executing our initial code [2], with one flex sensor, as we wanted to make
sure our circuitry makes sense, and is sending authentic values of current values per
finger.

As we see, for a specific angle of movement of the flex sensor (in this case, 96°), we see
a specific amount of current being shown (1.80mA). This tells us that our circuitry
makes sense, and that we are on the right track.

2. Final Setup of the Glove:


The final set up for the execution of the project includes 5 flex sensors along with
accelerometers to sense the speed and angle of movements, and the scope of this project
can be further expanded by making a database with what each letter would have as its
current values, and which angle gives which letter. The reference for the letters has also
been given in Fig1.
Implementing on an online platform came in handy, as we were able to execute it to our
heart’s content, and did not have to worry about faulty circuit parts and the whole issue of
having to solder circuit elements.
Using this previous circuit, we built it further by adding 4 more flex sensors to the mix,
each for one finger. This completes the structural aspect of the project. Code for the final
execution (Stage 2) has been given in the Appendix [3].

Shortcomings: Because we made the project using online simulation, we were not able to add
another sensor for the wrist that would sense the movement of wrist and fingers. Example, for
both the letters R, and P, we need to either move our fingers around each other (as in case of R)
or move our wrist to a specific angle (as in case of P), for the letter to be signed. The execution
of the project for an example text (T-A-R-P) is as follows:
American Sign Language for T-A-R-P using our model
As we notice, our code has been able to identify the movement of flex sensors, and send current
values appropriately. Using this sensor in real life, would be of great help, as this difference in
current is unique to the letter being signed, and can be used to display the alphabet in question,
thereby helping differently abled people communicate with people who are not well versed with
American Sign Language. The code for the Arduino UNO, is given below in [3].
Conclusion
Sign language is a very important tool for the hearing impaired. This system will minimize this
communication barrier by working as an automated translator and converting sign language for
the understanding of normal people using various flex sensors that make this medium more
reliable and helpful although the awareness about sign language has increased over the past few
years, it is not as prevalent as one would hope it is.

For such reasons, gloves will be extremely helpful to not only exhibit a fluent communication
mode between normal and deaf/dumb people but connect with the hearing impaired. The flex
sensors on being bent show different potential differences. We were able to successfully classify
letters from ASL. This glove can also be used as a teaching tool at centers to be able to teach
ASL successfully. In such cases a teacher will not have to be around to correct, and a student
who’s learning ASL will know what letter he is depicting. This system works as a translating
device not only for mute people but also for biomedical applications that are for speech impaired
and paralyzed patients, intelligent home and industrial applications.

Future Scope and Work

With Further integration of various services helps to generate employment for the deaf and dumb
people. This project can be improved further using machine learning and natural language
processing. Also if geared up, the controller can provide home automation on fingertips and if
paired up with a fitness sensor that helps in monitoring the health of the individual. More sensors
can be embedded to recognize full sign language with more perfection and accuracy. The system
can also be designed such that it can translate words from one language to another. It should be
noted that by increasing the number of signs, more combinations of sensor range/values need to
be added to the microcontroller.

In future, an underlying framework would be useful in differentiating between more sets of


alphabets and gestures. Further, this prototype can be designed to make wireless systems for
increasing the range of communication and area of application. The future plan is to use dynamic
hand gesture recognition in video sequences for virtual reality. Also designing a system which is
capable of vocalizing the gestures and movements which can be worn by paralyzed patients.
References

[1] Kiratey Patil, Gayatri Pendharkar, Prof. G. N. Gaikwad, “American Sign Language
Detection”, International Journal of Scientific and Research Publications, Volume 4, Issue 11,
November 2014
[2] Reinhard Gentner, Joseph Classen, “Development and evaluation of a low-cost sensor glove
for Assessment of human finger movements.”, Journal of Neuroscience Methods, Volume 178,
Issue 1, 30 March 2009, Pages 138-147
[3] Yasmeen Raushan, Abhishek Shirpurkar, Vrushabh Mudholkar, Shamal Walke, Tejas Makde,
Pratik Wahane, “SIGN LANGUAGE DETECTION FOR DEAF AND DUMB USING FLEX
SENSORS”, Volume: 04 Issue: 03 | Mar -2017
[4] Mehdi,S.A., FAST-Nat. Univ. of Comput. & Emerging Sci., Lahore, Pakistan, Khan Y.N.,
“Sign language recognition using sensor gloves”, International Journal of Scientific and
Research Publications, Volume 4, Issue 11, November 2014 1, ISSN 2250-3153
[5] Ahmad Zaki Shukor Muhammad Fahmi Miskon Muhammad Herman Jamaluddin Fariz bin
Ali Ibrahim Mohd Fareed Asyraf Mohd Bazli bin Bahar, “A New Data Glove Approach for
Malaysian Sign Language Detection”, Procedia Computer Science, Volume 76, 2015, Pages
60-67
[6] Mahesh Kumar N B1, “Conversion of Sign Language into Text”, International Journal of
Applied Engineering Research ISSN 0973-4562 Volume 13, Number 9 (2018) pp. 7154- 7161.
[7] Prateek S.G. , Jagadeesh J., Siddarth R. ; Smitha Y.,P. G. Sunitha Hiremath ,Neha Tarannum
Pendari,” Dynamic Tool for American Sign Language Finger Spelling Interpreter”, 2018
International Conference on Advances in Computing, Communication Control and Networking
(ICACCCT).
[8] Huy B.D Nguyen ; Hung Ngoc Do,” Deep Learning for American Sign Language Finger
spelling Recognition System”, 2019 26th International Conference on Telecommunications
(ICT).
[9] Shivashankara Ss, Dr.Srinath S,” American Sign Language Recognition System: An Optimal
Approach”, International Journal of Image, Graphics and Signal Processing 10(8), 2018.
Appendix

[1] Arduino Code imported into the Proteus File


-------PIN CONFIGURATION----------------
A0-A3 : FLEX SENSOR
A4&A5 : XPIN AND YPIN FOR ACCELEROMETER
#include <SoftwareSerial.h>
//variable initialization
int xpin = A4;
int ypin = A5;
int FLEX_PIN1 = A0;
int FLEX_PIN2 = A1;
int FLEX_PIN3 = A2;
void setup()
{
Serial.begin(9600);
}
void loop()
{
// reading sensor value
// reading sensor value
float flex1 = analogRead(FLEX_PIN1);
float flex2 = analogRead(FLEX_PIN2);
float flex3 = analogRead(FLEX_PIN3);
float xadc = analogRead(xpin);
float yadc = analogRead(ypin);
//if conditions for the flex sensor values, and showing value on screen
if(((flex1>=70)&&(flex1<=82))&&((flex2>=77)&&(flex2<=95))&&((flex3>=70)&&(flex3<=8
6))))
Serial.print('A');
if(((flex1>=0)&&(flex1<=10))&&((flex2>=0)&&(flex2<=10))&&((flex3>=0)&&(flex3<=12))))
Serial.print('B');
if(((flex1>=40)&&(flex1<=72))&&((flex2>=50)&&(flex2<=90))&&((flex3>=51)&&(flex3<=7
5))))
Serial.print('C');
if(((flex1>=50)&&(flex1<=72))&&((flex2>=45)&&(flex2<=90))&&((flex3>=35)&&(flex3<=7
5))&
&!(((xadc>=412)&&(xadc<=418))&&((yadc>=340)&&(yadc<=360))))
Serial.print('D');
if(((flex1>=68)&&(flex1<=88))&&((flex2>=68)&&(flex2<=90))&&((flex3>=50)&&(flex3<=8
0)))
Serial.print('E');
if(((flex1>=0)&&(flex1<=10))&&((flex2>=0)&&(flex2<=10))&&((flex3>=0)&&(flex3<=10))))
Serial.print('F');
if(((flex1>=75)&&(flex1<=90))&&((flex2>=75)&&(flex2<=90))&&((flex3>=65)&&(flex3<=9
0))&
&(((xadc>=400)&&(xadc<=420))&&((yadc>=340)&&(yadc<=360))))
Serial.print('G');
if(((flex1>=70)&&(flex1<=85))&&((flex2>=75)&&(flex2<=90))&&((flex3>=0)&&(flex3<=10)
)&&!
(((xadc>=410)&&(xadc<=420))&&((yadc>=368)&&(yadc<=380))))
Serial.print('H');
if(((flex1>=0)&&(flex1<=10))&&((flex2>=50)&&(flex2<=70))&&((flex3>=50)&&(flex3<=70)
)&&(
(xadc>=410)&&(xadc<=420))&&((yadc>=330)&&(yadc<=370))))
Serial.print('I');
if(((flex1>=0)&&(flex1<=10))&&((flex2>=50)&&(flex2<=70))&&((flex3>=50))&&(!((xadc>=
410)
&&(xadc<=420))&&((yadc>=355)&&(yadc<=370))))
Serial.print('J');
if(((flex1>=60)&&(flex1<=75))&&((flex2>=60)&&(flex2<=85))&&((flex3>=0))&&(((xadc>=4
04)
&&(xadc<=415))&&((yadc>=368)&&(yadc<=380))))
Serial.print('K');
if(((flex1>=75)&&(flex1<=90))&&((flex2>=75)&&(flex2<=90))&&((flex3>=70)&&(flex3<=9
0))&
&((xadc>=390)&&(xadc<=405))&&((yadc>=360)&&(yadc<=380)))&&!((xadc>=270)&&(xad
c<
=300))&&((yadc>=360)&&(yadc<=390)))
Serial.print('L');
if(((flex1>=40)&&(flex1<=61))&&((flex2>=72)&&(flex2<=84))&&((flex3>=45)&&(flex3<=6
5))))
Serial.print('M');
if(((flex1>=54)&&(flex1<=70))&&((flex2>=50)&&(flex2<=61))&&((flex3>=48)&&(flex3<=6
6))&
&(((xadc>=400)&&(xadc<=435))&&((yadc>=350)&&(yadc<=390))))
Serial.print('N');
if(((flex1>=68)&&(flex1<=88))&&((flex2>=68)&&(flex2<=90))&&((flex3>=50)&&(flex3<=8
0)))
Serial.print('O');
if(((flex1>=60)&&(flex1<=75))&&((flex2>=60)&&(flex2<=85))&&((flex3>=0)&&(flex3<=10)
)&&(
((xadc>=270)&&(xadc<=290))&&((yadc>=360)&&(yadc<=380))))
Serial.print('P');
if(((flex1>=75)&&(flex1<=90))&&((flex2>=75)&&(flex2<=90))&&((flex3>=65)&&(flex3<=9
0)))&
&(((xadc>=270)&&(xadc<=300))&&((yadc>=360)&&(yadc<=390))))
Serial.print('Q');
if(((flex1>=40)&&(flex1<=72))&&((flex2>=45)&&(flex2<=90))&&((flex3>=20)&&(flex3<=4
5)))
Serial.print('R');
if(((flex1>=70)&&(flex1<=90))&&((flex2>=80)&&(flex2<=90))&&((flex3>=80)&&(flex3<=9
0))))
Serial.print('S');
if(((flex1>=40)&&(flex1<=61))&&((flex2>=72)&&(flex2<=84))&&((flex3>=45)&&(flex3<=6
5)))
Serial.print('T');
if(((flex1>=70)&&(flex1<=90))&&((flex2>=80)&&(flex2<=90))&&((flex3>=0)&&(flex3<=10)
))
Serial.print('U');
if(((flex1>=70)&&(flex1<=90))&&((flex2>=80)&&(flex2<=90))&&((flex3>=0)&&(flex3<=10)
)&&(
digitalRead(6)==HIGH))
Serial.print('V');
if(((flex1>=70)&&(flex1<=90))&&((flex2>=0)&&(flex2<=10))&&((flex3>=0)&&(flex3<=10))
)
Serial.print('W');
if(((flex1>=50)&&(flex1<=72))&&((flex2>=45)&&(flex2<=90))&&((flex3>=35))
Serial.print('X');
if(((flex1>=0)&&(flex1<=10))&&((flex2>=70)&&(flex2<=90))&&((flex3>=60)&&(flex3<=80)
))
Serial.print('Y');
if(((flex1>=50)&&(flex1<=72))&&((flex2>=45)&&(flex2<=90))&&((flex3>=35)&&(flex3<=7
5))&
&(((xadc>=412)&&(xadc<=418))&&((yadc>=340)&&(yadc<=360))))
Serial.print('Z');
delay(1000);
}
[2] Initial Execution Stage Code
#include <Servo.h>
Servo myServo;

# define flexPin A0

void setup()
{
myServo.attach(3);
Serial.begin(9600);

void loop()
{
int flexValue;
int servoPosition;

flexValue = analogRead(flexPin);

servoPosition = map(flexValue, 770, 950, 0, 180);


servoPosition = constrain(servoPosition, 0, 180);

myServo.write(servoPosition);

Serial.print("sensor =");
Serial.println(flexValue);
Serial.print("servo =");
Serial.println(servoPosition);

delay(20);
}
[3] Executable Code for Arduino
#include <stdint.h>
#include <Servo.h>

// The minimum and maximum values of the flex sensors.


// determined by simply reading analog values when flex sensor
// was at rest, and fully flexed.
#define FLEX_MIN 384
#define FLEX_MAX 783

// The minimum and maximum turning values of the servo.


#define SERVO_MIN 0
#define SERVO_MAX 180

// the macro used to get number of elements in an array.


#define LENGTH(arr) sizeof(arr)/sizeof(arr[0])

uint8_t glove[5] = {A0, A1, A2, A3, A4}; // Glove input pins.
uint8_t robot[5] = {11, 10, 9, 6, 5}; // Servo output pins (PWM).

// Create 5 servo objects to represent our fingers.


Servo finger[5];

uint8_t pin;
int16_t pos; // The new position to write to the servo.

void setup() {
Serial.begin(9600);

for (pin = 0; pin < LENGTH(glove); pin++) {


pinMode(glove[pin], INPUT_PULLUP);
}

for (pin = 0; pin < LENGTH(robot); pin++) {


finger[pin].attach(robot[pin]);
}
}

void loop() {
for (pin = 0; pin < LENGTH(glove); pin++) {
pos = analogRead(glove[pin]);

pos = map(pos, FLEX_MIN, FLEX_MAX, SERVO_MIN, SERVO_MAX);

Serial.print(pos); Serial.print(" ");

// Write the new position of the current finger to the servo.


finger[pin].write(pos);
}

Serial.println("");
// Wait a little while to avoid unnecessary jitter.
delay(20);
}

You might also like