Professional Documents
Culture Documents
Submitted By:
Chinmayee (reg no: 19BEC0311)
Adithya Neelakantan (reg no: 19BEC0647)
Akshay Jaithalia (reg no: 19BEC0704)
Shanmathi (reg no: 19BEC0761)
Aditya Krishna Nayak (reg no: 19BEC0781)
Abstract:
The disabled community has lost a major portion of their voice because of their disability.
American Sign Language is one of the ways to help give them that lost voice, and we aim to aid
in that process. We intend to make a glove which has sensors which are capable of sensing one's
motions, in a manner where it gives us an output of the letters being signed. This then forms
sensible words and outputs them on a screen. With this
work, we intend to take a basic step in bridging this
communication gap using Sign Language Recognition.
We aim to convert basic symbols that represent the 26 letters
in the alphabet as mentioned under American Sign Language
script and display them with the help of a glove. We use an
accelerometer to measure the tilt in the palm and five flex
sensors, one for each finger on the glove. These sensors
measure the bend in the fingers and thumb and palm and
according to the bend angle value, the Arduino Uno
microcontroller understands which set of values represents
each symbol and transfers the appropriate outcome value
which is displayed on the Arduino IDE serial monitor.
Literature Review:
Proposed Methodology
Results and Discussion:
Firstly, we tried using a full glove hardware, but that was out of the question, given that some of
the sensors received were faulty, and we did not have a strong supply of the materials, and
therefore we turned to the next best thing: Simulation.
We tried executing our project code on Proteus, but we were not able to get the correct modules
which were required to make our flex sensors work properly, even though our code was perfectly
right. This led us to use Tinkercad for the execution, where the execution took shape in two
stages. We made a circuit for one finger, just to see if the current variations were visible, and to
see if the project was feasible to execute, given the time crunch.
We were successfully able to complete this project using an online simulation platform,
TINKERCAD, which is an online service where we can make realistic circuits and make
incredibly usable and economic applications using concepts like IOT, Microcontrollers and so
on. We used it to implement our idea to make a glove prototype, using our previous execution of
one finger. The five fingers resemble a typical right hand, with the flex sensor for the thumb
being placed a little lower.
1) Accelerometer
This was not an inbuilt library and had to be designed within Proteus software.
2) Flex Sensor
This component was also not available within Proteus and the library had to be imported
from EngineeringProjects.com
However, the sensor seems to be having some sort of error and hence, is not functional
within the simulation.
3) Arduino Uno R3
This component was imported from EngineeringProjects.com, and worked well with the
built-in LCD Display library.
4) LCD display
This was a built-in library within Proteus and had no issues.
Final Approach, Using TINKERCAD
1. Initial Execution:
We first tried executing our initial code [2], with one flex sensor, as we wanted to make
sure our circuitry makes sense, and is sending authentic values of current values per
finger.
As we see, for a specific angle of movement of the flex sensor (in this case, 96°), we see
a specific amount of current being shown (1.80mA). This tells us that our circuitry
makes sense, and that we are on the right track.
Shortcomings: Because we made the project using online simulation, we were not able to add
another sensor for the wrist that would sense the movement of wrist and fingers. Example, for
both the letters R, and P, we need to either move our fingers around each other (as in case of R)
or move our wrist to a specific angle (as in case of P), for the letter to be signed. The execution
of the project for an example text (T-A-R-P) is as follows:
American Sign Language for T-A-R-P using our model
As we notice, our code has been able to identify the movement of flex sensors, and send current
values appropriately. Using this sensor in real life, would be of great help, as this difference in
current is unique to the letter being signed, and can be used to display the alphabet in question,
thereby helping differently abled people communicate with people who are not well versed with
American Sign Language. The code for the Arduino UNO, is given below in [3].
Conclusion
Sign language is a very important tool for the hearing impaired. This system will minimize this
communication barrier by working as an automated translator and converting sign language for
the understanding of normal people using various flex sensors that make this medium more
reliable and helpful although the awareness about sign language has increased over the past few
years, it is not as prevalent as one would hope it is.
For such reasons, gloves will be extremely helpful to not only exhibit a fluent communication
mode between normal and deaf/dumb people but connect with the hearing impaired. The flex
sensors on being bent show different potential differences. We were able to successfully classify
letters from ASL. This glove can also be used as a teaching tool at centers to be able to teach
ASL successfully. In such cases a teacher will not have to be around to correct, and a student
who’s learning ASL will know what letter he is depicting. This system works as a translating
device not only for mute people but also for biomedical applications that are for speech impaired
and paralyzed patients, intelligent home and industrial applications.
With Further integration of various services helps to generate employment for the deaf and dumb
people. This project can be improved further using machine learning and natural language
processing. Also if geared up, the controller can provide home automation on fingertips and if
paired up with a fitness sensor that helps in monitoring the health of the individual. More sensors
can be embedded to recognize full sign language with more perfection and accuracy. The system
can also be designed such that it can translate words from one language to another. It should be
noted that by increasing the number of signs, more combinations of sensor range/values need to
be added to the microcontroller.
[1] Kiratey Patil, Gayatri Pendharkar, Prof. G. N. Gaikwad, “American Sign Language
Detection”, International Journal of Scientific and Research Publications, Volume 4, Issue 11,
November 2014
[2] Reinhard Gentner, Joseph Classen, “Development and evaluation of a low-cost sensor glove
for Assessment of human finger movements.”, Journal of Neuroscience Methods, Volume 178,
Issue 1, 30 March 2009, Pages 138-147
[3] Yasmeen Raushan, Abhishek Shirpurkar, Vrushabh Mudholkar, Shamal Walke, Tejas Makde,
Pratik Wahane, “SIGN LANGUAGE DETECTION FOR DEAF AND DUMB USING FLEX
SENSORS”, Volume: 04 Issue: 03 | Mar -2017
[4] Mehdi,S.A., FAST-Nat. Univ. of Comput. & Emerging Sci., Lahore, Pakistan, Khan Y.N.,
“Sign language recognition using sensor gloves”, International Journal of Scientific and
Research Publications, Volume 4, Issue 11, November 2014 1, ISSN 2250-3153
[5] Ahmad Zaki Shukor Muhammad Fahmi Miskon Muhammad Herman Jamaluddin Fariz bin
Ali Ibrahim Mohd Fareed Asyraf Mohd Bazli bin Bahar, “A New Data Glove Approach for
Malaysian Sign Language Detection”, Procedia Computer Science, Volume 76, 2015, Pages
60-67
[6] Mahesh Kumar N B1, “Conversion of Sign Language into Text”, International Journal of
Applied Engineering Research ISSN 0973-4562 Volume 13, Number 9 (2018) pp. 7154- 7161.
[7] Prateek S.G. , Jagadeesh J., Siddarth R. ; Smitha Y.,P. G. Sunitha Hiremath ,Neha Tarannum
Pendari,” Dynamic Tool for American Sign Language Finger Spelling Interpreter”, 2018
International Conference on Advances in Computing, Communication Control and Networking
(ICACCCT).
[8] Huy B.D Nguyen ; Hung Ngoc Do,” Deep Learning for American Sign Language Finger
spelling Recognition System”, 2019 26th International Conference on Telecommunications
(ICT).
[9] Shivashankara Ss, Dr.Srinath S,” American Sign Language Recognition System: An Optimal
Approach”, International Journal of Image, Graphics and Signal Processing 10(8), 2018.
Appendix
# define flexPin A0
void setup()
{
myServo.attach(3);
Serial.begin(9600);
void loop()
{
int flexValue;
int servoPosition;
flexValue = analogRead(flexPin);
myServo.write(servoPosition);
Serial.print("sensor =");
Serial.println(flexValue);
Serial.print("servo =");
Serial.println(servoPosition);
delay(20);
}
[3] Executable Code for Arduino
#include <stdint.h>
#include <Servo.h>
uint8_t glove[5] = {A0, A1, A2, A3, A4}; // Glove input pins.
uint8_t robot[5] = {11, 10, 9, 6, 5}; // Servo output pins (PWM).
uint8_t pin;
int16_t pos; // The new position to write to the servo.
void setup() {
Serial.begin(9600);
void loop() {
for (pin = 0; pin < LENGTH(glove); pin++) {
pos = analogRead(glove[pin]);
Serial.println("");
// Wait a little while to avoid unnecessary jitter.
delay(20);
}