You are on page 1of 8

Hand Gesture Techniques For Determining Sign Languages

Adit khurana ,Student Ankit Khurana,Student C.Swedheetha,


Department of Computer Department of Computer Assistant Professor
Science and Engineering, Science and Engineering, Department of Computer
Velammal College of Velammal College of Science and Engineering,
Engineering and Technology, Engineering and Technology, Velammal College of
Madurai, India. Madurai, India. Engineering and Technology,
swedhaece08@gmail.com swedhaece08@gmail.com Madurai, India.
swedhaece08@gmail.com

Abstract- Hand gesture recognition plays a vital role in a technical and practical perspective. Here are some of
bridging communication gaps for individuals with hearing the key challenges such as Variability in Signs,
impairments. This abstract explores the application of Background Noise, Lighting Conditions, Complex Hand
computer vision and machine learning techniques to give
Poses etc [1]. Preventing the exploitation of
sign language gestures. By analyzing and classifying the
intricate movements of hand and finger configurations,
these systems facilitate real-time translation of sign
language into text or speech. The potential impact on
accessibility and inclusivity is substantial, empowering the
deaf and hard-of-hearing community to communicate
effortlessly with the broader society. This abstract
encapsulates the significance of hand gesture recognition in
breaking down linguistic barriers and fostering a more
inclusive world.

Keywords: ML, Disability, Python, Open CV,


Gesture,Detection

Figure [1]: Recognition of Hand Using Camera Lens


I. INTRODUCTION hand gesture techniques for determining sign
languages is crucial to ensure that these technologies are
Sign languages are complex and rich forms of used ethically and responsibly.[6].Thus various means
communication primarily used by deaf and hard-of- should be adopted to avoid the use of irrealvant use of
hearing individuals to convey thoughts, emotions, and this system [7].
ideas. These languages employ a combination of hand
2.SYSTEM COMPONENTS
gestures, facial expressions, body movements, and
sometimes even specific lip patterns. One of the
Hand gesture recognition techniques for
fundamental aspects of sign languages is the use of hand
determining sign languages typically involve several
gestures, which play a central role in conveying
system components, including hardware and software
meaning[1]. The aim of utilizing hand gesture techniques
components. These components work together to
for determining sign languages is multifaceted,
capture, process, and interpret hand gestures to
encompassing various objectives and purposes aimed at
understand sign language. Some of the System
enhancing communication, inclusivity, and
components are Gesture Input Device , Data Acquisition
understanding within the deaf and hard-of-hearing
and Preprocessing, Hand Tracking, Gesture Database etc
communities and the broader society[2] Computer vision
plays a crucial role in Hand Gesture Techniques for
determining sign languages by enabling the recognition
of hand gestures, ultimately facilitating communication
between deaf and hard-of-hearing individuals and the
broader community.[3] Hand Gesture Techniques for
determining sign languages possess several
characteristics that set them apart from other forms of
communication and make them effective for conveying
meaning within the deaf and hard-of-hearing
communities.[4]. Creating a system for Hand Gesture
Techniques for determining sign languages is a complex
and ongoing endeavor that involves various components,
including technology, linguistics, and cultural sensitivity. Figure [3]: Types of Hand Gestures for Sign Languages
[5] Detecting and interpreting hand gestures for sign
languages presents several challenges, both from
4.TYPES OF HAND GESTURE TECHNIQUES :

Hand Gesture Techniques are divided in two sub


divisions.They are explained as follows .

1. Recognition Hand Gesture Using Instrumented


Gloves

2. Recognition Hand Gesture Using Computer


Vision
Figure [4]: Recognition Hand Gesture Using
4.1: Recognition Hand Gesture Using Instrumented Computer Vision
Gloves
Hand gesture recognition is a technology that
allows human-computer interaction through the
interpretation of hand movements and gestures. This
capability has gained significant attention in recent years
due to its potential applications in various fields,
including virtual reality, augmented reality, robotics, and
human-computer interfaces. An instrumented glove is a
wearable device equipped with an array of sensors
strategically placed on the fingers and palm. These
sensors capture data related to hand movements,

such as finger flexion and extension, hand orientation, .


By analysing the data collected from these sensors,
sophisticated algorithms can interpret the gestures made
by the wearer and translate them into meaningful
commands or interactions with digital environments or
devices.

Figure [5]: Recognition Hand Gesture Using


Instrumented Gloves

.
4.2 Recognition Hand Gesture Using Computer Vision

Hand gesture recognition using camera


lenses is a application of computer vision technology
that enables machines to interpret and respond to hand
movements and gestures captured through traditional
camera lenses. This approach uses the power of image
processing and pattern recognition to understand and
interpret gestures, opening up a wide range of
applications in human-computer interaction, automation,
and augmented reality.

Reference Metrics Advantage Disadvantage


[1] Gaming Motion Limited
tracking Accuracy
abilities
[2] Security Data is kept Poor
private performance
when load is
high
[3] Gaming Motion Limited
tracking Accuracy
abilities
[4] Literature Indian Sign Not suited for
Language is foreign sign
described languages
[5] Efficiency Deep Limited
learning Accuracy
modules is
explained
[6] Testing Testing of Payload is high
sign
language is
done
[7] Accurate Tells about Not Used in
Hand determining
Gesture sign languages
techniques
[8] Image Can More no of
Processing identify cameras are
more hand used
gestures
with more
accuracy
[13] Resilience Robust Does not
result implement
more on
Deaf
patients
[14] Constancy Indian Foreign
languages Language
Recognition
are
recognized Is minimal
except
English
[15] Accurate Sign More
language complex to
idea is develop
given in
terms of
gesture
[16] Optimized Deaf and Limited
dumb Accuracy
people can related to
understand other
more better dialects
[17] Fidelity Indian and Foreign
American languages
languages is is not
recognized recognized
[18] Speed A optimal Setup and
solution in working is
budget more
complex
[19] Preservation Image Limited
preservatio Accuracy
n is done and
precision
on different
gestures
[20] Responsive Can More no of
identify cameras
more hand are used
gestures
[9] Robustness Result is Some patterns
robust where not
identified
[10] Efficiency Deep Limited
learning Accuracy
modules is
explained
[11] Literature Indian No Foreign
Languages Languages
are easily Are recognized
identified
[12] Scalability Can be More complex
extended to architecture
various
domains as
5. ALGORITHMS USED BY THE HAND GESTURE
DEVICE communication between the deaf and hearing
communities. Here are some notable applications:
Hand gesture recognition for sign languages typically
involves the use of computer vision and machine learning Sign Language Translation: Hand gesture recognition
algorithms. The choice of algorithms can vary depending can be used to translate sign language into spoken or
on the specific requirements of the application, but here written language, allowing for real-time communication
are some commonly used algorithms and techniques. between deaf individuals and those who do not know sign
Some of them are language.

Education and Training: Hand gesture recognition


systems can assist in teaching sign language to both deaf
5.1 Convolutional Neural Networks (CNNs): and hearing individuals. Interactive applications can
provide feedback on the correctness of signs and help learners
This algorithm is taken from the literature review improve their signing skills.
Dynamic Hand Gesture Recognition prepared by
Deepali.N.Kakade and Dr J.S.Chitode .The main idea of Communication Aids: Hand gesture recognition
this algorithm is to model the nature performance which technology can be used to develop communication aids
is related to human mental state. for non-verbal individuals who rely on sign language to
express their thoughts and needs.
5.2 Recurrent Neural Networks (RNNs): This
algorithm is taken from the literature review Dynamic Emergency Services: Hand gesture recognition systems
Hand Gesture Recognition prepared by can be integrated into emergency services to ensure that
Deepali.N.Kakade and Dr J.S.Chitode .The main idea of deaf individuals can effectively communicate their needs
this algorithm is to model these sequences over time, during emergencies or when seeking help.
making them suitable for gesture recognition tasks that
require tracking the movement of the hands. Navigation and Transportation: Gesture recognition can
be integrated into navigation systems to help deaf
5.3 Hidden Markov Models (HMMs): This algorithm is individuals navigate transportation, airports, and other
taken from the literature review Dynamic Hand Gesture complex environments more effectively.
Recognition prepared by Deepali.N.Kakade and Dr
J.S.Chitode .They can represent the probabilistic Gaming and Entertainment: Gesture recognition can
transitions between different hand gestures and can be enhance the gaming experience by allowing players to
combined with other techniques for better accuracy. control games using sign language gestures. It can also be
used in interactive storytelling and virtual reality
5.4 3D Gesture Recognition: When dealing with depth applications.
information, as provided by depth cameras like Microsoft
Kinect or LiDAR sensors, 3D gesture recognition Research and Linguistics: Sign language recognition
techniques are employed. Algorithms such as Random technology can aid linguists and researchers in studying
Forests, Decision Trees, or even deep learning
architectures adapted for 3D data can be used to
recognize hand gestures in three dimensions.

5.5 Data Augmentation: Augmentation techniques


like data mirroring, rotation, and scaling can help
increase the robustness of the recognition system by
providing variations of the same gesture during
training.

5.6 Post-processing Techniques: Post-processing


techniques, including filtering, smoothing, and hand
tracking, can be applied to refine the detected hand
gestures and improve overall accuracy.

∑ ❑ ∑ ❑ ( Amn− Á ) ( Bmn− B́ )
m n
r=
√¿¿¿

where Á=mean ⁡2( A ) and B́=mean ⁡2 (2) sign languages, their dialects, and their evolution over
time.
TP
Precision =TP/(TP+ FP) Recall = Figure [6]: Flowchart
TP+ FN
6. APPLICATIONS: 7. DRAWBACKS

Hand gesture recognition for sign language detection While hand gesture recognition techniques have
has numerous practical applications that can benefit made significant progress in determining sign
individuals with hearing impairments and improve languages, there are several drawbacks and challenges
associated with these methods:
1. Variability in Gestures: Sign languages can have a Sivic Published in: 2015 IEEE International Conference on
wide range of gestures, and there can be variation in how Computer Vision (ICCV) This research focuses on sign language
individual signs are performed by different signers. recognition using the Microsoft Kinect sensor and Hidden Markov
Models (HMMs). It explores how HMMs can be applied to
recognize sign language gestures..
2. Ambiguity: Some signs in sign languages can be [4] Hand Gesture Recognition for Sign Language Translation" -
highly ambiguous. The same hand gesture may have Authors: Chandra Shekhar Yadav, Amit Mishra, and Geetika
different meanings depending on the surrounding signs, Published in: 2015 International Conference on Computing,
facial expressions, and body movements. Recognizing Communication and Automation (ICCCA) This paper discusses a
these subtle differences can be difficult for algorithms. system for recognizing hand gestures in Indian Sign Language
(ISL) using a combination of techniques like contour analysis and
shape recognition.
3. Limited Datasets: Building accurate hand gesture [5] Sign Language Recognition Using Deep Learning Models: A
recognition models requires large and diverse datasets. Review"** Authors: Jaya Susan Jacob and Vennila Saminathan
Collecting such datasets for sign languages can be Published in: 2018 International Conference on Information and
challenging, especially for less widely spoken sign Computer Technologies (ICICT) This review paper provides an
languages. Limited data can lead to overfitting and overview of various deep learning models and techniques applied
reduced model performance. to sign language recognition, including convolutional neural
networks (CNNs) and recurrent neural networks (RNNs).
[6] Real-Time American Sign Language Recognition Using
5. Real-world Variations: Real-world scenarios Convolutional Neural Networks" Authors: Arpan Pal, Rohit
introduce additional challenges, such as varying lighting Ranchal, and Dhiraj Joshi
conditions, background clutter, occlusions, and different Published in: 2019 International Conference on Machine Learning
camera angles. These factors can negatively impact the and Data Science (ICMLDS)This paper presents a real-time
performance of hand gesture recognition systems. American Sign Language recognition system based on CNNs. It
discusses the training and testing of the model for sign language
Hardware Requirements: Many hand gesture gestures
recognition systems rely on specialized hardware like [7] "A Survey on Hand Gesture Recognition for Human-Computer
depth sensors or multiple cameras, which can be Interaction" - Authors: Li Li and Weigang Zhan- Published in:
2018 IEEE Access While this survey doesn't specifically focus on
expensive and not widely available. This limits the sign language, it provides an overview of hand gesture recognition
accessibility of such systems, particularly in resource- techniques, which are fundamental to sign language recognition
constrained environments. systems.
[8] Dynamic Hand Gesture :Literature Review Presented By
Training and Adaptation: Training gesture recognition Deepali .N.Kakade and Prof Dr J.S. Chitode
models for sign languages requires expertise in both [9] "Recent Trends in Deep Learning Based Sign Language
computer vision and linguistics. Moreover, adapting these Recognition" by K. Singh and P. Kaur (2021) - This paper
models to new signs or variations within a sign language discusses recent trends and advancements in deep learning-based
can be labor-intensive. sign language recognition.
[10] Sunitha K. A, Anitha Saraswathi.P, Aarthi.M, Jayapriya. K,
Lingam Sunny, “Deaf Mute Communication Interpreter- A
Ethical Considerations: There are ethical considerations Review”, International Journal of Applied Engineering
related to the use of hand gesture recognition in sign Research ,Volume 11, pp 290-296 , 2016.
languages. Ensuring that recognition technology respects [11] Mathavan Suresh Anand, Nagarajan Mohan Kumar, Angappan
the cultural and linguistic nuances of sign languages and Kumaresan, “ An Efficient Framework for Indian SignLanguage
does not perpetuate biases is crucial. Recognition Using Wavelet Transform” Circuits and Systems,
Volume 7, pp 1874-1883, 2016
[12] Mandeep Kaur Ahuja, Amardeep Singh, “Hand Gesture
Recognition Using PCA”, International Journal of Computer
Science Engineering and Technology (IJCSET ), Volume 5, Issue
7, pp. 267-27, July 2015
[13] Sagar P.More, Prof. Abdul Sattar, “Hand gesture recognition
system for dumb people”, International Journal of Science and
Research (IJSR)
[14] Chandandeep Kaur, Nivit Gill, “An Automated System for Indian
Sign Language Recognition”, International Journal of Advanced
Research in Computer Science and Software Engineering.
[15] Pratibha Pandey, Vinay Jain, “Hand Gesture Recognition for Sign
Language Recognition: A Review”, International Journal of
Science, Engineering Technology Research (IJSETR), Volume 4,
Issue 3, March 2015
[16] Nakul Nagpal,Dr. Arun Mitra.,Dr. Pankaj Agrawal, “Design Issue
and Proposed Implementation of Communication Aid for Deaf &
Dumb People”, International Journal on Recent and Innovation
Trends in Computing and Communication ,Volume: 3 Issue: 5,pp-
8.References 147 – 149.
[17] Neelam K. Gilorkar, Manisha M. Ingle, “Real Time Detection
[1] "AMERICAN SIGN LANGUAGE RECOGNITION WITH THE KINECT" And Recognition Of Indian And American Sign Language Using
AUTHORS: K. R. ZOLLIKER AND E. D. SCHEIN Sift”, International Journal of Electronics and Communication
PUBLISHED IN: 2012 IEEE INTERNATIONAL CONFERENCE ON Engineering & Technology (IJECET), Volume 5, Issue 5, pp. 11-
ROBOTICS AND AUTOMATION (ICRA) 18 , May 2014
[2] "Real-Time Sign Language Recognition Using a Convolutional [18] Neelam K. Gilorkar, Manisha M. Ingle, “A Review on Feature
Neural Network" Authors: Tharindu Fernando, Simon Denman, Extraction for Indian and American Sign Language”, International
Sridha Sridharan, and Clinton Fookes Journal of Computer Science and Information Technologies,
Published in: 2016 IEEE Winter Conference on Applications of Volume 5 (1) ,
Computer Vision (WACV) [19] Ashish Sethi, Hemanth ,Kuldeep Kumar,Bhaskara Rao ,Krishnan
[3] "Sign Language Recognition with Microsoft Kinect and Hidden R, “Sign Pro-An Application Suite for Deaf and Dumb”, IJCSET ,
Markov Models : Authors: Oscar Koller, Ivan Laptev, and Josef Volume 2, Issue 5, pp-1203-1206, May 2012. [12]Priyanka
Sharma,“Offline Signature Verification Using Surf Feature
Extraction and Neural Networks Approach”, International Journal
of Computer Science and Information Technologies, Volume 5
(3) ,
[20] Pragati Garg, Naveen Aggarwal and Sanjeev Sofat, Vision Based
Hand Gesture Recognition‖ World Academy of Science,
Engineering and Technology 49 2009

You might also like