Professional Documents
Culture Documents
B.E. Project Phase-I Report on SPEAKING SYSTEM FOR MUTE PEOPLE USING IOT by
Candidates Name (Exam Seat No.) Candidates Name (Exam Seat No.) Candidates Name
(Exam Seat No.) Candidates Name (Exam Seat No.) Under the guidance of Guide Name /
Department of Information Technology Smt. Kashibai Navale College of Engineering,
Pune-41 Accredited by NBA SAVITRIBAI PHULE PUNE UNIVERSITY 2020-2021 Sinhgad
Technical Education Society, Department of Information Technology Smt. Kashibai
Navale College of Engineering , Pune-411041 _/ _ _ Date: CERTIFICATE This is to certify
that, Candidates Name (Exam Seat No.)
Candidates Name (Exam Seat No.) Candidates Name (Exam Seat No.) Candidates Name
(Exam Seat No.) of class B.E IT; have successfully completed their project Phase-I work
on “SPEAKING SYSTEM FOR MUTE PEOPLE USING IOT’’ at Smt. Kashibai Navale College
of Engineering, Pune in the partial fulfillment of the Graduate Degree course in B.E
_Title _Page No. _ _1.1 _Location and topography of Roorkee, India. _10 _ _1.2 _Location
and aerial view of New Orleans (Google Earth, 2008). _11 _ _1.3 _Preprocessing of ERS-2
SAR image. _13 _ _1.4 _Preprocessing of RADARSAT-1 image. _16 _ _3.1 _Scaling and
wavelet function of fourth order daubechies wavelet transform (db4). _36 _ _ Notes:
Figure numbers should be given as a.b
such that ‘a’ corresponds to chapter number and ‘b’ corresponds to figure number
inside the chapters. Acronyms ART Adaptive Resonance Theory AVHRR Advanced High
Resolution Radiometer CEOS Committee on Earth Observation Satellites CP Change’
Pixels CSA Canadian Space Agency DCT Discrete Cosine Transform DEM Digital
Elevation Model DInSAR Differential Interferometric SAR DN Digital Number Note:
acronyms should be alphabetically sorted
Abstract (Minimum 300 words & maximum 500 words)
Chapter 1 Introduction Motivation It’s very difficult for mute people to convey their
message to regular people. Since regular people are not trained on hand sign language,
the communication becomes very difficult.
In emergency or other times when a mute person travelling or among new people
communication with nearby people or conveying a message becomes very difficult.
Here we propose smart speaking systems that help mute people in conveying their
message to regular people using hand motions and gestures. Justification of problem
Here we propose a smart speaking system that helps mute people in conveying their
message to regular people using hand motions and gestures.
Need for the new system Mute people can’t speak and normal people don’t know the
sign language which is used for inter-communication between mute people. This system
will be useful to solve this problem Advances/additions/updating the previous system
Smart wearable hand device as a sign interpretation system using a built-in SVM
classifier is implemented in [1] but the system is for blind people we here created a
system for mute people too.
Presently available systems for the same Smart Wearable Hand Device for Sign
Language Interpretation System with Sensors Fusion Hand Gesture Movement Tracking
System for Human Computer Interaction Vision-Based Sign Language Translation Device
Organization of the report Proposed report is divided into 6 chapters Chapter 1
introduced the subject, clarify reason behind taking this project, need of this this system
in medical, how system justify formulated problem, advances or addition in previous
system and presently available system which are available for same problem. Chapter 2
discusses different papers related to proposed system.
A table is drawn which tells what was the outline of referred papers and advantages over
previous system. Chapter 3 discussed problem and proposed methodology. In this
project we are using SVM hence SVM is discussed thoroughly. Chapter 4 analyses
required hardware and software for proposed system Chapter 5 is all about detail
design of proposed system.
It includes system architecture, different UML diagrams like Use case, component,
sequence diagram etc. Chapter 6 draws conclusion on proposed system
The system has been implemented in MATLAB environment using MATLAB Image
Processing Toolbox. The system can recognize and track the hand movement and can
replace the mouse function to move the mouse cursor and the mouse click function. In
general, the system can detect and track the hand movement so that it can be used as
user interface in real time.
Yellapu Madhuri and et al [5] presents a report on a mobile Vision-Based Sign Language
Translation Device for automatic translation of Indian sign language into speech in
English to assist the hearing and/or speech impaired people to communicate with
hearing people. This system is an interactive application program developed using
LABVIEW software and incorporated into a mobile phone.
This is able to recognize one handed sign representations of alphabets (A-Z) and
numbers (0-9). Siddharth S. Rautaray and Anupam Agrawal design a system for gestural
interaction between a user and a computer in dynamic environment in their paper [6].
The gesture recognition system uses image processing techniques for detection,
segmentation, tracking and recognition of hand gestures for converting it to a
meaningful command. The interface being proposed here can be substantially applied
towards different applications like image browser, games etc. G.
Simion and et al [7] studied advances in the field of vision based hand gesture
recognition, from hardware and software point of view and reviews some major trends
and the recent evolution. While providing a non-exhaustive inventory of the huge
amount of past research in the field, the paper reviews in more detail part based
approaches, particularly those embedded in the compositional framework, an emerging
dominant trend in computer vision.
To get daily information from the internet has become most people’s living habits
today. In order to reduce the steps of receiving the information, such as complex mouse
or keyboard steps, paper [8] propose a system designed for easily getting daily
information without mouse and keyboard actions. N. Subhash Chandra, T. Venu and P.
Srikanth developed a simple and fast motion image based algorithm.
Gestures recognition deals with the goal of interpreting human gestures via
mathematical algorithm In general; it is suitable to control home appliances using hand
gestures. [9] The system proposed in [10] by Lee and et al is divided into three modules:
processing module, sensor module, and communication module. The sensor module,
consisting of three BNO055 absolute orientation sensors, is placed on the thumb, index
finger, and the back of the hand.
Sign language helps deaf and dumb people to communicate with other people. But not
all people understand sign language. Proposed Algorithm/Methodology Support vector
machines (SVMs) are powerful yet flexible supervised machine learning algorithms
which are used both for classification and regression. But generally, they are used in
classification problems. In 1960s, SVMs were first introduced but later they got refined in
1990.
SVMs have their unique way of implementation as compared to other machine learning
algorithms. Lately, they are extremely popular because of their ability to handle multiple
continuous and categorical variables. An SVM model is basically a representation of
different classes in a hyperplane in multidimensional space.
The hyperplane will be generated in an iterative manner by SVM so that the error can be
minimized. The goal of SVM is to divide the datasets into classes to find a maximum
marginal hyperplane (MMH). / The followings are important concepts in SVM - Support
Vectors - Datapoints that are closest to the hyperplane is called support vectors.
Separating line will be defined with the help of these data points. Hyperplane - As we
can see in the above diagram, it is a decision plane or space which is divided between a
set of objects having different classes. Margin - It may be defined as the gap between
two lines on the closet data points of different classes.
It can be calculated as the perpendicular distance from the line to the support vectors.
Large margin is considered as a good margin and small margin is considered as a bad
margin. The main goal of SVM is to divide the datasets into classes to find a maximum
marginal hyperplane (MMH) and it can be done in the following two steps - First, SVM
will generate hyperplanes iteratively that segregates the classes in best way. Then, it will
choose the hyperplane that separates the classes correctly.
In practice, SVM algorithm is implemented with kernel that transforms an input data
space into the required form. In simple words, kernel converts non-separable problems
into separable problems by adding more dimensions to it. It makes SVM more powerful,
flexible and accurate. The following are some of the types of kernels used by SVM.
Linear Kernel It can be used as a dot product between any two observations.
The formula of linear kernel is as below - ??(??, ?? ?? )=??????(??* ?? ?? ) From the above
formula, we can see that the product between two vectors say ?? & ???? is the sum of
the multiplication of each pair of input values. Polynomial Kernel It is more generalized
form of linear kernel and distinguish curved or nonlinear input space.
GHz RAM - 4GB Hard Disk - 50 GB Key Board - Standard Windows Keyboard Mouse -
Two or Three Button Mouse Monitor - SVGA 4.2 S/W Requirements Operating system :
Windows 7. Coding Language : Python Database : MySQL
App also plays audio of sentences for better results. 5.2 Uses-Case Diagram / 5.3
Sequence Diagram 5.3.1 User Sequence Diagram / 5.3.2 Admin sequence diagram / 5.4
Activity Diagram / 5.5 Class Diagram / 5.6 Deployment Diagram / 5.7Component
Diagram / 5.8 DFD Level-0 Diagram (If applicable) /
Chapter 6 Conclusion This system eliminates the barrier in communication between the
mute community and the normal people.
It also provides communication between dumb and blind. It is also useful for speech
impaired and paralysed patient means those do not speak properly. The project
proposes a translational device for deaf-mute people using glove technology. Further
the device will be an apt tool for deaf mute community to learn gesture and words
easily. And also it is portable.
Chapter 7 References Smart Wearable Hand Device for Sign Language Interpretation
System with Sensors Fusion B. G. Lee and S. M. Lee 1558-1748 (c) 2017 IEEE A Wearable
System for Recognizing American Sign Language in Real-time Using IMU and Surface
EMG Sensors Jian Wu, Lu Sun International Journal of UbiComp (IJU), Vol.3, No.1,
January 2012 DOI:10.5121/iju.2012.3103 21 REAL TIME HAND GESTURE RECOGNITION
SYSTEM FOR DYNAMIC APPLICATIONS Siddharth S.
L. Lamberti and F. Camastra, “Real-time hand gesture recognition using a color glove,”
presented at Int. Conf. Image Analy. Process., Ravenna, Italy, Sep. 2011. Y. Iwai, K.
Watanabe, Y. Yagi and M. Yachida, “Gesture recognition using colored gloves,” in Proc.
13th Int. Conf. Pattern Recog., Vienna, Austria, Aug. 25-29, 1996.
INTERNET SOURCES:
-------------------------------------------------------------------------------------------
0% - Empty
0% - https://en.wikipedia.org/wiki/Indian_Ins
0% - http://www.sinhgad.edu/Mandatory_Disclos
0% - https://www.collegedekho.com/colleges/co
0% - https://kgr.ac.in/ece/
0% - https://www.campusoption.com/college/adm
0% - http://sppudocs.unipune.ac.in/sites/news
1% - http://www.kscst.iisc.ernet.in/spp/42_se
0% - https://docplayer.net/56885520-C-h-m-e-s
0% - https://baixardoc.com/documents/macmilla
0% - https://shaunakelly.com/word/styles/page
0% - https://www.quora.com/
0% - https://www.coursehero.com/file/74779133
0% - https://engineering.eckovation.com/forma
0% - https://docs.oracle.com/middleware/1221/
0% - https://www.scribd.com/document/39225074
0% - https://www.visual-paradigm.com/tutorial
0% - http://researchspace.ukzn.ac.za/bitstrea
0% - https://www.ncbi.nlm.nih.gov/pmc/article
0% - https://www.researchgate.net/publication
0% - http://www.science.gov/topicpages/c/comp
1% - https://www.semanticscholar.org/paper/SP
1% - https://www.youtube.com/watch?v=l2K6Fcrg
1% - http://tecxperia.com/rp_project.html
1% - http://pep.ijieee.org.in/journal_pdf/11-
0% - https://www.technewsworld.com/story/Stud
0% - https://stackoverflow.com/questions/3053
1% - https://www.researchgate.net/publication
0% - https://www.researchgate.net/publication
0% - https://www.examveda.com/computer-scienc
4% - https://www.tutorialspoint.com/machine_l
0% - http://dspace.cusat.ac.in/jspui/bitstrea
0% - https://www.slideshare.net/shahsmzh/onli
0% - https://www.slideshare.net/MahanteshHire
1% - https://www.researchgate.net/publication
0% - https://www.researchgate.net/publication
1% - https://www.researchgate.net/post/What_a
1% - https://www.researchgate.net/publication
1% - https://www.researchgate.net/publication
1% - https://www.researchgate.net/publication
0% - https://in.mathworks.com/help/hdlverifie
0% - https://www.computerhope.com/issues/ch00
0% - https://www.cl.cam.ac.uk/teaching/1011/O
2% - https://www.researchgate.net/publication
2% - https://www.researchgate.net/publication
2% - https://www.researchgate.net/publication
0% - https://www.researchgate.net/publication
1% - https://www.sciencedirect.com/science/ar
0% - https://www.interaction-design.org/liter
0% - https://www.researchgate.net/publication
1% - https://www.cpc.ncep.noaa.gov/products/a
1% - https://naun.org/main/NAUN/circuitssyste
1% - https://www.semanticscholar.org/paper/Vi
0% - https://smallbiztrends.com/2017/07/impac
1% - https://www.inderscienceonline.com/doi/a
1% - https://www.coursehero.com/file/p2keq00o
0% - https://academic.oup.com/iwc/article/26/
1% - https://www.researchgate.net/publication
0% - https://b-ok.cc/book/3662142/8f1225
0% - https://blog.jooq.org/2018/09/21/how-to-
1% - https://www.sciencedirect.com/science/ar
2% - https://www.tutorialspoint.com/machine_l
0% - https://ueeh.elenaturin.it/rbf-python.ht
4% - https://www.tutorialspoint.com/machine_l
0% - https://machinelearningmastery.com/suppo
1% - https://www.tutorialspoint.com/machine_l
0% - http://www.projecttopics.info/java-proje
0% - https://www.researchgate.net/publication
0% - https://www.ijert.org/research/Fruit-Rec
0% - https://deepai.org/publication/a-novel-r
0% - https://www.researchgate.net/publication
1% - https://www.slideshare.net/RakeshKumarAl
1% - https://www.slideshare.net/RakeshKumarAl
1% - https://dl.acm.org/doi/10.1145/3301275.3
1% - https://www.ijcaonline.org/archives/volu
1% - https://www.irjet.net/archives/V2/i8/IRJ
0% - https://issuu.com/irjet/docs/irjet-v3i94
0% - https://authorzilla.com/V5VN3/complete-v
0% - https://www.ijeat.org/icodsip2017/
1% - https://www.researchgate.net/publication