You are on page 1of 21

A Mini Project Report on

Hand Gesture Controls for


Laptops and PCs.

Submitted in partial fulfillment of the


requirements for the award of the
degree of

Bachelor of Engineering

in

Computer Engineering

By

1. TANMAY KHULE (24)


2. RIDDHI MORE (33)
3. ANUJ PATHARE (35)
4. KAUSHIK RANADE (44)

Under the Guidance of


PROF. RANJANA SINGH

Department of Computer Engineering


Watumull Institute of Electronics Engineering and Computer Technology
CHM College Campus, Ulhasnagar,
Maharashtra 421003
University of Mumbai (AY 2021-22)
Approval Sheet

This Mini Project Report entitled “Hand Gesture Controls for Laptops and
PCs.” Submitted by “TANMAY KHULE” (24), “RIDDHI MORE” (33),
“ANUJ PATHARE” (35), “KAUSHIK RANADE” (44) is approved for the
partial fulfillment of the requirement for the award of the degree of Bachelor of
Engineering in Computer Science from University of Mumbai.

Prof. Ranjana Singh


(Guide)

Head Department of Computer Engineering

Place:
Date:
CERTIFICATE

This is to certify that the mini project entitled “Hand Gesture Controls
for Laptops and PCs.” submitted by “TANMAY KHULE” (24), “RIDDHI
MORE” (33), “ANUJ PATHARE” (35), “KAUSHIK RANADE” (44) for the
partial fulfillment of the requirement for award of a degree Bachelor of
Engineering in Computer Engineering,to the University of Mumbai, is a
bonafide work carried out during academic year 2021-2022.

Guide Name & Signature Examiners:

1.

2.

Head Department of Computer Engineering Principal

Place:
Date:
DECLARATION

We declare that this written submission represents our ideas in our own
words and where others’ ideas or words have been included, We have
adequately cited and referenced the original sources. We also declare that we
have adhered to all principles of academic honesty and integrity and have not
misrepresented or fabricated or falsified any idea/data/fact/source in our
submission. We understand that any violation of the above will be cause for
disciplinary action by the Institute and can also evoke penal action from the
sources which have thus not been properly cited or from whom proper
permission has not been taken when needed.

———————————————

(Signature)

———————————————

TANMAY KHULE (24)


RIDDHI MORE (33)
ANUJ PATHARE (35)
KAUSHIK RANADE (44)

Date:
Abstract
The purpose of Medical Shop Management System is to
automate the existing manual system by the help of
computerized equipments and ful-fledged computer software,
fuffiling their requirements, so that their valuable
datalinformation can be stored for a longer period with easy
accessing and manipulation of the same. The required software
and hardware are easily available and easy to work with.
Medical Shop Management System, as described above, can
lead to error free.secure, reliable and fast management system.
It can assist the user to concentrate on their other activities
rather to concentrate on the record keeping. Thus it will help
organization in better utilization of resources. The organization
can maintain computerized records without redundant entries.
That means that one need not be distracted by information that
is not relevant, while being able to reach the information. The
aim is to automate its existing manual system by the help of
computerized equipments and full-fledged computer software,
tutting their requirements, so that their valuable data
information can be stored for a longer period with easy
accessing and manipulation of the same. Basically the project
describes how to manage for good performance and better
services for the clients..
Contents
1 Introduction 8
1.1 Problem Definition 8
1.2 Objectives 8
1.3 Scope 8
2 Existing System/Project 9
3 Technology Stack 10
4 Benefits and Applications 11
4.1 Benefits for society 11
4.2 Benefits for environment 11
4.3 Applications. 11
5 Project Design 12
5.1 Proposed System 12
5.2 Data Flow Diagram. 14

6 Project Implementation 15

7 Result 17

8 Conclusion 18
9 Annexure 19
10 Bibliography 20
List of Figures
Figure 1 : Keypoint extraction and classification 12
Figure 2 : Mapping Hand Gestures to Computer commands 13
Figure 3 : Proposed System 14
Figure 4 : Data Flow Diagram 14
Figure 5 : Hand Signs from ASL data set 15
Figure 6 : Confusion matrix 17
Figure 7 :Gantt chart 19

1. Introduction
Using hand gestures is very convenient to users. Performing hand gestures without
the use of any external devices can allow users to easily control their computers.
“Hancon”, is an app through which users can do the same. Using Hancon, users can
perform operations such as save, exit, print or can shutdown or restart their
machines with the respective hand gestures.

1.1 Problem Definition

There are many applications where hand gesture can be used for interacting
with systems like, videogames, controlling UAV’s, medical equipments, etc. These
hand gestures can also be used by handicapped people to interact with the system.
Classical interactions tools like keyboard, mouse, touchscreen, etc. may limit the
way we use the system. In order to interact with the system, physical contact is
necessary. Gestures can interpret same functionality without physical contact with
the interfacing devices. The problem lies in understanding these gestures, as for
different people, the same gesture may look different for performing the same task.
This problem may be overthrown by the use of Deep Learning approaches.
Convolution neural networks (CNN/'s) are proving to be ultimate tool to process
such recognition systems. The only problem with the deep learning approaches it
that they may work poorly in real world recognition. High computing power is
required in order to process gestures.
1.2 Objectives

i. To create a model which would identify static and dynamic hand gestures.
ii. To create platform which would map these gestures into executable
commands.

1.3 Scope

Our current system can recognize the previously mentioned static hand
gestures. We intend to expand this system by incorporating dynamic hand gestures,
which will make it more versatile and user friendly. Furthermore, the existing
system can be improved, and there is still room for performance enhancements. In
order to improve the overall user experience, we can build on the gesture
recognition module to accept user-defined gestures and tasks. Our main future goal
is to reduce our reliance on third-party APIs and make the system more self-
sufficient.
2. Existing System/Project
In computer science and language technology, gesture recognition is an
important topic which interpret human gesture through computer vision algorithms.
There are various bodily motion which can originate gesture but the common form
of gesture origination comes from the face and hands. The entire procedure of
tracking gesture to their representation and converting them to some purposeful
command is known as gesture recognition1. Various technologies has been used for
the design and implementation of such kind of devices, but contact based and vision
based technologies are two main types of technologies used for robust, accurate and
reliable hand gesture recognition systems. Contact based devices like
accelerometers, multi-touch screen, data glove, etc. based on physical interaction of
user who will be required to learn their usages. Whereas vision based devices like
cameras has to deal with the prominent variety of gestures. Gesture recognition
involves to handle degrees of freedom (DOF), variable 2D appearances, different
silhouette scales (i.e. spatial resolution) and temporal dimension (i.e. gesture speed
variability). Vision based gesture recognition is further classified into two main
categories, which are 3D model based methods and appearance based methods. 3D
based hand models describes the hand shapes and are the main choice of hand
gesture modeling in which volumetric analysis is done. In appearance based models,
the appearance of the arm and hand movements are directly linked from visual
images to specific gestures. A large number of models belong to this group.
Projects exist with a variety of solutions including image segmentation () and
other image preprocessing steps. Image capturing is also done using special
hardware like Kinetic camera in (). Many solutions use a trained CNN to classify
the signs, for all those solutions preprocessing is required as CNNs are prone to
overfit on the data and hence need to be generalized, requiring large dataset.
3. Technology Stack
● Google Mediapipe API
● Keras
● Tensorflow
● SciKit-Learn
● Tkinter
● Python Libraries:
● pandas
● numpy
● os
● pillow
● cv2
● keyboard
● Pickle
● pyinstaller
4. Benefits and Applications
4.1 Benefits for society
Gesture based interfaces allows users to interact with the computer without
the need of keyboard or mouse. The user can control the computer only with hand
gestures. There is no need to wear any gears for controlling the computer.
Places such as offices, computer labs, etc. where a single computer is used by
multiple users, gesture based commands can be useful in maintaining the hygiene of
the system. While giving a presentation, the user can make hand gestures to the
computer for controlling the presentation, maintaining the flow of presentaion

4.2 Benefits to environment


Since users can control the system using only hand gestures, without the use
of any external devices such as kinetic sensors, inertial sensors, magnetic sensors,
gyro sensors, electromyography, force-sensitive resistors, and other types of
sensors, their manufacturing can be reduced by a considerable amount.
Models which can predict hand gestures with high accuracy can be
implemented in the system, eliminating the need of keyboard or mouse. Thus, hand
gesture enabled controls can prove to be a boon to the environment if implemented
correctly and to its full potential.

4.3 Applications
● Entertainment: Computer games are a particularly technologically promising
and commercially rewarding arena for innovative interfaces due to the
entertaining nature of the interaction.
● Medical Systems and Assistive Technologies: Gestures can be used to
control the distribution of resources in hospitals, interact with medical
instrumentation, control visualization displays, and help handicapped users
as part of their rehabilitation therapy.
● Crisis management and disaster relief: Command-and-control systems help
manage public response to natural disasters (such as tornados, floods,
wildfires, and epidemic diseases) and to human-caused disasters (such as
terrorist attacks and toxic spills).
● Commercial Sector: Gestures can be used by groups to present their ideas
without disturbing the flow of presentation.
● Virtual Environment Control: Hand-gesture based control system can be
integrated with automation system to remotely carry our operations provided
by the automation system
● Human-robot interaction: Hand-gesture recognition is a critical aspect of
fixed and mobile robots, in which gestures can be used to command robot to
perform certain set of operations.
5. Project Design
5.1 Proposed System

Figure 1: Keypoint extraction and classification

The first step of our system is to detect the hand of user. After the hand is
detected, we extract certain landmarks from the user’s hand. Total 21 keypoints are
extracted from the user’s hand, which describes the orientation of hand in frame.
These keypoints are then fed to the Neural Network which uses the relative
positions of the key points to identify the relationship and hence predict their
respective class.
Figure 2: Mapping Hand Gestures to Computer commands

After the landmark extraction is done and the keypoints are fed to the neural
network, the predictions are obtained. Now, the hand gestures are mapped to
computer commands. We have defined hand gestures for save, print, close and
restart. The hand gestures for respective operaions are shown in fig. As soon as the
user delineates a hand gesture, the respective operation is perfomed.
The final stage is to create a working destop application. We are using tkinter
to design and create a desktop application. The application is created by bundling all
the modules together.
Figure 3: Proposed System

5.2 Data Flow Diagram

Figure 4: Data Flow Diagram


6. Project Implementation
We have implemented our project in 4 stages:

Stage 1) Using a proper Dataset consisting of Hand Signs.

Stage 2) Designing and training a Machine Learning model to predict hand gestures.

Stage 3) Mapping the hand gestures to computer operations.

Stage 4) Creating a basic GUI application with provision to run commands like
save, print, close, restart.

Stage 1:

Figure 5: Hand Signs from ASL data set

In stage 1, we create a dataset for the hand gestures. It is a crucial task, since
even a single incorrect image would ruin the entire training phase of the model
causing poor performance.
Stage 2:
In stage 2, we train a Neural Network using keras and tensorflow. The input
for this model is a pandas dataframe which includes the keypoints extracted by the
keypoint extraction module. The output is probablity of a specific label among the 4
predefined classes. Once the model is ready, the final step is to integrate it with the
system to run the respective commands.

Stage 3:
In stage 3, we take the predictions which are given to us by the neural
network and map them to respective computer operations.

Stage 4:
In stage 4, we create a basic GUI (Graphical User Interface) application using
the Tkinter library in python. The GUI primarily consists of a welcome screen
which redirects the user to the main program screen which has 2 elements, a video
feed which opens the webcam, and a text area which displays the current command
which is being detected and executed.
7. Results

Figure 6: Confusion matrix

Following is the confusion matrix which is retrieved after predicting the hand
gestures. It allows us to visualize the performance of the model, and provides us
details of how confused the classifier is, i.e., how many times the classifier classifies
the prediction of a specific class into a different class.
8. Conclusion
Firstly, the hand of user is detected and the keypoints are extracted from
user’s hand. The Neural Network classifies the data into one of the 4 classes. The
predictions are then mapped to respective computer command, thus executing the
command successfully.
9. Annexure
9.1 Gantt Chart

Figure 7: Gantt chart


10. Bibliography
[1] Ram Pratap Sharma, Gyanendra K. Verma, Human Computer Interaction
using Hand Gesture, Procedia Computer Science, Volume 54,
(https://www.sciencedirect.com/science/article/pii/S187705091501409X)
[2] Mediapipe, Google, Available: https://google.github.io/mediapipe/
[3] IBM, IBM Machine Learning Professional Certificate, Available:
https://www.coursera.org/professional-certificates/ibm-machine-learning
[4] B. A. Myers, A Brief History of Human Computer Interaction Technology,
ACM Interactions, vol. 5(2), pp. 44-54, (1998)
(https://dl.acm.org/doi/abs/10.1145/274430.274436)
[5] Z. Xu, et al., Hand Gesture Recognition and Virtual Game Control Based on
3D Accelerometer and EMG Sensors, In Proceedings of IUI’09, pp. 401-406,
(2009) (https://dl.acm.org/doi/abs/10.1145/1502650.1502708)
[6] N. H. Dardas and E. M. Petriu, "Hand gesture detection and recognition using
principal component analysis," 2011 IEEE International Conference on
Computational Intelligence for Measurement Systems and Applications
(CIMSA) Proceedings, 2011, pp. 1-6, doi: 10.1109/CIMSA.2011.6059935.
[7] N. H. Dardas and N. D. Georganas, "Real-Time Hand Gesture Detection and
Recognition Using Bag-of-Features and Support Vector Machine
Techniques," in IEEE Transactions on Instrumentation and Measurement,
vol. 60, no. 11, pp. 3592-3607, Nov. 2011, doi: 10.1109/TIM.2011.2161140.
[8] Aashni Haria, Archanasri Subramanian, Nivedhitha Asokkumar, Shristi
Poddar, Jyothi S Nayak,
[9] Hand Gesture Recognition for Human Computer Interaction, Procedia
Computer Science, Volume 115, 2017, Pages 367-374
ACKNOWLEDGEMENT

We have great pleasure in presenting the mini project report on


Hand Gesture Controls for Laptops and PCs. We take this
opportunity to express our sincere thanks towards our guide Prof.
Ranjana Singh Department of Computer Engineering, WIEECT
ulhasnagar for providing the technical guidelines and suggestions
regarding line of work. We would like to express our gratitude towards
her constant encouragement, support and guidance through the
development of project.
We thank Prof. Ranjana Singh for her encouragement during
progress meeting and providing guidelines to write this report.
We also thank the entire staff of WIEECT for their invaluable help
rendered during the course of this work. We wish to express our deep
gratitude towards all our colleagues of WIEECT for their
encouragement.

Student Name: TANMAY KHULE


Roll No: 24

Student Name: RIDDHI MORE


Roll No: 33

Student Name: ANUJ PATHARE


Roll No: 35

Student Name: KAUSHIK RANADE


Roll No: 44

You might also like