You are on page 1of 7

Project Report

DEPARMENT: INFORMATION TECHNOLOGY


DATE: 29/12/2020
SEMESTER: 3
SUBMITTED BY: 1. Vedant Dhote (SYITA131)
2. Vedang Dadape (SYITA122)
3. Sarvesh Dakare (SYITA123)
4. Tejas Chougale (SYITA121)
Abstract:
Gesture based interaction systems are becoming very popular both at workplace and home. This
work intends to develop a system which can recognize hand gestures which can be used as an
input command to interact with the PC or laptop. One of the key areas which need to be looked
at while developing such systems is the code implementation stage. In order to manage the
work we shall be using Python for the implementation of the code. We feel that if we
successfully meet our goals then we shall have contributed towards the future of natural gesture
based interfaces, if only in a minimal way.
I. INTRODUCTION
In order to control pc using ultra sonic sensors, this technique is called Leap motion which
enables us to control certain functions on our computer/Laptop by simply waving our hand in
front of it. It is very cool and fun to do it, but these laptops are really priced very high. So in this
work let us try building our own Gesture Control Laptop/Computer by combining the power of
Arduino and Python. We will use two Ultrasonic sensors to determine the position of our hand
and control a media player (VLC) based on the position. We have used this for demonstration,
but once we have understood the work, we can do anything by just changing few lines of code
and control our favourite application in our favourite way. The concept behind this work is very
simple. We will place two Ultrasonic sensors on top of our monitor and will read the distance
between the monitor and our hand using Arduino, based on this value of distance we will
perform certain actions. To perform actions on our computer we use Python Pyautogui library.
The commands from Arduino are sent to the computer through serial port. This data will be then
read by python which is running on the computer and based on the read data an action will be
performed. The incoming time-domain signals are buffered, and Fourier transform is applied on
them. The Arduino can be connected to the PC/Laptop for powering the module and also for
serial communication. The result of this operation is magnitude vectors that are spread equally
over the spectral width. After each FFT vector is computed, it is further processed to determine
the bandwidth of the signals, speed of gestures and motion detection. The detected motions are
then converted to pc commands.
A. Proposed Work
This paper introduces a technique based on determining distance by the sensor and accordingly
a particular function is performed. Some recognition methods of the gestures are proposed and
then actions are recognized using sensor. We set up few mainstream methods based on the
action recognition by the sensors. The sensor device is attached on computer at head of the
screen, for quick operation. In this field much research work has been done but that work is
related to hand recognition, real time finger recognition and recognition of alphabet characters
[1]. Real time human computer interactions using hand gesture are also used for much
functionality [2] such as video control, music player, gaming [2], controlling the functions of
PDF reader etc. All these interactions have real time gesture recognition techniques. A gesture
controller resolution always requires a physical device which follows and recognizes the body
languages or movements, so that the computer can clarify them [3]. By using ultrasonic sensor,
the distance of hand can be found which acts as aninput. According to the distance of hand,
particular function is performed.

II. SYSTEM MODEL


A. Principle Behind Our Proposed Scheme
The principle behind the Arduino based Hand Gesture Control of Computer is basically very
simple. All we have to do is use two Ultrasonic Sensors with Arduino, place our hand in front
of the Ultrasonic Sensor and calculate the distance between the hand and the sensor. Using this
information, certain actions in the computer can be performed. The position of the Ultrasonic
Sensors is very significant. The two Ultrasonic Sensors are to be placed on the top of a laptop
screen at either end. The distance informations from Arduino are collected by a Python
Program and a special library called PyAutoGUI will convert the data into keyboard click
actions.
B. Circuit Diagram
The circuit diagram of Arduino part of the model is shown in the figure 1. It consists of an
Arduino UNO board and two Ultrasonic Sensors. We can power up all these components from
the laptop’s USB Port.
Figure.1. Circuit diagram of system

C. Problem Formulation
In this paper, we have implemented a simple
Arduino based hand gesture controlled
computer where we can control few functions
like: scrolling up in web pages, scrolling down
in web pages, play or pause a video, previous
and next video, and shift between tasks.

III. IMPLEMENTATION

A. Components Required
• Arduino UNO x 1
• Ultrasonic
Sensors x 2
• USB Cable (for Arduino)
• Few Connecting Wires
• A Laptop with internet connection

B. Detailed Modeling
The design of the circuit is too simple, but the setup of the components is very significant. The
Trigger and Echo Pins of the first Ultrasonic Sensor, which is placed on the left of the screen,
are connected to Pins 11 and 10 of the Arduino. For the second Ultrasonic Sensor, the Trigger
and Echo Pins are connected to Pins 6 and 5 of the Arduino[4]. Now, for the placement of the
Sensors, place both the Ultrasonic Sensors on top of the Laptop screen, one at the left end and
the other at right. We can use double sided tape to hold the sensors onto the screen.
Figure.2. View of work model

Now, the Arduino is placed on the back of the laptop screen. Then, connect the wires from
Arduino to Trigger and Echo Pins of the individual sensors. Now, we are ready for
programming the Arduino. Figure 2 shows the view of the model.

C. Programming Arduino to Detect Gesture:


The following are the 5 different hand gestures or actions that we have programmed
for demonstration purpose.
Gesture 1: Place the hand in front of the Right Ultrasonic Sensor at a distance (between 15CM
to 35CM) for a small duration and move our hand away from the sensor. This gesture will
Scroll Down the Web Page.
Gesture 2: Place the hand in front of the Right Ultrasonic Sensor at a distance (between 15CM
to 35CM) for a small duration and move our hand towards the sensor. This gesture will Scroll
up the Web Page.
Gesture 3: Swipe the hand in front of the Right Ultrasonic Sensor. This gesture will move to the
Next Tab.
Gesture 4: Swipe the hand infront of the Left hand Ultrasonic Sensor. This gesture will move to
the Previous Tab or Play/Pause the Video.
Gesture 5: Swipe the hand across both the sensors (Left Sensor first). This action will switch
between Tasks.
Based on above gestures Arduino program has been written.

D. Working Principle
Gesture controlling is based on specifying
hand position from the ultrasonic sensor. For
processing the raw data, a micro- controller is
essential; for that we use Arduino UNO board.
Via USB connection the micro-controller
transfers the processed and calculated distance
value which is provided by the sensor. The
data which is send by the sensor is processed in
the software in PC where all the calculations
are performed and the data is matched with the
predefined conditions (gesture resolution). In
this model two ultrasonic sensors are used to
detect hand position and are connected to the Arduino board as shown in the figure . As we
know ultrasonic sensor continuously emits sound and it gets reflected back from user’s hand
as shown in the figure . The distance between the sounds is send and detection of reflect back
sound wave is calculated by the micro- controller[4].

This model has following hardware components; these are two ultrasonic sensors (HCSR-04)
and Arduino UNO board. System software includes Arduino IDE and python GUI. To run this
model, the python code should run on python GUI first. It matches gestures with predefined
conditions and prints on python output shell.
Figure.3.a Ultrasonic sensor working diagram

IV. RESULT AND DISCUSSION

The gesture recognition using ultrasonic waves is found to be accurate and reliable. The
methodology for testing comprised of movement of single hand or multiple hands. Single hand
movement was detected accurately. When there are multiple hands, the movement is not
detected accurately. The detection did not take into account the background area. The noise in
human audible range did not affect the detection. In our work, we have implemented Arduino
based hand gesture control of our computer, where few hand gestures made in front of the
computer will perform defined tasks in the computer without using mouse or keyboard.

Figure. Show a playing video


This type of gesture based control of computer
is already present and a company called Leap
Motion has been implementing such
technology in computers. Such hand gesture
control of computers can be used for VR
(Virtual Reality), AR (Augmented Reality),
3D Design, Reading Sign Language, etc.
Figures show some examples of playing and
paused video.
V. APPLICATION

Gesture recognition is useful in processing information from human beings that is not conveyed
through speech or other methods. This technology is useful in following areas:
a. Immersive gaming technology: Gestures may be used to control interactions with the gaming
console and give a more interactive and immersive experience.
b. Control through facial gestures: This technology can be used for applications with even more
precision like recognizing face gestures. This will be helpful in situations when users cannot use
other input interfaces like mouse or keyboard or even hand gestures. This would be additionally
helpful in applications like mood sensing.
c. Alternative computer interface: Strong gesture recognition can be used to accomplish
common tasks performed traditionally with the current input devices such as mouse or
keyboard. Gestures, along with other methodologies like speech recognition can be made to
control the electronic appliances and gadgets completely or with little need to type or touch.
d. Remote control: By using gesture recognition, it is possible to use hand alone as a remote
control for various devices. The signal must not only indicate the desired response, but also
which device to be controlled.
e. Home Appliances control: It is possible to extend the gesture recognition technology to
control the household appliances.

VI. CONCLUSION AND FUTURE SCOPE

This report presents one of the solutions among various others, for operating a computer using
hand gestures. It is one of the easiest ways of interaction between human and computer. It is a
cost effective model which is only based on Arduino UNO and ultrasonic sensor. The python
IDE allows a seamless integration with Arduino UNO in order to achieve different processing
and controlling method for creating new gesture control solution. Additional gesture recognition
opportunities exist in medical applications where, for health and safety reasons, a nurse or
doctor may not be able to touch a display or track-pad but still needs to control a system. In
other cases, the medical professional may not be within reach of the display yet still needs to
manipulate the content being shown on the display[1]. Appropriate gestures, such as hand
swipes or using a finger as a virtual mouse, are a safer and faster way to control the device.

You might also like