You are on page 1of 4

ISSN : 2230-7109 (Online) | ISSN : 2230-9543 (Print) IJECT Vol.

4, Issue Spl - 2, Jan - March 2013

Hand Gesture Control Robot Vehicle


1
Riyaz Mansuri, 2Sandesh Vakale, 3Ashish Shinde, 4Tanveer Patel
Electronics and Telecommunication, University of Mumbai
Mahatma Education Society’s
Pillai’s Institute of Information Technology,
Engineering, Media Studies and Research
Dr K.M. Vasudevan Pillai’s Campus, 10, Sector 16, New Panvel, Navi Mumbai India
Abstract
In this paper, we introduce a hand-gesture-based Control interface
for navigating a car-robot. A 2-axis accelerometer is adopted to
record a user’s hand trajectories. The trajectory data is transmitted
wirelessly via an RF module. The received trajectories are then
classified to one of four commands for navigating a car-robot and
two control commands for claw/robotic arm.

I. Introduction
This robot is a prototype but its applications are vast. It makes
use of tilt sensor / acceleration sensor to move forward/reverse/
right/left. Many of us have played games on android phones
where you just tilt your phone to make the character in the game
to do action. The same technology is used here. A tilt sensor
is located on a remote. The remote and robot are connected/do
communication using wireless technology. When you tilt the remote
forward the robot moves forward and when you tilt the remote Fig. 2:
backward the robot moves back. Same for right and left. Robot
can be easily controlled at a range of within 100 meters range.  IV. Programming Flow Chart
This project employs a hand-gesture-based control interface
for navigating a car-robot. A 2-axis accelerometer is adopted to
record a user’s hand trajectories. The trajectory data is transmitted
wirelessly via an RF module (TR). The received trajectories are
then classified for navigating a car-robot.

II. History
New intelligent algorithms could help robots to quickly recognize
and respond to human gestures. Researchers in past have created
a computer program which recognises human gestures quickly
and accurately, and requires very little training.
Interface improvements more than anything else has triggered
explosive growth in robotics. Teitelman in 1964 developed the first
trainable gesture recognizer. It was quite common in light-pen-
based systems to include some gesture recognition, for example
in the AMBIT/G system (1968 -- ARPA funded). A gesture-based
text editor using proof-reading symbols was developed at CMU
by Michael Coleman in 1969. Gesture recognition has been used
in commercial CAD systems since the 1970s.
Furthermore, the research that will lead to the user interfaces for
the computers of tomorrow is happening at universities and a few
corporate research labs.

III. Mechanical Design

Fig. 3: Tilt Sensor and its Circuit

Fig. 1:

w w w. i j e c t. o r g International Journal of Electronics & Communication Technology   77


IJECT Vol. 4, Issue Spl - 2, Jan - March 2013 ISSN : 2230-7109 (Online) | ISSN : 2230-9543 (Print)

The 7805 has a 5 volt output, while the 7812 produces 12 volts.

C. Motor Driver L293D/L298

Fig. 7: Bot Motor Controller

Fig. 4:

V. Components Used

A. PIC 18 F 4525
It is a 40 pin IC with pin description given below.

Fig. 8: Motor Driver Mechanism

• It is an electronic circuit which enables a voltage to be applied


across a load in either direction.
• It allows a circuit full control over a standard electric DC
motor. That is, with an H-bridge, a microcontroller, logic chip,
or remote control can electronically command the motor to
go forward, reverse, brake, and coast.
• H-bridges are available as integrated circuits, or can be built
from discrete components.
• A “double pole double throw” relay can generally achieve the
same electrical functionality as an H-bridge, but an H-bridge
FIg. 5: would be preferable where a smaller physical size is needed,
high speed switching, low driving voltage, or where the
B. Accelerometer wearing out of mechanical parts is undesirable.
An accelerometer is an electromechanical device that will measure The term “H-bridge” is derived from the typical graphical
acceleration forces. These forces may be static, like the constant representation of such a circuit, which is built with four switches,
force of gravity pulling at your feet, or they could be dynamic - either solid-state (e.g. L293/ L298) or mechanical (e.g., relays).
caused by moving or vibrating the accelerometer. The current provided by the MCU is of the order of 5mA and that
required by a motor is ~500mA. Hence, motor can’t be controlled
directly by MCU and we need an interface between the MCU and
the motor. A motor driver does not amplify the current; it only
acts as a switch.

D. RF Module

Fig. 6: Voltage Regulator 7805/7812 Fig. 9:

78 International Journal of Electronics & Communication Technology w w w. i j e c t. o r g


ISSN : 2230-7109 (Online) | ISSN : 2230-9543 (Print) IJECT Vol. 4, Issue Spl - 2, Jan - March 2013

E. Robotic Arm VI. Application


• Gesture Driven systems are simpler and easier to operate. Any
common man with basic knowledge can operate it.
• This system tends to penetrate deeper in everyday use rather
than being confined in research labs.
• It will open a new era in Home automation and remotely
operated utility devices at households. Even physically
disabled can gain benefit.
• With proper wireless connectivity, gesture recognition
can provide a simple and reliable solution to household
activities.
• Complex and realistic physical interactions in animation
Industry.
• Gesture recognition can be seen as a way for computers to
begin to understand human body language, thus building a
Fig. 10: richer bridge between machines and humans than primitive text
user interfaces or even GUIs (graphical user interfaces), which
F. Power Supply still limit the majority of input to keyboard and mouse
Battery 6V ,4Amps • Gesture recognition enables humans to interface with the
Working machine (HMI) and interact naturally without any mechanical
devices

Fig. 11:

• The purpose is to sense the coordinates of position of our


hand.
• ADC converts output of sensor to digital form.
• Microcontroller performs the required function as
programmed. Fig. 13:
• The sensor outputs are fed to MCU through a comparator.
• The MCU produces the control pulses. Motor driver drives VII. Deficiency
the motors according to those control pulses. • The user has a huge device on his hand which obstruct the
user do normal hand movement.
• Since there is no force feedback, the user won’t know what
he is working on. But we can add feedback system in newer
projects.
• Fine movement is difficult to achieve when working with
bigger objects or controlling machines are bigger in size.

VIII. Future Scope


• Wireless modules consume very low power and is best suited
for wireless, battery driven devices.
• Advanced robotic arms that are designed like the human hand
itself can easily controlled using hand gestures only.
• Proposed utility in fields of Construction, Hazardous waste
disposal, Medical science.
• Combination of Heads Up display, Wired Gloves, Haptic-
tactile sensors, Omni directional Tread mills may produce a
feel of physical places during simulated environments.
• VR simulation may prove to be crucial for Military, LAW
enforcement and Medical Surgeries.

IX. Acknowledgment
The timely completion of this report is mainly due to the interest
Fig. 12: Block Diagram and persuasion of our project guide and HOD of EXTC Department

w w w. i j e c t. o r g International Journal of Electronics & Communication Technology   79


IJECT Vol. 4, Issue Spl - 2, Jan - March 2013 ISSN : 2230-7109 (Online) | ISSN : 2230-9543 (Print)

who supported us and guided. and last but not the least the team
work, team effort, team spirit of all the group members has made
our report successful.

References
[1] W.T. Freeman, C.D. Weissman, Television control by
hand gesture, Proceedings of the International Workshop
on Automatic Face-and Gesture-Recognition, Zurich,
Switzerland, June 1995, pp. 179-183.
[2] J.S. Kim, C.S. Lee, K.J. Song, B. Min, Z. Bien, Real-time hand
gesture recognition for avatar motion control, Proceedings
of HCI’97, February 1997, pp. 96-101. SANDESH VAKALE
[3] T. Starner, A. Pentland,"Visual recognition of american sign
language using hidden Markov model", Proceedings of the
International Workshop on Automatic Face-and Gesture-
Recognition, Zurich, Switzerland, June 1995, pp. 189-194.
[4] J. Yang, Y. Xu, C.S. Chen,"Human action learning via hidden
markov model", IEEE Trans. Systems, Man, Cybernet. 27
(1), 1997, pp. 34-44.
[5] T.S. Huang, A. Pentland,"Hand gesture modeling, analysis,
and synthesis", Proceedings of the International Workshop
on Automatic Face-and Gesture-Recognition, Zurich,
Switzerland, June 1995, pp. 73-79.
[6] Jakub Segan,"Controlling computer with gloveless gesture",
In Virtual Reality System’93, 1993, pp. 2-6. ASHISH SHINDE
[7] E. Hunter, J. Schlenzig, R. Jain,"Posture estimation in
reduced-model gesture input systems, Proceedings of the
International Workshop on Automatic Face-and Gesture-
Recognition, June 1995, pp. 290-295.
[8] M.J. Swain, D.H. Ballard,"Color indexing", International
Journal of Computer Vision 7 (1), 1991, pp. 11-32.
[9] R.C. Gonzalez, R.E. Woods,"Digital Image Processing",
Addison-Wesley, Reading, MA, 1992.
[10] S. Seki, K. Takahashi, R. Oka,"Gesture recognition from
motion images by spotting algorithm", ACCV’93, 1993, pp.
759-762.
[11] L.W. Campbell, D.A. Becker, A. Azarbayejani, A.F. Bobick, RIYAZ MANSURI
A. Pentland,"Invariant features for 3-D gesture recognition,
Proceedings of the International Workshop on Automatic
Face-and Gesture-Recognition, 1996, pp. 157-162.
[12] L.R. Rabiner,"A tutorial on hidden Markov models and
selected application in speech recognition", Proc. IEEE 77,
1989, pp. 267-293.

TANVEER PATEL

80 International Journal of Electronics & Communication Technology w w w. i j e c t. o r g

You might also like