You are on page 1of 82

Controlling Automotive Using Hand

Gesture Control System


By
Name of Student: Sonia Anam
Registration Number: 2014-GCUF-06934
Name of Student: Anmol Anwar
Registration Number: 2014-GCUF-06931

Project submitted in partial contentment of the


requirements for the degree of

BACHOLAR OF SCIENCE
IN
COMPUTER SCIENCE
Distance Learning Education

DEPARTMENT OF COMPUTER SCIENCE


GOVERNMENT COLLEGE UNIVERSITY, FAISALABAD

October, 2018
DECLARATION

The work reported in the respective project is carried out by Sonia Anam
Roll# 15134 and Anmol Anwar Roll# 15131 under the supervision of
Mr. Umer Sarwar, Department of Computer Science Government College
University, and Faisalabad, Pakistan.
We hereby declare that the “Controlling Automotive Using Hand Gesture
Control System” and the contents of project is the product of our own research.
We further declare that this work has not been submitted for award of any other
degree / diploma. The University may take action if the information provided is
found in accurate at any stage.

Signature of the Student: ______________

Registration No. 2014-GCUF-06934

Signature of the Student: ______________

Registration No. 2014-GCUF-06931

2
CERTIFICATE BY THE PROJECT SUPERVISOR

This is to certify that Sonia Anam Roll No. 15134 and Anmol Anwar Roll No.
15131 have successfully completed the final project named as: Controlling
Automotive Using Hand Gesture Control System, at the Department of
Computer Science, Government College University Faisalabad, to fulfill the partial
requirement of the degree of BS (Computer Science).

Internal Examiner

Name: _____________

Signature: _____________

External Examiner

Name: _____________

Signature: _____________

Chairperson

Department of Computer Science


Government College University Faisalabad

3
Acknowledgement

Firstly, we are thankful to ALLAH ALMIGHTY who blessed us with knowledge


to complete this project. We truly acknowledge the cooperation and help make by
Name of Mr. Umer Sarwar, teacher and also the Project Coordinator of
Department of Computer Science of Government College University Faisalabad
for his help and guidance throughout this project. He has been a constant source of
guidance throughout the course of this project. He helped in requirement gathering
and specification.

SONIA ANAM

2014-GCUF-06934

ANMOL ANWAR

2014-GCUF-06931

4
Table of Contents
Table of Contents ........................................................................................................... 5
CHAPTER #1 ........................................................................................................ 9
1 Introduction .......................................................................................................... 14
1.1 Background and Motivation:......................................................................... 15
1.2 Review of literature ....................................................................................... 16
1.3 Statement of purpose ..................................................................................... 16
1.4 TECHNICAL REQUIREMENTS ................................................................ 16
1.4.1 A. Microcontroller ................................................................................. 17
1.4.2 B. Mechanical Components ................................................................... 17
1.4.3 C. Bluetooth Module .............................................................................. 17
1.4.4 D. Android Smartphone ......................................................................... 17
1.5 SYSTEM DESCRIPTION ............................................................................ 17
1.6 Block Diagram of the system ........................................................................ 18
1.7 Transmission of data from Android application to Motor driver .................. 18
1.8 Receiving the data ......................................................................................... 19
1.9 Gesture Recognition ...................................................................................... 19
1.10 Movement of Motors and Wheels ............................................................. 21
1.11 FUTURE SCOPE AND CONCLUSION .................................................. 22
CHAPTER #2 ...................................................................................................... 24
2 Design .................................................................................................................. 25
2.1 Block Diagram .............................................................................................. 25
2.2 Physical Design Diagram .............................................................................. 26
2.2.1 Physical Design Diagram of Glove........................................................ 26
2.2.2 Physical Design Diagram of Car............................................................ 27
2.3 Block Design ................................................................................................. 27
2.3.1 Power Module A .................................................................................... 27
2.3.2 Battery A ................................................................................................ 27
2.3.3 Linear Voltage Regulator A ................................................................... 28
2.3.4 Sensor Module A ................................................................................... 28
2.3.5 Flex sensor ............................................................................................. 29
2.3.6 Accelerometer ........................................................................................ 30
2.3.7 Control Module A .................................................................................. 32
2.3.8 Microcontroller A .................................................................................. 32
2.3.9 Communication Module ........................................................................ 33
2.3.10 Transmitter Device................................................................................. 34
2.3.11 Receiver Device ..................................................................................... 35
2.3.12 Power Module B .................................................................................... 35

5
2.3.13 Battery B ................................................................................................ 35
2.3.14 Linear Voltage Regulator B ................................................................... 36
2.3.15 Sensor Module B.................................................................................... 36
2.3.16 Ranging Sensor ...................................................................................... 37
2.3.17 Control Module B .................................................................................. 37
2.3.18 Microcontroller B................................................................................... 37
2.3.19 Motion Module ...................................................................................... 38
2.3.20 Motors .................................................................................................... 38
2.3.21 Power MOSFET (RFP12N10L) ............................................................ 39
2.3.22 Wheels.................................................................................................... 40
2.3.23 Circuit Schematics ................................................................................. 41
2.3.24 Control Unit Flowchart .......................................................................... 41
2.4 Tolerance Analysis ........................................................................................ 43
2.5 Safety and Ethics ........................................................................................... 45
CHAPTER #3 ........................................................................................................ 1
3 UML Diagrams ...................................................................................................... 1
3.1 ER Diagram ..................................................................................................... 1
3.2 Class Diagram ................................................................................................. 2
3.3 Use case Diagram ............................................................................................ 3
3.3.1 Gesture Recognition................................................................................. 3
3.3.2 Processing Narrative ................................................................................ 3
3.3.3 Dynamic Behaviour ................................................................................. 4
3.4 Sequence Diagram........................................................................................... 4
3.4.1 Recognition Gesture................................................................................. 5
3.4.2 Interface Description ................................................................................ 5
CHAPTER #4 ........................................................................................................ 6
4 Hardware Used & Technical Specifications ........................................................ 48
4.1 Required Components ................................................................................... 48
4.2 Arduino UNO ................................................................................................ 48
4.2.1 Sender .................................................................................................... 48
4.2.2 Receiver: ................................................................................................ 49
4.3 DC Motors ..................................................................................................... 49
4.4 Accelerometer: .............................................................................................. 50
4.4.1 Connect to the Power: ............................................................................ 50
4.4.2 Connect the X,Y and Z Signal Outputs: ................................................ 50
4.4.3 Using the Voltage Reference: ................................................................ 51
4.4.4 Schematic: .............................................................................................. 52

6
4.5 NRF Pair:....................................................................................................... 52
4.5.1 Pin Description of Accelerometer:......................................................... 53
4.6 Motor Driver ................................................................................................. 54
4.7 Circuit Diagram and Explanation .................................................................. 54
4.8 Working......................................................................................................... 54
4.9 Implementation.............................................................................................. 56
4.9.1 Initial automotive movement ................................................................. 56
4.9.2 Initial gesture implementation ............................................................... 57
4.9.3 Final design ............................................................................................ 58
4.10 Controlling Speed of Automotive.............................................................. 58
4.11 Complete Outcome Structure of Hand Gesture: ........................................ 59
4.12 Complete Outcome Structure of Automotive: ........................................... 59
CHAPTER #5 ...................................................................................................... 60
5 Results and Analysis ............................................................................................ 61
5.1 How does it work and recognize the gesture? ............................................... 61
5.2 Make the power supply ................................................................................. 61
5.3 Start Making the Transmitter (Remote) ........................................................ 62
5.4 Make the Receiver:........................................................................................ 64
5.4.1 HT12E PIN Description:........................................................................ 64
5.4.2 PIN Description HT12D ........................................................................ 65
5.4.3 PIN Description of L293D ..................................................................... 65
5.5 Choose the right motor .................................................................................. 66
5.6 Choose the right RMP for motor ................................................................... 66
5.7 DEBUGGING (OPTIONAL, if there is problem with circuit):.................... 66
5.7.1 L293D IC ............................................................................................... 66
5.7.2 POWER SUPPLY .................................................................................. 66
5.8 What gestures will the robot recognize? ....................................................... 67
5.9 Testing and Evaluation .................................................................................. 70
5.10 Automotive ................................................................................................ 70
5.11 Gesture Design .......................................................................................... 70
5.12 Wired Implementation: .............................................................................. 70
5.13 Wireless Implementation ........................................................................... 71
5.14 Range of wireless....................................................................................... 72
5.15 Battery ....................................................................................................... 72
Chapter#6 ............................................................................................................. 73
6 Conclusion ........................................................................................................... 74
6.1 Further Work ................................................................................................. 74

7
6.2 Summary ....................................................................................................... 75
Chapter# 7 ............................................................................................................ 76
7 Coding .................................................................................................................. 77
7.1 Sender code ................................................................................................... 77
7.2 Receiver code ................................................................................................ 79

8
Project Abstract:
Nowadays use of robots is increased in large amount. This paper presents a algorithm to
control the robotic car using hand gesture recognition. This system is used for human
computer interaction (HCI). The human can control the robotic car by using gesture of his/her
palm. Webcam is used to capture the hand movement of the user. Image processing is used to
detect the gesture perform by the user. The command signals are generated from these
gestures. These signals are passed to the robotic car to move in the direction specified by the
user. This way we have created a system in which user can control the robotic car by his/her
hand gesture wirelessly on windows7 platform by using java and embedded C technologies.

8
CHAPTER #1
INTRODUCTION

9
1 Introduction
Humans interact in the physical world by the means of the five senses. However,
gestures have been an important means of communication in the physical world from
ancient times, even before the invention of any language. In this era of machines
taking control of every complex works, interactions with machines have become more
important than ever. Robots are classified into two types: Autonomous robots like
Line sensing or edge sensing robots, and Remote controlled robots like Gesture
controlled Robots. Since this paper deals with gesture controlled robot, the primary
focus will be on the remotely controlled robots only. Undoubtedly, the output and the
functioning of machines will be more intuitive if they are communicated using human
gestures.
A gesture is a form of communication in a non-verbal manner by using visible body
movements or actions conveying messages. There are several ways to capture a
human gesture that a machine would be able to understand. The gesture can be
captured using a camera, or a data glove. Gestures can also be captured via Bluetooth
or infrared waves, Acoustic, Tactile, optical or motion technological means. The
embedded systems designed for specific control functions can be optimized to reduce
the size and cost of the device, and increase the reliability and performance. With the
advent of Smartphone and other modern technologies, operating machines have
become more flexible. The Smartphone are equipped with in-built accelerometer
which may be used for gesture recognition and such other tasks. Moreover, the
Android OS is gaining significant popularity in the world of Smartphone due to its
open architecture. Android platform is being used in the development of numerous
applications for cell-phones. Researchers have shown interest in gesture recognitions
and have built several robots and devices that are controlled by human gestures. There
is a constant development in the field of gesture controlled devices. Apart from hand
gesture recognition, emotional gesture recognition from face is also done in some
cases. There are two types of gestures used in gesture recognition: Online gestures
and Offline gestures. In Online gestures, direct manipulations like rotation and scaling
are done. In Offline gestures, the processing is done only after the user interacts with
the object. Gesture technologies are applied in several fields like in Augmented
Reality, Socially assistive Robots, recognition of sign languages, emotion detection
from facial expressions, Virtual mouse or keyboard, recognition of sign languages,
remote control, etc. There are various modes of communication between the
microcontroller of the robot and the Smartphone. However, the popularly used means
of communication is done via RF, Bluetooth or Wi-Fi. Using RF limits the distance
from which the robot can be controlled. Using Wi-Fi increases the overall cost for
setup. So, the robot has been built with Bluetooth which has intermediate range of
distance covered and cost between RF and Wi-Fi.
In this paper, the Arduino microcontroller is incorporated in the robot for the main
computation and the main communication between all the modules. Then there is a
motor driver that deals with the computation and functioning of the motors to turn the
wheels essential for the movement of the robot. Last, but not least, a Bluetooth
module is incorporated in the robot that serves as the means of receiving the data from
the Smartphone which is processed in the Arduino to detect the direction of
movement of the user‟s hand and move the robot accordingly. The prime aim of the
design is that as the user moves his hand in some direction, the robot moves in the
same direction as well. In other words, the robot is solely controlled by the hand
movements and gestures of the user. The goal of this paper is to develop a method to

14
control and program a robot with gestures and assure high level of abstraction, cheap
and minimal hardware and a simplified robot programming.

1.1 Background and Motivation:


The emergence of robots can be traced back to the 90‟s with Helpmate Robots and
Robot-Caddy. Since then, there is an exponential development in the field of robotics,
and controlling robots through human gestures have been the topic of research for as
long time. With the implementation of gestures to control robots, there have been
several methodologies to perform the action.

Some of the related works are being described in this section: A. Light-based Gesture
Recognition Light or illumination tracking and controlling robots with light sensors
are being done in a lot of cases. Such robots are autonomous in nature. Generally,
there are some light sensors associated with the robot. The sensors send some rays of
light and track them as they gets absorbed in the surface or reflected back to it.
According to this, the robot can be line-sensing robots where it is made to follow a
black or a white path autonomously.

B. Vision-based Gesture Recognition Several robots are designed to be controlled by


vision-based gestures. In such robots, there are, generally, some cameras as the
sensor, which also acts as an interface to control the robot with some manipulators.
The input gesture can be some patterns, movements of hands, color tracking, face
recognition, finger tracking, or some templates. They are also used in ball tracking
and Robo-football games where the robots play the traditional game of football by
tracking the movement of the ball. Though it has paved a way for advanced robot
operations, but it is affected by factors such as illumination, foggy weather,
background lights, etc.

C. Motion-based Gesture Recognition The motions can be used to control a robot.


This is generally done by incorporating an accelerometer to control the robot
wirelessly. This can also be done using sensors. This method is beneficial over other
methods in the sense that it can interact with machines naturally without being
intruded by the factors that affects the mechanical devices [3]. One important
development in this field is done by Sauvik Das et al in 2010, where he designed a
spying device yielding location and activities of the user without his/her knowledge

D. Sixth Sense Technology the Sixth sense technology begins in 1990 by Steve Mann
who implemented a wearable computing device via neck projector or head-mounted
projector coupled with a camera. Later, following his idea, Pranav Mistry, a young
research scientist at MIT at that time came up with new applications of this
technology. Pranav Mistry came up with the name „Sixth Sense Technology” and has
since been named Wear Ur World (WUW). This technology applies all of the
techniques mentioned above and designing applications that give an intuitive output
with the connection of internet.

15
1.2 Review of literature
Stefan Waldherr, Roseli Romero, Sebastian Thrun describes a gesture interface for the
control of a mobile robot equipped with a manipulator. The interface uses a camera to
track a person and recognize gestures involving arm motion. A fast, adaptive tracking
algorithm enables the robot to track and follow a person reliably through office
environments with changing lighting conditions. Two alternative methods for gesture
recognition are compared: a template based approach and a neural network approach.
Both are combined with the Viterbi algorithm for there cognition of gestures defined
through arm motion (in addition to static arm poses). Results are reported in the
context of an interactive clean-up task, where a person guides the robot to specific
locations that need to be cleaned and instructs the robot to pick up trash.

1.3 Statement of purpose


The aim of this project is to build, through research and implementation, a remote
controlled automotive which can be operated by gestures, using command data achieved
from a visual input device. The aim can be divided as follows:
 Designing a gesture recognition and tracking system which identifies and
interprets gestures and translates them to motion commands to be followed by a
car.
 Controlling a car, mounted by a microcontroller, using the commands interpreted
from the gesture recognition data.
 Successfully establishing a reliable communication link between the gesture
tracking system and the car, thereby integrating them in to one big cohesive
system.

1.4 TECHNICAL REQUIREMENTS

In this work, there are three fundamental compnents are required as shown in the
figure 1

Figure 1: Technical Components used in the work

16
1.4.1 A. Microcontroller
The first requirement for the design of the robot is the microcontroller. In this work,
Arduino ATMEGA 328 has been used. It is an open-source electronics prototyping
platform with 14 digital I/O pins, 6 analog inputs, 16 MHz crystal oscillator, a USB
connection, a power jack, an ICSP header and a reset button. In figure 1, it is marked
with 1

1.4.2 B. Mechanical Components


Motor driver is an additional requirement in this work. But to make the output more
intuitive, this component is necessary. DC motors are used in this work to rotate the
four wheels due to its effectiveness while designing robot with a high gear ratio. The
motors are shown in the figure as 4. The motor shield used here is Adafruit Motor
Shield V2. The chip marked as 2 in the figure is motor shield, though a commercial
manufactured motor driver is used to complete the work.

1.4.3 C. Bluetooth Module


Bluetooth is a global standard for Bluetooth connectivity. This is an essential
component in this work. This module connects the microcontroller and the Android
Smartphone for data exchange. The module used here is HC-05 Bluetooth module. It
is an easy to use Bluetooth SSP with typical -80dBm sensitivity, up to +4dBm RF
power, low power 1.8V operation and several software properties that facilitates the
connectivity [9]. In figure 1, it has been denoted by 3.

1.4.4 D. Android Smartphone


This is the remote components that senses the user‟s hand movements and send its
determinant value to the Arduino microcontroller via the Bluetooth module. The
Android application designed for this work does the calculation of the determinant.

1.5 SYSTEM DESCRIPTION


Gesture recognition is the main aim of this work. The user holds an Android operated
Smartphone, and moves, or rotates his hand in any direction. The accelerometer
within the phone is regulated to generate a maximum and minimum value for the
movement of the hand in three dimensional co-ordinates depending upon the external
environmental conditions. The android application does the work of sensing the
accelerometer calibration and generating the maximum and minimum values from it.
Depending upon the values obtained, it sends a determinant value to the
microcontroller using Bluetooth. The Bluetooth module receives data and transmits it
to Arduino where it checks the determinant value and moves the robot accordingly.
The whole process is under an infinite loop, so it runs as long as the power is
supplied. The output depends on the accelerometer inputs directly that can be used to
control the robot. The accelerometer input depends on the gestures of the user‟s hand.
The steps stated above are broadly described in this section. The system consists of
the following steps to work as mentioned:

17
1.6 Block Diagram of the system

Figure 2: Block diagram of the system

1.7 Transmission of data from Android application to Motor driver


The input to the application is the direction of movement of hand of the user given by
the accelerometer. This is analog in nature. It is then digitally coded by the Android
application before sending it to Arduino by HC-05 Bluetooth module. The signal goes
to the digital pins of the Arduino board, which has an inbuilt AD/DA converter of 8
bit. The Arduino process the received data. Based on the data received, appropriate
signal is transmitted to the motor driver to rotate the motor in such a way that the
robot moves in the direction of movement of the user‟s hand.

Figure 3: Transmission of data

18
1.8 Receiving the data
The data is received from the Android Smartphone via HC-05 Bluetooth module on
the digital pins of the Arduino microcontroller. It is then processed in Arduino. This
processed data is received by the Adafruit motor shield. Based on the data.

Figure 4: Receiving the data

1.9 Gesture Recognition


Android Smartphone‟s are equipped with inbuilt accelerometers. The application
designed in this work retrieves the value of the accelerometer and sends a determinant
value to the microcontroller via Bluetooth. As the user moves his hand, the
accelerometer reading changes. It is then retrieved by the application. There are two
values: One is maximum value and the other is minimum value. The range is
specified using these two values for each function of the robot. If the value retrieved
by the application lies between these specified values, then the corresponding
determinant is generated. This determinant is sent to the microcontroller, which then
receives the determinant value, process it to recognize the corresponding gesture, and
sends signals to move the robot accordingly.

19
Figure 5: Gestures for movement of the robot

Figure 5 shows the gestures to control the movement of the robot. When the user tilts
his hand forward, the gesture is recognized as the forward movement, and the robot
moves in the forward direction. The angle of the tilt or the difference between the
angle of tilt of user‟s hand and the threshold value of forward movement gesture
determines the speed of the robot. When the user tilts his hand on the right direction,
the gesture is recognized as the right turn, and the robot moves in the right direction.
When the user tilts his hand in the left direction, the gesture is recognized as the left
turn, and the robot moves in the right direction. The angle of the tilt of user‟s hand
determines whether the left or right turn is a normal turn or a sharp turn. A sharp turn
is one in which a car changes direction without slowing down before turning. When
the user tilts his hand backwards, the gesture is recognized as the move backward
gesture, and the robot moves in the backward direction. If the user‟s hand is
somewhere between the two gestures, i.e., the accelerometer value is somewhere

20
between the threshold of two directions(forward and left turn, left turn and backwards,
backwards and right turn, forward and right turn), then the robot moves in that
diagonal direction.

1.10 Movement of Motors and Wheels


There are four DC motors used in the design of this robot: one motor for each wheel.
The motors are controlled by the Adafruit motor shield. The shield is stacked on top
of Arduino. Every shield stacked can run 4 DC motors. Installing the Adafruit Motor
Shield library gives the flexibility of using the motors just by calling some pre-
defined functions as motor1.setSpeed(value) that sets the speed of the motor to 250
rpm, or motor1.run(FORWARD) that makes the motor1 to rotate forward [10]. These
functions are called from the program burnt in the Arduino microcontroller. The
signal is sent to the motor shield that runs the motors.
The wheels are connected to the motors. 4 DC motors are used Two for left wheels,
and two for right wheels. When the signal received in the motor shield is to move
forward, all the four wheels of motors rotate forward, this turns all the four wheels in
the forward direction. The robot moves in the forward direction. When the signal
received in the motor shield is to turn the robot in the forward left direction, the left
diagonal motors are rotated backwards while the right diagonal motors are made
rotated forwards. This makes the robot turn in the forward left direction. When the
signal received in the motor shield is to turn the robot in the forward right direction,
the right diagonal motors are rotated
backward while the left diagonal motors are rotated forwards. This makes the robot
turn in the forward right direction. When the signal in the motor shield is to move
backward, both the pairs of the motors are rotated backwards resulting the robot to
move backwards. When the signal in the motor shield is to stop the robot, all the
motors are made stationary resulting the robot to stop. Similarly, to rotate the robot in
backward directions, similar methodology is used. To turn the robot in the backward
left direction, the left diagonal motors are rotated forwards while the right diagonal
motors are rotated backwards. This makes the robot turn in the backward left
direction. To turn the robot in the backward right direction, the right diagonal motors
are rotated forwards while the left diagonal motors are rotated backwards. This makes
the robot turn in the backward right direction...

21
Figure 6: Movement of the Motors and Wheels

1.11 FUTURE SCOPE AND CONCLUSION


The Gesture controlled robot designed in this work has many future scopes. The robot
can be used for surveillance purpose. The robot can be applied in a wheelchair where
the wheelchair can be driven by the movements of rider‟s hand. Wi-Fi can be used for
communication instead of Bluetooth to access it from a greater distance. Edge sensors
can be incorporated to it to prevent the robot from falling from any surface. Some
camera can be installed which can record and send data to the nearby computer or
cell-phone. It can be implemented on a watch, or in any home appliances like Room
heater. Modern ARDUINO chips support Intranet as well as Internet connections
which can be utilized to a greater extent. This robotic car can be enhanced to work in
the military surveillance where it can be sent to some enemy camps and track it‟s
activities via Internet. With a mind full of creation, the possibilities are endless. In this
paper, the design and implementation of Gesture Controlled Robot is presented and
developed using Arduino microcontroller and Android Smartphone. An algorithm has
been provided and its working is detailed thoroughly. Since the updating possibilities
are endless, updating the system has been kept as a future scope. The built device is
cheap, and is easy to carry from one place to another. The addition of the some
additional sensors or camera will make it more productive. The limitation of the
hardware being associated with a system has been reduced to a great extent. As an end

22
thought, the system will allow the user to control it in a way that reduces the gap
between the physical world and the digital world with an output more intuitive.

23
CHAPTER #2
DESIGN

24
2 Design
 Car must response fastly and accurately to every given gesture. When the user
gives command through the glove, the car will be able to respond within 1
second.

 Glove must be relatively comfortable to wear and make gestures in it. The size
of the glove will be between 8.5~9 inches. With all the electrical components
installed on the glove, it should still fit the size of most people’s hand.

 Communication between car and glove must be easy to set up while using.
There are 2 separate switches on the glove and the car. When they are both
turned on, the communication starts.

2.1 Block Diagram

Figure 1. Block Diagram

25
2.2 Physical Design Diagram

2.2.1 Physical Design Diagram of Glove

Figure 2. Physical Design Diagram of the Glove

26
2.2.2 Physical Design Diagram of Car

Figure 3. Physical Design Diagram of the Car

2.3 Block Design

2.3.1 Power Module A


This module serves to provide power to all the electrical components on the glove,
including three flex sensors, one accelerometer, a microcontroller, and a wireless
communication transmitter. The module will provide stable 5V and 3.3V voltages at
all times. It consists of a battery and a linear voltage regulator.

2.3.2 Battery A
A 12V Energizer A23 battery will be used here to provide power. The battery itself
only weighs 0.8 ounces, with a dimension of 2.9x1.8x0.5 inches. Adding up the
power consumption and current flow of each individual component on the glove, the
maximum total current will be 18.97mA and the maximum power will be 67.43mW.
Therefore, one battery will be sufficient to provide enough power for 3-4 hours.

Requirements Verifications
1. Must be able to provide a voltage of 1.
7-12V under continuous operation a. Before using the glove every time,
(Minimum input voltage for regulator is use a voltmeter to measure the

27
7V) voltage level of batteries.
b. If voltage level is dropping close to
7V, meaning the battery is about to
be dead, change to a new battery
2. Must stay under 60C under all 2.
conditions a. Use hand to feel the temperature, if
fingers cannot stay on battery over
1sec, take the battery out and let it
cool down

2.3.3 Linear Voltage Regulator A


A linear voltage regulator is used here to regulate the voltage level to 5V. After doing
some comparison, the STMicroelectronics L7805CV meets all of our requirement. It
takes wide range of voltage input from 7V to 35V. Its output current is 1A. It can
operate normally under up to 125C environment, which makes it a very sufficient
product for our project. On the glove, when some power in the battery is consumed,
voltage will drop and won’t be at the 12V level. With the help of this linear voltage
regulator, microcontroller and sensors will always be able to have a stable 5V input
voltage.

Power Consumption:
The operating voltage is 4.8-5.2V, and the current is 5mA-1A. Therefore, the power
will be 0.024-5.2W.
Requirements Verification
1. Provides 5V+/- 5% from a 12V source 1.
at 1A output current a. Use a constant-current circuit, draw
1A current from this linear voltage
regulator
b. Use oscilloscope to monitor output
voltage, ensuring it’s within the 5V+/-
5% range.
2. Maintain temperature under 125C 2.
a. When doing the step above, use an
IR thermometer to ensure the IC
temperature stays under 125C.

2.3.4 Sensor Module A


This module contains the two types of sensors on the glove that are needed for hand
gesture detection. The first type of sensor is flex sensor, and it will sense the curl of
the fingers. The second type of sensor is accelerometer, which can sense the
orientation of the hand. The sensor output data from this module is transferred to the
Control Module A for further usage.

28
2.3.5 Flex sensor
Three flex sensors will be used to sense the curl of the fingers (This gesture controls
the drive/stop as well as the speed of the car). They will be attached to the back of the
three longest fingers. A flex sensor, essentially a variable resistance from 30kOhm to
130kOhm, must be able to control the range of voltage going into the microcontroller
ADC pin to be between 1.389V to 3.125V. Below is a circuit diagram of feeding the
voltage related to the bend of a flex sensor into a microcontroller (denoted as MP in
the diagram).

Figure 4. Circuit for Feeding Flex Sensor Measurement into a Microcontroller

For component testing, we built the above circuit and used Arduino board to read in
the analog data for several bending degrees for each of the three flex sensors we have.
Below is a table of the bending degree-vs-ADC readings.

ABC Reading
Bending Degree Flex Sensor A Flex Sensor B Flex Sensor C
0 degrees 600 630 618
30 degrees 470 470 466
60 degrees 400 408 398
90 degrees 350 356 354

Requirements Verifications
The input voltage range to the a. Like above, connect a voltmeter across
microcontroller pins (shown in the circuit the 50kOhm resistor. (Flex sensor is
diagram above) must vary from 1.389V stretched now) Ensure voltage is around

29
to 3.125V 3.125V
b. Gradually bend the flex sensor, ensure
the voltage is gradually decreasing
c. When the flex sensor is bent to its
limit, ensure voltage is around 1.389V

2.3.6 Accelerometer
An accelerometer will be used to sense the orientation/tilt angle of the glove which
controls the turning of the car. The model we will use is ADXL 335 3-Axis
Accelerometer. It has ultra low power consumption (350 μA typically) and very small
size. The range of power supply is 1.8-3.6V. The accelerometer has three analog
output pins that will be connected to the analog input pins of the ATmega328 chip.
This is how the sensor data will be transferred to the microcontroller for further
processing. The three analog outputs of the accelerometer-denoted by Ax,out , Ay,out
, Az,out-can be converted to three tilt angles (angles are with respect to the three
sensing axes) using the formulas below:

30
Figure 5. Accelerometer on a flat surface Figure6. Accelerometer at an Tilted Orientation

For component testing, we connected the accelerometer module to an Arduino board,


displayed the ADC readings of the three axes to the Serial Monitor, and recorded the
ADC readings for numerous tilting degrees of each axis. Below is a table of tilt angle-
vs-digital reading data for all three axes (0° is when the axis is perpendicular to the
earth gravity, +90° is when the positive axis aligns with earth gravity, -90° is when
the negative axis aligns with earth gravity):

Power consumption:
typical operating power: 350 uA at 3V power supply (1.05mW), maximum power:
375 uA at 3.6 V(1.35mW), minimum: 200 uA at 2V (0.4mW)

Requirements Verification
1. ADXL335 must be able to sense and 1.
transfer its output at a minimum a. Connect the ADXL335 with an
frequency of 100 Hz Arduino Uno

31
b. setup properly and write codes to
receive real-time output data from the
accelerometer
c. Use the timing functionalities in
Arduino to calculate the frequency of the
ADXL335 output data, and make sure it’s
above 100 Hz
2. Its analog voltage output must change 2.
at least 100mV per 15 degrees of rotation a. Connect the ADXL335 to an
around any axis oscillascope
b. Manually rotate the chip around any
axis at steps of 15 degrees
c. Record the corresponding axis’s output
voltage value, make sure it changes at
least 100 mV
d. Repeat for a total rotation of ∓90
degrees for all three sensing axes

2.3.7 Control Module A


This control module on the glove will continuously take in analog sensing data from
the Sensor Module A, convert it to digital signals, and send properly assembled data
packages to the transmitter device in the communication module. This module
consists of a microcontroller.

2.3.8 Microcontroller A
We will use Atmega328 microcontroller for the control module on the glove. The SPI
serial port on Atmega328 will handle the SPI communication with the transmitter
device. The 6-channel 10-bit A/D converter will be able to convert the analog inputs
from the sensor modules to digital signals. After bootloading the Atmega328 chip, we
will build an Arduino circuit which enable us to program the microcontroller through
Arduino Software (IDE) and send the program to the memory on the Atmega328 via
USB cable.

32
Figure 7. Pin map of Atmega328

Power Consumption:
If an 8MHz (internal) oscillator is used, the operating voltage is 3.43V and the current is
3.6mA. Therefore, the power will be 12.3mV.
If an 8MHz (internal) oscillator is used, the operating voltage is 3.3V and the current is
6.6mA. Therefore, the power will be 21.8mV.

Requirements Verification
The Atmega328 chip must be bootloaded Before pulling the Atmega chip out of the
before using. Arduino board, make sure it has been
programmed several times.
The Vcc has the ratings of 1.8V to 5 V, Before connecting any component to the
maximum 6V. The voltage on the A/D microcontroller, make sure to measure its
pin with respect to ground can take the voltage using the multimeter to check if
range from -0.5V to Vcc+0.5V. it’s in the acceptable range.

2.3.9 Communication Module


This module serves to implement the wireless communication between the glove side
and the car side. It consists of a transmitter device on the glove and a receiver device
on the car. The transmitter device continuously receives the real-time data sent from
the control module A on the glove, and transmit it wirelessly to the receiver device on
the car. The control module B continuously reads the data received by the receiver
device and performs further processing.

33
2.3.10 Transmitter Device
We will use an NRF24L01 transceiver module to wirelessly transmit real-time data
from the glove side to the car side. It will pair with another identical NRF24L01
transceiver module on the car (set as receiver) to form the communication. The
transceiver interfaces with the microcontroller ATmega328 through the digital SPI
communication. The maximum SPI speed is 8 Mbps and the in-the-air data rate is 1 or
2 Mbps, both of which are fast enough. The range of operation is up to 100 m, which
is enough for our project. During normal operation, the microcontroller will
continuously send real-time data to the transceiver for it to transmit to the receiver on
the car. There is an RF24 library associated with the transceiver available online,
which will make the programming of the wireless data transfer less difficult. The
operating voltage of NRF24L01 is 1.9-3.6 V. The current consumption is very low,
only 9.0mA at an output power of -6dBm and 11.3mA at an output power of 0 dBm.
Another advantage of this transceiver is its small physical size.

Power consumption:
The operating voltage is 1.9-3.6 V. The supply current for one channel at 1000 kps is
11.8mA, and at 2000 kps is 12.3 mA. Thus the power consumption range is 22.42-
44.28 mW.
Requirements Verification
1. It must be able to receive the custom a. Connect one NRF24L01 transceiver to
data packages from the microcontroller, an ATmega328 microcontroller
and transmit them to the receiver b. Set the transceiver as transmitter
NRF24L01 15m away, at a minimum rate c. Connect another NRF24L01 to another
of 100 packages per second. ATmega328, placed at 15m away from
the transmitter
d. set it as receiver
e. Through ATmega328 programming
interface, send custom data packages to
the transmitter at a rate of 120
packages/second
f. Through ATmega328 programming
interface at the receiving end, record the
number of packages received in one
second. Make sure at least 100 packages
are received by the receiver and sent to
the microcontroller
2. The VCC pin on the transceiver 2.
module should connect to 3.3 V supply. a. Before connecting the VCC, use
multimeter to measure the voltage to
make sure it is 3.3 V.

34
2.3.11 Receiver Device
For the receiver device, we will use another NRF24L01 transceiver module by setting
it as receiver. It will continuously receive the real-time data sent by the transmitter
NRF24L01, and then send it to the microcontroller B on the car. The communication
between the receiver and microcontroller B is the same as that between the transmitter
and microcontroller A, except that the direction of the data transfer is reversed. The
specs of the receiver is also the same as that of the transmitter, since they are two
identical NRF24L01 modules.

Requirements Verification
1. The receiver must receive data a. Do the same steps as in the verification
packages and send them to the for the transmitter, except this time, send
microcontroller with minimum 95% 100 packages only once from the
accuracy (compared with the original data transmitter end
packages at the transmitting end) b. Through the ATmega328
programming interface at the receiving
end, record the received packages and
make sure at least 95 of them match the
original packages
2. The VCC pin on the transceiver 2.
module should connect to 3.3 V supply. a. Before connecting the VCC, use
multimeter to measure the voltage to
make sure it is 3.3 V.

Power consumption:
The operating voltage is 1.9-3.6 V. The supply current for one channel at 1000 kps is
11.8mA, and at 2000 kps is 12.3 mA. Thus, the power consumption range is 22.42-
44.28 mW.

2.3.12 Power Module B


This module serves to provide power to all the electrical components installed on the
car. Components include, two 12V motors, a microcontroller, a wireless
communication receiver device. This power module provides continuous and stable
12V to two motors and 5V to microcontroller and the receiver.

2.3.13 Battery B
Two 12V Energizer A23 battery, directly meeting voltage requirement of two 12V
motors. The battery itself only weighs 0.8 ounces, with a dimension of 2.9x1.8x0.5
inches. It has enough power to continuously support the components on the car. Even
when batteries are losing power therefore dropping voltage level, it will still be able to
correctly operate two motors on the car.

Requirements Verification
1. Must be able to provide a voltage of 7- 1.

35
12V under continuous operation a. Before using the glove everytime, use a
(Minimum input voltage for regulator is
voltmeter to measure the voltage level of
7V) batteries.
b. If voltage level is dropping close to
7V, meaning the battery is about to be
dead, change to a new battery
2. Must stay under 60C under all 2.
conditions a. Use hand to feel the temperature, if
fingers cannot stay on battery over 1sec,
take the battery out and let it cool down

2.3.14 Linear Voltage Regulator B


Same issue as above, a linear voltage regulator is used here to regulate the voltage
level to 5V. After doing some comparison, the STMicroelectronics L7805CV meets
all of our requirement. It takes wide range of voltage input from 7V to 35V. Its output
current is 1A. It can operate normally under up to 125C environment, which makes it
a very sufficient product for our project. On the glove, when some power in the
battery is consumed, voltage will drop and won’t be at the 12V level. With the help of
this linear voltage regulator, microcontroller and sensors will always be able to have a
stable 5V input voltage.

Power Consumption:
The operating voltage is 4.8-5.2V, and the current is 5mA-1A. Therefore, the power
will be 0.024-5.2W.
Requirements Verification
1. Provides 5V+/- 5% from a 12V source 1.
at 1A output current a. Use a constant-current circuit, draw 1A
current from this linear voltage regulator
b. Use oscilloscope to monitor output
voltage, ensuring it’s within the 5V+/-
5% range.
2. Maintain temperature under 125C 2.
a. When doing the step above, use an IR
thermometer to ensure the IC temperature
stays under 125C.

2.3.15 Sensor Module B


This module generates the distance sensing data needed for the car’s anti-collision
feature. It consists of a ranging sensor that will continuously output the distance
sensing data. The data is transferred to the Control Module B for further processing.

36
2.3.16 Ranging Sensor
We will use HC-SR04 Ultrasonic Ranging Sensor Module for distance measurement.
It will be placed at the front of the car and will face forward. It can provide stable and
accurate distance measurements from 2cm to 400cm. The ranging accuracy is up to
3mm. The working frequency is 40Hz. The measuring angle is 15 degrees. All these
specs suit our collision detection task. The sensor module will be connected with the
pins of the microcontroller B and send distance measurements to the microcontroller
for further usage. For component testing, we connected the ultrasonic sensor to an
Arduino Uno board, continuously read in the sensing data and displayed it on the
Serial Monitor. I used my hand as the object in front of the sensor and tested the
sensing accuracy for various distances. Below is a table of the actual distance-vs-
sensor output.

Actual 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
(cm)
Sensed 3 2 2 3 4 6 7 7 8 10 11 12 13 13 15 16 17 17 18 20
(cm)

Power consumption:
The operating voltage is 5V and the rated current is 15mA. Therefore, the power is 75
mW.

Requirements Verification
1. The sensor must provide the distance a. Connect the sensor with a ATmega328
measurements of an object placed in a board and setup properly
range of 30cm-2m away in front of the b. Place an object in front of the sensor
car with at least 90% accuracy. starting from 30cm away and ending at
2m with steps of 10cm.
c. At each step, record the distance
measurement by the sensor
d. Calculate the accuracy and make sure
it’s above 90% for all the steps

2.3.17 Control Module B


This control module on the car will be able to use SPI communication to control the
communication module on the car to receive information from the glove, process the
information and give control signals to the motion module which is responsible for
the movement of the car.

2.3.18 Microcontroller B
We will use Atmega328 microcontroller for the control module on the car. The SPI
serial port on Atmega328 will handle the SPI communication with the receiver
device. The Port D (PD [7:0]), which is an 8-bit bi-directional I/O port with internal
pull-up resistors, will be responsible for the analog PWM output for the motors on the

37
car. After boot loading the Atmega328 chip, we will build an Arduino circuit which
enable us to program the microcontroller through Arduino Software (IDE) and send
the program to the memory on the Atmega328 via USB cable.

Power Consumption:
If an 8MHz (internal) oscillator is used, the operating voltage is 3.43V and the current is
3.6mA. Therefore, the power will be 12.3mV.
If an 8MHz (internal) oscillator is used, the operating voltage is 3.3V and the current is
6.6mA. Therefore, the power will be 21.8mV.

Requirements Verifications
The Atmega328 chip must be bootloaded Before pulling the Atmega chip out of the
before using. Arduino board, make sure it has been
programmed several times.
The Vcc has the ratings of 1.8V to 5 V, Before connecting any component to the
maximum 6V. microcontroller, make sure to measure its
voltage using the multimeter to check if
it’s in the acceptable range.

2.3.19 Motion Module


This module takes in the control signals from the control unit (microcontroller B) and
makes the car move in the desired way. The module contains two motors, power
MOSFET, and wheels. The motions of the car include turning left/right and
increase/decrease speed. Especially, the turning of the car will be achieved by
different spinning speeds of the two frontal wheels.

2.3.20 Motors
Two PAN14 motors will be used to drive our car. PAN14 motor is one of the cheapest
but high-performance motors. One PAN14 motor weighs only 39g. It has a rated
voltage of 12V. Under its rated load(4.9mN*m), it has rated load current of 0.563A.
The motors will be controlling the speed of the wheels. A PWM signal generated by
microcontroller is sent to the RFP12N10L Power MOSFET, which is connected with
a motor in series. The speed change of motors can be adjusted by changing the duty
cycle of PWM signal. The car will turn left and right with two motors spinning in
different speed. They are fed the control signals from microcontroller B and make the
individual wheels rotate accordingly.

38
Figure 8. Physical Diagram of Motor

Requirements Verification
1. Can operate with VDC 12V+/-1V 1, 2.
2. Motor current doesn’t go over 1.02A a. Connect motor with an Energizer
under any operating condition A23 battery(12V). Battery doesn’t
have to be full
b. Install wheel on the motor
c. Use a voltmeter to monitor the
battery voltage; use a current probe
to monitor the motor current
d. Ensure both of the values are within
the range

2.3.21 Power MOSFET (RFP12N10L)


In our design, the speed of the motor is controlled by the PWM power supply. The
higher the duty cycle of PWM, the faster the motor speed. Our ATmega328
microcontroller is able to output an PWM with high of 5V. However, our motor has
an operating voltage of 12V. Thus, me and my teammates designed an MOSFET
circuit to use the microcontroller PWM signal to control the on and off of the motor.
We use RFP12N10L n-MOSFET to achieve this switch. At the Gate, we connect the
PWM from microcontroller in series with a 10 kΩ resistor to the ground. We put a
220 Ω resistor to represent the motor between the 12V supply and the Drain. Then, we
connect the source to the ground. Below is the circuit schematic of the power
MOSFET circuit.

39
Figure 9. Power MOSFET Circuit Schematic

Requirements Verification
1. RFP12N10L must have a voltage drop 1.
less than 0.5V a. Connect RFP12N10L Power MOSFET
with motor and an Arduino Uno
b. Connect a volt meter across the
MOSFET
c. Run code changing duty cycle for 20%
every 2 seconds
d. Monitor voltage across the MOSFET,
ensuring it’s within the range
2. It must reacts effectively when the 2.
PWM duty cycle changes a. Keep the connection from Step 1, keep
the code running
b. Look at the motor, see if it is able to
react to every PWM duty cycle change
fastly.

2.3.22 Wheels
There will be three wheels for the car. Two front wheels will be individually powered
by two motors. The third wheel is a dummy wheel without any motor connection,
placed at the center on the rear. It can rotate all degrees freely and its main purpose is
to make the turning of the car smoother.

Requirements Verification
1. Wheels can stably stay on the motor a. Install wheels on both motors, put
under motor full speed car upside down
b. Connect motors to 12V battery,
ensure it operates in full speed. Let it
run for 60sec.
c. Check to see if wheels are still
installed tightly on the motor

40
2.3.23 Circuit Schematics

Figure 9. Accelerometer Circuit on the Glove Schematic

Figure 10. Control Circuit on the Glove Schematic

2.3.24 Control Unit Flowchart

41
Figure 11. Algorithm of the microcontroller on the glove

42
Figure 12. Algorithm of the microcontroller on the car

2.4 Tolerance Analysis


For the tolerance analysis, we will focus on the ADXL335 accelerometer in the sensor
module A. We chose this component for two main reasons. First, its proper
measurements of the hand orientation are essential for the correct turning of the car.
Second, compared with other sensors in the project, the accelerometer has higher
precision requirements and thus has more stringent tolerance.

43
The important feature of the accelerometer is its measurement sensitivity, namely the
voltage change per given amount of acceleration change. As can be seen in the plots
below, the typical sensitivity is 0.303 V/g for x-axis, 0.306 V/g for y-axis, and 0.300
V/g for z-axis.

Figure 13. X-Axis Sensitivity at 25℃, Vs = 3V Figure 14. Y-Axis Sensitivity at 25℃, Vs = 3V

Figure 15. Z-Axis Sensitivity at 25℃, Vs = 3V

For our task of hand inclination measurement, the static acceleration due to gravity
will change between +g to -g, corresponding to the hand rotation between +90
degrees to -90 degrees. We need to consider how much the sensitivities can deviate
from the expected values so that the microcontroller will still be able to obtain the
accurate hand orientation measurement. The ATmega328p microcontroller has 10-bit
ADC, which means it can do 1024 bits of resolution. As a result, it can produce a
resolution of 5V/1024 = 0.005V. If we want to be able to distinguish a change of 15
degrees tilt angle (2g/12 = g/6 change in acceleration), the minimum sensitivity of any
axis is 0.005V per g/6, which is 0.03V/g. From the large of number of samples in the
plots above, even with potential deviations, all three axes’ sensitivities are still much
larger than the minimum requirement. Another good thing here to note is that the
large difference also allows other external electrical noises and voltage level
fluctuation to exist without affecting the measuring accuracy.

44
2.5 Safety and Ethics
There are several safety issues arising from our project, both during the design
process and in the applications. During the design period, the first big safety aspect is
the usage of battery. We rely on batteries to power all other electrical components.
Well-designed circuitries to connect batteries with, the careful storage and handling of
batteries, and the prevention of overheating batteries with extended usage time are all
extremely important to the safety of dealing with batteries. Second, there are lots of
safety concerns related to building electrical components. These concerns include:
soldering safety, circuit testing with power on, and safety of usage of external devices.
To minimize the risks, we must comply with the standard safety rules. Third, there’s
some mechanical work involved in this project. There are safety issues of conducting
mechanical work as well, and we must again comply with the standard safety rules.
Finally, remote controlled mobile car also poses safety issues. We can’t guarantee the
car will be controlled as intended all the time. As a result, we need to make sure the
car will not crash with anything or anybody to cause any hazard when testing. This
can be done by choosing an empty field, or putting protective barriers on the field. All
these safety analysis and precaution/prevention efforts align with the IEEE Code of
Ethics, #1: “to hold paramount the safety...” [6]. On the application side of our
project, there is also a noticeable potential safety concern. Users sometimes might
give bad control commands due to inappropriate usage or even malicious intention. In
addition, a hacker might be able to hack into a user’s control system and make the
mobile device do ill-intended things. In such cases, the mobile device would cause
hazard, such as hitting people or destructing infrastructures. This is against the IEEE
Code of Ethics, #9: “to avoid injuring others, their property, reputation, or employment
by false or malicious action” [6]. One precaution is to build an automatic stop function
so that the device will halt when it’s too close to certain types of objects.

45
CHAPTER #3
UML Diagrams

1
3 UML Diagrams
Design Description
CAUHGCS system consists of two different components that are working together. Engine
and Interface are the mentioned components. At first the Engine part tracks a hand gesture
and sends it to the interface. Interface translates the gesture to the appropriate command and
if it is valid, the corresponding menu or submenu transition or the corresponding action will
be done. The ER diagram of the perfect cooperation of these two parts, Engine and Interface
can be seen on Figure 1.

3.1 ER Diagram

Figure 1 - Entity Relationship Diagram of System Entities

CAUHGCS system consists of two main data structures named Gesture and Menu. These
data structures are created, modified or used by the two modules of the system, Engine and
Interface. Engine part of CMDGR system tracks a hand movement or gesture and creates a
Gesture object belonging to it. Engine sends that gesture object to the interface and the
current menu or submenu, reflected on the screen by the Interface module, will be changed
according to that gesture.

1
3.2 Class Diagram

Figure 2 - Class Diagrams and Attributes of Main Data Structures of CAUHGCS

Interface module is the module that gets the gesture object from Engine module and decides
the action corresponding to the gesture. Only one main data structure will work on the
interface module: Current Menu Object. Current menu object only holds a pointer on the
current menu shown on the screen.

2
In order to be more understandable, data flow diagram of an action that invokes Call function
is shown in Figure 3.

Figure 3 - Data Flow Diagram

3.3 Use case Diagram

Figure 4 - Use Case Diagram of the System

3.3.1 Gesture Recognition


Gesture Recognition is the component that basically recognizes incoming hand movement.

3.3.2 Processing Narrative


Two responsibilities of the Gesture Recognition component is to recognize gestures and
create gesture objects and send it to Navigation component of the Interface module for further
processing.

3
3.3.3 Dynamic Behaviour
As seen in the figure below, Gesture Recognition gets the signal and creates proper gestures
if the user makes valid hand movements. In other case, that is user makes meaningless hand
movements which does not exist in hand gesture data set, no gesture is created.

Figure 5 - Use Case Diagram of Gesture Recognition

3.4 Sequence Diagram

Figure 6 - Sequence Diagram of Gesture Recognition

4
3.4.1 Recognition Gesture
Recognition is the component that basically recognizes the hand movements for the system.

3.4.2 Interface Description


Frames that come from the gesture are the input to this component and output is a signal that
is sent to the Navigation component. Also, it triggers gesture recognition component of
engine module.

5
CHAPTER #4
Hardware Used
&
Technical Specifications

6
4 Hardware Used & Technical Specifications
Robots are playing an important role in automation across all the sectors like
construction, military, medical, manufacturing, etc. After making some basic robots
like line follower robot, computer controlled robot etc, we have developed
this accelerometer based gesture controlled robot by using arduino uno. In this project
we have used hand motion to drive the robot. For this purpose we have used
accelerometer which works on acceleration.

4.1 Required Components


1. Arduino UNO
2. DC Motors
3. Accelerometer
4. HT12D
5. HT12E
6. NRF Pair
7. Motor Driver L293D
8. 12 Volt Battery
9. Battery Connector
10. USB cable

4.2 Arduino UNO


4.2.1 Sender

48
4.2.2 Receiver:

4.3 DC Motors

49
4.4 Accelerometer:
The accelerometer uses very little current, so it can be plugged into your board and
run directly off of the output from the digital output pins. To do this, you'll use three
of the analog input pins as digital I/O pins, for power and ground to the
accelerometer, and for the self-test pin. You'll use the other three analog inputs to read
the accelerometer's analog outputs.

4.4.1 Connect to the Power:

 Connect the GND pin to GND on the Arduino.


 Connect the VIN pin to the 5v pin on the Arduino.
 (For 3.3v microprocessors, connect the pin marked 3Vo to the 3.3v
supply)

4.4.2 Connect the X,Y and Z Signal Outputs:


Connect X, Y and Z to the analog pins as shown below:

50
4.4.3 Using the Voltage Reference:
For the best possible accuracy and precision, you can use the output of the
accelerometer boards voltage regulator as the analog reference for the Arduino.
Connect the 3Vo pin on the accelerometer board to the AREF pin on the Arduino.

51
4.4.4 Schematic:

4.5 NRF Pair:

A gesture controlled robot is controlled by using hand in place of any other method
like buttons or joystick. Here one only needs to move hand to control the robot. A
transmitting device is used in your hand which contains RF Transmitter and accelero-

52
meter. This will transmit command to robot so that it can do the required task like
moving forward, reverse, turning left, turning right and stop. All these tasks will be
performed by using hand gesture.
Here the most important component is accelerometer. Accelerometer is a 3 axis
acceleration measurement device with +-3g range. This device is made by using
polysilicon surface sensor and signal conditioning circuit to measure acceleration. The
output of this device is Analog in nature and proportional to the acceleration. This
device measures the static acceleration of gravity when we tilt it. And gives an result
in form of motion or vibration.
According to the datasheet of adxl335 polysilicon surface-micromachined structure
placed on top of silicon wafer. Polysilicon springs suspend the structure over the
surface of the wafer and provide a resistance against acceleration forces. Deflection of
the structure is measured using a differential capacitor which incorporate independent
fixed plates and plates attached to the moving mass. The fixed plates are driven by
180° out-of-phase square waves.
Acceleration deflects the moving mass and unbalances the differential capacitor
resulting in a sensor output whose amplitude is proportional to acceleration. Phase-
sensitive demodulation techniques are then used to determine the magnitude and
direction of the acceleration.

4.5.1 Pin Description of Accelerometer:


1. Vcc 5 volt supply should connect at this pin.
2. X-OUT This pin gives an Analog output in x direction
3. Y-OUT This pin give an Analog Output in y direction
4. Z-OUT This pin gives an Analog Output in z direction
5. GND Ground
6. ST This pin used for set sensitivity of sensor

53
4.6 Motor Driver

4.7 Circuit Diagram and Explanation


Gesture Controlled Robot is divided into two sections:
1. Transmitter part
2. Receiver part
In transmitter part an accelerometer and a RF transmitter unit is used. As we have
already discussed that accelerometer gives an analog output so here we need to
convert this analog data in to digital. For this purpose we have used 4 channel
comparator circuit in place of any ADC. By setting reference voltage we gets a digital
signal and then apply this signal to HT12E encoder to encode data or converting it
into serial form and then send this data by using RF transmitter into the environment.

At the receiver end we have used RF receiver to receive data and then applied to
HT12D decoder. This decoder IC converts received serial data to parallel and then
read by using arduino. According to received data we drive robot by using two DC
motor in forward, reverse, left, right and stop direction.

4.8 Working
Gesture controlled robot moves according to hand movement as we place transmitter
in our hand. When we tilt hand in front side, robot start to moving forward and
continues moving forward until next command is given.
When we tilt hand in backward side, robot change its state and start moving in
backwards direction until other command is given.
When we tilt it in left side Robot get turn left till next command.
When we tilt hand in right side robot turned to right.
And for stopping robot we keeps hand in stable.

54
Circuit Diagram for Transmitter Section

55
Circuit Diagram for Receiver Section

Circuit for this hand gesture controlled robot is quite simple. As shown in above
schematic diagrams, a RF pair is used for communication and connected with arduino.
Motor driver is connected to arduino to run the robot. Motor driver’s input pin 2, 7, 10
and 15 is connected to arduino digital pin number 6, 5, 4 and 3 respectively. Here we
have used two DC motors to drive robot in which one motor is connected at output
pin of motor driver 3 and 6 and another motor is connected at 11 and 14. A 12volt
Battery is also used to power the motor driver for driving motors.

4.9 Implementation

4.9.1 Initial automotive movement


An Arduino Uno was connected to the modified automotive being used for the
project. As an important part of the project, the first implementation task was to
integrate the car with the microcontroller and establish meaningful communication
between the two. The car's motor drivers can be controlled using 6 pins, 3 for each set
of wheels allowing independent control of each side of the car. Individual control of
each wheel certainly would have provided more control but for this project dual wheel
control was sufficient. The ribbon wire attached to the car consisted of 20 pins. The
first task was to establish which 8 pins out of the 20 are connected to the motor driver.
These pins were then connected to the relevant Arduino pins using small wires.

The following code snippet was run to initiate a forward movement by the car.

digitalWritephaseRight, HIGH);
digitalWrite(breakRight, HIGH);
digitalWrite(enableRight, HIGH);
digitalWrite(phaseLeft, HIGH);
digitalWrite(breakLeft, HIGH);
digitalWrite(enableLeft, HIGH);

The next step was to perform backward movement of car. This was achieved by
setting the phaseRight and phaseLeft values to LOW. By turning one side of the
wheels in the forward direction and the other side in the backward direction allowed

56
the car to spin on the spot. For movements in the right or left directions, wheels on the
respective side were made to move forward while the wheels on the opposite side
were kept still. At this stage, the Arduino was powered via a USB connection with the
computer and Serial communication was started using the following code snippet.

// A serial communication link at baud rate 9600 is started.


Serial.begin(9600);

4.9.2 Initial gesture implementation


First a very simple design was implemented. The aim, during this phase, was to test
the Serial communication between the car and Processing. The screen canvas was
divided into two halves horizontally and the position of the mouse cursor was used to
define the command that was sent to the car. If the mouse position was in the top half
of the screen then a command to move in the forward direction was sent to the car via
the serial port and if the mouse was positioned in the lower half of the screen the
command to stop the motion was sent to the car. The commands were sent in the form
of characters. 'F' was sent for forward and 'S' was sent for stop. A unique character
was also used as an event trigger to alert the Arduino to read the command.
After it was established that the communication worked, in the next phase of
development, a slightly more complicated gesture design was implemented. Mouse
position tracking was replaced by hand tracking. A “wave” was used as the focus
gesture to begin hand tracking. The position of the hand on screen was used as an
indicator of the command that is to be sent to the car. The graphical display and the
characters used for commands were kept the same from the previous iteration. A
scene can contain multiple hands but only one active hand was allowed at a time, to
avoid confusion in the commands being sent to the car. Since this design could be
used to send only two kinds of commands to the car, a more comprehensive gesture
design was required for the project.
For the next phase of gesture development, skeleton tracking was attempted. A “Psi”
pose, shown in Figure, was used as the focus gesture. The following code snippet
shows few of the different body parts which were tracked, and was used to draw the
skeleton on screen for the convenience of the user.
The hand, elbow, shoulder, hip, foot and knee joints of both sides of the body were
tracked. The angles between the, right and left, forearm and arm were recorded and
were used for deciding the command to be sent to the car.

Five different commands were decided based on the values of the angles being
tracked. 'F' was sent for forward motion of car, 'B' for backward motion, 'R' for
motion towards the right direction, 'L' for motion towards the left direction and a
default value was considered stop. Similar to the previous gesture design, a unique
character was used as event trigger for alerting the Arduino.

However, it was seen that the skeleton tracking gestures were not user-friendly and
resulted in fatigue in the arms frequently. Therefore, the gesture design was changed
again and a more comprehensive design, based on the hand tracking design discussed
in the previous chapter, was implemented.

57
4.9.3 Final design
As previously mentioned, this design is based on a simple idea with a control pad kind
of GUI. First, the environment was set up; the essential variables were declared and
the required objects were created. The following code fragment shows a part of the
initial setup.

After the initial setup, the GUI was created using a newly defined control pad object
which had associated parameters like height and width of screen, number of buttons
on control pad, size of each button and shape of buttons etc. Three states were defined
for the rectangular buttons at a point in time. The buttons could either be selected,
active but not selected or rendered useless owing to an expired session. Selected
buttons were represented with blue colour, active but unselected buttons were
represented with green colour and an inactive control pad was drawn in red colour.
The code fragment given below was used to draw the GUI for the system.

An if-else construct was used to determine the command to be sent to the car based on
the position of the active hand at any point in time. This selected command was
transmitted to the Arduino via Serial communication. To aid the experience of the
user descriptive text suggesting the commands being given was displayed on the
screen. Frequent session and XnVPointControl, the NITE object used for hand
tracking, callbacks were also made in the code.

4.10 Controlling Speed of Automotive


Since the Automotive is originally built for racing, it's default speed is very high.
Therefore, it was essential to reduce the speed to a reasonable value for proper use of
the car in small rooms without collisions. In order to reduce the speed Pulse Width
Modulation (PWM) was implemented in the Arduino code by introducing stepping
delays in the motion of the car.

Initially, the introduction of the delay lead to reduced speed but the motion obtained
was jerky in nature. After trial with multiple delay values, a reasonably smooth
motion was acquired while sustaining the required slow speed.

However, these delays were only introduced for the forward and backward
movements as a smooth spinning motion could not be attained with slow speed. The
values finally used were forward/backward motion for 60ms followed by stalling of
the car for 80ms.

At the end of the project, a working implementation of the car and the gesture design
were obtained. following picture shows the car at the end of the implementation:

58
4.11 Complete Outcome Structure of Hand Gesture:

4.12 Complete Outcome Structure of Automotive:

59
CHAPTER #5
Result and Analysis

60
5 Results and Analysis

5.1 How does it work and recognize the gesture?


Here the brain of the robot is Arduino Uno (Atmega32) it is fed with some set of
code. The gestures/motion made by hand are recognized by a acceleration measuring
device called accelerometer (ADXL335).

Here the accelerometer reads the X Y Z coordinates when we make gestures by


hand and send the X Y Z coordinates to the Arduino (here we don’t need the Z axis
we need only two coordinated X and Y So neglect the Z coordinate). The Arduino
checks the values of coordinates and sends a 4bit code to the Encoder IC.
The Encoder passes the data to RF transmitter and the transmitted data is received by
the RF receiver. The receiver sends the 4bit code to the Decoder IC and the decoder
passes it to Motor Driver IC. Later the motor driver makes the decision to turn the two
motors in the required direction.

5.2 Make the power supply


First we will start with the power supply circuit. We need two power supply circuits:
one for the transmitter and one for receiver. The receiver circuit needs to powered
using 12V supply (since I am using a 12V motor) and the transmitter circuit can be
powered using 9V battery.

61
You can see the circuit for the receiver power supply on the right. Using this diagram,
wire up the supply circuit. You can also add an LED via a 1k resistor to indicate the
state of power supply.

 IC 7805 which regulates the 12V supply to 5V (if you can't get a 12V supply
you can use a 9V supply)
 0.1uf and 470uf capacitor
 1k resistor for status LED

NOTE: Use heat sink for 7805 because we are dropping 7V (12-5) so lots of heat will
be produced to burn the regulator so usage of heat sink is recommended.

5.3 Start Making the Transmitter (Remote)


The transmitter section consists of an accelerometer which detects the hand gesture
and sends the data to the Arduino. Later Arduino sends data to the Encoder IC in
accordance to the data received from accelerometer and the data is transmitted to the
receiver. Wire up as per the below circuit:

62
This is just an illustration of the transmitter:

63
5.4 Make the Receiver:

The receiver circuit consists of 2 IC (HT12D decoder, L293D motor driver), RF


receiver module.

Wire the circuit as per the above receiver schematic. There are 2 LEDs in the receiver
board, one lights up when the power supply is given to the receiver and the other
when power supply is given to the transmitter circuit. The LED near the IC HT12D
should light up and this provides you a valid transmission (VT) when power is given
at the transmitter if not there is something wrong with your connection or your RF-
TX-RX module.

5.4.1 HT12E PIN Description:

Pin (1- 8) — 8 bit address pin for output [ A0,A1,A2,A3,A4,A5,A6,A7 ]

Pin 9 — Ground [ Gnd ]

Pin (10,11,12,13) — 4 bit address pin for input [ AD0,AD1,AD2,AD3 ]

Pin 14 — Transmission enable , Active low [TE]

Pin 15 — Oscillator input [ Osc2 ]

Pin 16 — Oscillator output [ Osc1 ]

Pin 17 — Serial data output [ Output ]

Pin 18 — Supply voltage 5V (2.4V-12V) [ vcc ]

A0-A7 — These are 8 bit address pin for the output.

64
GND — This pin should also be connected to the negative of the power supply.

TE — This are the transmission enable pin.

Osc 1,2 — These pins are the oscillator input and output pins.This pin are connected
to each other with a external resistor.

Output — This is an output pin. The data signals is given out from this pin.

Vcc — The Vcc pin connected to positive power supply, It is used to power the IC.

AD0 – AD3 — These are the 4 bit address pins.

5.4.2 PIN Description HT12D

VDD and VSS: This pin are used to provide power to the IC, Positive and Negative
of the power supply respectively

DIN: This pin is the serial data input and can be connected to a RF receiver output.

A0 – A7: This are the address input. Status of these pins should match with status of
address pin in HT12E (used in transmitter) to receive the data. These pins can be
connected to VSS or left open

D8 – D11: This are the data output pins. Status of these pins can be VSS or VDD
depending upon the received serial data through pin DIN.
VT: stand for Valid Transmission. This output pin will be HIGH when valid data is
available at D8 – D11 data output pins.

OSC1 and OSC2: This pin are used to connect external resistor for internal oscillator
of HT12D. OSC1 is the oscillator input pin and OSC2 is the oscillator output pin

L293D (Motor Driver): The L293D is a Motor Driver IC it allows the motor to drive
on both direction. L293D is a 16-pin IC with eight pins, on each side, dedicated to the
controlling of a motor which can control a set of two DC motors at a same time in any
direction. With one L293D we can control 2 dc motors ,There are 2 INPUT pins,
2 OUTPUT pins and 1 ENABLE pin for each motor. L293D consist of two H-bridge.
H-bridge is the simplest circuit for controlling a low current rated motor.

5.4.3 PIN Description of L293D

Pin 1 — Enable pin for motor 1 [Enable 1 ]


Pin 2 — Input pin 1 for Motor 1 [Input 1 ]
Pin 3 — Output pin 1 for Motor 1 [Output 1]
Pin 4,5,12,13 — Ground [ GND ]
Pin 6 — Output Pin 2 for Motor 1 [ Output 2 ]
65
Pin 7 — Input pin 2 for motor 1 [ Input 2 ]
Pin 8 — Power supply for motors(9-12v) [Vcc]
Pin 9 — Enable pin for motor 2 [ Enable 2 ]
Pin 10 — Input pin 1 for motor 1 [Input 3 ]
Pin 11 — Output pin 2 for motor 1 [Output 3]
Pin 14 — Output 2 for motor 1 [ Output4 ]
Pin 15 — Input 2 for motor 1 [ Input 4 ]
Pin 16 — supply voltage ; 5V [ Vcc1 ]

5.5 Choose the right motor


Choosing a motor is very important and it totally depends on the type of robot (car)
you are making if you are making a smaller one use 6v Bo motor If you are making a
larger one which need to carry heavy load then use an 12v dc motor

5.6 Choose the right RMP for motor


I have 12V 300 RPM motor RPM, which stands for revolutions per minute, is the
amount of times the shaft of a DC motor completes a full spin cycle per minute. A full
spin cycle is when the shaft turns a full 360°. The amount of 360° turns, or
revolutions, a motor does in a minute is its RPM value. You should be very careful
while choosing the rpm don’t choose motors of higher rpm cause i will be difficult to
control it and remember SPEED IS INVERSELY PROPORTIONAL TO
TORQUE

5.7 DEBUGGING (OPTIONAL, if there is problem with circuit):


In this session i will be discussing on debugging the circuit First of all don’t be angry
just keep calm for debugging we will split the circuit in different First we will be
debugging the L293D IC.

5.7.1 L293D IC
Place the IC on a bread board and give 5v and Gnd to the IC and then give the 12v to
pin 8. connect the enable pins of the motors to 5v. Now, give power to the input of
one motor and check the output pins with a multimeter. If it shows nothing then there
is problem with you motor driver.

5.7.2 POWER SUPPLY


Most of the problems arise in the power supply circuit is due to short circuit so for
checking power off the circuit and use a multimeter to check whether there is any
connection between Negative and positive
For debugging the decoder and encoder IC connect pin 7 of HT12E to pin 14 of
HT12D, connect push buttons at pin 10,11,12,13 of HT12E and connect 4 leds at pin
10,11,12,13 of the decoder (connect as per Decoder and Encoder Debugging circuit,
The leds should light up when switches are pressed If your robot is still not working
then there would be problem with the RF module and so try replacing the module.

66
5.8 What gestures will the robot recognize?
This robot is designed for recognizing five sets of gestures: forward, backward, left,
right and stop. You will get a better idea if you check the photos of the gestures given
below.

Stop flat

Right flat

67
Left flat

Forward flat

68
Backward flat

Enjoying driving automotive.

69
5.9 Testing and Evaluation
As the project moved towards completion, the individual parts were tested before
integratitng them into the rest of the system. At the end of each iteration, integration
tests were performed and at the end of the project, full system tests were performed.
Therefore, testing was a continuous process during the implementation and it aided in
the early identification of any problems or bugs.

This chapter outlines the set of tests that were carried out during development in order
to gauge the performance of the system.

5.10 Automotive
The car was first tested without an Arduino attached to it. It was seen that the car
performed moves in the forward direction, by default, at it's maximum possible speed.

Next, an Arduino Uno was connected to the car via the ribbon wire attached to the
modified chip of the car. All possible motions were then tried individually on the car.
It was then seen that the car behaved erratically and did not follow the given
commands for all motions successfully at all times. A discrepancy was found in the
reactions to the same command and this fault was accounted to loose connections in
the driver chip of the car. Therefore, as a solution, the faulty car was replaced by a
properly functioning model.

The new car successfully followed all commands and performed motions in the
forward, backward, diagonally forward and diagonally backward directions in a
consistent manner. It also performed spinning on the spot.

5.11 Gesture Design


To test the performance of the gesture recognition system, 5 people were made to try
the different gesture designs and feedback was collected from them. The reactions
received suggested that the skeleton tracking system in it's current form was not very
user-friendly. It resulted in fatigue and there was a lack of clear boundaries between
different gestures. However, the hand tracking gesture system was received with
much appreciation owing to the simplicity of design and intuitiveness. Improvements
were made on the spacing and size of the buttons on the control pad based on the
accuracy reached by users when using the system. The GUI was made more
comprehensible with the use of different primary colors to suggest different states of a
button, and the inclusion of supportive text to suggest the command respective to each
button on the control pad.

5.12 Wired Implementation:


After individually testing the car and gesture systems, they were integrated together
with Arduino receiving power via a USB connection from the computer. Integration
tests were performed to check the performance. It was seen that there was a delay
between the recognition of the gesture and the respective movement of the car.

70
This lag was accounted to the delay that was induced to control speed of the car
through PWM. Initially, for forward motion of the car, a delay of 60ms was induced
after a forward high signal was given which was followed by a break signal for 80ms.
This was in a loop. Therefore, each time a change in the gesture command was
performed, a very noticeable delay of more than 80ms or 60ms was felt before the
respective motion for the new command could be performed by the car.

A solution to this problem was found by reducing the values of the induced delays
such that the speed is controlled and the interim delay time between motions is also
made unnoticeable. The final delay values settled on 15ms of forward motion
followed by stalling of the car for 5ms.

A considerable amount of lag was also reduced by removing the unnecessary debug
information which was being printed on the Serial monitor of the Arduino IDE during
the development procedure.

5.13 Wireless Implementation


The completion and testing of a successful wired implementation was followed by the
attempt to make the car wireless. Followed by the configuration of the NRF wireless
modules as coordinator and end device, both modules were installed on the Arduino
side using an Arduino shield and on the computer side using a USB explorer. The
Serial library was still being used for communication between the Arduino and the
gesture code on Processing.

After having eliminated the lag from the wired implementation of the system earlier, a
reintroduction of a lag or delay was seen with the use of the wireless modules. In
order to eliminate this delay, the following attempts were made:
NRF receives a continuous data stream via Serial communication and therefore it does
not wait long before packetizing and transmitting the data. The 'Packetization timeout'
(RO) is a measure of the amount of time the NRF will wait before transmitting the
data. Since, in this software design, only two characters need to be transmitted at once
the RO value can be kept small to reduce the transmission delay time. At a baud rate
of 9600 bps, approximately 1ms is required for transmission of one character.
Therefore, the default value of RO was reduced from 3 to 1 to reduce that delay.

However, this did not prove to be a solution and a delay was still being experienced
A few more possible solutions were tried next which included setting a slower or
faster baud rate ranging from 4800 bps to 115200 bps, changing the configuration of
the wireless modules from coordinator and end device to coordinator and router and
changing the operation mode of the wireless modules from the simpler transparent
mode (AT) to the more reliable and complicated API mode. These measures were also
implemented without much success and the delay remained.

Finally, the delay was accounted for by the 6 V battery which was powering the
Arduino, the NRF end device connected to the Arduino and the car. This voltage was
not enough for the smooth functioning of the three devices together and hence, a new
9 V was introduced to power the Arduino and the accelerometer module while the 6 V
was kept for the car. This resulted in the successful elimination of the delay.

71
5.14 Range of wireless
This test was performed to test the range of the wireless modules. Since, there were
no provisions to allow testing of the module over very long distances outdoors, the
testing was done mostly indoors in large rooms. It was seen that NRF modules had a
comfortable range of more than 1Km indoors.

5.15 Battery
In this test, the car was run using a fully powered battery until the battery was
exhausted. It was seen that the battery lasted for 28 minutes before it reached a state
of exhaustion.

72
Chapter#6
Conclusion

73
6 Conclusion
Over the past few years, gesture recognition has become increasingly popular due to
its frequent appearance in the entertainment and gaming markets. Now, it is rapidly
becoming a commonplace technology, allowing humans and machines to interface
more easily in the home, work and automobile environments. It has become easy to
imagine a person sitting on a couch, controlling the lights and TV with a wave of a
hand. This and other capabilities are being realized as gesture recognition
technologies enable natural interactions with electronics that surround us.

Gesture recognition has varied applications ranging from sign language recognition,
such as PSLT app (Portable Sign Language Translator) which recognizes and
translates sign language into text, to medical rehabilitation and virtual reality. With
the extensive research and documentation available on gesture recognition it is easy to
overlook the challenges associated with the use of the technology. One prominent
problem associated with gesture recognition is the existence of numerous underlying
assumptions which might be true in a controlled environment but cannot be
generalized to real-world scenarios. Commonly, the ambient lighting conditions and
stationary backgrounds with high contrast are assumed but in reality this is not the
case. Also, the subjective nature of the use of this technology is often ignored and
generalizations are presented based on work by a few researchers.

Realizing the nascent stage of this technology, it is evident that gesture recognition
technology has come a long way especially for vision based methods but it remains
that extensive further research into areas of gesture classification and representation is
much needed to realize the full potential of this technology.

6.1 Further Work


If this project was to be continued further the following implementations could be
attempted:
 Proximity sensor could be successfully implemented resulting in a collision-free
and more efficient motion of the car.
 A wireless camera could be mounted on to the car and the images it captured
could be sent as feedback to the Computer wirelessly.
 The gesture design could be improved, based on user feedback and instead of
just one design, a variety could be provided to the users based on their
preference.
 A more comprehensive testing regime could be conducted to enable further
improvement in the functioning of the system.

74
6.2 Summary
Overall, the project was successful and the end result was a fully and consistently
functioning automotive, controlled by hand gestures. In the process designing a
gesture recognition and tracking system, extensive research was undertaken. In
conclusion, the requirements outlined at the beginning of the project were realized
and the objectives were fulfilled.

75
Chapter# 7
Coding

76
7 Coding
7.1 Sender code

#include <SPI.h>
#include <nRF24L01.h>
#include <RF24.h>

RF24 radio(7, 8);

const byte rxAddr[6] = "00001";


const int xpin = A2; // x-axis of the accelerometer
const int ypin = A1; // y-axis

// the setup function runs once when you press reset or power the board
int sx,sy=0;

void setup()
{
radio.begin();
radio.setRetries(15, 15);
radio.openWritingPipe(rxAddr);

radio.stopListening();

// initialize digital pin 13 as an output.


Serial.begin(9600);
sx=analogRead(xpin);
sy=analogRead(ypin);

Serial.println();
}

void loop()
{

int potValue =0;

int xval =analogRead(xpin);

int yval=analogRead(ypin);

if(xval>370)
{

potValue=1;

}
else if(xval<320)
{

potValue=2;

77
}
else if(yval<300)
{

potValue=3;

}
else if(yval>360)
{

potValue=4;

// const char text[] = stringOne;


radio.write(&potValue, sizeof(potValue));
// radio.write(stringOne );

delay(100);
}

78
7.2 Receiver code
#include <SPI.h>
#include <nRF24L01.h>
#include <RF24.h>

RF24 radio(7, 8);

const byte rxAddr[6] = "00001";

void setup()
{
while (!Serial);
Serial.begin(9600);

radio.begin();
radio.openReadingPipe(0, rxAddr);

radio.startListening();

}
int angleV;
void loop()
{

if (radio.available())
{
angleV = 0;
radio.read(&angleV, sizeof(angleV));

if(angleV==0){
stopm();

}else if(angleV==1){
leftm();

}else if(angleV==2){

rightm();
}
else if(angleV==3){
forword();

}
else if(angleV==4){
backword();

}
else{

stopm();
}

Serial.println(angleV);

79
}
Serial.println(angleV);
}
void backword()
{
digitalWrite(9,HIGH) ;
digitalWrite(10,LOW) ;
digitalWrite(5,HIGH) ;
digitalWrite(6,LOW) ;

}
void forword()
{

digitalWrite(9,LOW) ;
digitalWrite(10,HIGH) ;
digitalWrite(5,LOW) ;
digitalWrite(6,HIGH) ;

}
void rightm()
{
digitalWrite(9,LOW) ;
digitalWrite(10,LOW) ;
digitalWrite(5,LOW);
digitalWrite(6,HIGH) ;

}
void leftm()
{

digitalWrite(9,LOW) ;
digitalWrite(10,HIGH) ;
digitalWrite(5,LOW);
digitalWrite(6,LOW) ;

}
void stopm()
{

digitalWrite(9,LOW) ;
digitalWrite(10,LOW) ;
digitalWrite(5,LOW);
digitalWrite(6,LOW) ;

80

You might also like