Professional Documents
Culture Documents
BACHOLAR OF SCIENCE
IN
COMPUTER SCIENCE
Distance Learning Education
October, 2018
DECLARATION
The work reported in the respective project is carried out by Sonia Anam
Roll# 15134 and Anmol Anwar Roll# 15131 under the supervision of
Mr. Umer Sarwar, Department of Computer Science Government College
University, and Faisalabad, Pakistan.
We hereby declare that the “Controlling Automotive Using Hand Gesture
Control System” and the contents of project is the product of our own research.
We further declare that this work has not been submitted for award of any other
degree / diploma. The University may take action if the information provided is
found in accurate at any stage.
2
CERTIFICATE BY THE PROJECT SUPERVISOR
This is to certify that Sonia Anam Roll No. 15134 and Anmol Anwar Roll No.
15131 have successfully completed the final project named as: Controlling
Automotive Using Hand Gesture Control System, at the Department of
Computer Science, Government College University Faisalabad, to fulfill the partial
requirement of the degree of BS (Computer Science).
Internal Examiner
Name: _____________
Signature: _____________
External Examiner
Name: _____________
Signature: _____________
Chairperson
3
Acknowledgement
SONIA ANAM
2014-GCUF-06934
ANMOL ANWAR
2014-GCUF-06931
4
Table of Contents
Table of Contents ........................................................................................................... 5
CHAPTER #1 ........................................................................................................ 9
1 Introduction .......................................................................................................... 14
1.1 Background and Motivation:......................................................................... 15
1.2 Review of literature ....................................................................................... 16
1.3 Statement of purpose ..................................................................................... 16
1.4 TECHNICAL REQUIREMENTS ................................................................ 16
1.4.1 A. Microcontroller ................................................................................. 17
1.4.2 B. Mechanical Components ................................................................... 17
1.4.3 C. Bluetooth Module .............................................................................. 17
1.4.4 D. Android Smartphone ......................................................................... 17
1.5 SYSTEM DESCRIPTION ............................................................................ 17
1.6 Block Diagram of the system ........................................................................ 18
1.7 Transmission of data from Android application to Motor driver .................. 18
1.8 Receiving the data ......................................................................................... 19
1.9 Gesture Recognition ...................................................................................... 19
1.10 Movement of Motors and Wheels ............................................................. 21
1.11 FUTURE SCOPE AND CONCLUSION .................................................. 22
CHAPTER #2 ...................................................................................................... 24
2 Design .................................................................................................................. 25
2.1 Block Diagram .............................................................................................. 25
2.2 Physical Design Diagram .............................................................................. 26
2.2.1 Physical Design Diagram of Glove........................................................ 26
2.2.2 Physical Design Diagram of Car............................................................ 27
2.3 Block Design ................................................................................................. 27
2.3.1 Power Module A .................................................................................... 27
2.3.2 Battery A ................................................................................................ 27
2.3.3 Linear Voltage Regulator A ................................................................... 28
2.3.4 Sensor Module A ................................................................................... 28
2.3.5 Flex sensor ............................................................................................. 29
2.3.6 Accelerometer ........................................................................................ 30
2.3.7 Control Module A .................................................................................. 32
2.3.8 Microcontroller A .................................................................................. 32
2.3.9 Communication Module ........................................................................ 33
2.3.10 Transmitter Device................................................................................. 34
2.3.11 Receiver Device ..................................................................................... 35
2.3.12 Power Module B .................................................................................... 35
5
2.3.13 Battery B ................................................................................................ 35
2.3.14 Linear Voltage Regulator B ................................................................... 36
2.3.15 Sensor Module B.................................................................................... 36
2.3.16 Ranging Sensor ...................................................................................... 37
2.3.17 Control Module B .................................................................................. 37
2.3.18 Microcontroller B................................................................................... 37
2.3.19 Motion Module ...................................................................................... 38
2.3.20 Motors .................................................................................................... 38
2.3.21 Power MOSFET (RFP12N10L) ............................................................ 39
2.3.22 Wheels.................................................................................................... 40
2.3.23 Circuit Schematics ................................................................................. 41
2.3.24 Control Unit Flowchart .......................................................................... 41
2.4 Tolerance Analysis ........................................................................................ 43
2.5 Safety and Ethics ........................................................................................... 45
CHAPTER #3 ........................................................................................................ 1
3 UML Diagrams ...................................................................................................... 1
3.1 ER Diagram ..................................................................................................... 1
3.2 Class Diagram ................................................................................................. 2
3.3 Use case Diagram ............................................................................................ 3
3.3.1 Gesture Recognition................................................................................. 3
3.3.2 Processing Narrative ................................................................................ 3
3.3.3 Dynamic Behaviour ................................................................................. 4
3.4 Sequence Diagram........................................................................................... 4
3.4.1 Recognition Gesture................................................................................. 5
3.4.2 Interface Description ................................................................................ 5
CHAPTER #4 ........................................................................................................ 6
4 Hardware Used & Technical Specifications ........................................................ 48
4.1 Required Components ................................................................................... 48
4.2 Arduino UNO ................................................................................................ 48
4.2.1 Sender .................................................................................................... 48
4.2.2 Receiver: ................................................................................................ 49
4.3 DC Motors ..................................................................................................... 49
4.4 Accelerometer: .............................................................................................. 50
4.4.1 Connect to the Power: ............................................................................ 50
4.4.2 Connect the X,Y and Z Signal Outputs: ................................................ 50
4.4.3 Using the Voltage Reference: ................................................................ 51
4.4.4 Schematic: .............................................................................................. 52
6
4.5 NRF Pair:....................................................................................................... 52
4.5.1 Pin Description of Accelerometer:......................................................... 53
4.6 Motor Driver ................................................................................................. 54
4.7 Circuit Diagram and Explanation .................................................................. 54
4.8 Working......................................................................................................... 54
4.9 Implementation.............................................................................................. 56
4.9.1 Initial automotive movement ................................................................. 56
4.9.2 Initial gesture implementation ............................................................... 57
4.9.3 Final design ............................................................................................ 58
4.10 Controlling Speed of Automotive.............................................................. 58
4.11 Complete Outcome Structure of Hand Gesture: ........................................ 59
4.12 Complete Outcome Structure of Automotive: ........................................... 59
CHAPTER #5 ...................................................................................................... 60
5 Results and Analysis ............................................................................................ 61
5.1 How does it work and recognize the gesture? ............................................... 61
5.2 Make the power supply ................................................................................. 61
5.3 Start Making the Transmitter (Remote) ........................................................ 62
5.4 Make the Receiver:........................................................................................ 64
5.4.1 HT12E PIN Description:........................................................................ 64
5.4.2 PIN Description HT12D ........................................................................ 65
5.4.3 PIN Description of L293D ..................................................................... 65
5.5 Choose the right motor .................................................................................. 66
5.6 Choose the right RMP for motor ................................................................... 66
5.7 DEBUGGING (OPTIONAL, if there is problem with circuit):.................... 66
5.7.1 L293D IC ............................................................................................... 66
5.7.2 POWER SUPPLY .................................................................................. 66
5.8 What gestures will the robot recognize? ....................................................... 67
5.9 Testing and Evaluation .................................................................................. 70
5.10 Automotive ................................................................................................ 70
5.11 Gesture Design .......................................................................................... 70
5.12 Wired Implementation: .............................................................................. 70
5.13 Wireless Implementation ........................................................................... 71
5.14 Range of wireless....................................................................................... 72
5.15 Battery ....................................................................................................... 72
Chapter#6 ............................................................................................................. 73
6 Conclusion ........................................................................................................... 74
6.1 Further Work ................................................................................................. 74
7
6.2 Summary ....................................................................................................... 75
Chapter# 7 ............................................................................................................ 76
7 Coding .................................................................................................................. 77
7.1 Sender code ................................................................................................... 77
7.2 Receiver code ................................................................................................ 79
8
Project Abstract:
Nowadays use of robots is increased in large amount. This paper presents a algorithm to
control the robotic car using hand gesture recognition. This system is used for human
computer interaction (HCI). The human can control the robotic car by using gesture of his/her
palm. Webcam is used to capture the hand movement of the user. Image processing is used to
detect the gesture perform by the user. The command signals are generated from these
gestures. These signals are passed to the robotic car to move in the direction specified by the
user. This way we have created a system in which user can control the robotic car by his/her
hand gesture wirelessly on windows7 platform by using java and embedded C technologies.
8
CHAPTER #1
INTRODUCTION
9
1 Introduction
Humans interact in the physical world by the means of the five senses. However,
gestures have been an important means of communication in the physical world from
ancient times, even before the invention of any language. In this era of machines
taking control of every complex works, interactions with machines have become more
important than ever. Robots are classified into two types: Autonomous robots like
Line sensing or edge sensing robots, and Remote controlled robots like Gesture
controlled Robots. Since this paper deals with gesture controlled robot, the primary
focus will be on the remotely controlled robots only. Undoubtedly, the output and the
functioning of machines will be more intuitive if they are communicated using human
gestures.
A gesture is a form of communication in a non-verbal manner by using visible body
movements or actions conveying messages. There are several ways to capture a
human gesture that a machine would be able to understand. The gesture can be
captured using a camera, or a data glove. Gestures can also be captured via Bluetooth
or infrared waves, Acoustic, Tactile, optical or motion technological means. The
embedded systems designed for specific control functions can be optimized to reduce
the size and cost of the device, and increase the reliability and performance. With the
advent of Smartphone and other modern technologies, operating machines have
become more flexible. The Smartphone are equipped with in-built accelerometer
which may be used for gesture recognition and such other tasks. Moreover, the
Android OS is gaining significant popularity in the world of Smartphone due to its
open architecture. Android platform is being used in the development of numerous
applications for cell-phones. Researchers have shown interest in gesture recognitions
and have built several robots and devices that are controlled by human gestures. There
is a constant development in the field of gesture controlled devices. Apart from hand
gesture recognition, emotional gesture recognition from face is also done in some
cases. There are two types of gestures used in gesture recognition: Online gestures
and Offline gestures. In Online gestures, direct manipulations like rotation and scaling
are done. In Offline gestures, the processing is done only after the user interacts with
the object. Gesture technologies are applied in several fields like in Augmented
Reality, Socially assistive Robots, recognition of sign languages, emotion detection
from facial expressions, Virtual mouse or keyboard, recognition of sign languages,
remote control, etc. There are various modes of communication between the
microcontroller of the robot and the Smartphone. However, the popularly used means
of communication is done via RF, Bluetooth or Wi-Fi. Using RF limits the distance
from which the robot can be controlled. Using Wi-Fi increases the overall cost for
setup. So, the robot has been built with Bluetooth which has intermediate range of
distance covered and cost between RF and Wi-Fi.
In this paper, the Arduino microcontroller is incorporated in the robot for the main
computation and the main communication between all the modules. Then there is a
motor driver that deals with the computation and functioning of the motors to turn the
wheels essential for the movement of the robot. Last, but not least, a Bluetooth
module is incorporated in the robot that serves as the means of receiving the data from
the Smartphone which is processed in the Arduino to detect the direction of
movement of the user‟s hand and move the robot accordingly. The prime aim of the
design is that as the user moves his hand in some direction, the robot moves in the
same direction as well. In other words, the robot is solely controlled by the hand
movements and gestures of the user. The goal of this paper is to develop a method to
14
control and program a robot with gestures and assure high level of abstraction, cheap
and minimal hardware and a simplified robot programming.
Some of the related works are being described in this section: A. Light-based Gesture
Recognition Light or illumination tracking and controlling robots with light sensors
are being done in a lot of cases. Such robots are autonomous in nature. Generally,
there are some light sensors associated with the robot. The sensors send some rays of
light and track them as they gets absorbed in the surface or reflected back to it.
According to this, the robot can be line-sensing robots where it is made to follow a
black or a white path autonomously.
D. Sixth Sense Technology the Sixth sense technology begins in 1990 by Steve Mann
who implemented a wearable computing device via neck projector or head-mounted
projector coupled with a camera. Later, following his idea, Pranav Mistry, a young
research scientist at MIT at that time came up with new applications of this
technology. Pranav Mistry came up with the name „Sixth Sense Technology” and has
since been named Wear Ur World (WUW). This technology applies all of the
techniques mentioned above and designing applications that give an intuitive output
with the connection of internet.
15
1.2 Review of literature
Stefan Waldherr, Roseli Romero, Sebastian Thrun describes a gesture interface for the
control of a mobile robot equipped with a manipulator. The interface uses a camera to
track a person and recognize gestures involving arm motion. A fast, adaptive tracking
algorithm enables the robot to track and follow a person reliably through office
environments with changing lighting conditions. Two alternative methods for gesture
recognition are compared: a template based approach and a neural network approach.
Both are combined with the Viterbi algorithm for there cognition of gestures defined
through arm motion (in addition to static arm poses). Results are reported in the
context of an interactive clean-up task, where a person guides the robot to specific
locations that need to be cleaned and instructs the robot to pick up trash.
In this work, there are three fundamental compnents are required as shown in the
figure 1
16
1.4.1 A. Microcontroller
The first requirement for the design of the robot is the microcontroller. In this work,
Arduino ATMEGA 328 has been used. It is an open-source electronics prototyping
platform with 14 digital I/O pins, 6 analog inputs, 16 MHz crystal oscillator, a USB
connection, a power jack, an ICSP header and a reset button. In figure 1, it is marked
with 1
17
1.6 Block Diagram of the system
18
1.8 Receiving the data
The data is received from the Android Smartphone via HC-05 Bluetooth module on
the digital pins of the Arduino microcontroller. It is then processed in Arduino. This
processed data is received by the Adafruit motor shield. Based on the data.
19
Figure 5: Gestures for movement of the robot
Figure 5 shows the gestures to control the movement of the robot. When the user tilts
his hand forward, the gesture is recognized as the forward movement, and the robot
moves in the forward direction. The angle of the tilt or the difference between the
angle of tilt of user‟s hand and the threshold value of forward movement gesture
determines the speed of the robot. When the user tilts his hand on the right direction,
the gesture is recognized as the right turn, and the robot moves in the right direction.
When the user tilts his hand in the left direction, the gesture is recognized as the left
turn, and the robot moves in the right direction. The angle of the tilt of user‟s hand
determines whether the left or right turn is a normal turn or a sharp turn. A sharp turn
is one in which a car changes direction without slowing down before turning. When
the user tilts his hand backwards, the gesture is recognized as the move backward
gesture, and the robot moves in the backward direction. If the user‟s hand is
somewhere between the two gestures, i.e., the accelerometer value is somewhere
20
between the threshold of two directions(forward and left turn, left turn and backwards,
backwards and right turn, forward and right turn), then the robot moves in that
diagonal direction.
21
Figure 6: Movement of the Motors and Wheels
22
thought, the system will allow the user to control it in a way that reduces the gap
between the physical world and the digital world with an output more intuitive.
23
CHAPTER #2
DESIGN
24
2 Design
Car must response fastly and accurately to every given gesture. When the user
gives command through the glove, the car will be able to respond within 1
second.
Glove must be relatively comfortable to wear and make gestures in it. The size
of the glove will be between 8.5~9 inches. With all the electrical components
installed on the glove, it should still fit the size of most people’s hand.
Communication between car and glove must be easy to set up while using.
There are 2 separate switches on the glove and the car. When they are both
turned on, the communication starts.
25
2.2 Physical Design Diagram
26
2.2.2 Physical Design Diagram of Car
2.3.2 Battery A
A 12V Energizer A23 battery will be used here to provide power. The battery itself
only weighs 0.8 ounces, with a dimension of 2.9x1.8x0.5 inches. Adding up the
power consumption and current flow of each individual component on the glove, the
maximum total current will be 18.97mA and the maximum power will be 67.43mW.
Therefore, one battery will be sufficient to provide enough power for 3-4 hours.
Requirements Verifications
1. Must be able to provide a voltage of 1.
7-12V under continuous operation a. Before using the glove every time,
(Minimum input voltage for regulator is use a voltmeter to measure the
27
7V) voltage level of batteries.
b. If voltage level is dropping close to
7V, meaning the battery is about to
be dead, change to a new battery
2. Must stay under 60C under all 2.
conditions a. Use hand to feel the temperature, if
fingers cannot stay on battery over
1sec, take the battery out and let it
cool down
Power Consumption:
The operating voltage is 4.8-5.2V, and the current is 5mA-1A. Therefore, the power
will be 0.024-5.2W.
Requirements Verification
1. Provides 5V+/- 5% from a 12V source 1.
at 1A output current a. Use a constant-current circuit, draw
1A current from this linear voltage
regulator
b. Use oscilloscope to monitor output
voltage, ensuring it’s within the 5V+/-
5% range.
2. Maintain temperature under 125C 2.
a. When doing the step above, use an
IR thermometer to ensure the IC
temperature stays under 125C.
28
2.3.5 Flex sensor
Three flex sensors will be used to sense the curl of the fingers (This gesture controls
the drive/stop as well as the speed of the car). They will be attached to the back of the
three longest fingers. A flex sensor, essentially a variable resistance from 30kOhm to
130kOhm, must be able to control the range of voltage going into the microcontroller
ADC pin to be between 1.389V to 3.125V. Below is a circuit diagram of feeding the
voltage related to the bend of a flex sensor into a microcontroller (denoted as MP in
the diagram).
For component testing, we built the above circuit and used Arduino board to read in
the analog data for several bending degrees for each of the three flex sensors we have.
Below is a table of the bending degree-vs-ADC readings.
ABC Reading
Bending Degree Flex Sensor A Flex Sensor B Flex Sensor C
0 degrees 600 630 618
30 degrees 470 470 466
60 degrees 400 408 398
90 degrees 350 356 354
Requirements Verifications
The input voltage range to the a. Like above, connect a voltmeter across
microcontroller pins (shown in the circuit the 50kOhm resistor. (Flex sensor is
diagram above) must vary from 1.389V stretched now) Ensure voltage is around
29
to 3.125V 3.125V
b. Gradually bend the flex sensor, ensure
the voltage is gradually decreasing
c. When the flex sensor is bent to its
limit, ensure voltage is around 1.389V
2.3.6 Accelerometer
An accelerometer will be used to sense the orientation/tilt angle of the glove which
controls the turning of the car. The model we will use is ADXL 335 3-Axis
Accelerometer. It has ultra low power consumption (350 μA typically) and very small
size. The range of power supply is 1.8-3.6V. The accelerometer has three analog
output pins that will be connected to the analog input pins of the ATmega328 chip.
This is how the sensor data will be transferred to the microcontroller for further
processing. The three analog outputs of the accelerometer-denoted by Ax,out , Ay,out
, Az,out-can be converted to three tilt angles (angles are with respect to the three
sensing axes) using the formulas below:
30
Figure 5. Accelerometer on a flat surface Figure6. Accelerometer at an Tilted Orientation
Power consumption:
typical operating power: 350 uA at 3V power supply (1.05mW), maximum power:
375 uA at 3.6 V(1.35mW), minimum: 200 uA at 2V (0.4mW)
Requirements Verification
1. ADXL335 must be able to sense and 1.
transfer its output at a minimum a. Connect the ADXL335 with an
frequency of 100 Hz Arduino Uno
31
b. setup properly and write codes to
receive real-time output data from the
accelerometer
c. Use the timing functionalities in
Arduino to calculate the frequency of the
ADXL335 output data, and make sure it’s
above 100 Hz
2. Its analog voltage output must change 2.
at least 100mV per 15 degrees of rotation a. Connect the ADXL335 to an
around any axis oscillascope
b. Manually rotate the chip around any
axis at steps of 15 degrees
c. Record the corresponding axis’s output
voltage value, make sure it changes at
least 100 mV
d. Repeat for a total rotation of ∓90
degrees for all three sensing axes
2.3.8 Microcontroller A
We will use Atmega328 microcontroller for the control module on the glove. The SPI
serial port on Atmega328 will handle the SPI communication with the transmitter
device. The 6-channel 10-bit A/D converter will be able to convert the analog inputs
from the sensor modules to digital signals. After bootloading the Atmega328 chip, we
will build an Arduino circuit which enable us to program the microcontroller through
Arduino Software (IDE) and send the program to the memory on the Atmega328 via
USB cable.
32
Figure 7. Pin map of Atmega328
Power Consumption:
If an 8MHz (internal) oscillator is used, the operating voltage is 3.43V and the current is
3.6mA. Therefore, the power will be 12.3mV.
If an 8MHz (internal) oscillator is used, the operating voltage is 3.3V and the current is
6.6mA. Therefore, the power will be 21.8mV.
Requirements Verification
The Atmega328 chip must be bootloaded Before pulling the Atmega chip out of the
before using. Arduino board, make sure it has been
programmed several times.
The Vcc has the ratings of 1.8V to 5 V, Before connecting any component to the
maximum 6V. The voltage on the A/D microcontroller, make sure to measure its
pin with respect to ground can take the voltage using the multimeter to check if
range from -0.5V to Vcc+0.5V. it’s in the acceptable range.
33
2.3.10 Transmitter Device
We will use an NRF24L01 transceiver module to wirelessly transmit real-time data
from the glove side to the car side. It will pair with another identical NRF24L01
transceiver module on the car (set as receiver) to form the communication. The
transceiver interfaces with the microcontroller ATmega328 through the digital SPI
communication. The maximum SPI speed is 8 Mbps and the in-the-air data rate is 1 or
2 Mbps, both of which are fast enough. The range of operation is up to 100 m, which
is enough for our project. During normal operation, the microcontroller will
continuously send real-time data to the transceiver for it to transmit to the receiver on
the car. There is an RF24 library associated with the transceiver available online,
which will make the programming of the wireless data transfer less difficult. The
operating voltage of NRF24L01 is 1.9-3.6 V. The current consumption is very low,
only 9.0mA at an output power of -6dBm and 11.3mA at an output power of 0 dBm.
Another advantage of this transceiver is its small physical size.
Power consumption:
The operating voltage is 1.9-3.6 V. The supply current for one channel at 1000 kps is
11.8mA, and at 2000 kps is 12.3 mA. Thus the power consumption range is 22.42-
44.28 mW.
Requirements Verification
1. It must be able to receive the custom a. Connect one NRF24L01 transceiver to
data packages from the microcontroller, an ATmega328 microcontroller
and transmit them to the receiver b. Set the transceiver as transmitter
NRF24L01 15m away, at a minimum rate c. Connect another NRF24L01 to another
of 100 packages per second. ATmega328, placed at 15m away from
the transmitter
d. set it as receiver
e. Through ATmega328 programming
interface, send custom data packages to
the transmitter at a rate of 120
packages/second
f. Through ATmega328 programming
interface at the receiving end, record the
number of packages received in one
second. Make sure at least 100 packages
are received by the receiver and sent to
the microcontroller
2. The VCC pin on the transceiver 2.
module should connect to 3.3 V supply. a. Before connecting the VCC, use
multimeter to measure the voltage to
make sure it is 3.3 V.
34
2.3.11 Receiver Device
For the receiver device, we will use another NRF24L01 transceiver module by setting
it as receiver. It will continuously receive the real-time data sent by the transmitter
NRF24L01, and then send it to the microcontroller B on the car. The communication
between the receiver and microcontroller B is the same as that between the transmitter
and microcontroller A, except that the direction of the data transfer is reversed. The
specs of the receiver is also the same as that of the transmitter, since they are two
identical NRF24L01 modules.
Requirements Verification
1. The receiver must receive data a. Do the same steps as in the verification
packages and send them to the for the transmitter, except this time, send
microcontroller with minimum 95% 100 packages only once from the
accuracy (compared with the original data transmitter end
packages at the transmitting end) b. Through the ATmega328
programming interface at the receiving
end, record the received packages and
make sure at least 95 of them match the
original packages
2. The VCC pin on the transceiver 2.
module should connect to 3.3 V supply. a. Before connecting the VCC, use
multimeter to measure the voltage to
make sure it is 3.3 V.
Power consumption:
The operating voltage is 1.9-3.6 V. The supply current for one channel at 1000 kps is
11.8mA, and at 2000 kps is 12.3 mA. Thus, the power consumption range is 22.42-
44.28 mW.
2.3.13 Battery B
Two 12V Energizer A23 battery, directly meeting voltage requirement of two 12V
motors. The battery itself only weighs 0.8 ounces, with a dimension of 2.9x1.8x0.5
inches. It has enough power to continuously support the components on the car. Even
when batteries are losing power therefore dropping voltage level, it will still be able to
correctly operate two motors on the car.
Requirements Verification
1. Must be able to provide a voltage of 7- 1.
35
12V under continuous operation a. Before using the glove everytime, use a
(Minimum input voltage for regulator is
voltmeter to measure the voltage level of
7V) batteries.
b. If voltage level is dropping close to
7V, meaning the battery is about to be
dead, change to a new battery
2. Must stay under 60C under all 2.
conditions a. Use hand to feel the temperature, if
fingers cannot stay on battery over 1sec,
take the battery out and let it cool down
Power Consumption:
The operating voltage is 4.8-5.2V, and the current is 5mA-1A. Therefore, the power
will be 0.024-5.2W.
Requirements Verification
1. Provides 5V+/- 5% from a 12V source 1.
at 1A output current a. Use a constant-current circuit, draw 1A
current from this linear voltage regulator
b. Use oscilloscope to monitor output
voltage, ensuring it’s within the 5V+/-
5% range.
2. Maintain temperature under 125C 2.
a. When doing the step above, use an IR
thermometer to ensure the IC temperature
stays under 125C.
36
2.3.16 Ranging Sensor
We will use HC-SR04 Ultrasonic Ranging Sensor Module for distance measurement.
It will be placed at the front of the car and will face forward. It can provide stable and
accurate distance measurements from 2cm to 400cm. The ranging accuracy is up to
3mm. The working frequency is 40Hz. The measuring angle is 15 degrees. All these
specs suit our collision detection task. The sensor module will be connected with the
pins of the microcontroller B and send distance measurements to the microcontroller
for further usage. For component testing, we connected the ultrasonic sensor to an
Arduino Uno board, continuously read in the sensing data and displayed it on the
Serial Monitor. I used my hand as the object in front of the sensor and tested the
sensing accuracy for various distances. Below is a table of the actual distance-vs-
sensor output.
Actual 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
(cm)
Sensed 3 2 2 3 4 6 7 7 8 10 11 12 13 13 15 16 17 17 18 20
(cm)
Power consumption:
The operating voltage is 5V and the rated current is 15mA. Therefore, the power is 75
mW.
Requirements Verification
1. The sensor must provide the distance a. Connect the sensor with a ATmega328
measurements of an object placed in a board and setup properly
range of 30cm-2m away in front of the b. Place an object in front of the sensor
car with at least 90% accuracy. starting from 30cm away and ending at
2m with steps of 10cm.
c. At each step, record the distance
measurement by the sensor
d. Calculate the accuracy and make sure
it’s above 90% for all the steps
2.3.18 Microcontroller B
We will use Atmega328 microcontroller for the control module on the car. The SPI
serial port on Atmega328 will handle the SPI communication with the receiver
device. The Port D (PD [7:0]), which is an 8-bit bi-directional I/O port with internal
pull-up resistors, will be responsible for the analog PWM output for the motors on the
37
car. After boot loading the Atmega328 chip, we will build an Arduino circuit which
enable us to program the microcontroller through Arduino Software (IDE) and send
the program to the memory on the Atmega328 via USB cable.
Power Consumption:
If an 8MHz (internal) oscillator is used, the operating voltage is 3.43V and the current is
3.6mA. Therefore, the power will be 12.3mV.
If an 8MHz (internal) oscillator is used, the operating voltage is 3.3V and the current is
6.6mA. Therefore, the power will be 21.8mV.
Requirements Verifications
The Atmega328 chip must be bootloaded Before pulling the Atmega chip out of the
before using. Arduino board, make sure it has been
programmed several times.
The Vcc has the ratings of 1.8V to 5 V, Before connecting any component to the
maximum 6V. microcontroller, make sure to measure its
voltage using the multimeter to check if
it’s in the acceptable range.
2.3.20 Motors
Two PAN14 motors will be used to drive our car. PAN14 motor is one of the cheapest
but high-performance motors. One PAN14 motor weighs only 39g. It has a rated
voltage of 12V. Under its rated load(4.9mN*m), it has rated load current of 0.563A.
The motors will be controlling the speed of the wheels. A PWM signal generated by
microcontroller is sent to the RFP12N10L Power MOSFET, which is connected with
a motor in series. The speed change of motors can be adjusted by changing the duty
cycle of PWM signal. The car will turn left and right with two motors spinning in
different speed. They are fed the control signals from microcontroller B and make the
individual wheels rotate accordingly.
38
Figure 8. Physical Diagram of Motor
Requirements Verification
1. Can operate with VDC 12V+/-1V 1, 2.
2. Motor current doesn’t go over 1.02A a. Connect motor with an Energizer
under any operating condition A23 battery(12V). Battery doesn’t
have to be full
b. Install wheel on the motor
c. Use a voltmeter to monitor the
battery voltage; use a current probe
to monitor the motor current
d. Ensure both of the values are within
the range
39
Figure 9. Power MOSFET Circuit Schematic
Requirements Verification
1. RFP12N10L must have a voltage drop 1.
less than 0.5V a. Connect RFP12N10L Power MOSFET
with motor and an Arduino Uno
b. Connect a volt meter across the
MOSFET
c. Run code changing duty cycle for 20%
every 2 seconds
d. Monitor voltage across the MOSFET,
ensuring it’s within the range
2. It must reacts effectively when the 2.
PWM duty cycle changes a. Keep the connection from Step 1, keep
the code running
b. Look at the motor, see if it is able to
react to every PWM duty cycle change
fastly.
2.3.22 Wheels
There will be three wheels for the car. Two front wheels will be individually powered
by two motors. The third wheel is a dummy wheel without any motor connection,
placed at the center on the rear. It can rotate all degrees freely and its main purpose is
to make the turning of the car smoother.
Requirements Verification
1. Wheels can stably stay on the motor a. Install wheels on both motors, put
under motor full speed car upside down
b. Connect motors to 12V battery,
ensure it operates in full speed. Let it
run for 60sec.
c. Check to see if wheels are still
installed tightly on the motor
40
2.3.23 Circuit Schematics
41
Figure 11. Algorithm of the microcontroller on the glove
42
Figure 12. Algorithm of the microcontroller on the car
43
The important feature of the accelerometer is its measurement sensitivity, namely the
voltage change per given amount of acceleration change. As can be seen in the plots
below, the typical sensitivity is 0.303 V/g for x-axis, 0.306 V/g for y-axis, and 0.300
V/g for z-axis.
Figure 13. X-Axis Sensitivity at 25℃, Vs = 3V Figure 14. Y-Axis Sensitivity at 25℃, Vs = 3V
For our task of hand inclination measurement, the static acceleration due to gravity
will change between +g to -g, corresponding to the hand rotation between +90
degrees to -90 degrees. We need to consider how much the sensitivities can deviate
from the expected values so that the microcontroller will still be able to obtain the
accurate hand orientation measurement. The ATmega328p microcontroller has 10-bit
ADC, which means it can do 1024 bits of resolution. As a result, it can produce a
resolution of 5V/1024 = 0.005V. If we want to be able to distinguish a change of 15
degrees tilt angle (2g/12 = g/6 change in acceleration), the minimum sensitivity of any
axis is 0.005V per g/6, which is 0.03V/g. From the large of number of samples in the
plots above, even with potential deviations, all three axes’ sensitivities are still much
larger than the minimum requirement. Another good thing here to note is that the
large difference also allows other external electrical noises and voltage level
fluctuation to exist without affecting the measuring accuracy.
44
2.5 Safety and Ethics
There are several safety issues arising from our project, both during the design
process and in the applications. During the design period, the first big safety aspect is
the usage of battery. We rely on batteries to power all other electrical components.
Well-designed circuitries to connect batteries with, the careful storage and handling of
batteries, and the prevention of overheating batteries with extended usage time are all
extremely important to the safety of dealing with batteries. Second, there are lots of
safety concerns related to building electrical components. These concerns include:
soldering safety, circuit testing with power on, and safety of usage of external devices.
To minimize the risks, we must comply with the standard safety rules. Third, there’s
some mechanical work involved in this project. There are safety issues of conducting
mechanical work as well, and we must again comply with the standard safety rules.
Finally, remote controlled mobile car also poses safety issues. We can’t guarantee the
car will be controlled as intended all the time. As a result, we need to make sure the
car will not crash with anything or anybody to cause any hazard when testing. This
can be done by choosing an empty field, or putting protective barriers on the field. All
these safety analysis and precaution/prevention efforts align with the IEEE Code of
Ethics, #1: “to hold paramount the safety...” [6]. On the application side of our
project, there is also a noticeable potential safety concern. Users sometimes might
give bad control commands due to inappropriate usage or even malicious intention. In
addition, a hacker might be able to hack into a user’s control system and make the
mobile device do ill-intended things. In such cases, the mobile device would cause
hazard, such as hitting people or destructing infrastructures. This is against the IEEE
Code of Ethics, #9: “to avoid injuring others, their property, reputation, or employment
by false or malicious action” [6]. One precaution is to build an automatic stop function
so that the device will halt when it’s too close to certain types of objects.
45
CHAPTER #3
UML Diagrams
1
3 UML Diagrams
Design Description
CAUHGCS system consists of two different components that are working together. Engine
and Interface are the mentioned components. At first the Engine part tracks a hand gesture
and sends it to the interface. Interface translates the gesture to the appropriate command and
if it is valid, the corresponding menu or submenu transition or the corresponding action will
be done. The ER diagram of the perfect cooperation of these two parts, Engine and Interface
can be seen on Figure 1.
3.1 ER Diagram
CAUHGCS system consists of two main data structures named Gesture and Menu. These
data structures are created, modified or used by the two modules of the system, Engine and
Interface. Engine part of CMDGR system tracks a hand movement or gesture and creates a
Gesture object belonging to it. Engine sends that gesture object to the interface and the
current menu or submenu, reflected on the screen by the Interface module, will be changed
according to that gesture.
1
3.2 Class Diagram
Interface module is the module that gets the gesture object from Engine module and decides
the action corresponding to the gesture. Only one main data structure will work on the
interface module: Current Menu Object. Current menu object only holds a pointer on the
current menu shown on the screen.
2
In order to be more understandable, data flow diagram of an action that invokes Call function
is shown in Figure 3.
3
3.3.3 Dynamic Behaviour
As seen in the figure below, Gesture Recognition gets the signal and creates proper gestures
if the user makes valid hand movements. In other case, that is user makes meaningless hand
movements which does not exist in hand gesture data set, no gesture is created.
4
3.4.1 Recognition Gesture
Recognition is the component that basically recognizes the hand movements for the system.
5
CHAPTER #4
Hardware Used
&
Technical Specifications
6
4 Hardware Used & Technical Specifications
Robots are playing an important role in automation across all the sectors like
construction, military, medical, manufacturing, etc. After making some basic robots
like line follower robot, computer controlled robot etc, we have developed
this accelerometer based gesture controlled robot by using arduino uno. In this project
we have used hand motion to drive the robot. For this purpose we have used
accelerometer which works on acceleration.
48
4.2.2 Receiver:
4.3 DC Motors
49
4.4 Accelerometer:
The accelerometer uses very little current, so it can be plugged into your board and
run directly off of the output from the digital output pins. To do this, you'll use three
of the analog input pins as digital I/O pins, for power and ground to the
accelerometer, and for the self-test pin. You'll use the other three analog inputs to read
the accelerometer's analog outputs.
50
4.4.3 Using the Voltage Reference:
For the best possible accuracy and precision, you can use the output of the
accelerometer boards voltage regulator as the analog reference for the Arduino.
Connect the 3Vo pin on the accelerometer board to the AREF pin on the Arduino.
51
4.4.4 Schematic:
A gesture controlled robot is controlled by using hand in place of any other method
like buttons or joystick. Here one only needs to move hand to control the robot. A
transmitting device is used in your hand which contains RF Transmitter and accelero-
52
meter. This will transmit command to robot so that it can do the required task like
moving forward, reverse, turning left, turning right and stop. All these tasks will be
performed by using hand gesture.
Here the most important component is accelerometer. Accelerometer is a 3 axis
acceleration measurement device with +-3g range. This device is made by using
polysilicon surface sensor and signal conditioning circuit to measure acceleration. The
output of this device is Analog in nature and proportional to the acceleration. This
device measures the static acceleration of gravity when we tilt it. And gives an result
in form of motion or vibration.
According to the datasheet of adxl335 polysilicon surface-micromachined structure
placed on top of silicon wafer. Polysilicon springs suspend the structure over the
surface of the wafer and provide a resistance against acceleration forces. Deflection of
the structure is measured using a differential capacitor which incorporate independent
fixed plates and plates attached to the moving mass. The fixed plates are driven by
180° out-of-phase square waves.
Acceleration deflects the moving mass and unbalances the differential capacitor
resulting in a sensor output whose amplitude is proportional to acceleration. Phase-
sensitive demodulation techniques are then used to determine the magnitude and
direction of the acceleration.
53
4.6 Motor Driver
At the receiver end we have used RF receiver to receive data and then applied to
HT12D decoder. This decoder IC converts received serial data to parallel and then
read by using arduino. According to received data we drive robot by using two DC
motor in forward, reverse, left, right and stop direction.
4.8 Working
Gesture controlled robot moves according to hand movement as we place transmitter
in our hand. When we tilt hand in front side, robot start to moving forward and
continues moving forward until next command is given.
When we tilt hand in backward side, robot change its state and start moving in
backwards direction until other command is given.
When we tilt it in left side Robot get turn left till next command.
When we tilt hand in right side robot turned to right.
And for stopping robot we keeps hand in stable.
54
Circuit Diagram for Transmitter Section
55
Circuit Diagram for Receiver Section
Circuit for this hand gesture controlled robot is quite simple. As shown in above
schematic diagrams, a RF pair is used for communication and connected with arduino.
Motor driver is connected to arduino to run the robot. Motor driver’s input pin 2, 7, 10
and 15 is connected to arduino digital pin number 6, 5, 4 and 3 respectively. Here we
have used two DC motors to drive robot in which one motor is connected at output
pin of motor driver 3 and 6 and another motor is connected at 11 and 14. A 12volt
Battery is also used to power the motor driver for driving motors.
4.9 Implementation
The following code snippet was run to initiate a forward movement by the car.
digitalWritephaseRight, HIGH);
digitalWrite(breakRight, HIGH);
digitalWrite(enableRight, HIGH);
digitalWrite(phaseLeft, HIGH);
digitalWrite(breakLeft, HIGH);
digitalWrite(enableLeft, HIGH);
The next step was to perform backward movement of car. This was achieved by
setting the phaseRight and phaseLeft values to LOW. By turning one side of the
wheels in the forward direction and the other side in the backward direction allowed
56
the car to spin on the spot. For movements in the right or left directions, wheels on the
respective side were made to move forward while the wheels on the opposite side
were kept still. At this stage, the Arduino was powered via a USB connection with the
computer and Serial communication was started using the following code snippet.
Five different commands were decided based on the values of the angles being
tracked. 'F' was sent for forward motion of car, 'B' for backward motion, 'R' for
motion towards the right direction, 'L' for motion towards the left direction and a
default value was considered stop. Similar to the previous gesture design, a unique
character was used as event trigger for alerting the Arduino.
However, it was seen that the skeleton tracking gestures were not user-friendly and
resulted in fatigue in the arms frequently. Therefore, the gesture design was changed
again and a more comprehensive design, based on the hand tracking design discussed
in the previous chapter, was implemented.
57
4.9.3 Final design
As previously mentioned, this design is based on a simple idea with a control pad kind
of GUI. First, the environment was set up; the essential variables were declared and
the required objects were created. The following code fragment shows a part of the
initial setup.
After the initial setup, the GUI was created using a newly defined control pad object
which had associated parameters like height and width of screen, number of buttons
on control pad, size of each button and shape of buttons etc. Three states were defined
for the rectangular buttons at a point in time. The buttons could either be selected,
active but not selected or rendered useless owing to an expired session. Selected
buttons were represented with blue colour, active but unselected buttons were
represented with green colour and an inactive control pad was drawn in red colour.
The code fragment given below was used to draw the GUI for the system.
An if-else construct was used to determine the command to be sent to the car based on
the position of the active hand at any point in time. This selected command was
transmitted to the Arduino via Serial communication. To aid the experience of the
user descriptive text suggesting the commands being given was displayed on the
screen. Frequent session and XnVPointControl, the NITE object used for hand
tracking, callbacks were also made in the code.
Initially, the introduction of the delay lead to reduced speed but the motion obtained
was jerky in nature. After trial with multiple delay values, a reasonably smooth
motion was acquired while sustaining the required slow speed.
However, these delays were only introduced for the forward and backward
movements as a smooth spinning motion could not be attained with slow speed. The
values finally used were forward/backward motion for 60ms followed by stalling of
the car for 80ms.
At the end of the project, a working implementation of the car and the gesture design
were obtained. following picture shows the car at the end of the implementation:
58
4.11 Complete Outcome Structure of Hand Gesture:
59
CHAPTER #5
Result and Analysis
60
5 Results and Analysis
61
You can see the circuit for the receiver power supply on the right. Using this diagram,
wire up the supply circuit. You can also add an LED via a 1k resistor to indicate the
state of power supply.
IC 7805 which regulates the 12V supply to 5V (if you can't get a 12V supply
you can use a 9V supply)
0.1uf and 470uf capacitor
1k resistor for status LED
NOTE: Use heat sink for 7805 because we are dropping 7V (12-5) so lots of heat will
be produced to burn the regulator so usage of heat sink is recommended.
62
This is just an illustration of the transmitter:
63
5.4 Make the Receiver:
Wire the circuit as per the above receiver schematic. There are 2 LEDs in the receiver
board, one lights up when the power supply is given to the receiver and the other
when power supply is given to the transmitter circuit. The LED near the IC HT12D
should light up and this provides you a valid transmission (VT) when power is given
at the transmitter if not there is something wrong with your connection or your RF-
TX-RX module.
64
GND — This pin should also be connected to the negative of the power supply.
Osc 1,2 — These pins are the oscillator input and output pins.This pin are connected
to each other with a external resistor.
Output — This is an output pin. The data signals is given out from this pin.
Vcc — The Vcc pin connected to positive power supply, It is used to power the IC.
VDD and VSS: This pin are used to provide power to the IC, Positive and Negative
of the power supply respectively
DIN: This pin is the serial data input and can be connected to a RF receiver output.
A0 – A7: This are the address input. Status of these pins should match with status of
address pin in HT12E (used in transmitter) to receive the data. These pins can be
connected to VSS or left open
D8 – D11: This are the data output pins. Status of these pins can be VSS or VDD
depending upon the received serial data through pin DIN.
VT: stand for Valid Transmission. This output pin will be HIGH when valid data is
available at D8 – D11 data output pins.
OSC1 and OSC2: This pin are used to connect external resistor for internal oscillator
of HT12D. OSC1 is the oscillator input pin and OSC2 is the oscillator output pin
L293D (Motor Driver): The L293D is a Motor Driver IC it allows the motor to drive
on both direction. L293D is a 16-pin IC with eight pins, on each side, dedicated to the
controlling of a motor which can control a set of two DC motors at a same time in any
direction. With one L293D we can control 2 dc motors ,There are 2 INPUT pins,
2 OUTPUT pins and 1 ENABLE pin for each motor. L293D consist of two H-bridge.
H-bridge is the simplest circuit for controlling a low current rated motor.
5.7.1 L293D IC
Place the IC on a bread board and give 5v and Gnd to the IC and then give the 12v to
pin 8. connect the enable pins of the motors to 5v. Now, give power to the input of
one motor and check the output pins with a multimeter. If it shows nothing then there
is problem with you motor driver.
66
5.8 What gestures will the robot recognize?
This robot is designed for recognizing five sets of gestures: forward, backward, left,
right and stop. You will get a better idea if you check the photos of the gestures given
below.
Stop flat
Right flat
67
Left flat
Forward flat
68
Backward flat
69
5.9 Testing and Evaluation
As the project moved towards completion, the individual parts were tested before
integratitng them into the rest of the system. At the end of each iteration, integration
tests were performed and at the end of the project, full system tests were performed.
Therefore, testing was a continuous process during the implementation and it aided in
the early identification of any problems or bugs.
This chapter outlines the set of tests that were carried out during development in order
to gauge the performance of the system.
5.10 Automotive
The car was first tested without an Arduino attached to it. It was seen that the car
performed moves in the forward direction, by default, at it's maximum possible speed.
Next, an Arduino Uno was connected to the car via the ribbon wire attached to the
modified chip of the car. All possible motions were then tried individually on the car.
It was then seen that the car behaved erratically and did not follow the given
commands for all motions successfully at all times. A discrepancy was found in the
reactions to the same command and this fault was accounted to loose connections in
the driver chip of the car. Therefore, as a solution, the faulty car was replaced by a
properly functioning model.
The new car successfully followed all commands and performed motions in the
forward, backward, diagonally forward and diagonally backward directions in a
consistent manner. It also performed spinning on the spot.
70
This lag was accounted to the delay that was induced to control speed of the car
through PWM. Initially, for forward motion of the car, a delay of 60ms was induced
after a forward high signal was given which was followed by a break signal for 80ms.
This was in a loop. Therefore, each time a change in the gesture command was
performed, a very noticeable delay of more than 80ms or 60ms was felt before the
respective motion for the new command could be performed by the car.
A solution to this problem was found by reducing the values of the induced delays
such that the speed is controlled and the interim delay time between motions is also
made unnoticeable. The final delay values settled on 15ms of forward motion
followed by stalling of the car for 5ms.
A considerable amount of lag was also reduced by removing the unnecessary debug
information which was being printed on the Serial monitor of the Arduino IDE during
the development procedure.
After having eliminated the lag from the wired implementation of the system earlier, a
reintroduction of a lag or delay was seen with the use of the wireless modules. In
order to eliminate this delay, the following attempts were made:
NRF receives a continuous data stream via Serial communication and therefore it does
not wait long before packetizing and transmitting the data. The 'Packetization timeout'
(RO) is a measure of the amount of time the NRF will wait before transmitting the
data. Since, in this software design, only two characters need to be transmitted at once
the RO value can be kept small to reduce the transmission delay time. At a baud rate
of 9600 bps, approximately 1ms is required for transmission of one character.
Therefore, the default value of RO was reduced from 3 to 1 to reduce that delay.
However, this did not prove to be a solution and a delay was still being experienced
A few more possible solutions were tried next which included setting a slower or
faster baud rate ranging from 4800 bps to 115200 bps, changing the configuration of
the wireless modules from coordinator and end device to coordinator and router and
changing the operation mode of the wireless modules from the simpler transparent
mode (AT) to the more reliable and complicated API mode. These measures were also
implemented without much success and the delay remained.
Finally, the delay was accounted for by the 6 V battery which was powering the
Arduino, the NRF end device connected to the Arduino and the car. This voltage was
not enough for the smooth functioning of the three devices together and hence, a new
9 V was introduced to power the Arduino and the accelerometer module while the 6 V
was kept for the car. This resulted in the successful elimination of the delay.
71
5.14 Range of wireless
This test was performed to test the range of the wireless modules. Since, there were
no provisions to allow testing of the module over very long distances outdoors, the
testing was done mostly indoors in large rooms. It was seen that NRF modules had a
comfortable range of more than 1Km indoors.
5.15 Battery
In this test, the car was run using a fully powered battery until the battery was
exhausted. It was seen that the battery lasted for 28 minutes before it reached a state
of exhaustion.
72
Chapter#6
Conclusion
73
6 Conclusion
Over the past few years, gesture recognition has become increasingly popular due to
its frequent appearance in the entertainment and gaming markets. Now, it is rapidly
becoming a commonplace technology, allowing humans and machines to interface
more easily in the home, work and automobile environments. It has become easy to
imagine a person sitting on a couch, controlling the lights and TV with a wave of a
hand. This and other capabilities are being realized as gesture recognition
technologies enable natural interactions with electronics that surround us.
Gesture recognition has varied applications ranging from sign language recognition,
such as PSLT app (Portable Sign Language Translator) which recognizes and
translates sign language into text, to medical rehabilitation and virtual reality. With
the extensive research and documentation available on gesture recognition it is easy to
overlook the challenges associated with the use of the technology. One prominent
problem associated with gesture recognition is the existence of numerous underlying
assumptions which might be true in a controlled environment but cannot be
generalized to real-world scenarios. Commonly, the ambient lighting conditions and
stationary backgrounds with high contrast are assumed but in reality this is not the
case. Also, the subjective nature of the use of this technology is often ignored and
generalizations are presented based on work by a few researchers.
Realizing the nascent stage of this technology, it is evident that gesture recognition
technology has come a long way especially for vision based methods but it remains
that extensive further research into areas of gesture classification and representation is
much needed to realize the full potential of this technology.
74
6.2 Summary
Overall, the project was successful and the end result was a fully and consistently
functioning automotive, controlled by hand gestures. In the process designing a
gesture recognition and tracking system, extensive research was undertaken. In
conclusion, the requirements outlined at the beginning of the project were realized
and the objectives were fulfilled.
75
Chapter# 7
Coding
76
7 Coding
7.1 Sender code
#include <SPI.h>
#include <nRF24L01.h>
#include <RF24.h>
// the setup function runs once when you press reset or power the board
int sx,sy=0;
void setup()
{
radio.begin();
radio.setRetries(15, 15);
radio.openWritingPipe(rxAddr);
radio.stopListening();
Serial.println();
}
void loop()
{
int yval=analogRead(ypin);
if(xval>370)
{
potValue=1;
}
else if(xval<320)
{
potValue=2;
77
}
else if(yval<300)
{
potValue=3;
}
else if(yval>360)
{
potValue=4;
delay(100);
}
78
7.2 Receiver code
#include <SPI.h>
#include <nRF24L01.h>
#include <RF24.h>
void setup()
{
while (!Serial);
Serial.begin(9600);
radio.begin();
radio.openReadingPipe(0, rxAddr);
radio.startListening();
}
int angleV;
void loop()
{
if (radio.available())
{
angleV = 0;
radio.read(&angleV, sizeof(angleV));
if(angleV==0){
stopm();
}else if(angleV==1){
leftm();
}else if(angleV==2){
rightm();
}
else if(angleV==3){
forword();
}
else if(angleV==4){
backword();
}
else{
stopm();
}
Serial.println(angleV);
79
}
Serial.println(angleV);
}
void backword()
{
digitalWrite(9,HIGH) ;
digitalWrite(10,LOW) ;
digitalWrite(5,HIGH) ;
digitalWrite(6,LOW) ;
}
void forword()
{
digitalWrite(9,LOW) ;
digitalWrite(10,HIGH) ;
digitalWrite(5,LOW) ;
digitalWrite(6,HIGH) ;
}
void rightm()
{
digitalWrite(9,LOW) ;
digitalWrite(10,LOW) ;
digitalWrite(5,LOW);
digitalWrite(6,HIGH) ;
}
void leftm()
{
digitalWrite(9,LOW) ;
digitalWrite(10,HIGH) ;
digitalWrite(5,LOW);
digitalWrite(6,LOW) ;
}
void stopm()
{
digitalWrite(9,LOW) ;
digitalWrite(10,LOW) ;
digitalWrite(5,LOW);
digitalWrite(6,LOW) ;
80