Professional Documents
Culture Documents
STALK-E
Object following robot
TUGAY KÖYLÜOGLU
ELIN LINDBERGH
TUGAY KÖYLÜOGLU
ELIN LINDBERGH
i
Referat
STALK-E
Att själv bära sina varor i ett varuhus är en självklar del av
de flesta människors liv. En sådan uppgift kan dock vara
svår för personer som av exempelvis ålder eller funktions-
nedsättning är i stort behov av assistans. En kundvagn som
följer efter sin användare i ett varuhus skulle kunna lösa
detta problem. I det här projektet har en robot som följer
efter sin användare med hjälp av färgigenkänning konstru-
erats. Två olika metoder att mäta avstånd jämfördes och
Pixy-kamerans förmåga att hitta objekt under olika om-
ständigheter undersöktes.
Några av robotens komponenter designades i Solid Edge
ST8 och 3D-printades sedan. Programmeringen av Ardui-
no Uno gjordes i Arduino IDE och styr Pixy-kamera, mo-
torer, ultraljudssensor och bluetooth-modul. Eftersom pro-
jektet var begränsat i tid och finansiering har flera områden
lämnats för framtida utveckling.
Roboten sätts på/av med en smartphone-app som via
bluetooth är kopplad till roboten. När systemet är på le-
tar kameran efter ett objekt i en fördefinerad färgkod. När
kameran hittat ett objekt börjar roboten följa det. För att
hålla ett konstant avstånd mellan robot och objekt, mäts
avståndet.
Resultatet visar att Pixy-kameran fungerar för att upp-
täcka objekt, men att den inte är tillräckligt pålitligt på
grund av dess känslighet för ljusförändringar. För att mäta
avstånd föredras ultraljudssensorn framför Pixy-kameran.
ii
Acknowledgements
iii
Contents
Abstract i
Referat ii
Acknowledgements iii
Contents iv
List of Figures vi
1 Introduction 1
1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.3 Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.4 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.4.1 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2 Theory 5
2.1 Colours and colour recognition . . . . . . . . . . . . . . . . . . . . . 5
2.2 Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2.1 Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2.2 Arduino . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.2.3 H-bridge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.2.4 Ultrasonic sensor . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.2.5 Bluetooth module . . . . . . . . . . . . . . . . . . . . . . . . 9
2.2.6 Motors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.3 Robot maneuvering . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.3.1 Feedback control . . . . . . . . . . . . . . . . . . . . . . . . . 9
3 Demonstrator 11
3.1 Problem formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.2 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.3 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.4 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
iv
3.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.5.1 Exposure of camera . . . . . . . . . . . . . . . . . . . . . . . 16
3.5.2 Deviation angle . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.5.3 Distance measurements . . . . . . . . . . . . . . . . . . . . . 17
5 Future work 21
Bibliography 23
Appendices 24
A Code 25
List of Figures
3.1 Flowchart that describes the behavior of the system, created in draw.io. 12
3.2 User interface [19]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.3 Schematic of electronics, created in Fritzing. . . . . . . . . . . . . . . . . 14
3.4 Finished construction of the vehicle. . . . . . . . . . . . . . . . . . . . . 15
3.5 The tracked object. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.6 The maximum distance between the robot and object at different expo-
sure values. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.7 The maximum angle the object can deviate from center line. . . . . . . 17
3.8 The error between the measured and the actual value. . . . . . . . . . . 18
vi
Nomenclature
µs microsecond
Bd Baud rate
CAD Computer-Aided Design
DC Direct Current
GND Ground
HSB Hue-Saturation-Brightness
I/O Input/Output
IDE Integrated Development Environment
ISO International Standards Organization
kB kilobyte
kHz kilohertz
KTH Royal Institute of Technology
MB Megabyte
PD Proportional Derivative
PWM Pulse Width Modulation
RAM Random Access Memory
RGB Red-Green-Blue
rpm Rotations Per Minute
TX/RX Transmitter/Receiver
USB Universal Serial Bus
V, A Voltage, Ampere
VCC Positive-voltage supply
vii
Chapter 1
Introduction
1.1 Background
A task as simple as carrying goods in department stores is part of everyones daily
life. However, such activity might not be that easy for some people who are in great
need for assistance, due to e.g. age or disability. Building a cart that has the ability
to recognize, track and follow a certain individual, while carrying a certain amount
of load, could be very beneficial for many who are unable to perform such simple
task and are in need of great assistance from others.
1.2 Purpose
This thesis addresses the results of a project to achieve the Bachelor degree in
Mechatronics at the Royal Institute of Technology (KTH). The purpose is to build
a tracking robot that follows its user and to answer following research questions:
• What is the optimal exposure value of the camera in order for the robot to be
able to maintain a predefined distance to the object?
• What is the maximum angle the tracked object can deviate from the path,
without the robot losing its ability to track it?
• The distance between the robot and the tracked object can be measured either
with an ultrasonic sensor or by calculating it, using the size of the object.
Which method gives the most accurate estimation of the distance?
1.3 Scope
Due to limitations in time and budget, the project is restricted in a number of ways:
1
CHAPTER 1. INTRODUCTION
• The robot is built as a small object-tracking vehicle and can not be used as a
person following cart in a real store. Because of its size, the vehicle can not
be loaded.
• The robot is expected to work under ideal conditions. For instance, the track-
ing method is based on colour recognition and will not work if the surroundings
have the same colour as the tracking object, or if there are more than one ob-
ject with the chosen colour. The lightning in the room must be ideal for the
camera to receive enough light for object detection.
• A tracking robot has to avoid collisions with both moving and stationary
obstacles. The robot in this project only detects obstacles that appears in
front of it. When an obstacle appears, the robot brakes to avoid a collision.
If it goes backwards, it might collide.
1.4 Method
For the system to be able to perform the object tracking task, the following steps
must be taken. Firstly, the camera needs to detect and recognize the object. In
order for the camera to not lose the detected object, it must communicate with
the servo motor in the pan mechanism. The information for steering and keeping a
constant distance to the tracked object must be provided for the DC-motors by the
Pixy camera and the ultrasonic sensor. The main construction consists of Arduino
Uno, the Pixy camera, bluetooth module, ultrasonic sensor, pan mechanism with
one servo motor and a movable base with four DC-motors. The pan mechanism
was constructed as a CAD-model and later 3D-printed with plastic material.
Solid Edge ST8 was used for construction of CAD-models, Ultimaker 2 for 3D-
printing and Arduino Software, Integrated Development Environment (IDE) for
programming the Arduino Uno [1]. Initially, the system was programmed for test-
ing each individual component and later on integrated for a full functional and
communicative system. When connecting the Pixy camera to a computer, Pixy-
Mon [2] was used to see the picture from the camera.
1.4.1 Experiments
To optimize the exposure value of the camera, the maximum distance between the
robot and the object was measured at different exposure values. This test was
performed in four different rooms, with four different light conditions. During this
test, the camera was connected to a computer and PixyMon was used to determine
when the camera could detect the object.
The second experiment was designed to calculate the distance between the robot
and the object. There are two possible ways to do this. One way is by using an
ultrasonic sensor and the other by using the camera and the size of the object, which
uses linear interpolation for measurement. These two methods where compared in
2
1.4. METHOD
order to identify which method that estimates the distance more accurately. The
actual distance was measured using a folding rule.
The third experiment was designed to determine how much the tracked object
can deviate from the path, without the camera losing it. Firstly, the object was
placed in front of the stationary vehicle along the center line. Secondly, without the
camera losing track of it, the object was placed as far as possible from the center
line, perpendicularly to it. The test was repeated at different distances from the
camera and on both sides of the center line.
3
Chapter 2
Theory
5
CHAPTER 2. THEORY
but there is also a way of using temporal information, which are computed from
a sequence of frames. Those are usually in the form of frame differencing where
changing regions in consecutive frames are highlighted. RGB or HSB colour spaces
are later used for colour recognition during analyzing of frames [5].
2.2 Components
2.2.1 Camera
The camera used in this project is a Pixy camera which was created by Charmed
Labs and got started as a kickstarter campaign in 2014. The idea with Pixy was
to create a small, fast and low-cost camera sensor for usage in smaller projects.
Pixy identifies objects using a hue-based colour filtering algorithm and is able to
detect objects in 50Hz. It has a horizontal angle of view of 75◦ . The filtering
algorithm calculates the hue and saturation of every RGB pixel, which there are
640x400 of in every frame. The hue stays mostly unchanged with differences in
lightning and exposure and the filtering algorithm remains robust when light and
exposure changes. The Pixy camera can be programed to remember up to seven
different objects and can also recognize objects via its ability to keep different colour
codes in its memory. Using colour code recognition will increase its colour-coded
encyclopedia from seven up to thousands [2]. The camera connects to Arduino via
an I/O connector [7] and only returns the necessary information, which is the X-Y
coordinates of the object. Connecting the Pixy camera to a computer with an USB
makes it available to see what the camera sees via PixyMon, which is an application.
The Pixy camera uses 5V for power input, has 264kB of RAM and 1MB in Flash
memory [2].
When calculating the distance to the object using the camera, linear interpola-
tion was used. At two different distances, d1 and d2 , the Pixy camera measured the
6
2.2. COMPONENTS
height, h1 and h2 , of the tracked object. With the assumption that the height h is
proportional to the distance, any distance can be calculated:
h − h1
distancecamera = d1 + (d2 − d1 ) (2.1)
h2 − h1
The exposure value of the camera, which is a dimensionless parameter, can vary
between 0 and 255. It can be changed in PixyMon, where it is called brightness.
The exposure depends on three different parameters in a camera; the aperture,
the shutter speed and the ISO value. The aperture determines how much light
that can pass through the lens, the shutter speed determines how long time the
light can pass and the ISO is a measurement of the image sensor’s light sensitivity
[8]. If the balance between these parameters are wrong, the image will be either
overexposed, i.e. too light, or underexposed, i.e. too dark. In PixyMon, these
parameters are merged into one brightness parameter. A high brightness value will
give an overexposed image, and vice versa.
To determine the maximum angle the object can deviate from the path (see
figure 2.3), the angle θ between the center line and the object was calculated:
2.2.2 Arduino
The microcontroller used in this project is an Arduino Uno. Arduino is an open
source platform, easy to program with the Arduino Software (IDE). [9] Arduino
Uno is based on the microcontroller ATmega328P and has 14 digital input/output
pins, whereof 6 can be used as PWM outputs. It can be powered either by an
7
CHAPTER 2. THEORY
2.2.3 H-bridge
With an H-bridge, the direction of the current through a motor can be changed. An
H-bridge consists of four switches, and the direction of the current is determined by
which switches that are open, see figure 2.4. If S2 and S3 are open while S1 and S4
are closed, a current will run through the motor in a positive direction. If opposite,
the current will run in a negative direction and the motor will change its direction
of rotation [10].
To control the DC-motors placed on each wheel on the vehicle, the motor driver
HG7881/L9110 is used. The module consists of two H-bridges, which means it can
control two motors independently. It operates at a voltage between 2.5V and 12V
[11].
8
2.3. ROBOT MANEUVERING
2.2.6 Motors
The vehicle has two types of motors. For usage of the pan mechanism to the Pixy
camera, a Tower Pro SG90 servo motor is used. This servo has a operating voltage
of 4.8V and a torque of 1.8kgcm. It can rotate 180◦ (90◦ in each direction) and
has a operating speed of 0.1s/60◦ . Its three inputs/outputs are VCC, ground and
PWM-control [14].
Each wheel has a DAGU DG02S DC-motor that operates at 3V and maximum
of 170mA. It has a gearing of 48:1 and a no load speed at 65rpm [15].
de(t)
u(t) = K P · e(t) + K D · , (2.4)
dt
where K P and K D are gain parameters [16].
9
Chapter 3
Demonstrator
• The robot should only follow the object when an on-signal is received from
the smartphone. If the bluetooth module receives an off-signal, the robot has
to stop.
3.2 Software
The programming is made in Arduino IDE and can be divided into sections in the
same way as the problem formulation above. The code for the object tracking is
based on open source code [17], as well as the code for the distance measurements
with the ultrasonic sensor [18]. Parts of these open source codes are then merged
with codes written by the authors. The flowchart in figure 3.1 describes the behavior
of the system briefly.
11
CHAPTER 3. DEMONSTRATOR
Figure 3.1. Flowchart that describes the behavior of the system, created in draw.io.
12
3.3. ELECTRONICS
mode is used in this project. When pressing the switch button the robot either starts
or stops. The user interface is shown in figure 3.2. The green button shows that
the robot is on. When pressing the button, it turns red and the robot stops.
3.3 Electronics
The electrical components the robot consist of are an Arduino Uno, a Pixy camera,
a servo motor, four DC-motors, a bluetooth module and an ultrasonic sensor. The
connections can be seen in figure 3.3.
13
CHAPTER 3. DEMONSTRATOR
14
3.4. HARDWARE
3.4 Hardware
The vehicle was built with a kit containing wheels and motors from DAGU, a pan
mechanism built for the camera and ultrasonic sensor which was modeled in Solid
Edge ST8 and then 3D-printed by Ultimaker 2. The components are assembled on
platforms that holds every part together. The assembled vehicle is shown in figure
3.4.
The object that is tracked and followed is a red and green coloured band. The
band wraps around a persons ankle for whom the robot follows. The object can be
seen in figure 3.5
15
CHAPTER 3. DEMONSTRATOR
3.5 Results
Following figures are created in MATLAB version R2016b. Figure 3.6 presents the
maximum distance to the object with different exposure values. The four curves
come from tests in four different light conditions. The yellow curve is the result
from a dark room, with only some daylight from a window. The red curve shows
the result from a room fully lit by lamps, the green from a light room with both
lamps and daylight and the blue curve comes from medium light room with daylight.
When changing the light conditions, the signature colour code has to be re-defined
in order for the camera to find it. Besides this, the camera can handle different
lights.
70
65
Maxmimum distance [cm]
60
55
50
45
40
35
30
80 100 120 140 160 180 200 220 240 260
Exposure value
Figure 3.6. The maximum distance between the robot and object at different
exposure values.
16
3.5. RESULTS
Figure 3.7 shows the maximum angle the object can deviate from the center line
without the robot losing it. The blue curve shows the angle when the object is
placed on the left side of the center line, and the red curve is the result on the right
side. The yellow curve is the sum of both curves and results in the total horizontal
angle of view.
70
60
Angle [°]
50
40
30
20
20 30 40 50 60 70 80 90
Distance along center line [cm]
Figure 3.7. The maximum angle the object can deviate from center line.
In figure 3.8 the distance measurements methods, with camera and ultrasonic sensor,
are shown. The distance is from robot to the object. The graph presents the error
between the measured and the actual value in percentage.
17
CHAPTER 3. DEMONSTRATOR
35
30
Error [%]
25
20
15
10
0
15 20 25 30 35 40
Actual distance [cm]
Figure 3.8. The error between the measured and the actual value.
18
Chapter 4
4.1 Discussion
The purpose of the project was to build a robot that follows its user and to answer
three research questions.
The results in figure 3.6 show that the maximum distance between the robot
and the object is more dependent on the light in the room, than on the exposure
value. However, for an exposure value lower than 90, the Pixy camera will not be
able to detect the object. When reaching an exposure value higher than 200, the
maximum distance decreases in most cases, as a result of an overexposed image.
In this project, the robot is supposed to have a constant distance of 30 cm to the
tracked object. With an exposure value between 100 and 200, the results show that
this distance will be possible to keep.
As can be seen from the results in figure 3.7, the horizontal field of view keeps
constant at 75◦ between 35 and 65 cm. When reaching a distance of more than 65
cm, the angle decreases. This is a consequence of the hypotenuse becoming longer
when the angle increases, and exceeds the camera’s maximum detection distance.
CMUcam5 advertises the camera’s angle of view as 75◦ while the results show that
the angle is between 82◦ and 63◦ .
The results presented in figure 3.8 shows that the ultrasonic sensor generally
gives a smaller distance measurement error than the Pixy camera. In a certain
range, between 23 and 35 cm, the error from the Pixy camera is less than 5%.
Outside this range, the error increases, which indicates that the distance is not
proportional to the size of the tracking object. As a consequence, there might be a
better method to use than linear interpolation.
There are also other factors that make the Pixy camera method less reliable
than the ultrasonic sensor. When calculating the distance, the height of the object
is measured. In order for the camera to measure this correctly, two requirements
must be met. Firstly, the object must be horizontal. Secondly, the camera must
detect the entire object. This might work under ideal conditions, but the method
is not preferable for a person following robot. The ultrasonic sensor uses the time
19
CHAPTER 4. DISCUSSION AND CONCLUSION
it takes for a signal to come back after being transmitted and is not dependent on
light conditions. This method is therefore preferred in this project.
4.2 Conclusion
The Pixy camera’s ability to detect the object is more dependent on light conditions,
than on the exposure value. However, to avoid over- or underexposed pictures, the
optimal exposure value is between 100 and 200. The maximum angle the tracked
object can deviate from the camera’s center line is 40◦ . Using the ultrasonic sensor
is preferable compared to the Pixy camera, when measuring distance between the
robot and the object.
20
Chapter 5
Future work
The object tracking robot can be improved in many ways. One important area
is the collision avoidance. In this project, only one ultrasonic sensor is used and
when an obstacle appears in front of the robot, it stops. To make the robot more
user-friendly, it could be developed in such a way that it drives around any blocking
object. It could also use ultrasonic sensors on its sides and back in order to detect
objects all around, most importantly on its back for reverse driving.
To improve object detection, a better camera could be used. A camera that is
less sensitive to brightness and has more pixels would be a better design choice but
also a financial problem for projects with smaller budget.
In order to make the robot more useful in a department store, it needs to be
larger, have stronger motors and a basket. The basket could have a weight sensor in
order for the robot to know how much more weight it could handle and also regulate
its speed.
21
Bibliography
23
BIBLIOGRAPHY
[13] Guangzhou HC Information Technology, “Product data sheet,” April 2011, [Ac-
cessed: 2017-05-03]. [Online]. Available: https://www.olimex.com/Products/
Components/RF/BLUETOOTH-SERIAL-HC-06/resources/hc06.pdf
[14] Tower Pro, “Sg90 9g micro servo,” June 2015, [Accessed: 2017-05-09]. [Online].
Available: http://www.micropik.com/PDF/SG90Servo.pdf
[16] T. Glad and L. Ljung, Reglerteknik Grundläggande Teori, 4th ed. Studentlit-
teratur AB, 2006.
[17] B. Earl, “Pixy pet robot - The code,” October 2016, [Ac-
cessed: 2017-04-20]. [Online]. Available: https://learn.adafruit.com/
pixy-pet-robot-color-vision-follower-using-pixycam/the-code
[18] S. Tautvidas, “Distance sensing with ultrasonic sensor and Arduino,” August
2012, [Accessed: 2017-04-20]. [Online]. Available: https://www.tautvidas.
com/blog/2012/08/distance-sensing-with-ultrasonic-sensor-and-arduino/
[19] Giumig Apps, “Arduino bluetooth controller,” (version 1.3) 2016, [Mobile Ap-
plication Software], Downloaded from https://play.google.com/store.
[20] Google, “Google Play,” May 2017, [Accessed: 2017-05-05]. [Online]. Available:
https://play.google.com/store
24
Appendix A
Code
#i n c l u d e <SPI . h>
#i n c l u d e <Pixy . h>
#i n c l u d e <S o f t w a r e S e r i a l . h>
#d e f i n e X CENTER 160L
#d e f i n e RCS MIN POS 0L
#d e f i n e RCS MAX POS 1000L
#d e f i n e RCS CENTER POS ( (RCS MAX POS−RCS MIN POS ) / 2 )
Pixy p i x y ; // D e c l a r i n g camera o b j e c t
// Setup components
// DC−Motors
c o n s t i n t RightA = 3 ;
c o n s t i n t RightB = 4 ;
c o n s t i n t LeftA = 5 ;
c o n s t i n t LeftB = 6 ;
byte s p e e d = 2 5 5 ∗ 2 / 3 ;
// U l t r a s o n i c S e n s o r
const int trigPin = 10;
c o n s t i n t echoPin = 8 ;
// B l u e t o o t h module
S o f t w a r e S e r i a l BTSerial ( 0 , 1 ) ;
25
APPENDIX A. CODE
c h a r data = ’ 0 ’ ;
// Setup
void setup ( ) {
S e r i a l . begin (9600);
pinMode ( LeftA , OUTPUT) ;
pinMode ( LeftB , OUTPUT) ;
pinMode ( RightA , OUTPUT) ;
pinMode ( RightB , OUTPUT) ;
p i x y . i n i t ( ) ; // I n i t i a l i z e Pixy
// B l u e t o o t h module
BTSerial . begin ( 9 6 0 0 ) ;
}
// Main l o o p
void loop ( ) {
i f ( B T S e r i a l . a v a i l a b l e ( ) > 0 ) // I f b l u e t o o t h data i s a v a i l a b l e t o r e a d
{
data = B T S e r i a l . r e a d ( ) ; // Read data
}
i f ( data == ’ 1 ’ ) // ON
{
pixyfunc ( ) ;
}
e l s e i f ( data == ’ 0 ’ ) // OFF
{
drive (0 ,0);
}
}
// C l a s s f o r pan−mechanism
c l a s s ServoLoop
{
public :
ServoLoop ( i n t 3 2 t p r o p o r t i o n a l G a i n , i n t 3 2 t d e r i v a t i v e G a i n ) ;
v o i d update ( i n t 3 2 t e r r o r ) ;
int32 t m pos ;
int32 t m prevError ;
int32 t m proportionalGain ;
int32 t m derivativeGain ;
};
26
// ServoLoop C o n s t r u c t o r
ServoLoop : : ServoLoop ( i n t 3 2 t p r o p o r t i o n a l G a i n , i n t 3 2 t d e r i v a t i v e G a i n )
{
m pos = RCS CENTER POS ;
m proportionalGain = proportionalGain ;
m derivativeGain = derivativeGain ;
m prevError = 0 x80000000L ;
}
// ServoLoop Update
// C a l c u l a t e s t h e output based on t h e measured e r r o r and t h e c u r r e n t s t a t e .
v o i d ServoLoop : : update ( i n t 3 2 t e r r o r )
{
long int v e l o c i t y ;
c h a r buf [ 3 2 ] ;
i f ( m prevError != 0 x80000000 )
{
// Using PD−c o n t r o l t o pan camera
v e l o c i t y = ( e r r o r ∗ m proportionalGain +
( e r r o r − m prevError ) ∗ m d e r i v a t i v e G a i n ) >> 1 0 ;
m pos += v e l o c i t y ;
i f ( m pos > RCS MAX POS)
{
m pos = RCS MAX POS ;
}
e l s e i f ( m pos < RCS MIN POS)
{
m pos = RCS MIN POS ;
}
}
m prevError = e r r o r ;
}
// I f b l o c k s a r e d e t e c t e d , t r a c k and f o l l o w them .
i f ( blocks )
27
APPENDIX A. CODE
{
i n t t r a c k e d B l o c k = TrackBlock ( b l o c k s ) ;
DriveToObject ( t r a c k e d B l o c k ) ;
}
// I f b l o c k s not i n s i g h t , s t o p motors and s c a n f o r b l o c k s .
else
{
drive (0 , 0);
ScanForBlocks ( ) ;
}
}
i n t oldX , o l d S i g n a t u r e ;
// The v e h i c l e w i l l f o l l o w t h e b i g g e s t b l o c k d e t e c t e d .
f o r ( i n t i = 0 ; i < blockCount ; i ++)
{
i f ( ( o l d S i g n a t u r e == 0 ) | | ( p i x y . b l o c k s [ i ] . s i g n a t u r e == o l d S i g n a t u r e ) )
{
l o n g newSize = p i x y . b l o c k s [ i ] . h e i g h t ∗ p i x y . b l o c k s [ i ] . width ;
i f ( newSize > maxSize )
{
trackedBlock = i ;
maxSize = newSize ;
// Measures h e i g h t o f b i g g e s t b l o c k :
o b j e c t h e i g h t = pixy . blocks [ i ] . height ;
}
}
}
// Using o b j e c t h e i g h t t o c a l c u l a t e d i s t a n c e t o o b j e c t :
calculatedistance ( objectheight );
// How f a r o f f c e n t e r i s t h e o b j e c t ? :
i n t 3 2 t panError = X CENTER − p i x y . b l o c k s [ t r a c k e d B l o c k ] . x ;
panLoop . update ( panError ) ;
p i x y . s e t S e r v o s ( panLoop . m pos , panLoop . m pos ) ; // Turn camera
28
oldX = p i x y . b l o c k s [ t r a c k e d B l o c k ] . x ;
oldSignature = pixy . blocks [ trackedBlock ] . s i g n a t u r e ;
r e t u r n t r a c k e d B l o c k ; // Returns which b l o c k t o f o l l o w
}
// Dr ive towards o b j e c t
v o i d DriveToObject ( i n t t r a c k e d B l o c k )
{
// How f a r from t h e c e n t e r l i n e i s t h e camera ? :
i n t 3 2 t f o l l o w E r r o r = RCS CENTER POS − panLoop . m pos ;
i n t forwardSpeed ;
i n t Kd = 2∗ forwardSpeed ;
long distance = 0;
d i s t a n c e = u l t r a s o n i c ( ) ; // Measures d i s t a n c e with u l t r a s o n i c s e n s o r
i f ( d i s t a n c e > 30)
{
forwardSpeed = s p e e d ;
}
e l s e i f ( d i s t a n c e > 15 && d i s t a n c e <= 3 0 )
{ // I f d i s t a n c e i s between 15 and 30 cm , don ’ t move
forwardSpeed = 0 ;
followError = 0;
}
e l s e i f ( d i s t a n c e <= 1 5 )
{ // I f d i s t a n c e i s l e s s than 15 cm , go backwards w i t h o u t t u r n i n g
forwardSpeed = s p e e d ;
followError = 0;
}
// PD−c o n t r o l f o r s t e e r i n g
i n t 3 2 t d i f f e r e n t i a l = ( f o l l o w E r r o r + ( f o l l o w E r r o r ∗ forwardSpeed ) +
Kd∗ ( f o l l o w E r r o r − p r e v f o l l o w E r r o r ) ) >> 8 ;
// Depending on t h e d i f f e r e n t i a l , s e t s p e e d on l e f t and r i g h t w h e e l s
i n t s p e e d l e f t = forwardSpeed − d i f f e r e n t i a l / 2 ;
i n t s p e e d r i g h t = forwardSpeed + d i f f e r e n t i a l / 2 ;
29
APPENDIX A. CODE
speedleft = 0;
}
e l s e i f ( s p e e d l e f t > 255)
{
speedleft = 255;
}
e l s e i f ( s p ee d ri gh t < 0)
{
speedright = 0;
}
e l s e i f ( s p e e d r i g h t > 255)
{
speedright = 255;
}
// Dr ive f o r w a r d o r r e v e r s e ?
i f ( d i s t a n c e > 1 5 ) // I f d i s t a n c e i s more than 15 cm , go f o r w a r d
{
drive ( speedright , speedleft ) ;
}
e l s e // I f d i s t a n c e i s l e s s than 15 cm , go backwards
{
reverse ( speedleft , speedright ) ;
}
prev followError = followError ;
}
// Scans t o d e t e c t o b j e c t
i n t s c a n I n c r e m e n t = (RCS MAX POS − RCS MIN POS) / 1 5 0 ;
u i n t 3 2 t lastMove = 0 ;
v o i d ScanForBlocks ( )
{
i f ( m i l l i s ( ) − lastMove > 2 0 )
{
lastMove = m i l l i s ( ) ;
panLoop . m pos += s c a n I n c r e m e n t ;
i f ( ( panLoop . m pos >= RCS MAX POS) | | ( panLoop . m pos <= RCS MIN POS ) )
{
s c a n I n c r e m e n t = −s c a n I n c r e m e n t ;
}
30
}
// Measures d i s t a n c e t o o b j e c t with t h e u l t r a s o n i c s e n s o r
long u l t r a s o n i c () {
l o n g d u r a t i o n , cm ;
// Send a s i g n a l
pinMode ( t r i g P i n , OUTPUT) ;
d i g i t a l W r i t e ( t r i g P i n , LOW) ;
delayMicroseconds ( 2 ) ;
d i g i t a l W r i t e ( t r i g P i n , HIGH ) ;
delayMicroseconds ( 1 0 ) ;
d i g i t a l W r i t e ( t r i g P i n , LOW) ;
// Read t h e s i g n a l
pinMode ( echoPin , INPUT ) ;
d u r a t i o n = p u l s e I n ( echoPin , HIGH ) ;
cm = m i c r o s e c t o c m ( d u r a t i o n ) ;
Serial . println (” Ultrasonic distance : ” );
S e r i a l . p r i n t (cm ) ;
S e r i a l . p r i n t ( ”cm” ) ;
r e t u r n cm ;
}
// C a l c u l a t i n g d i s t a n c e with Pixy u s i n g l i n e a r i n t e r p o l a t i o n
void c a l c u l a t e d i s t a n c e ( f l o a t o b j e c t h e i g h t )
{
f l o a t d1 = 3 0 ;
f l o a t d2 = 7 5 ;
f l o a t h1 = 2 1 ;
f l o a t h2 = 7 ;
f l o a t d i s t = d1 + ( d2 − d1 ) ∗ ( ( o b j e c t h e i g h t −h1 ) / ( h2−h1 ) ) ;
S e r i a l . p r i n t ( ” C a l c u l a t e d d i s t a n c e from Pixy : ” ) ;
Serial . print ( dist );
S e r i a l . p r i n t ( ” cm ” ) ;
}
// D r i v e s f o r w a r d
void drive ( i n t s p e e d l e f t , i n t speedright )
31
APPENDIX A. CODE
{
a n a l o g W r i t e ( LeftA , 0);
a n a l o g W r i t e ( LeftB , speedleft );
a n a l o g W r i t e ( RightA , 0);
a n a l o g W r i t e ( RightB , speedright ) ;
}
// D r i v e s backwards
void r e v e r s e ( i n t s p e e d l e f t , i n t speedright )
{
a n a l o g W r i t e ( LeftA , s p e e d l e f t ) ;
a n a l o g W r i t e ( LeftB , 0 ) ;
a n a l o g W r i t e ( RightA , s p e e d r i g h t ) ;
a n a l o g W r i t e ( RightB , 0 ) ;
}
32
TRITA MMK 2017:13 MDAB 631
www.kth.se