You are on page 1of 60

A Vision-based Line Following Method for

Micro Air Vehicles

Degree Thesis
submitted to the Faculty of the
Escola Tècnica d’Enginyeria de Telecomunicació de Barcelona
Universitat Politècnica de Catalunya
by
Pol Majó Casero

In partial fulfillment
of the requirements for the degree in
Telecommunication Technology and Services ENGINEERING

Advisor: Andreas Kugi


Co-Advisor: Minh-Nhat Vu
Barcelona, Date 20/06/2021
Contents
List of Figures 4

List of Tables 5

1 Introduction 11
1.1 Statement of purpose (objectives) . . . . . . . . . . . . . . . . . . . . . . . 11
1.2 Requirements and specifications . . . . . . . . . . . . . . . . . . . . . . . . 12
1.3 Methods and procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
1.4 Work plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
1.4.1 Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
1.4.2 Gantt diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
1.4.3 Milestones . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
1.5 Incidents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

2 State of the art and fundamentals 16


2.1 History of drones . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.2 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.3 Modelling and control of a quadcopter . . . . . . . . . . . . . . . . . . . . 20
2.3.1 Basic idea controller of the quadcopter . . . . . . . . . . . . . . . . 20
2.3.2 Torques and forces . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
2.3.3 Translational directions . . . . . . . . . . . . . . . . . . . . . . . . . 23
2.3.4 Rotational directions . . . . . . . . . . . . . . . . . . . . . . . . . . 24

3 Hardware description 25
3.1 Peripheral devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.2 Parrot Mambo Fly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.2.1 Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.2.2 Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.2.3 Setup and connection with computer . . . . . . . . . . . . . . . . . 26

4 Software description 28
4.1 Programmes used and prior knowledge . . . . . . . . . . . . . . . . . . . . 28
4.1.1 Required toolboxes . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4.2 Overview of Simulink Support Package for Parrot Minidrones . . . . . . . 29

5 Methodology / project development: 31


5.1 First line tracking algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.1.1 Image Processing System block design . . . . . . . . . . . . . . . . 31
5.1.2 Path Planning block design . . . . . . . . . . . . . . . . . . . . . . 33
5.2 Second line tracking algorithm . . . . . . . . . . . . . . . . . . . . . . . . . 35
5.2.1 Image Processing System block design . . . . . . . . . . . . . . . . 35
5.2.2 Path Planning block design . . . . . . . . . . . . . . . . . . . . . . 36
5.3 Third line tracking algorithm . . . . . . . . . . . . . . . . . . . . . . . . . 38
5.3.1 Image Processing System block design . . . . . . . . . . . . . . . . 38
5.3.2 Path Planning block design . . . . . . . . . . . . . . . . . . . . . . 41

2
6 Simulation test results 44

7 Experimental test results 45


7.1 Set-up . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
7.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46

8 Budget 47

9 Conclusions 48

10 Future Work 49

References 50

Appendices 54

A Quadcopter Flight Simulation Model 54

B MATLAB Functions 59
B.1 createMask simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
B.2 filtered image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
B.3 corner edges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
B.4 createMask real scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

3
Listings

List of Figures
1 Minidrone and a colored path. . . . . . . . . . . . . . . . . . . . . . . . . . 11
2 Arena details. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
3 Successful landings in the circle. In the center of the circle or by touching it. 13
4 Unsuccessful landings in the circle. Landing out of the circle or upside down. 13
5 Gantt chart of the project. . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
6 Leonardo da Vinci sketch of a drone. . . . . . . . . . . . . . . . . . . . . . 16
7 First quadcopter design by Jacques and Louis Bréguet brothers. . . . . . . 16
8 Two pilotless aircraft developed during World Wars. . . . . . . . . . . . . . 17
9 Famous drones from the 2000s. . . . . . . . . . . . . . . . . . . . . . . . . 17
10 Plant of the controller. The inputs are the interactions with each rotor.
The output is the physical position of the drone. . . . . . . . . . . . . . . . 20
11 Simple control structure of the quadcopter. . . . . . . . . . . . . . . . . . . 20
12 Forces on the drone with different positions. . . . . . . . . . . . . . . . . . 21
13 Rotations and torques of the propellers. . . . . . . . . . . . . . . . . . . . . 22
14 Up/Down. The size of circles indicates how large are the thrusts. . . . . . . 23
15 Left/Right. The size of circles indicates how large is the rotational speed
of each pair of rotors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
16 Forward/Backward. The size of circles indicates how large are the moments
that generated from the rotors. . . . . . . . . . . . . . . . . . . . . . . . . 23
17 Roll movements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
18 Pitch movements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
19 Yaw movements (Torques in blue). . . . . . . . . . . . . . . . . . . . . . . 24
20 Detecting drone and updating firmware. . . . . . . . . . . . . . . . . . . . 27
21 Minidrone Bluetooth connection. . . . . . . . . . . . . . . . . . . . . . . . 27
22 Flight visualization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
23 Flight control system. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
24 120x160 image obtained from the drone. . . . . . . . . . . . . . . . . . . . 31
25 First algorithm sub-matrices. . . . . . . . . . . . . . . . . . . . . . . . . . 32
26 First image processing system. . . . . . . . . . . . . . . . . . . . . . . . . . 32
27 First path planning system. . . . . . . . . . . . . . . . . . . . . . . . . . . 33
28 The first path planning algorithm stateflow. . . . . . . . . . . . . . . . . . 33
29 First algorithm minidrone movements. . . . . . . . . . . . . . . . . . . . . 34
30 Second image processing system. . . . . . . . . . . . . . . . . . . . . . . . 35
31 Second algorithm sub-matrices. . . . . . . . . . . . . . . . . . . . . . . . . 36
32 Second path planning system. . . . . . . . . . . . . . . . . . . . . . . . . . 36
33 Second path planning algorithm stateflow. . . . . . . . . . . . . . . . . . . 37
34 Turning points of the track. . . . . . . . . . . . . . . . . . . . . . . . . . . 37
35 Third image processing system. . . . . . . . . . . . . . . . . . . . . . . . . 38
36 Rotating and squaring the image. . . . . . . . . . . . . . . . . . . . . . . . 38
37 Image Processing subsystem. . . . . . . . . . . . . . . . . . . . . . . . . . . 39
38 Graphical representation of the outputs. . . . . . . . . . . . . . . . . . . . 40

4
39 Third path planning block. . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
40 Third path planning algorithm stateflow. . . . . . . . . . . . . . . . . . . . 42
41 Third algorithm yaw movements. . . . . . . . . . . . . . . . . . . . . . . . 42
42 Circle detection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
43 Simulation tracks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
44 Real scenario track. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
45 Applying the mask in the real scenario. . . . . . . . . . . . . . . . . . . . . 46
46 Quadcopter flight simulation model. . . . . . . . . . . . . . . . . . . . . . . 54
47 Airframe. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
48 Environment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
49 Sensors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
50 Flag. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
51 Commands. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
52 createMask simulation MATLAB function. . . . . . . . . . . . . . . . . . . 59
53 filtered image MATLAB function. . . . . . . . . . . . . . . . . . . . . . . . 59
54 corner edges MATLAB function. . . . . . . . . . . . . . . . . . . . . . . . 60
55 createMask real scenario MATLAB function. . . . . . . . . . . . . . . . . . 60

List of Tables
1 Milestones / deliverables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2 Parrot Mambo Fly specifications [1]. . . . . . . . . . . . . . . . . . . . . . 26
3 Required toolboxes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4 Sizes and thresholds of the first tracking algorithm sub-matrices. . . . . . . 31
5 Sizes and thresholds of the second tracking algorithm sub-matrices. . . . . 35
6 Simulation results. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
7 Components cost. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
8 Hours dedicated and estimated cost. . . . . . . . . . . . . . . . . . . . . . 47

5
Abbreviations
3D Three-Dimensional
BT Bluetooth
BW Black and White
CCW Counterclockwise
CSR Cambridge Silicon Radio
CW Clockwise
DSP Discrete Signal Processing
DVI Digital Visual Interface
FAA Federal Aviation Administration
FPS Frames Per Second
GPS Global Positioning System
HDMI High Definition Multimedia Interface
IMU Inertial Measurement Unit
LED Light-Emitting Diode
LiPo Lithium Polymer
LPR License-Plate Reader
RC Radio-Controlled
RGB Red Green Blue
RPA Remotely Piloted Aircraft
SME Small and Medium-sized Enterprises
UAS Unmanned Aircraft System
UAV Unmanned Aerial Vehicle
USB Universal Serial Bus

6
Abstract
Due to the technological advances over the last few years and the large number of appli-
cations that could be carried out with UAVs, many companies have recognised a unique
market opportunity by incorporating drones into everyday tasks and daily life.
Although we are seeing more and more drones of all kinds, applied to different tasks,
such as monitoring wildlife or delivering packages to homes, it is true that we have only
seen the tip of the iceberg of a technology that has yet to be discovered. In this work, a
software development project on a minidrone, experimenting with the autonomous flight
of a quadcopter using computer vision is proposed. In this project, the Parrot Mambo Fly
must take off, follow the colored line marked on the ground, and finally land on a circle
mark. Both 3D simulations and experiments are carried out to test the performance of
the proposed algorithms.

7
Resum
A causa dels avenços tecnològics dels darrers anys i a la gran quantitat d’aplicacions que
es poden dur a terme amb els UAVs, moltes empreses han vist una oportunitat de mercat
única a l’incorporar els drones a les tasques quotidianes i a la vida diària.
Tot i que cada vegada veiem més drones de tota mena, aplicats a diferents tasques, com
la vigilància de la fauna o el lliurament de paquets a domicili, és cert que només hem vist
la punta de l’iceberg d’una tecnologia que encara està per descobrir. En aquest treball, un
projecte de desenvolupament de programari sobre un minidrone, experimentant amb el vol
autònom d’un quadcòpter mitjançant computer vision és proposat. En aquest projecte,
el Parrot Mambo Fly ha de prendre el vol, seguir la lı́nia de color marcada a terra, i
finalment aterrar en una marca circular. Tant les simulacions 3D com els experiments es
duen a terme per provar el rendiment dels algoritmes proposats.

8
Resumen
Debido a los avances tecnológicos de los últimos años y a la gran cantidad de aplicaciones
que se pueden llevar a cabo con los UAVs, muchas empresas han visto una oportunidad
de mercado única al incorporar los drones a las tareas cotidianas y a la vida diaria.
Aunque cada vez vemos más drones de todo tipo, aplicados a diferentes tareas, como
la vigilancia de la fauna o la entrega de paquetes a domicilio, es cierto que sólo hemos
visto la punta del iceberg de una tecnologı́a que aún está por descubrir. En este trabajo,
un proyecto de desarrollo de software sobre un minidrone, experimentando con el vuelo
autónomo de un cuadricópetro mediante computer vision es propuesto. En este proyecto, el
Parrot Mambo Fly debe despegar, seguir la lı́nea de color marcada en el suelo, y finalmente
aterrizar en una marca circular. Tanto las simulaciones 3D como los experimentos se llevan
a cabo para probar el rendimiento de los algoritmos propuestos.

9
Revision history and approval record

Revision Date Purpose


0 08/03/2021 Document creation
1 02/06/2021 Document revision
2 07/06/2021 Document revision
3 08/06/2021 Document revision
4 09/06/2021 Document revision
5 14/06/2021 Document revision
5 18/06/2021 Document revision

DOCUMENT DISTRIBUTION LIST

Name e-mail
Pol Majó polmajo@gmail.com
Andreas Kugi kugi@acin.tuwien.ac.at
Minh-Nhat Vu VU@acin.tuwien.ac.at

Written by: Reviewed and approved by:


Date 08/03/2021 Date 20/06/2021
Name Pol Majó Name Andreas Kugi
Position Project Author Position Project Supervisor

10
1 Introduction

1.1 Statement of purpose (objectives)


This project, A Vision-based Line Following Method for Micro Air Vehicles, which is
carried out at TU Wien, has two main objectives:
• First, color tracking algorithms are developed in simulation using MATLAB/Simulink.
To do so, different computer vision approaches will be applied to the image of the
drone’s down-facing camera.
• Second, proposed algorithms will be tested in the minidrone in a real setup scenario.
In order to meet the thesis objectives, the following aspects are investigated:
• Deploy the computer vision tool for line follower algorithm.
• Efficient algorithm for planning the path in the 3D space (smooth, time of flight).
• Understanding the structure of the Simulink Support Package for Parrot Minidrones.
• Perform the flight on the Parrot Mambo Fly in the real setup scenario.

Figure 1: Minidrone and a colored path.

11
1.2 Requirements and specifications
Project Requirements:
• The vision-based line following method and 3D simulation are recommended to be
developed in MATLAB/Simulink.
• The developed algorithms will be tested in a real setup scenario with the Parrot
Mambo Fly.

Project specifications:
• The drone should follow a track laid on the arena and land on the circular marker
in the shortest time, see Fig. 3. In Fig. 4, the landing on the circular marker is
unsuccessful.
• The track will be divided into multiple sections. In addition, the track will have only
straight lines and no smooth curves.
• The arena is a 4x4 meter space enclosed by nets on all sides.
• The lines will be 10 cm in width.
• The landing circular marker will have a diameter of 20 cm.
• The angle between any two track sections may be between 10 and 350 degrees.
• The track may have between 1 to 10 connected line segments. The initial position
of the drone will always be at the start of the line. However, the front part of the
drone may not always face the direction of the first line on the track.
• The distance from the end of the track to the center of the circle will be 25 cm, see
Fig. 2.
• The background on which the track will be laid may not be a single colour and will
have texture.

Figure 2: Arena details.

12
1.3 Methods and procedures
This project was initiated as a global competition organized by Mathworks, where each
team gets a score depending on the algorithm designed.
Landing, flight time, or successful simulation are the factors that influence the score. In
my case, I am doing the project alone and without any competition.
As a starting point, the TU Wien provides me the Parrot Mambo Fly and the Simulink
Support Package for Parrot Minidrones, with which the simulation environment is created.

Figure 3: Successful landings in the circle. In the center of the circle or by touching it.

Figure 4: Unsuccessful landings in the circle. Landing out of the circle or upside down.

13
1.4 Work plan

1.4.1 Tasks
• Phase 1: Literature review on the colored line detection using computer vision.
• Phase 2: Literature review on the mathematical model of quadcopter motion.
• Phase 3: Get started with Simulink Support Package for Parrot Minidrones.
• Phase 4: Develop the colored line following method using computer vision.
• Phase 5: Develop three different simple path planning algorithms for Parrot Minidrone
in Simulink.
• Phase 6: Perform simulations and experiments by combining the path planning
algorithm with the existing controller in the Simulink package.
• Phase 7: Summarize results and write thesis.

1.4.2 Gantt diagram

Phases of the Project


2021
February March April May June
Phase1
Phase2
Phase3
Phase4
Phase5
Phase6
Phase7

Figure 5: Gantt chart of the project.

14
1.4.3 Milestones

WP Task Short line Milestone / deliverable Date


01 1 Literature review line detection - -
02 2 Literature review quadcopter - -
03 3 Get started with Parrot Minidrones - -
04 4 Colored line following method Following method algorithm 16.03.2021
05 5 First path algorithm drone First path planning 26.03.2021
06 6 Second path algorithm drone Second path planning 09.04.2021
07 7 Third path algorithm drone Third path planning 30.04.2021
08 8 Perform simulation and experiment Results 31.05.2021
09 9 Write thesis Thesis 21.06.2021

Table 1: Milestones / deliverables.

1.5 Incidents

There were many incidences during the development of the project. One of them was
with the minidrone hardware. The problem was that the drone must be rebooted and the
internal memory had to be cleaned every time after the drone is turned off. If I did not
do that, the drone was connected to the computer successfully, but it did not fly, even if
the ‘take off’ button was pressed .
The following steps are necessary for cleaning the drone’s internal memory:
1. Connect the Parrot minidrone to the host computer using Bluetooth.
2. Open the command prompt on the host computer, and connect to the drone using
Telnet: telnet 192.168.3.1
3. After the successful Telnet connection, go to /data/edu directory: cd /data/edu
4. Delete all the text files, MAT files, and shared object files: rm *.so

In addition, a special Bluetooth device Cambridge Silicon Radio (CSR) 4.0 is needed in
order to make a connection between the drone and the computer, because the Windows
10 operating system Bluetooth is not compatible with the Parrot Mambo Fly.

15
2 State of the art and fundamentals

2.1 History of drones


Drones are formally known as UAVs or UASes. Significantly, a drone is a flying robot that
can be remotely controlled or fly autonomously through a fly controller in their embedded
systems, working in conjunction with onboard sensors and GPS [2].
For many years, it was believed the first concept of drones was seen in Austria, using used
unmanned aerial balloons stuffed with explosives to attack Venice in 1849. In 2016, when a
new sketch of Leonardo da Vinci was discovered at the Torre do Trombo National Archive
in Lisbon showing a self-operating quadcopter, it proofs that the idea of the autonomous
flight has existed for a long time [3].

Figure 6: Leonardo da Vinci sketch of a drone.

In 1907, the current structure of quadcopters was designed by Jacques and Louis Bréguet
brothers. Although it achieved the first ascent of a vertical-flight aircraft with a pilot,
the design of the quadcopter was more an idea than a reality. It only reached an altitude
of 0.6 meters. Moreover, four men were needed to stabilize the structure. However, it
demonstrated that the concept of the quadcopter would work for flight.

Figure 7: First quadcopter design by Jacques and Louis Bréguet brothers.

16
Moving forward in time, due to the World War I and the World War II, where the
technique and quality of aircraft and drones improved remarkably for war’s demands, the
first pilotless aircraft was developed in 1916. Later, Great Britain developed an unmanned
aircraft radio-controlled drone ”Queen Bee” in 1935.

(a) First pilotless aircraft (b) Queen Bee

Figure 8: Two pilotless aircraft developed during World Wars.

After the end of the World War II and during the Cold War, the progress of the technology
continued. This allowed to reduce the size of drones progressively over the years. For
instance, the first drone with cameras was used in the Vietnam War. Moreover, RC planes
were sold to civilian costumers [4].
From 1990 to 2010, different versions of UAVs were introduced, such as the famous ”Preda-
tor” drone in 2000 that was used in Afghanistan in the search of Osama Bin Laden, and
small-sized drones such as ”Raven”, ”Wasp” or ”Puma”, developed by AeroVironment
Inc. Also in 2006, the FAA officially issued the first commercial drone permit.

(a) Predator (b) Raven

Figure 9: Famous drones from the 2000s.

17
2.2 Applications
Although originally built for military purposes, drones have seen rapid growth and ad-
vancements. Nowadays, drones are being used by individual entrepreneurs, SMEs, and
large companies to accomplish various other tasks. Drone’s applications can be used in
four different sectors: military, commercial, personal, and future technology [5].
In addition, due to the progress of technology in the field of sensors and cameras in recent
years, it has made it possible equip remotely piloted aircrafts (RPAs) with more advanced
technology for different applications.

• Aerial photography for journalism and film:


Drones are being used to capture image and video scenes that would require ex-
pensive helicopters or cranes. The autonomous flying devices are also used in sports
photography, to collect footage and information in live broadcasts [6].
• Express shipping and delivery:
The rapid urbanisation in many areas has increased pollution, congested roads and
decreased efficiency. Furthermore, rural areas tend to have less infrastructure and
may be more difficult to reach by traditional delivery methods. For that reason, com-
panies such as Amazon, UPS and DHL are in favor of drone delivery, which would
reduce the aforementioned problems. However, it is not a completed application,
and there is still a lot of room for improvement [7].
• Gathering information or supplying essentials for disaster management
and rescue operations:
The drones are able to quickly reach the area where disasters have occurred. They
collect information about what happened, the damage that has been caused, and
whether there are victims to rescue, before rescue crews move to the location. High-
tech cameras, sensors, and radars help drones perform these tasks. Also, drones can
enter and explore dangerous environments, because of their agility [8].
• Geographic mapping of inaccessible terrain and locations:
Drones can acquire very high-resolution data and they can reach locations such
as coastlines, mountaintops, and islands. In addition, in dynamic natural systems,
repeated drone missions over time can provide essential information on processes
and timing, and can inform future management decisions [9].
• Building safety inspections:
There are many buildings and structures that are difficult to access. Using a drone
can help to minimize costs associated with accessing these areas, as scaffolding and
aerial work platforms would not be necessary. They can also keep individuals safely
away from sources of danger during the inspection process. Moreover, they can
obtain useful data on the state of the building thanks to the different sensors and
cameras of the RPA [10].
• Precision crop monitoring:
Farmers and agriculturists are always looking for cheap and effective methods to
regularly monitor their crops. The infrared sensors in drones can be tuned to detect

18
crop health. This allows farmers to react and improve crop conditions locally, by
taking necessaries actions. Also, through time-lapse drone photography, a farmer
might find that part of his or her crop is not being properly irrigated.
Another example of use, since 2015, when the FAA approved the first drone weighing
more than 55 pounds to carry tanks of fertilizers and pesticides in order to spray
crops. Drones are capable of spraying crops with far more precision than a traditional
tractor, reducing costs, and avoiding worker’s contact with the pesticide [11].
• Wildlife monitoring:
RPAs have served as a deterrent to poachers, as they provide unprecedented pro-
tection to animals, like elephants, rhinos, and big cats. With their thermal cameras
and sensors, drones have the ability to operate during the night. This enables them
to monitor and research wildlife and provides insight into their patterns, behavior,
habitat, and reproduction. This monitoring method using drones is less invasive and
harmful compared to the traditional method [12].
• Law enforcement and border control surveillance:
Drones are also used for maintaining the law. They help with the surveillance of
large crowds and ensure public safety. They assist in monitoring criminal and illegal
activities, because they are versatile, nearly undetectable, and relatively inexpensive.
In fact, fire investigations, smugglers of migrants, and illegal transportation of drugs
via coastlines, are monitored by the border patrol with the help of drones. Also, some
drones are equipped with License-Plate Readers (LPRs), which use cameras to track
vehicles on city streets. These types of RPAs equipped with LPRs are able to track
a single car for kilometers [13].
• Storm tracking and forecasting hurricanes and tornadoes:
Since they are cheap and unmanned, drones can be sent into hurricanes and tor-
nadoes, so that scientists and weather forecasters acquire new insights into their
behavior and trajectory. Specialized sensors can be used to detail weather param-
eters, collect data, and prevent mishaps. Not only will help in providing real-time
weather data but also it can help scientists make weather predictions. So, the tra-
ditional weather collection systems are becoming obsolete, especially when it comes
to tracking and predicting the occurrence of storms [14].
• Entertainment:
Although many UAVs are designed for specific tasks, many other drones are used
for entertainment. Capturing personal videos and images, or for drone fights and
races, are a few examples.

19
2.3 Modelling and control of a quadcopter

2.3.1 Basic idea controller of the quadcopter


In Fig. 10, the quadcopter used in this thesis has 4 rotors, where each rotor will interact
directly with the drone, varying its direction and altitude. The task is to design a controller
that acts on each rotor to maneuver the drone in the 3D space. In appendix A, the airframe
is the block that acts as plant in the simulation.
The inputs to the plant will be the updated position and orientation of the drone, in the
designed algorithms, which will act directly on each rotor, and result in an output which
will be the physical movement of the drone.

Figure 10: Plant of the controller. The inputs are the interactions with each rotor. The
output is the physical position of the drone.
In order to observe the system state, the Parrot Mambo Fly has different sensors which
will provide more information about the position of the drone. In Fig. 11, a simple control
structure for Parrot Mambo Fly is illustrated [15]. On board sensors observe the current
state of the system. The differences between the desired set point and the current system
state will be fed into a fly controller. In appendix A, the sensors model block acts as a
feedback, and the flight control system block as a controller in the simulation.

Figure 11: Simple control structure of the quadcopter.

20
2.3.2 Torques and forces
The quadcopter consists of 4 motors. Two of the rotors rotate in clockwise (CW) direction
and the remaining two in counterclockwise (CCW) direction. CW and CCW motors are
placed next to each other [16].
Each rotating propeller produces two kind of forces. In Fig. 12, one force is the thrust,
which is an upward force. As a quadcopter has four propellers, the total thrust of a
quadcopter is the sum of the thrust generated by each propeller. When the rotors spin
together, they push down on the air and the air pushes back up on the rotors. When
the rotors spin fast the drone lifts up into the sky, and when the rotors slow down the
drone will descend towards the ground. Four propellers produce thrust Fi in the direction
perpendicular to the plane of rotation of propeller. This thrust Fi is proportional to square
of angular velocity wi of propellers (Fi = Kf wi2 , i = 1, ..., 4) [17].
To rise above the ground, the drone needs a net upward force. As a quadcopter is steady in
air, it must be in equilibrium. The motors of the drone generate thrust that is greater than
the weight of the quadcopter, making the quadcopter rise upwards. To enter equilibrium,
see Fig. 12a, the forces must be balanced, and the total thrust F1 + F2 + F3 + F4 produced
by the propellers must be equal to the weight of the quadcopter.

(a) Horizontal position. (b) Different position.

Figure 12: Forces on the drone with different positions.

21
Apart from the upward force, a rotating propeller also generates an opposing torque. For
instance, a rotor spinning CW direction will produce a torque which causes the body of
the drone to spin CCW direction. This reaction torque is proportional to the square of
angular velocity (Mi = Km wi2 )[18], see Fig. 13. The two opposite rotations balance out
and keep the drone steady, as all the torques produced must be equal to 0.

Figure 13: Rotations and torques of the propellers.

By controlling the speed of the rotors, the different directions, such as translational and
rotational directions, of the quadcopter can be achieved.

22
2.3.3 Translational directions
There are three translational directions:
• Up/Down: The up and down movement depends on the thrust of the four motors
at the same time. If the total thrust of 4 rotors is greater than the weight of the
drone, it will go up, see Fig. 14a.

(a) Up. (b) Down.

Figure 14: Up/Down. The size of circles indicates how large are the thrusts.

• Left/Right: The left and right movements depend on the independent rotational
speed between the two motors on the right and the two motors on the left. If the
rotational speed of the motors on the right is higher, the quadcopter will go to the
left, see Fig. 15a, and vice versa, see Fig. 15b.

(a) Left. (b) Right.

Figure 15: Left/Right. The size of circles indicates how large is the rotational speed of
each pair of rotors.

• Forward/Backward: If the two motors in front rotate faster than the two motors
in the back, the quadcopter will go backward, see Fig. 16b, and vice versa, see Fig.
16a.

(a) Forward. (b) Backward.

Figure 16: Forward/Backward. The size of circles indicates how large are the moments
that generated from the rotors.

23
2.3.4 Rotational directions
There are three rotational directions:
• Roll: The roll movement works the same way as the left and right movements. But
the roll movement refers to rotating the quadcopter’s motors clockwise, in Fig. 17a,
and counterclockwise, in Fig. 17b, with respect to the quadcopter’s head.

(a) Roll right. (b) Roll left.

Figure 17: Roll movements.

• Pitch: The pitch movements work the same way as the roll movement but viewed
from the side of the quadcopter. Thus, the pitch movement will occur when the two
front, in Fig. 18a, or rear motors, in Fig. 18b, create more thrust than the other two
rotors.

(a) Pitch backward. (b) Pitch forward.

Figure 18: Pitch movements.

• Yaw: The yaw movement will occur when two rotors that spin in the same direction
make more torque than the other two rotors that spin in the opposite direction, as
shown in Fig. 19.

(a) Yaw right. (b) Yaw left.

Figure 19: Yaw movements (Torques in blue).

24
3 Hardware description

3.1 Peripheral devices


• Micro-USB cable used for charging the minidrone and also for uploading the firmware
from Mathworks.
• USB BT Wireless Dongle[19] low energy, and its drivers, used to connect the minidrone
to the computer and try the developed Simulink model in the real scenario.
• USB type C[20] hub LC-Power with 3 USB type A[21] peripherals to the computer
via USB type C, used to be able to connect all hardware to the computer.
• Extra screen for a larger view of Simulink.
• Cable adapter HDMI to DVI used to connect the computer to the extra screen.
• For the real scenario, 10 cm thick red lines made of tape were used, as well as 20
cm diameter red and circle made of the same material as the lines.

3.2 Parrot Mambo Fly


The Parrot Mambo Fly minidrone is stable, safe, small, and easy to control. It is equipped
with an ultrasound sensor, a down-facing camera, a pressure sensor, and an IMU sensor.

3.2.1 Sensors
• Ultrasound: The ultrasound sensor, which is located under the drone, is used to
measure the distance of the minidrone from the ground. Specifically, the time of
the sound waves from the sensor, bouncing off the ground, and return back to the
sensor, is measured. This information is used to determine the distance from the
current sensor position to the ground. The measurements can be obtained with the
maximum value of 4 meters. In addition, it is used for the vertical stabilization of
the drone.
• Down-facing camera: The down-facing camera, which is located under the drone,
is used to estimate the minidrone motion using optical flow. It is useful for the
horizontal stabilization and to compute the speed of the Parrot Mambo Fly. The
camera has a frame rate of 60 FPS.
• Pressure: This sensor is used for vertical stabilisation to estimate the altitude. The
lower pressure will be received if the drone has higher altitude and vice versa.
• IMU: This sensor combines a 3-axis accelerometer sensor and a 3-axis gyroscope
sensor. It is used for evaluating the acceleration, the angular rate, and obstacle con-
tact. In addition, it provides measures about orientation, heading, and acceleration
of the drone.

25
3.2.2 Specifications
When the quadcopter is on, its two front lights inform about its status depending on the
colour and whether they blink or not.
• Steady orange: Starting up state.
• Steady green: Ready to fly state.
• Steady red: Error state.
• Blinking green: No Bluetooth connection.
• Blinking red: Running out of power.
The LiPo battery is inserted in the rear of the minidrone, where also there is a LED that
informs about the state of charge of the battery. If the LED is green, indicates the charge
is complete or in red if it is not.

Battery LiPo 550 mAh


Battery life 9 minutes
Charging time 30 minutes with a 2.1 A charger
Camera 60 FPS
Bluetooth range 20 m with smartphone
100 m with Parrot Flypad
Weight 63 g without bumpers
Dimensions 180 x 180 x 40 mm with bumpers
Operating systems iOS 7 or higher
Android 4.3 or higher
Software Development Kit Linux OS

Table 2: Parrot Mambo Fly specifications [1].

3.2.3 Setup and connection with computer


This section explains how to set up and communicate with the Parrot Mambo Fly using
Windows 10 via Bluetooth.
1. Connect the Parrot Mambo Fly to the computer via USB, and then open the Add-
On Manager. Then, click on the gear next to the Simulink Support Package for
Parrot Minidrones toolbox.
2. Once the Hardware Setup window has been opened, the minidrone has to be detected
and then the firmware update can be started. To do this, disconnect the drone from
the computer and press the button Next. The firmware will be updated when both
front lights of the drone change from flashing orange to flashing green.

26
(a) Drone detected successfully. (b) Update firmware instructions.

Figure 20: Detecting drone and updating firmware.

3. When the lights are flashing green, it is time to connect the drone to the computer
via Bluetooth, using the USB BT Wireless Dongle. Inside My Bluetooth Devices, we
will press the Add Devices button and select the Mambo 823458 device that has a
joystick icon, as there are two with the same name.
4. If the connection was successful, right-click on the connected Parrot Mambo in
the Bluetooth Devices, then click on Open Services and connect the Personal Area
Networking.

(a) Joystick icon to be selected. (b) Personal Area Networking connection.

Figure 21: Minidrone Bluetooth connection.

5. Once the connection of the Personal Area Networking is done, the minidrone is
finally connected and ready to use.

27
4 Software description
4.1 Programmes used and prior knowledge
The program used to carry out the project is MATLAB/SIMULINK R2021a[22].
MATLAB[23] is the numeric computing environment with its own programming language,
while Simulink[24] is a MATLAB-based graphical programming environment for modeling,
simulating and analyzing multi-domain dynamical systems. Stateflow[25] is a control logic
tool used to model state machines and flow charts within a Simulink model.

4.1.1 Required toolboxes

Toolbox Version
DSP System Toolbox [26] 9.12
Signal Processing Toolbox [27] 8.6
Simulink Support Package for Parrot Minidrones [28] 21.1.0
Stateflow [29] 10.4
Simulink 3D Animation [30] 9.2
Image Processing Toolbox [31] 11.3
Control System Toolbox [32] 10.10
Computer Vision Toolbox [33] 10.0
Aerospace Toolbox [34] 4.0
Aerospace Blockset [35] 5.0

Table 3: Required toolboxes.

• DSP System Toolbox: Provides algorithms, apps, and scopes for designing, sim-
ulating, and analyzing signal processing systems.
• Signal Processing Toolbox: Provides functions and apps to analyze, preprocess,
and extract features from uniformly and nonuniformly sampled signals.
• Simulink Support Package for Parrot Minidrones: Allows to build and deploy
flight control algorithms to Parrot minidrones.
• Stateflow: Provides a graphical language that includes state transition diagrams,
flow charts, state transition tables, and truth tables.
• Simulink 3D Animation: Links Simulink models and MATLAB algorithms to
3D graphics objects in virtual reality scenes.
• Image Processing Toolbox: Provides a comprehensive set of reference-standard
algorithms and workflow apps for image processing, analysis, visualization, and al-
gorithm development.
• Control System Toolbox: Provides algorithms and apps for systematically ana-
lyzing, designing, and tuning linear control systems.

28
• Computer Vision Toolbox: Provides algorithms, functions, and apps for design-
ing and testing computer vision, 3D vision, and video processing systems.
• Aerospace Toolbox Toolbox: Provides standards-based tools and functions for
analyzing the motion, mission, and environment of aerospace vehicles.
• Aerospace Blockset: Provides Simulink reference examples and blocks for mod-
eling, simulating, and analyzing high-fidelity aircraft and spacecraft platforms.

4.2 Overview of Simulink Support Package for Parrot Minidrones


The main toolbox used in this project is the Simulink Support Package for Parrot Minidrones.
In annex A, a compact documentation of the toolbox is shown.
This toolbox lets us build and deploy the flight control algorithms on Parrot minidrones.
This program provides a simulation model that takes into account the airframe, a flight
visualization, the flight commands, and a flight control system. The Image Processing
System and the Path Planning block must be designed and implemented inside the Flight
Control System.

(a) Isometric camera. (b) Quadcopter camera.

Figure 22: Flight visualization.

Of all the seven subsystems, four subsystems are considered as variant subsystems, because
they allow us to choose between different options inside each subsystems:
• Airframe: Linear airframe or nonlinear airframe.
• Environment: Constant or variable.
• Sensors: Dynamics or feedthrough.
• Flight Commands: Signal builder, joystick, .mat data or spreadsheet data.
Within all the possibilities, the subsystems selected by default are the nonlinear airframe
block, the constant environment block, the dynamics sensors block, and the signal builder
command block.

29
The other three that are not considered as variant subsystems are the mentioned flight
control system block, the flight visualization system block, and the flag system block,
where the simulation will stop if any unexpected error occurs during the flight.
In the image processing system block, see green block in Fig. 23, the track has to be
detected, as well as the circle, and the output from this block is the input of the path
planning block.
In the path planning block, the movements of the minidrone are controlled to follow the
line already detected, and finally, land on the circle.

Figure 23: Flight control system.

In Section 5, the designed algorithms can access onboard sensors, such as the ultrasonic,
accelerometer, gyroscope, and air pressure sensors, as well as the downward facing camera,
and the 3D simulation track was redesigned multiple times to test the robustness of the
algorithm.

30
5 Methodology / project development:

5.1 First line tracking algorithm


The aim of this first line tracking algorithm is to divide the image obtained into different
sub-matrices and then calculate the number of white pixels in each sub-matrix. Based on
this information, the decisions of going straight, turning to the right, turning to the left,
or landing, are made.

5.1.1 Image Processing System block design


The Image data obtained from the Parrot Mambo Fly is transformed into the RGB
colours. Then, in annex B.1, a mask created with the MATLAB Color Thresholder App
is applied to the RGB image. After the mask, a BW image is obtained, where the white
pixels are the line and the black pixels are the background.

(a) RGB image. (b) Masked image.

Figure 24: 120x160 image obtained from the drone.

Once the image is masked, it is divided into different sub-matrices to know which region
of the image has more white pixels. Then, each region of the image will return a ’1’ if the
number of white pixels is greater than a defined threshold, or a ’0’, if the number of white
pixels is smaller than the threshold. Each threshold is selected depending on the size of
the sub-matrix applied to the original image.
The sub-matrices can be separated into those in charge of detecting the circle to land and
those in charge of detecting changes of direction on the line.

Sub-matrix Rows Columns Threshold


Left 1 to 120 1 to 55 150
Right 1 to 120 105 to 160 150
Curve 40 70 to 90 10
Up 2 70 to 90 15
Middle 35 to 80 40 to 120 1100
Bottom 100 to 120 50 to 110 210

Table 4: Sizes and thresholds of the first tracking algorithm sub-matrices.

31
The values of rows and columns in Tab. 4 refer to the size of each sub-matrix, while the
value of threshold, refers to the minimum number of white pixels in each sub-matrix for
the output to be ’1’.

(a) Detecting changes of direction (b) Detecting circle sub-matrices.


sub-matrices.

Figure 25: First algorithm sub-matrices.

Thus, the outputs of this algorithm are Left, Right, Curve, Up, and Circle detections. The
output Circle will be ’1’ if Up is ’0’, Middle and Bottom are ’1’, using logical operators.

Figure 26: First image processing system.

32
5.1.2 Path Planning block design
The inputs of the Path Planning block are Left, Right, Curve, Up, and Circle detections
from the colour tracking block, the estimated state values, and the reference commands.
And the outputs are the updated commands, which are the position and the orientation
of the minidrone, and are calculated inside the chart block, in Fig. 28.

Figure 27: First path planning system.

The estimated yaw value and the inputs from the Image Processing System block are
the inputs of a chart, which updates the value of the position and the orientation of the
minidrone.

Figure 28: The first path planning algorithm stateflow.

33
The algorithm works as follows. First, the altitude of the drone is commanded to 1.1
meters, other coordinates stay at 0 meters. After 3 seconds, the drone will either go
forward or turn to the left/right until the downward camera is facing the line. The decision
is made depending on the Right and Left input values from the color-tracking algorithm.
When the minidrone is moving forward, its speed will decrease if the Up detection is
equal to ’0’, because the algorithm recognises a turning point is coming. In case of the
minidrone is on the edge of a turn i.e. curve equal to ’0’, it will yaw to the left or to the
right without moving forward. This turning action will finish when the drone detects the
line is in front of the quadcopter. At the end of the track, the circle will be detected and
a landing command will be activated in the drone.

(a) Land (quadcopter camera). (b) Yawing right (down-facing camera).

Figure 29: First algorithm minidrone movements.

34
5.2 Second line tracking algorithm
The second line tracking algorithm has some minor improvements over the first algorithm.
This makes the minidrones land on the circle with less time and with smoother movements
in the turns.

5.2.1 Image Processing System block design


In this algorithm, see Fig. 30, we use different values for the sub-matrices compared to
the first algorithm. Furthermore, LineLeft and LineUp commands are added into the
algorithm, in order to detect the initial orientation of the line with respect to the drone,
and to detect the circle at the end of the track.

Figure 30: Second image processing system.

As with the sub-matrices of the first image processing system, the sub-matrices of this
second block can be divided into those in charge of detecting the line, and those in charge
of detecting the circle.

Sub-matrix Rows Columns Threshold


Left 1 to 20 1 to 55 150
Right 1 to 20 105 to 160 150
Up 1 to 30 75 to 85 150
Middle 35 to 90 30 to 130 1100
Bottom 100 to 120 1 to 160 200
LineLeft 1 to 120 1 to 50 150
LineUp 1 to 30 1 to 160 150

Table 5: Sizes and thresholds of the second tracking algorithm sub-matrices.

35
(a) Detecting changes of direction (b) Detecting circle sub-matrices.
sub-matrices.

Figure 31: Second algorithm sub-matrices.

The input of this system is the image obtained from the downward minidrone camera,
which will be binarized using the mask, obtaining a 120x160 pixel BW image. The outputs
are Left, Right, Up, LineLeft, and Circle detections. Circle will be ’1’ if Middle and Bottom
are ’1’ and LineUp is ’0’, using logical operators.

5.2.2 Path Planning block design


The output of this block is the updated position and orientation of the drone, which are
calculated inside a chart. The inputs are the reference commands, the outputs of the track
detecting block and the estimated state values.
A chart block is used to update the position of the drone, i.e. xout, yout and zout, and the
orientation of the drone, i.e. yawOut, depending on which input of Vision-based Data is
equal to ’1’ or ’0’. Vision-based Data is an input bus, in Fig. 32, where all the sub-matrices
detections are carried from the image processing system block to the path planning block.

Figure 32: Second path planning system.

36
The stateflow works in the same way as the stateflow in Fig. 28, only with some changes.
First, as the sub-matrices Left and Right are in the corner, in Fig. 31, the initial position
of the drone, in order to face the line, is obtained using LineLeft and Up. Then, once the
minidrone is facing the line, the drone will move forward until Right state or Left state is
equal to ’1’. The drone will start yawing to the left or to the right. In this algorithm, the
drone will move forward at a lower speed while it is turning right or left. This decision
makes a smoother movement compared to the previous one, as shown in Fig. 34. For
the landing, if a circle is detected and Right state and Left state are equal to ’0’, the
quadcopter will land and the program will finish.

Figure 33: Second path planning algorithm stateflow.

(a) YawRight state. (b) YawLeft state.

Figure 34: Turning points of the track.

37
5.3 Third line tracking algorithm
As the first two algorithms work with sub-matrices, and in a real scenario there are errors
and unexpected variations, this third line tracking algorithm has been designed to obtain
a more robust algorithm than the previous ones.

5.3.1 Image Processing System block design


First, in Fig. 35, the Image Data obtained from the minidrone is masked, to obtain a BW
image. The white pixels are the line and the circle. The black pixels are the background.
Then, this binarized image is filtered, in Annex B.2, in order to remove the white pixels
that appear due to noises.

Figure 35: Third image processing system.

After the filter, the image is shrinked in size of 118x118 pixels, to obtain the same dimen-
sions of all sides. Then, it is rotated 90◦ , to work easily with the Blob Analysis [36] block,
as shown in Fig. 36b. This rotation helps to obtain an orientation between 90◦ and -90◦ ,
and not between 180◦ and 0◦ .
The Blob Analysis block calculates statistics for labeled regions in a binary image. The
block returns quantities such as the centroid, blob count, orientation, perimeter, and area,
among others.

(a) Image before rotate (b) Image after


and sub-matrix blocks. rotate and
sub-matrix blocks.

Figure 36: Rotating and squaring the image.

38
Once the image is rotated, the image processing is divided into two branches, see Fig. 35.
One branch takes care of detecting the circle and the orientation of the line. The other
branch calculates the midpoint where the line passes on each of the 4 sides of the image.
In the second branch, the Edge Detection block is first applied to then obtain the corners
of the line on each side of the image, as shown in Fig. 54. Then in the Subsystem block,
for each side of the image, the midpoint of the corners is calculated. It helps to find the
exact point where the line is located on each side.

(a) turnFront and turnBack subsystems.

(b) turnUpper and turnLower subsystems.

Figure 37: Image Processing subsystem.

The else-if block of each subsystem gives the maximum value 113, as the corner edges
Matlab function returns a value between 1 to 113, or the minimum value 1 to each output,
if only one edge of the line is detected on that side. Meaning that the line is located in a

39
corner. If this were not done and the line was in a corner, the value of one centroid edge
value could be 110 and the other one -1, as it would not be detected, giving an average
value of 54.5 which would not be real.
The detection of the circle is made with an equation, that should be 1 or approximate
1 when a blob similar to a circle is detected. By using the Area A and Perimeter P
parameters of Blob Analysis block, we have:

(4πA) (4ππr2 ) (4ππr2 )


= = =1
P2 (2πr)2 (4ππr2 )

The input of the Image Processing System block is the image obtained from the downward
camera of the drone. The outputs, in Fig. 35, are orientation, which is transformed from
radians to degrees, and circle, from the Blob Analysis block, and turnBack, turnFront,
turnUpper, and turnLower, from the Edge Detection block. As the dimension of the image
after the corner edge Matlab function is 113x113 pixels, turnBack, turnFront, turnUpper,
and turnLower can have a value between 1 and 113 if the line touches that side, or -1 if
the line does not touch that side.

Figure 38: Graphical representation of the outputs.

40
5.3.2 Path Planning block design
The Path Planning block, in Fig. 39, inputs are Circle state, Orientation, turnFront,
turnBack, turnUpper, and turnLower from the Image Processing System block, and the
reference commands. The estimated state values are not used in this algorithm, as the
simulation results are the same using the estimated states or the inputs. The outputs are
the updated reference commands, which are the orientation and the position of the drone.

Figure 39: Third path planning block.

In order to update the position and the orientation of the drone, a stateflow is created, in
Fig. 40, with turnBack, turnFront, turnUpper, turnLower, Orientation, and Circle state
as inputs. The outputs are yawOut, in radians, and xout, yout, and zout in meters.
This stateflow has 9 states that will control the 5 movements of the drone, e.g. take off,
turn right, turn left, move forward, and land.
First, the TakeOff state, where all the variables are initialized and the drone altitude is
updated to 1.1 meters. After 3 seconds, if the drone is facing the line, the Forward state
will take place. The drone will move forward at a specific speed. However, if the drone is
not facing the line, it will turn to the right or to the left until it is facing the line. Then,
if a turning point is detected, the minidrone will turn to the right or to the left while
it is moving forward. In order to make smoother movements, until the line is finally at
the center of the image. Also, during the flight, if the orientation between the line and
the minidrone is larger than 3◦ or smaller than -3◦ , and no turning point is detected, the
drone will also turn to the right or to the left. This helps to center the quadcopter in the
middle of the line. Finally, if the input circle is greater than 0.85, the minidrone will land.

41
Figure 40: Third path planning algorithm stateflow.

In Fig. 41, the YawLeftAcute and YawLeftObtuse states can be distinguished.


The YawLeftAcute and YawRightAcute state, will help the drone to correct its position
when the deviation between the line and the orientation of the drone is larger than 3◦ or
lower than -3◦ .

(a) YawLeftAcute state. (b) YawLeftObtuse state.

Figure 41: Third algorithm yaw movements.

42
For the landing, if the input circle is greater than 0.85, which can be seen in Fig. 42a, the
land will start and the drone will land in the middle of the circle, Fig. 42b. If the value of
0.85 were changed to 0.95 or 1, the drone would not land until the red line disappeared
from the image and there was only the circle in it.

(a) Circle detection scope. (b) Land in the circle.

Figure 42: Circle detection.

43
6 Simulation test results
All three algorithms were tested on the same two tracks with different minidrone speeds,
to compare the time to complete the track and its proper functioning.

(a) Track 1. (b) Track 2.

Figure 43: Simulation tracks.

Although algorithm 2 is the one that completes the track fastest, algorithm 3 is more
consistent and stays in the center of the line most of the time. In addition, it is less prone
to detect noise and, therefore, more robust.

Algorithm Track Speed Time


0.0003 221.00s
1 0.0004 185.00s
0.0005 169.47s
First
0.0003 143.04s
2 0.0004 131.40s
0.0005 122.40s
0.0003 188.46s
1 0.0004 148.85s
0.0005 123.46s
Second
0.0003 105.46s
2 0.0004 85.62s
0.0005 69.26s
0.0003 183.42s
1 0.0004 150.42s
0.0005 129.24s
Third
0.0003 118.26s
2 0.0004 97.02s
0.0005 82.20s

Table 6: Simulation results.

44
7 Experimental test results
7.1 Set-up
Once the drone is working perfectly in the 3D simulation, it is time to test it in a real
scenario. In the real scenario, the ground is not uniform, the 10 cm thick lines are red, as
is the 20 cm diameter circle, and the light is not constant throughout the entire route.
For the experimental test, track 1 of the 3D simulation has been reconstructed. In addition,
as the optical flow uses the down-facing camera, and the arena has no other objects than
the line, some tape strips have had to be added to the arena. This helps the drone to more
easily calculate its position and movement, taking the tape strips as reference points. If
the tape strips were not added, the drone tended to make unexpected movements, because
it did not know its exact position.
Lamps were also added to focus from the side all along the track to provide a constant
light scenario and help the minidrone to detect the entire track, as in the real experiment,
the drone is susceptible to change its behaviour due to different illuminations.

Figure 44: Real scenario track.

45
In Annex B.4, it can be seen how the thresholds of the initial mask were changed in order
to detect the track in the real scenario. We are no longer working with the simulation,
and, therefore, the colour of the line is not the same. In Fig. 45, only the red color of the
line and the circle in the real scenario are detected, using the Color Thresholder App by
MATLAB.

(a) Real scenario track (b) Real scenario track in


after the mask. binary after the mask.

Figure 45: Applying the mask in the real scenario.

7.2 Results
Once the real scenario has been set up and the mask has been changed, the Flight control
system can be tested with the Parrot Mambo Fly.
Based on the results of the simulation test, in the real scenario, the third algorithm will
be tested, because it is the most consistent and robust algorithm.
In the experimental tests, the colour detecting algorithm works as expected, and the track
and the circle are detected without noise pixels, if the illumination is almost the same in
all the arena.
On the other hand, the line tracking algorithm also works as expected but sometimes
some undesired behaviours happens:
• As the ultrasound sensor of the Parrot Mambo Fly is very sensitive to variations,
sometimes it does not detect the real altitude of 1.1 meters of the minidrone. In that
case, it begins to ascend without limit, either because of the drone’s own movement,
or because the waves from the ultrasound sensor itself bounced off nearby objects.
• The limited computational speed on a low-cost hardware such as the Parrot Mambo
Fly, makes it impossible to process all the frames of the drone’s down-facing camera.
Therefore, some of the unexpected movements of the quadcopter are due to the fact
that the image from the down-facing camera is not updated and processed as fast
as the minidrone is moving.

46
8 Budget
Components list
Parrot Mambo Fly 69,90€
USB BT Wireless Dongle 7,04€
USB type C hub LC-POwer 28,30€
10 cm thick red tape 8,97€
Cable adapter HDMI to DVI 4,72€
Eizo monitor 208,99€
MathWorks license 2.000€

TOTAL 2.327,92€

Table 7: Components cost.

Dedication time
Task Hours Cost
Phase 1 26h 520€
Phase 2 26h 520€
Phase 3 28h 560€
Phase 4 46h 920€
Phase 5 134h 2.680€
Phase 6 84h 1.680€
Phase 7 60h 1.200€

TOTAL 8.080€

Table 8: Hours dedicated and estimated cost.

Taking into account the costs calculated in the tables above (the task of each phase could
be seen in section 1.4.1) and the salary estimated for a junior engineer is 20 €/h, the
total cost of the project is 2.327, 92 + 8.080 = 10.407,92€.
It should also be noted that many of the materials used during the project, as well as the
MathWorks licence, were already part of TU Wien. Also, as I am doing the bachelor thesis,
the dedication cost is calculated to get an idea of the hours worked and the cost involved
if the project had been carried out with a company or with a self-employed person.

47
9 Conclusions
In the literature, there are numerous different vision-based line following methods for
micro air vehicles. In this project, three line tracking algorithms were selected and suc-
cessfully tested.
The first and second algorithms are based on the analysis of pixels in specific regions of
the downwards drone’s camera. If these specific regions detect a higher number of white
pixels than a given threshold, turn right or left, move forward, or land movements are
made.
The second algorithm is an improved version of the first one, and it achieves a smoother
movement and requires less time for reaching the end.
The third algorithm, on the other hand, calculates the exact location of the line, knowing
whether the drone has to turn right or left based on the midpoint of the end of the line in
the image algorithm. It also calculates whether there is a circle in the image, having to land
or not, and whether the orientation of the line with respect to the drone is appropriate.
In the simulation, the three algorithms are tested at different speeds of the minidrone.
The fastest algorithm to finish the track is the second one. The most robust is the third
algorithm, because it makes almost no unnecessary movements and the line is always
beneath the drone.
In the real experiment, the scenario has to be set up precisely, with preferably constant
lighting, with no nearby objects, and with reference points to help the optical flow sensor.
The third algorithm proved to work as expected and the errors are minimised.
Finally, it can be said that the project has been a success, meeting the initial requirements
with three tracking algorithms.

48
10 Future Work
Drones have evolved considerably over the years. They started as applications for military
operations, and they have evolved into a broad range of applications.
Line tracking algorithms will be life-changing with regards to the future applications of
unmanned aerial vehicles since they highly enhance the precision of the navigation. Despite
the high-level functionality of today’s drones, they still have some margin of improvement,
for instance, the obstacle detection must be optimized.
I am positive that the implementation of such techniques will reduce the amount of drone
pilots since they will be remotely piloted.
Drones have gone from being an idea to an increasingly widespread and advanced reality.
I am wondering how will they advance and look like in the next decades.

49
References
[1] Cnet. Parrot Mambo Fly - drone Specs, March 2021. Rev. 1.
[2] Alan R. Earls. Drone (UAV). https://internetofthingsagenda.techtarget.
com/definition/drone, July 2019. [Accessed: April 2021].
[3] Kinshasa. Newly Discovered Sketches Reveal Leonardo da Vinci
Designed the First Drone. https://hyperallergic.com/287893/
newly-discovered-sketches-reveal-leonardo-da-vinci-designed-the-first-drone/,
April 2016. [Accessed: April 2021].
[4] Kashyap Vyas. A Brief History of Drones: The Remote Controlled Un-
manned Aerial Vehicles (UAVs). https://interestingengineering.com/
a-brief-history-of-drones-the-remote-controlled-unmanned-aerial-vehicles-uavs,
June 2020. [Accessed: April 2021].
[5] Drone technology uses and applications for commercial, industrial and mil-
itary drones in 2021 and the future. https://www.businessinsider.com/
drone-technology-uses-applications?r=DE&IR=T, January 2021. [Accessed:
April 2021].
[6] Naveen Joshi. 10 stunning applications of drone technology. https://www.
allerin.com/blog/10-stunning-applications-of-drone-technology, Septem-
ber 2017. [Accessed: April 2021].
[7] Drone Technology for Delivery and Transport. https://www.airbornedrones.co/
delivery-and-transport/, 2021. [Accessed: April 2021].
[8] How Drones Aid in Disaster Response. https://www.precisionhawk.com/blog/
how-drones-aid-in-disaster-response, August 2019. [Accessed: April 2021].
[9] Anthony Bennet. Using Drones in Geography. https://www.internetgeography.
net/using-drones-in-geography/, December 2019. [Accessed: April 2021].
[10] Using Drones in Geography. https://www.internetgeography.net/
using-drones-in-geography/, 2015. [Accessed: April 2021].
[11] Precision agriculture in 2021: The future of farming is using drones and sen-
sors for efficient mapping and spraying. https://www.businessinsider.com/
agricultural-drones-precision-mapping-spraying?r=DE&IR=T, February 2021.
[Accessed: April 2021].
[12] Amir Iliaifar. Using Mapping Drones for Wildlife Moni-
toring and Conservation. https://www.sensefly.com/blog/
using-mapping-drones-for-wildlife-conservation/, September 2019. [Ac-
cessed: April 2021].
[13] Jason Blazakis. Border security and unmanned aerial vehicles. Connections, 5(2):154–
159, 2006.

50
[14] Mike Boover. Drones are being used for weather forecasting by meteorologists.
https://www.prophotouav.com/meteorologists-storm-weather-drones/, 2019.
[Accessed: April 2021].
[15] Drone Simulation and Control, Part 1: Setting Up the Control Problem. https:
//www.youtube.com/watch?v=hGcGPUqB67Q [YouTube video], 2018. [Accessed: May
2021].
[16] Mario. Physics Behind How Drones Fly. https://www.dronetechplanet.com/
physics-behind-how-drones-fly/, 2021. [Accessed: May 2021].
[17] Rajan Gill and Raffaello D’Andrea. Computationally efficient force and moment
models for propellers in uav forward flight applications. Drones, 3(4), 2019.
[18] Fintan Corrigan. How A Quadcopter Works Along With Propellers And
Motors. https://www.dronezon.com/learn-about-drones-quadcopters/
how-a-quadcopter-works-with-propellers-and-motors-direction-design-explained/,
May 2020. [Accessed: April 2021].
[19] Gembird. USB Bluetooth V4.0 Dongle, January 2021. Rev. 1.
[20] Hewlett-Packard. Universal Serial Bus Type-C Cable and Connector Specification,
August 2019. Rev. 2.
[21] Hewlett-Packard. Universal Serial Bus 3.0 Specification, June 2011. Rev. 1.
[22] R2021a at a Glance. https://www.mathworks.com/products/new_products/
latest_features.html, 2021. [Accessed: May 2021].
[23] MATLAB Onramp. https://www.mathworks.com/learn/tutorials/
matlab-onramp.html, 2021. [Accessed: March 2021].
[24] Simulink Onramp. https://www.mathworks.com/learn/tutorials/
simulink-onramp.html, 2021. [Accessed: March 2021].
[25] Stateflow Onramp. https://www.mathworks.com/learn/tutorials/
stateflow-onramp.html, 2021. [Accessed: March 2021].
[26] DSP System Toolbox. https://www.mathworks.com/products/dsp-system.html,
2021. [Accessed: May 2021].
[27] Signal Processing Toolbox. https://www.mathworks.com/products/signal.html,
2021. [Accessed: May 2021].
[28] Simulink Support Package for Parrot Minidrones.
https://www.mathworks.com/matlabcentral/fileexchange/
63318-simulink-support-package-for-parrot-minidrones, 2021. [Accessed:
May 2021].
[29] Stateflow. https://www.mathworks.com/products/stateflow.html, 2021. [Ac-
cessed: May 2021].

51
[30] Simulink 3D Animation. https://www.mathworks.com/products/3d-animation.
html, 2021. [Accessed: May 2021].
[31] Image Processing Toolbox. https://www.mathworks.com/products/image.html,
2021. [Accessed: May 2021].
[32] Control System Toolbox. https://www.mathworks.com/products/control.html,
2021. [Accessed: May 2021].
[33] Computer Vision Toolbox. https://www.mathworks.com/products/
computer-vision.html, 2021. [Accessed: May 2021].
[34] Aerospace Toolbox. https://www.mathworks.com/products/aerospace-toolbox.
html, 2021. [Accessed: May 2021].
[35] Aerospace Blockset. https://www.mathworks.com/products/
aerospace-blockset.html, 2021. [Accessed: May 2021].
[36] Blob Analysis and Edge Detection In the Real World. https:
//www.evaluationengineering.com/home/article/13003187/
blob-analysis-and-edge-detection-in-the-real-world, August 2006. [Ac-
cessed: May 2021].

52
53
Appendices
A Quadcopter Flight Simulation Model

Figure 46: Quadcopter flight simulation model.

54
(a) Linear Airframe.

(b) Nonlinear Airframe.

Figure 47: Airframe.

55
(a) Constant Environment.

(b) Variable Environment.

Figure 48: Environment.

56
(a) Dynamics Sensors.

(b) Feedthrough Sensors.

Figure 49: Sensors.

Figure 50: Flag.

57
(a) Signal Builder.

(b) Joystick.

(c) .mat Data.

(d) Spreadsheet Data.

Figure 51: Commands.

58
B MATLAB Functions
B.1 createMask simulation

Figure 52: createMask simulation MATLAB function.

B.2 filtered image

Figure 53: filtered image MATLAB function.

59
B.3 corner edges

Figure 54: corner edges MATLAB function.

B.4 createMask real scenario

Figure 55: createMask real scenario MATLAB function.

60

You might also like