You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/335210748

Implementation of Autonomous Visual Detection, Tracking and Landing for


AR.Drone 2.0 Quadcopter

Conference Paper · August 2019


DOI: 10.1109/DeSE.2019.00093

CITATIONS READS
7 1,193

3 authors:

Victor Massagué Respall Sami Sellami


Innopolis University Innopolis University
9 PUBLICATIONS   16 CITATIONS    5 PUBLICATIONS   9 CITATIONS   

SEE PROFILE SEE PROFILE

Ilya Afanasyev
Innopolis University
101 PUBLICATIONS   742 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Development of methods for identification of interaction models of robot with adaptive compliance with the human and environment View project

Calibration of the Space Solar Patrol apparatus at the synchrotron source View project

All content following this page was uploaded by Ilya Afanasyev on 16 August 2019.

The user has requested enhancement of the downloaded file.


Implementation of Autonomous Visual Detection,
Tracking and Landing for AR.Drone 2.0 Quadcopter
Victor Massague Respall, Sami Sellami, Ilya Afanasyev
Innopolis University, Innopolis, 420500, Russia
{v.respall, s.sellami, i.afanasyev}@innopolis.ru

Abstract—This paper presents the design and implementation This study addresses the problem of building the detection,
of a system where an autonomous aerial vehicle (UAV) has to tracking and landing system for an autonomous quadcopter
detect, track and land in the vicinity of a moving Unmanned by developing a controller that needs to manage the tasks of
Ground Vehicle (UGV). The quadcopter needs to identify and
track the target using camera-based perception. A color-based following and landing in the vicinity of a moving UGV using
detection algorithm is employed to detect the UGV position and an the onboard camera and computer vision algorithms. The
Kernelized Correlation Filters (KCF) are taken for its tracking, task solving was executed in two stages:
to double check the UGV presence in an image frame, the ORB • Designing the feedback controller for the UAV
(Oriented FAST and rotated BRIEF) feature detector is used
• Implementing the detection, tracking and landing system
(suppressing the false positive cases). Then the UGV position
detected in the image frame is used in a PD feedback control loop using the drone’s onboard camera.
in the quadcopter’s controller that enables the actual tracking We contribute by this paper with a solution for the online
of the moving UGV in the horizontal plane, while another PID tracking of a moving platform without any artificial markers
controller stabilizes the UAV altitude. The proposed approach was
initially tested in Gazebo simulator, and then in real environment
and the landing within the vicinity of this platform using a
with Parrot AR.Drone 2.0 and Plato robots (as UAV and UGV low-cost quadcopter. The main limitation of the study is the
correspondingly). The results demonstrate that the proposed low-altitude indoor testing environment due to safety reasons,
system can detect and track the UGV in real time mode without that could be overcome in future investigations.
artificial markers and provide robust UAV landing. The paper is organized as follows: Section II describes
the papers related to this research; Section III explains our
I. I NTRODUCTION methodology, Section IV presents the robots used and Section
V provides the results of simulation and experiments. Finally
Nowadays, plenty of investigations have been carried out in we conclude in Section VI.
the field of Unmanned Aerial Vehicles (UAVs), and there has
been considerable interest into multi-rotor drones. Their most II. R ELATED W ORKS
known form is quadcopters, as they are an excellent option The development of an object tracking controller for a
for various applications, such as ground recognition, access quadcopter with low-cost components of an onboard vision
to difficult or dangerous areas, load delivery, data collection, system was introduced in [12]. The low-frequency monocular
recreation, etc. [1], [2]. Clearly, one useful application would vision methodology was used in a closed-loop control to track
be the ability to track and land autonomously on an object in a an object of known color, while parallel PID controllers for
dynamically changing environment [3], [4], while landing on UAV bearing, relative height and range were implemented with
a moving platform is of particular interest for many practical feedback from object displacement and size detected in the
tasks [5]. This can be valuable for many areas, such as terrain image frame. The noise occurred during image measurements
exploration with ground and aerial vehicles [6], rescue [7] and was reduced using Kalman filter. In a similar study an
military missions [8] or simply for commercial purposes, such autopilot for autonomous landing of a helicopter on a rocking
as the delivery of goods [9], [10]. ship was developed, considering the problem as a non-linear
All of these tasks require a detection, tracking and landing tracking control problem for an underactuated system with a
system in different situations and environments where the two time-scale controller [13]. The research [14] provided
target or platform can be in motion. Landing on a moving the first demonstration of an autonomous landing of a quadro-
target is a challenging task to deal with due to the a priori copter on a moving platform, using computer vision algorithms
unknown motion and the nonlinear quadcopter’s dynamics. and multi-sensor fusion to localize the UAV, detect and eval-
Moreover drones can operate in GPS-denied heterogeneous uate movement of the moving platform, and plan the path for
areas that require a higher level of UAV autonomy using full autonomous navigation, using only onboard sensing and
computer vision-based tracking and landing. An additional computing without relying on any external infrastructure.
problem is the UAV flight time limit, since it usually takes An early study and implementation of a real-time vision-
about 20 minutes [11]. However, the problem of UAV tracking based landing for an autonomous helicopter was presented in
and landing time optimization is beyond the scope of this the research [15]. The landing method was integrated with
article. helicopter visual detection and navigation algorithms from an
arbitrary initial position and orientation. They applied vision acquires the UGV position in the image coordinate frame. In
for accurate target detection and recognition, as well as vision the second phase, the UGV position is used to calculate the
and GPS for navigation, demonstrating successful flight test required command inputs to perform the following. Kalman
results in the field conditions. Another work [16] implements filter is used to reduce the measurement noise. The process
autonomous landing with AR.Drone 2.0, using an image flow is repeated in a control loop until the predefined landing
from drone’s camera that was rectified by camera calibration. signal is sent to the UAV. The block-scheme of the proposed
To calculate the position and orientation relation between the methodology for detection, tracking and landing is shown in
drone and the landing pad, image frames are processed by the Fig. 1.
AR marker recognition package, which is used as the landing
pad. To control the drone’s velocity and stabilize its hovering
on the center of the landing pad, the PID controller is used,
and the landing control is carried out by connecting the drone
with a notebook via WiFi.
Experiments with an external motion capture system were
conducted by [17]. They achieved a maximum deviation of
10 cm from the desired position. Also [18] were using a field
monitored by stationary cameras. They presented an assistant
and training system for controlling RC helicopters during
flight. The idea of using LEDs as landmarks is not completely
new. In [19] is described how a miniature helicopter can be
visually tracked in six degrees of freedom with only three on-
board LEDs and a single on-ground camera. Also the WiFi
remote camera can be found in former projects. [20] presented
an optical tracking system using the camera data of two Wii
remotes. Similar approach using WiFi remotes was used in
[21] and the system provides small errors and allows for safe,
autonomous indoor flights.
Visual methods of tracking and landing are often used for
a multirotor drones using computer vision algorithms and
visual markers like ArUco, AprilTags, Alvar, etc. [3], [22].
The robustness and invariance of these visual markers to
the rotation and occlusion that have made them popular for
robotics applications, is discussed in various studies [23].
Although visual markers have proven to be reliable visual
landmarks for detecting and tracking targets, they are not the
only possible solutions to the problem of UAV following and
landing. In this study, we apply a color-based detection algo-
rithm to detect the position of the UGV and ORB (Oriented
FAST and Rotated BRIEF) feature detector to double check
for the UGV presence in a bounding box within the image
frame (to exclude false positive cases).
III. M ETHODOLOGY
Fig. 1. The block-scheme of the proposed methodology for UGV detection
In this section we describe every steps of the detection and and tracking
tracking methodology. Because we need to calculate the UGV
exact positions relative to a quadcopter using image processing
from the UAV camera, and suppress possible distortions, the A. Detection and Tracking
AR.Drone 2.0 bottom camera is calibrated. This procedure is To obtain the UGV current position, considering it is within
repeated every time the drone is powered up. In the next step, the on-board camera’s field of view, one of the approaches
the drone is ready to take off and start the process of UGV would be to detect the UGV in every frame. However this
detection in every current frame of the video streaming. technique is not computationally efficient since it searches the
The algorithm is performed in two phases, the first one is object in the whole image frame, without using the previous
UGV detection and tracking, and the second one deals with information of its last position. Instead, we would like to
the drone’s motion control. The detection process is executed detect the object once and then use the object tracker to handle
for each frame until the UGV is detected. If the UGV is every subsequent frame, leading to faster and more efficient
successfully localized in the image, then the tracking algorithm object tracking pipeline. This is exactly what many trackers
do, they optimize such process using different information features as the main detector. Therefore, we used it as the
acquired over time. A good tracking algorithm will use all additional one.
information about the object position while a detection algo-
rithm always starts a search from the beginning. The examples B. PID Controller
of successful trackers are Boosting (based on AdaBoost [24]), As the name suggests the PID controller consist solely of
Multiple Instance Learning (MIL, [25]), Tracking, Learning three elements, Proportional, Integral and Derivative and can
and Detection (TLD, [26]). be represented by this formula:
For our project, a fast and precise tracker is required that Z
can work in real-time mode in order to keep the ground de
u(t) = Kp e(t) + Ki e(t)d(t) + Kd (1)
vehicle position in the current frame. It should also be robust dt
to tracking failures which means the ability to recover from
wheres Kp , Ki , Kd are the proportional, integral and deriva-
a lost position while tracking. All these characteristics are
tive gains and e(t) is the error signal i.e., the difference
included in Kernelized Correlation Filters (KCF) tracker [27].
between desired and measured values.
The basic idea of the correlation filter tracking is the estimation
The basic operation principle of PID control is that it
of an optimal image filter in a way that the filtration with the
takes measured value from output sensor, compares it with
input image produces a desired response. The desired response
the desired value and then accordingly sends commands to a
typically has the Gaussian shape centered at the target location,
plant (i.e. a system) on which the desired actions are being
therefore the estimation decreases with the distance.
performed to optimize it’s behaviour. The block diagram is
The filter is trained from translated (shifted) instances of the
shown in Fig. 3.
target patch. When testing, the filter’s response is evaluated
and the maximum value gives the new target position. The
filter is trained online and updated sequentially with every
frame in order to adapt the tracker to moderate target position
changes. The major advantage of the correlation filter tracker
performed efficiently in the Fourier domain is the computa-
tional cost [27]. Therefore the tracker runs successfully in real
time even with several hundreds frames per second (FPS).
The UGV and UAV used for this task are shown in the Fig. Fig. 3. The schematic diagram of PID controller
2.
The PID controller coefficients make a different contribution
to the damping of transient process, stability and a response
rate of the system. The proportional gain simply multiplies the
constant value by the error, aggressively trying to match the
output and input. The higher the proportional gain, the shorter
the rise time. But increasing the proportional gain results in
more oscillations around a steady state value. The increase
of the derivative coefficient results in damping the system
oscillation that is valuable for developing stable systems.
But additional increase can lead to the system over-damping,
Fig. 2. The differential drive robot called Plato (left) used as a moving target
platform, and AR.Drone 2.0 quadcopter (right) driving up the system’s rise time and reaching the steady
state value. The integral coefficient is responsible for the
For the detection part, Gaussian filtering is applied for accumulation of input and output errors over time, trying to
smoothing and removing possible image noise caused by improve the steady state error of the system [28].
the image’s low quality coming from the constantly moving For the UAV control, we applied two controllers: PD to
camera. Then we take advantage of the UGV color to detect control the quadcopter in the horizontal plane, and PID to
the platform using the yellow tint. We used HSV color space control the altitude. This choice is primarily motivated by the
to extract the yellow color considering possible illumination continuous change of the trajectory in the x-y plane, whereas
changes. This step will determine if the robot is in the frame the altitude should remain constant over time.
of the drone’s camera. After the detection phase, the algorithm Error’s compensation in the x-y quadcopter trajectory (Fig.
builds a bounding box around the UGV that the tracker will 4) with PD controller is performed because of the proportional
use in future frames. As an extra solution, the ORB feature and derivative actions which causes the error to converge to
detector will further be used to double check that the bounding zero. Adding the Integral term causes offset’s accumulation
box found with the color detection actually contain the UGV over time which and a deviation in the final response, so we
(suppression of the false positive cases). It should be noted decided to exclude it, considering that the drone has already
that the quality of the images from the drone’s bottom camera demonstrated a robust behavior. Thus, the control equations for
during flight is not good enough to rely only on the ORB UAV stabilization in the horizontal plane will be the following:
The Plato mobile platform is equipped with the onboard
d computer Jetson TX1, bumpers, sonars, Troyka IMU module,
ux (t) = Kp ex (t) + Kd ex (t) (2)
dt and wheel encoders that allow to calculate the wheel odometry.
The robots onboard computer uses Ubuntu 16.04 installed with
d
uy (t) = Kp ey (t) + Kd ey (t) (3) Robot Operating System - ROS Kinetic Kame [30]. The
dt
hardware and software system configuration is presented in
The control of the drone’s flight altitude is executed with a the Table I.
PID controller, where there is also an integral term responsible
for bringing the error to zero. The UAV command scheme TABLE I
allows to control linear velocities in each direction, therefore T HE S PECIFICATION OF C OMPUTATIONAL S YSTEM
the input signal will be proportional to the above-mentioned System Configuration
velocities [29]: Laptop MacBook Pro (15-inch, 2017)
Operating System macOS Mojave
Z t
d RAM 16 GB
uz (t) = Kpz ez (t) + Kiz ez (t)dt + Kdz ez (t) (4)
0 dt CPU 2.8 GHz Intel Core i7

where ez (t) = z ∗ − z is the difference between the


actual and the desired altitude (which is given by the sensors The processing of the frame images captured by the quad-
measurements). copter bottom’s camera is performed using a laptop with
ROS Kinetic Kame packages. Moreover the Ardrone Autonomy
package [31] from ROS library is used for communicating
between the quadcopter and the laptop, computing the UGV
position, reading sensors data from AR.Drone and sending
the corresponding commands (calculated by a pilot controller)
in order to track it. In addition, the OpenCV (Open Source
Computer Vision) Library [32] is applied for detecting and
tracking the UGV using Kernelized Correlation Filters (KCF)
algorithm [27]. The main advantage of this algorithm is that
it uses very few easy-to-set parameter.
V. S IMULATION AND E XPERIMENTS
Fig. 4. Local frame representation of the UAV A. Simulation of UGV detection and tracking by a quadcopter
The first phase of experiments was performed using Gazebo
We use different PID coefficient for the altitude and x-y simulator that allowed studying the quadcopter behaviour
positions control, the altitude being a sensor measurement, it while detecting and tracking the UGV . The Fig. 5 shows
is quite different in scale from the desired x-y position which a screenshot of the simulation performed. The task was to de-
is obtained from the computer vision based algorithm used velop a robust and accurate algorithm for detection and track-
to track the UGV position. These coefficients are tuned in ing , using the proposed methodology and a modified version
realtime while controlling the UAV; by analyzing rigorously of the Turtlebot as the UGV prototype. The experiments with
the stability of the copter and the data obtained from the the Turtlebot in the Gazebo simulation show the robustness in
IMU, we made changes in the coefficients scale each time the ground platform detection and tracking by a quadcopter’s
the quadcopter was exhibiting a weak or strong behavior in camera in the different conditions and environments. After
the response. All coefficients are adapted for indoor flight. developing a robust system in the simulation, we transferred
IV. S YSTEM S ETUP the developed algorithm implementation to the real hardware
experiment.
For the execution of the tracking algorithm and the feedback
control loop together with the communication with the hard- B. Experiments with Plato mobile robot detection and tracking
ware we used the following system configuration, containing by AR.Drone 2.0 quadcopter
AR.Drone 2.0 quadcopter, the differential drive mobile robot We hereby discuss the results of experiments with the
called Plato and a laptop with Ubuntu 16.04 (Xenial Xerus) teleoperated Plato mobile robot that was detected and tracked
as an operating system. The Ar.Drone 2.0 quadcopter is with the proposed algorithm implemented on AR.Drone 2.0
equipped with the onboard 720p 30fps HD front-camera, quadcopter as demonstrated in the Fig. 8 and the video
ARM Cortex A8 32-bit 1 GHz processor with DSP video available at [33]. The system is able to recover from a tracking
800 MHz TMS320DMC64x, a gyroscope, an accelerometer, a failure (Figure 6) and renew the UGV detection if the platform
magnetometer, a pressure sensor, an altitude ultrasound sensor appears in the camera’s field of view again (Fig. 7).
and the vertical camera QVGA 60 FPS to measure the ground Performing the experiments, we faced some problems due
speed. to unpredictable and uncontrollable factors such as noise in
Fig. 8. The photo of AR.Drone 2.0 following the Plato mobile robot, and the
live video streaming from the copter (in the bottom-right part) with the red
Fig. 5. Gazebo-based simulation of UGV detection and tracking by a bounding box around the object of interest [33]
quadcopter. The figure shows the initial copter position after taking off and
live streaming from the quadcopter’s bottom camera
Finally, the landing system (Fig. 9), which is realized by
a simple decrease in the drone’s altitude while tracking the
UGV platform, was developed for landing on the moving
mobile robot. Using the altitude sensor makes it completely
autonomous for landing. The problem comes at a certain
altitude, where the system is not able to detect the UGV
landing platform anymore because of to the shrinking of the
camera’s field of view. Moreover, the UGV platform being
quite small, it was impossible to perform a safe landing, thus
we decided to land in the vicinity of the vehicle. We defined a
a circular area of radius 1.2 meters around the ground vehicle
where the quadcopter should perform a safe landing, and after
Fig. 6. Video streaming acquired from the AR.Drone 2.0 bottom camera numerous test we managed to create a robust and desired
reporting a tracking failure due to the absence of the UGV in the frame
behaviour in the landing process.

Fig. 7. Video streaming acquired from the AR.Drone 2.0 bottom camera Fig. 9. AR.Drone 2.0 landed in the vicinity of the UGV after tracking it for
showing the UGV detection (shown in the red rectangle, where the blue point a short period of time
represents the position sent to the controller)

VI. C ONCLUSIONS
the camera images and hardware failures. In addition, because This paper presents the design and implementation of a
of the AR.Drone’s feature of ”auto-hovering”, which uses the system where a moving Unmanned Ground Vehicle (UGV)
bottom camera for auto-stabilization, the quadcopter was not has to be detected and tracked by an autonomous aerial vehicle
available for full control, this is why we disabled this feature (UAV) using a camera-based perception and without special
for further tests. artificial markers, the UAV then has to land in the UGV
The test results for feedback controller lead to quite different vicinity when a predefined signal is sent to the quadcopter.
behavior comparing to the Gazebo simulation. Actually, the We proposed and developed a two-step methodology. At the
UAV was not able to accurately follow the desired trajectories first, the detection and tracking system of the ground vehicle
and altitude. To solve this problem we tuned the PID is executed using a color-based algorithm and Kernelized
controller parameters for various control schemes (altitude and Correlation Filters (KCF) tracker respectively. At the second
translation control), and finally obtained a robust behavior in step a PID controller for robust tracking is employed using
adequacy with our specifications. the UGV position detected from the first step. The process
is repeated until a specified landing command is sent to the [13] S.-R. Oh, K. Pathak, S. K. Agrawal, H. R. Pota, and M. Garrett,
quadcopter. The UGV position detected in the image frame “Autonomous helicopter landing on a moving platform using a tether,”
in Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005
is used in the feedback control loop of the quadcopter’s PD IEEE International Conference on. IEEE, 2005, pp. 3960–3965.
controller enabling the actual tracking of the moving UGV in [14] D. Falanga, A. Zanchettin, A. Simovic, J. Delmerico, and D. Scara-
the horizontal plane, while another PID controller is employed muzza, “Vision-based autonomous quadrotor landing on a moving
platform,” in 2017 IEEE International Symposium on Safety, Security
to stabilize the UAV altitude. To double check the UGV and Rescue Robotics (SSRR), Oct 2017, pp. 200–207.
presence in an camera frame, an ORB feature detector is used [15] S. Saripalli, J. F. Montgomery, and G. S. Sukhatme, “Visually guided
(to suppress the false positive cases). landing of an unmanned aerial vehicle,” IEEE transactions on robotics
and automation, vol. 19, no. 3, pp. 371–380, 2003.
The proposed approach was initially tested in Gazebo simu- [16] T. Zhao and H. Jiang, “Landing system for ar. drone 2.0 using onboard
lator, and then in real environment with Parrot AR.Drone 2.0 camera and ros,” in Guidance, Navigation and Control Conference
(CGNCC), 2016 IEEE Chinese. IEEE, 2016, pp. 1098–1102.
and Plato robots (as UAV and UGV correspondingly). The [17] D. Gurdan, J. Stumpf, M. Achtelik, K.-M. Doth, G. Hirzinger, and
quadcopter was operating quite well in the tracking procedure D. Rus, “Energy-efficient autonomous four-rotor flying robot controlled
and the experimental results demonstrate that the proposed at 1 khz,” in IEEE ICRA, 2007, pp. 361–366.
[18] K. Watanabe, Y. Iwatani, K. Nonaka, and K. Hashimoto, “A visual-
system can detect and track the UGV in real time mode servo-based assistant system for unmanned helicopter control,” in Intel-
without artificial markers and provide robust UAV landing in ligent Robots and Systems, 2008. IROS 2008. IEEE/RSJ International
the UGV vicinity. Conference on. IEEE, 2008, pp. 822–827.
[19] L. C. Mak and T. Furukawa, “A 6 dof visual tracking system for
To improve the proposed approach and compare the perfor- a miniature helicopter,” in 2nd International Conference on Sensing
mance with other methods, more tests with different detection Technology (ICST). Citeseer, 2007, pp. 32–37.
algorithms and controllers should be provided. Regarding the [20] S. Hay, J. Newman, and R. Harle, “Optical tracking using commodity
hardware,” in Proc. 7th IEEE/ACM international Symposium on Mixed
shrinking of the UAV field of view (FoV) while landing, one of and Augmented Reality. IEEE Computer Society, 2008, pp. 159–160.
the possible solutions would be to modify the camera lens to [21] K. E. Wenzel, A. Masselli, and A. Zell, “Automatic take off, tracking
obtain a fish-eye structure to increase the FOV in all directions, and landing of a miniature uav on a moving carrier vehicle,” Journal of
intelligent & robotic systems, vol. 61, no. 1-4, pp. 221–238, 2011.
however it would introduce image distortions and needs for [22] M. Gavrilenkov, I. Danilov, and I. Afanasyev, “Ros-based visual au-
relevant corrections. tonomous landing method for a multirotor drone,” MDPI Robotics, 2019.
[23] K. Shabalina, A. Sagitov, L. Sabirova, H. Li, and E. Magid, “Artag,
apriltag and caltag fiducial systems comparison in a presence of partial
R EFERENCES rotation: Manual and automated approaches,” in ICINCO. Springer,
2017, pp. 536–558.
[24] H. Grabner, M. Grabner, and H. Bischof, “Real-time tracking via on-line
[1] S. A. Wich, “Drones and conservation,” Drones and Aerial Observation:
boosting.” in Bmvc, vol. 1, no. 5, 2006, p. 6.
New Technologies for Property Rights, Human Rights, and Global
[25] B. Babenko, M.-H. Yang, and S. Belongie, “Visual tracking with online
Development, 2015.
multiple instance learning,” in 2009 IEEE Conference on Computer
[2] A. Sabirova, M. Rassabin, R. Fedorenko, and I. Afanasyev, “Ground pro- Vision and Pattern Recognition. IEEE, 2009, pp. 983–990.
file recovery from aerial 3d lidar-based maps,” in 2019 24th Conference [26] Z. Kalal, K. Mikolajczyk, and J. Matas, “Tracking-learning-detection,”
of Open Innovations Association (FRUCT), April 2019, pp. 367–374. IEEE transactions on pattern analysis and machine intelligence, vol. 34,
[3] P. H. Nguyen, K. W. Kim, Y. W. Lee, and K. R. Park, “Remote no. 7, pp. 1409–1422, 2012.
marker-based tracking for uav landing using visible-light camera sensor,” [27] J. F. Henriques, R. Caseiro, P. Martins, and J. Batista, “High-speed track-
Sensors, vol. 17, no. 9, p. 1987, 2017. ing with kernelized correlation filters,” IEEE transactions on pattern
[4] D. Borreguero, O. Velasco, and J. Valente, “Experimental design of a analysis and machine intelligence, vol. 37, no. 3, pp. 583–596, 2015.
mobile landing platform to assist aerial surveys in fluvial environments,” [28] K. J. Åström and T. Hägglund, PID controllers: theory, design, and
Applied Sciences, vol. 9, no. 1, p. 38, 2019. tuning. Instrument society of America Research Triangle Park, NC,
[5] T. Yang, Q. Ren, F. Zhang, B. Xie, H. Ren, J. Li, and Y. Zhang, 1995, vol. 2.
“Hybrid camera array-based uav auto-landing on moving ugv in gps- [29] R. C. G. Szafranski, Different Approaches of PID Control UAV Type
denied environment,” Remote Sensing, vol. 10, no. 11, p. 1829, 2018. Quadrotor. Silesian University of Technology, Akademicka St 16,
[6] P. Tokekar, J. Vander Hook, D. Mulla, and V. Isler, “Sensor planning Gliwice, Poland, 2011.
for a symbiotic uav and ugv system for precision agriculture,” IEEE [30] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs,
Transactions on Robotics, vol. 32, no. 6, pp. 1498–1511, 2016. R. Wheeler, and A. Y. Ng, “Ros: an open-source robot operating system,”
[7] T. Tomic, K. Schmid, P. Lutz, A. Domel, M. Kassecker, E. Mair, in ICRA workshop on open source software, vol. 3, no. 3.2. Kobe, Japan,
I. L. Grixa, F. Ruess, M. Suppa, and D. Burschka, “Toward a fully 2009, p. 5.
autonomous uav: Research platform for indoor and outdoor urban search [31] M. Monajjemi et al., “ardrone autonomy: A ros driver for ardrone 1.0
and rescue,” IEEE robotics & automation magazine, vol. 19, no. 3, pp. & 2.0,” 2012.
46–56, 2012. [32] G. Bradski, “The OpenCV Library,” Dr. Dobb’s Journal of Software
[8] T. Samad, J. S. Bay, and D. Godbole, “Network-centric systems for Tools, 2000.
military operations in urban terrain: The role of uavs,” Proceedings of [33] “Experiments with plato mobile robot detection and tracking by ar.drone
the IEEE, vol. 95, no. 1, pp. 92–107, 2007. 2.0 quadcopter,” https://youtu.be/9nc1TGTTwhY, accessed: 2019-06-15.
[9] E. T. Bokeno, T. M. Bort, S. S. Burns, M. Rucidlo, W. Wei, D. L.
Wires et al., “Package delivery by means of an automated multi-copter
uas/uav dispatched from a conventional delivery vehicle,” Jul. 14 2016,
uS Patent App. 14/989,870.
[10] K. L. Koster, “Delivery platform for unmanned aerial vehicles,” Jun. 25
2015, uS Patent App. 14/560,821.
[11] B. Saha, E. Koshimoto, C. C. Quach, E. F. Hogge, T. H. Strom, B. L.
Hill, S. L. Vazquez, and K. Goebel, “Battery health management system
for electric uavs,” in 2011 Aerospace Conference. IEEE, 2011, pp. 1–9.
[12] A. Kendall, N. N. Salvapantula, and K. Stol, “On-board object tracking
control of a quadcopter with monocular vision,” 05 2014, pp. 404–411.

View publication stats

You might also like