You are on page 1of 3

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/275409650

Quadcopter with camera visual tracking

Article · March 2015

CITATIONS READS
0 1,421

3 authors, including:

Gregor Tonic
University of Ljubljana
1 PUBLICATION   0 CITATIONS   

SEE PROFILE

All content following this page was uploaded by Gregor Tonic on 25 April 2015.

The user has requested enhancement of the downloaded file.


Quadcopter with camera visual tracking

Gregor Tonic Marko Sandar Philip K. Kataay


University of Ljubljana University of Ljubljana University of Ljubljana
Ljubljana, Slovenia Ljubljana, Slovenia Ljubljana, Slovenia
Email: gregor.tonic@globalook.com Email: marko.sandar@globalook.com Email: philip.kataay@globalook.com

Abstract—With the rising popularity of quadcopters and The CamShift algorithm first locates center of the region
similar flying devices more and more possibilities are open for using MeanShift and then calculates rotation and size of the
various computer vision applications. Quadcopters often feature object. MeanShift is iterative algorithm which finds region
advanced high-resolution cameras and can provide image on- in the image that matches our reference image the best. The
the-fly over wireless networks to the stationary computer system. search is based on similarity of color histogram of regions in
In this paper we present a method to autonomous tracking and
the image. The HSV color space is used, specifically H (hue)
following an object with camera-featured quadcopter. Tracking is
done using popular object tracker CamShift and is fast enough to component. For every image element probability is calculated
follow the quadcopter in real time. Advanced quadcopter Walkera that element belongs to tracked region. Probability distribution
Scout X4 was used to evaluate and confirm that the algorithm map is created. Algorithm Mean Shift locates maximum on
works well on various light and atmosphere conditions. the map and thus center of the region. To prevent algorithm
from using too bright and too dark pixels, border values
Keywords—Quadcopter with camera, tracking, drone are determined for H component of the image. Algorithm
then dynamically adjust color distribution of the target region
I. I NTRODUCTION among sequential images. After CamShift locates the region
in the image that fits reference the best it remembers color
Nowadays, on the field of mobile robotics, quadcopters distribution of that region and uses it in searching in next
have become very popular. Quadcopters are flying devices image. Similarity measure d(K1 , K2 ) between histograms,
similar to helicopters, but with four propellers instead of just Bhattacharyy distance is used
one. Such devices can feature multiple sensors like GPS, height !1/2
N q
sensor, and accelerator, but also more advanced devices like X
video camera or wifi hotspot. They are used for a range of d(K1 , K2 ) = 1 − K1(n) × K2(n) .
applications and can be found in military, sport events and n=1
entertainment industry. In this paper we present application of
tracking objects space. Inner square root represents scalar product of histograms
K1 and K2 , represented as vectors. Variable N represents
number of bins in histograms. The whole equation is to cosine
A. Related work of angle between vectors K1 and K2 .
Object tracking belongs in the field of computer vision,
specifically video analysis. Lots of research has been con- III. O BJECT FOLLOWING
ducted on this field. Video analysis is also used in number In this section we describe the procedure our quadcopter
of applications on field of video surveillance, video editing, use to autonomously regulate its flight according to the location
object tracking in sports, medicine, games and virtual reality. of the object in the image. As the 3D object from the space
Algorithms for object tracking in video give the infor- translates to 2D object in the camera we have to use the
mation about location of object in image for every image received information to estimate object’s position in space.
in sequence. Usually this information is given in pixels and
region size which covers area of object in image. In connection A. Camera calibration
with computer vision it is especially interesting autonomous First part of translations requires to understand the way
navigation in space with quadcopters. The navigation is also image is formed from 3D space. From intrinsic parameters of
supported with GPS and other sensors present on the device. the camera K, rotation matrix R and translation vector t we
Using laser and localization algorithms map can be build that obtain the model matrix that is used to perform the projection
can support quadcopter in its autonomous movements.
P = K · [R|t],
II. O BJECT TRACKING where calibration matrix consist of elements fx , fy , cx and cy .
For our tracking purposes we used simple, yet fairly Process of calibration is intended to obtain intrinsic param-
robust tracker CamShift. Algorithm CamShift (Continuously eters of camera. These include focus length, position of center
Adaptive Mean Shift Algorithm) was derived from similar of camera, margin of elements and distortion of image. We
algorithm MeanShift. Initially it was intended for face tracking see that focus length and center of image is needed to produce
but can also be used for any other object. calibration matrix.
B. Real-time estimation of object position [2] LaFleur, Karl, et al. ”Quadcopter control in three-dimensional space us-
ing a noninvasive motor imagery-based braincomputer interface.” Journal
Once we have calibrated system, the easies part is adjust- of neural engineering 10.4 (2013): 046003.
ment of jaw and pitch to keep object centered and to maintain [3] A. Bachrach, A. de Winter, R. He, G. Hemann, S. Prentice, N. Roy,
reasonable distance from the object obtained from estimation RANGE - Robust Autonomous Navigation in GPS-denied Environments,
of object’s size on image. IEEE International Conference on Robotics and Automation, May 3-8,
2010, Anchorage, Alaska, USA, pp.1096-1097.
On horizontal axis we keep the object center by adjusting [4] L. Derafa, A. Ouldali, T. Madani, A. Benallegue, Four Rotors Helicopter
tilt (rotation jaw) y 0 . This can be simply passed as motion Yaw and Altitude Stabilization, Proceedings of the World Congress on
Engineering 2007, vol I, WCE 2007, July 2 - 4, 2007, London, U.K.
parameter to quadcopters motion processor and is simply
[5] Achtelik, Markus, et al. ”Visual tracking and control of a quadcopter
calculated using using a stereo camera system and inertial sensors.” Mechatronics and
p x − w/2 Automation, 2009. ICMA 2009. International Conference on. IEEE,
1 − y0 = ( ), 2009.
w/2 [6] Luukkonen, Teppo. ”Modelling and control of quadcopter.” Independent
research project in applied mathematics, Espoo (2011)
where x represents current x-coordinate of object in the image
[7] Quadcopter with camera, Walkera Scout X4, accessed April 2015
and w represents full width of an image.
[8] Chen, Michael Y., et al. ”Designing a spatially aware and autonomous
To achieve greater stability of the quadcopter during the quadcopter.” Systems and Information Engineering Design Symposium
(SIEDS), 2013 IEEE. IEEE, 2013.
flight we use hysteresis which means we only change the
[9] Olivares-Mendez, Miguel A., et al. ”Quadcopter see and avoid using a
position of object in image if x is somewhere between chosen fuzzy controller.” Proceedings of the 10th International FLINS Confer-
xmin and xmax . ence on Uncertainty Modeling in Knowledge Engineering and Decision
Making (FLINS 2012). World Scientific, 2012.
Our assumption here was that object height never change
[10] G. R. Bradski, Computer Vision Face Tracking For Use in a Perceptual
and that quadcopter is always on the same vertical plane, which User Interface, Intel Technology Journal Q2 98, Microcomputer Research
is not always the case since the speeding up also produces Lab, Santa Clara, CA, Intel Corporation.
some pitch. Principle of vertical adjustments is very similar [11] Allen, John G., Richard YD Xu, and Jesse S. Jin. ”Object tracking using
as with jaw adjustment. We use hysteresis here as well with camshift algorithm and multiple quantized feature spaces.” Proceedings
the difference it is set on different axis and positioned slightly of the Pan-Sydney area workshop on Visual information processing.
Australian Computer Society, Inc., 2004.
below the center of the image. We have to use such positioning
because the changes of object in space are not visible on the [12] Chu, Hongxia, et al. ”Object tracking algorithm based on camshift algo-
rithm combinating with difference in frame.” Automation and Logistics,
image if the center of camera lays on the same height as the 2007 IEEE International Conference on. IEEE, 2007.
center of the object.

IV. E XPERIMENT
We evaluated our method on modern and full-feature
quadcopter Walkera Scout X4, displayed on Image 1.

Fig. 1. Walkera Scout X4 quadcopter

We presented it to various trackable objects and observed


speed and accuracy of its response. It turned out it was able
to follow most of the object with no regard to various light
conditions and speed of the movement.

R EFERENCES
[1] E. Altug, J. Ostrowski, R. Mahony, Control of a quadrotor helicopter
using visual feedback, na: Proceedings of the IEEE International Con-
ference on Robotics and automation, IEEE, 2002, pp. 72-77.

View publication stats

You might also like