You are on page 1of 5

UAV Assisted Landing on Moving UGV

Yves Bergeon*, Radek Doskoil, Vclav Kivnek, Jean Motsch* and Alexandr tefek
* cole de Saint Cyr, Guer Cedex, France, e-mail: {firstname.lastname}@st-cyr.terre-net.defense.gouv.fr
University of Defence, Brno, Czech Republic, e-mail: {firstname.lastname}@unob.cz

AbstractThe paper presents a method to assist the landing of an UAV is monitored. This could be problem because it needs to
Unmanned Aerial Vehicle (UAV) on a moving Unmanned consider a huge space around UGV. Certain solution gives
Ground Vehicle (UGV). The measurement of the relative position a tracking system. Conversely to equip UAV with a camera
of the UAV and the UGV is carried out by a cheap USB camera. when it is not needed for the mission reduces the available
The paper deals with the list of the difficulties of such a system payload.
which have to be taken into account and develop ideas on how to
overcome them. The solution of the key problems is implemented One of the primary reasons for the current interest in small
and tested by experiments. UAV is that they offer an inexpensive platform to carry visual
(VIS) and infrared (IR) cameras. Almost all small and
Keywords-unmanned vehicles; optical sensors; assisted landing miniature air vehicles that are currently deployed carry either
a VIS or IR camera. While the cameras primary use is to relay
I. INTRODUCTION information to a user, it makes sense to attempt to also use the
camera for the purpose of navigation, guidance, and control.
In these days usages of unmanned vehicles are pretty Further motivation comes from the fact that birds and flying
common. Mainly the small and miniature Unmanned Aircraft insects use vision as their primary guidance sensor [6].
Vehicles (UAVs) become in last days very popular, especially
multicopters for intelligence, surveillance and reconnaissance Critical to the creation and realization of UAV assisted
task. The real problem of such a deployment is mainly energy. landing (generally small UAVs or miniature air vehicles -
Common platforms grant about 10 - 15 minutes in air. This Micro Aerial Vehicles MAVs) is the development of small,
limits UAV flying range to distances about two kilometres. Of lightweight solid-state sensor. Based on Micro-Electro-
course it is possible to attach another battery pack but at the Mechanical Systems (MEMS) technology, small but accurate
expense of ability to carry another equipment which reduce sensors such as accelerometers, angular rate sensors, and
payload. pressure sensors have enabled the development of increasingly
smaller and more capable autonomous aircraft or specific
On the other hand, the unmanned ground vehicle (UGV) function such as assisted landing. Coupled with the
has not any problem with energy source. UGV can use as development of small global position systems (GPS),
propulsion unit a combustion engine. In this case UGVs are not computationally capable microcontrollers, and more powerful
limited in range of use. Moreover UGV can carry really heavy batteries, the capabilities of UAVs have gone from being
equipment without real problem. Contrariwise UGV ability to purely radio controlled by pilots on the ground to highly
discover larger area in a short time is limited. autonomous systems in less than 25 years [7].
This difference offers idea about cooperation on a mission. The onboard sensors typically used for guidance,
While UGVs will play role of transport and base, UAV will navigation and control of UAVs are:
focus on main mission aim. The standard action sequence of
a mission will begin by the UGV moving near the goal with the Accelerometers,
UAV onboard. UAV will then take off fulfilling and complete Rate gyros,
his mission, going back to the UGV (landing on it) at the end Pressure sensors (absolute and differential),
of his mission. The UGV then go back to the next goal. As the Magnetometers (Digitals Compass),
general mission structure is very complex this paper will focus
GPS [8].
just on the landing phase. This phase (manoeuvre) is very
difficult for the UAV operator to control it remotely after But there are payload sensors, such as VIS or IR cameras
a stressful mission [1]. which have a great potential for navigation (or specific assisted
landing too) [9].
There are many systems that could be used for this task
(GPS receiver for example). The main limit is an accuracy and The essential topic is the errors of the sensors and their
fluctuation of the measured data. Better results are given by effect on guidance, navigation and control of UAVs. It is not
laser range-finder or ultrasonic sensor, however, it is not full- topic of this paper and if needed the reader may find
fledged solution and still information about UGV orientation is information about errors in [7].
missing [2]. Therefore, an onboard camera seems to be a very
convenient sensor [3, 4, 5]. A. A Few Notes to the Sensors
The next issue is the layout of the couple UGV-UAV. The In MAVs applications, three accelerometers are commonly
main sensor could be on board of the UGV and the motion of used. The accelerometers are mounted near the centre of mass,

The work presented in this paper has been supported by the Ministry of
Defence of the Czech Republic (Project DZRO K208).
with the sensitive axis of one accelerometer aligned with each
of the body axes. Accelerometers measure the specific force in
the body frame of the MAV or the difference between the
acceleration of the MAV and the gravitational acceleration.
MEMS rates gyros typically operate based on the principle y
of the Coriolis acceleration. When measuring the angular
velocity with rates gyros errors appear. It is necessary to know

some values (a gain and a bias term of the rate gyro) to correct
y
the measurements. To ensure accurate measurements, the value
of this gain should be determined through experimental z
calibration. The Bias term is strongly depend on temperature
x
and should by calibrate prior to each flight. For low-cost
MEMS gyros, drift in this bias term can by significant and care z
must be taken to zero thy gyro bias periodically during flight. x

Pressure sensors provide indications of the altitude of the


MAV and the airspeed of the MAV. To measure altitude, we
will use an absolute pressure sensor. To measure airspeed, we
will use a differential pressure sensor.
The earths magnetic field continues to provide information
for navigation for a variety of vehicles, including UAV. Figure 1. Basic coordinate realations between UGV and UAV
A compass measures the direction of the magnetic field locally
and provides an indication of the heading relative to magnetic UGV differential drive has to follow those differences and
north. To know a true north, we have to calculate a declination correct them in real time to allow safe landing on UGV.
angle by using models (e.g. NGDC) for given latitude and
longitude. The magnetometers and digital compasses use the B. UGV Motion
sensitivity of the sensor to electromagnetic interference. Eq. 1 and eq. 2 show the differential equation for UGV
Careful placement of the sensor on the aircraft is essential to drive.
avoid interference from electric motors, servos, power wiring,
power lines and weather systems.
(R + l / 2 ) = V r
Modern digital compasses (contain magnetometers, signal
conditioning and microcontroller) use three-axis ( R l / 2 ) = Vl (1)
magnetometers to measure the strength of the magnetic field
along three orthogonal axes. In UAV application, these axes of
measurement are usually aligned with the body axes of the l Vl + V r
aircraft. Although only two sensing axes are required to R=
measure the magnetic heading if the aircraft is in horizontal 2 Vl V r
flight. A third axis is necessary if the aircraft is pitching or V Vr
rolling out of the horizontal plane.
= l (2)
l
Global positioning system (GPS) provides 3D positioning
and velocity information for objects on or near the earths
surface. The accuracy of a GPS position measurement is 1 1
affected by the accuracy of the satellite pseudorange cos cos
measurement and by the geometry of the satellites from which x
2 2
1 1 V
pseudorange measurement are taken. y = sin sin l (3)
2 2 Vr
1 1
II. BASIC PRINCIPLES OF LANDING
l l
For simplicity the UGV with differential drive was taken
into account. This type of drive allows easy orientation as the where Vl and V r are wheels velocities along the ground and l
drive is able to change rotation without moving (at place).
is distance between wheels, x and y are coordinates of robot
Flying multicopter can change orientation too. Because the
chassis in fixed coordinate system.
measurement is done on UGV the idea is to control the
displacement by UGV [10, 11].
C. Projective Transformation on Sensor
A. Coordinate Definition For the UAV observation the optical sensor (camera) is
used. For the calculation of the image sensed by the camera the
Let's assume that UGV define coordinate origin. Then
projective transformation can be used. In general 3D space the
according Fig. 1 UAV has coordinates [x, -y, z]. UAV
transformation matrix can have this form
board coordinate system is rotated by angle.
yf
[x s , y s ] = xf , (4)
z z

where [x s , y s ] are coordinates of point in camera image,


[x, y, z ] are original coordinates according to Fig. 1 and f is
adapted focus length. This form can be expressed in form of
projective matrix (see Eq. 5) [12, 13]. A)

xp 1 0 0 0 x

y p 0 1 0 0 y
z =
p 0 0 1 0 z (5)
1
z 0 0 0 1
f f B)

III. SENSING THE UAV


For the first approach of the landing assistant system lets
imagine that active lights are implemented on the UAV (Fig.
2). These active lights help to recognize position (distance and
C)
orientation) of the UAV. This helps UGV to setup orientation
and position with cooperation with UAV [14]. Figure 3. Use of captured image difference. A) Tested system with switched
off all lights. B) Active light as a result of image process. C) Tested system
Firstly the recognition of active lights must be determined. with light on left upper LED
The lights themselves are blinking which simplify their
identification on images in a video stream. Difference between
two images in sequence can reveal blinking lights. As the IV. POSITION ADAPTATION
difference is easy operation it can be computed even on light While there is difference between UAV and UGV position
computers like microcontrolers. Fig. 3 depicts this difference the UGV is navigated to eliminate this difference. According
calculation. the Eq. 2B the UGV has initial position and orientation. The
Even if the difference can be easily computed, still the desired position is defined too. The Eq. 1-3 allows do define
finding position of single light is problem with time consuming robot path on circle segment. Moreover the circle segment is
solution. fully defined by starting position, starting orientation and final
position. To derive wheel velocities the radius of circle
segment is needed. The situation is depicted on Fig. 4.

A B

S r

Figure 2. Schema of LED source of UAV and camera of UGV Figure 4. Navigation UGV into desired position
A is the starting point and B the target point. Starting If control processes the situation when robot is oriented
orientation is , the robot path is defined by circle with centre towards B point must be maintained. This can be detected by
in S and radius r. condition
The tangent of the circle at point A = [ax, ay] can be defined
by (bx a x ) sin (b y a y )cos = 0 (13)

[x, y ] = [k cos + a x , k sin + a y ] (6)


V. IMPLEMENTATION
and the axis of line AB by a parametric equation The implementation needs camera picture taking and pre-
processing to generate XUAV and YUAV, which will be with
the inputs to control system (see Fig. 5). Control system can be
[x, y ] = l (b y a y ) + 1 (a x + b x ), l (b x a x ) + 1 (a y + b y ) (7) implemented then as a simple numeric method.
2 2 The code expressed in C# language is depicted on Fig. 6.
Algorithm is fully based on Eq. 9 and 10. As the robot can be
The centre of the circle S has coordinates which satisfy both oriented towards the destination point there are some
equations. If k is evaluated, the centre of the circle S has conditions which are testing this orientation and keep the
distance from A velocities Vl and Vr same.

VI. CONCLUSION
d= (k cos )2 + (k sin )2 =k (8)
The action area of a UAV is very small because of its
energy independence (and in an open air environment UAV is
So the parameter k is directly distance from point A and under influence of wind gusts which reduce; the autonomy
then the radius r of the circle is k too. Deriving from previous too). The method we proposed allows an easy cooperation
equations k can be evaluated as between an UAV and an UGV to increase the scope of the
UAV during its mission. The simplicity of our method

k=
1 ( )
b y a y 2 + (bx a x )2
(9)
(difference of image to find blinking infrared LED allows to
have only a small computer system onboard.
(
2 (bx a x ) sin b y a y cos ) In this article the relation between UAV motion and UGV
adaptation to allow landing was completely described. The
or in simple form adaptation depends on sensor information manipulation and
UGV ability to control position. A model of a sensor system
2
was presented and the analysis of UGV motion was
1 AB done.
k= G (10)
2 AB
G
where = (cos , sin ) . According the Eq. 1 and Eq. 2 the
velocities for wheels can be expressed

2R + 1
Vl = Vr (11)
2R l

where R = k and l is distance between wheels. With use of state


space equation the control can be expressed as

1 1
cos cos

x 2 2 2R + 1
1 1
sin r 2 R l
V
y = sin
(12)
2 2 Vr
1 1

l l

Figure 5. Structure of proposed system (X, Y and are outputs)


private void CalculateVelocity(int basicVelocity) [3] L. Qiang, Z. Wenhui, L. Jilin, Realization of Odometry System Using
{ Monocular Vision. In International Conference on Computational
Vr = -basicVelocity; Intelligence and Security, pp. 18411844.
Vl = Vr; 10.1109/ICCIAS.2006.295383, 2006.
double abx = destinationX - x; [4] D. Scaramuzza, F. Fraundorfer, Visual Odometry, Part I and II. IEEE
double aby = destinationY - y; Robotics & Automation Magazine, pp. 80-92, Dec. 2011 and pp. 78-90,
Jun. 2012.
if ((abx * abx + aby * aby) < 1) [5] E. Stella, F.P. Lovergine, L. Caponetti, A. Distante, Mobile Robot
Vr = 0; Navigation Using Vision and Odometry. In Intelligent Vehicles 94
Vl = Vr; Symposium, pp. 417422. 10.1109/ISIC.1995.525115, 1994.
double sin = Math.Sin(phi); [6] J. H. Evers, Biological inspiration for agile autonomous air vehicle. In
double cos = Math.Cos(phi); Symposium on Platform Innovation and System Integration for
Unmanned Air, Land, and sea Vehicle, NATO Research and
double temp = (abx * sin - aby * cos); Technology Organization AVT-146, paper No. 15, Florence, Italy, 2007.
if (Math.Abs(temp) > 1e-3) [7] R. W. Beard, T. W. McLain, Small Unmanned Aircraft: Theory and
{ Practice. Princeton University Press , 2012.
double k = 0.5 * (aby * aby + abx * abx) / temp;
radius = (float)k; [8] J. Borenstein, H. Everett, L. Feng, Where am I? Sensors and Methods
if (Math.Abs(2 * k - l) > 1e-3) for Mobile Robot Positioning. Technical report, The University of
Vl = Vr * (2 * k + 1) / (2 * k - l); Michigan, 1996.
} [9] F. Chenavier, J. L. Crowley, Position Estimation for a Mobile Robot
} Using Vision and Odometry. In IEEE International Conference on
Robotics and Automation, volume 3, pp. 25882593.10.1109/ROBOT.
1992.220052, 1992.
Figure 6. Calculation of velocities for both motors
[10] P. Kutilek, J. Hozman, Determining the position of head and shoulders
in neurological practice with the use of cameras, Acta Polytechnica,
Volume 51, Issue 3, pp. 32-38, 2011.
VII. FUTURE EVOLVING
[11] J. Hejda, P. Kutilek, J. Hozman, R. Cerny, System for precise
Precision landing is challenge to our team. We are measurement of head and shoulders position, IFMBE Proceedings,
considering the use of appropriate modern guidance methods Volume 41, pp. 1555-1558, 2014.
that which appears effective guidance strategy against the [12] A. de Bourdonnaye, R. Doskocil, V. Krivanek, A., Stefek, Practical
moving UGV [12]. The problem of navigation has been treated Experience with Distance Measurement Based on the Single Visual
using several fundamental methodologies. The classical Camera. Advances in Military Technology, 2012, vol. 8, no. 1. ISSN
1802-2308.
approach is to apply proportional navigation and the modern
[13] Y. T. Bergeon, Calculation of The Distance Covered by a Robot
approach to guidance is based upon optimal control theory (e.g. Thanks to Image Analysis With a Two-robot Team. In ICMT 11
linear-quadratic one-side optimization) and differential games International Conference on Military Technologies, volume 35, pp. 849
(e.g. linear-quadratic differential games) [13]. 854. Brno, 2011.
[14] R. Doskocil, J. Fischer, V. Krivanek, A. Stefek, Measurement of
Distance by Single Visual Camera at Robot Sensor Systems. In IEEE
REFERENCES conference 15th Mechatronika, pp. 143-149, Praha, 2012.
[1] R. Doskocil, J. Hosek, V. Kivnek, A. Stefek, Y. T. Bergeon, Stereo [15] S. N. Balakrishnan, A. Tsourdos, B. A. White, Advances in Missile
Vision for Teleoperated Robot. In: Proceedings of the 16th International Guidance, Control, and Estimation. CRC Press, Taylor and Francis
Conference on Mechatronics Mechatronika 2014. Brno: University of Group. Great Britain, 2013.
Technology, 2014, pp. 511-518. ISBN 978-80-214-4817-9.
[16] J. Z. Asher-Ben, I. Yeash, Advanced in Missile Guidance Theory.
[2] R. Vitek, L. Jedlicka, Effect of the Accuracy of the Target Range Vol. 180, Progress in Astronautics and Aeronautics. American Institute
Measurement on the Accuracy of Fire. Advance in Military of Aeronautics and Astronautics, 1998.
Technology, 5(2), pp. 6983/B5, 2010.

You might also like