You are on page 1of 6

Estimation of UAV position with use of thermal

infrared images

Wanessa da Silva Elcio Hideiti Shiguemori Nandamudi Lankalapalli Vijaykumar


National Institute for Space Research Institute for Advanced Studies National Institute for Space Research
Sao Jose dos Campos-SP, Brazil Sao Jose dos Campos-SP, Brazil Sao Jose dos Campos-SP, Brazil
Email: wanessa.colaborador@ieav.cta.br Email: elcio@ieav.cta.br Email: vijay.nl@inpe.br

Haroldo Fraga de Campos Velho


National Institute for Space Research
Sao Jose dos Campos-SP, Brazil
Email: haroldo@lac.inpe.br

AbstractThe use of Unmanned Aerial Vehicles has increased spectral bands for images: visible band [16], [6], [17],
and become indispensable for many applications where human synthetic aperture radar (SAR) images [17].
intervention is exhausting, dangerous or expensive. With this
increase in UAV employment, autonomous navigation has been In this paper, UAV infrared images are used to develop a
the subject of several studies. For this purpose, several systems positioning algorithm. However, the UAV images are combined
have been used, among them, image processing, that is an with satellite images on visible electromagnetic spectrum,thus
alternative to the Global Positioning System. The employment enabling the aircraft to perform night time flight or with low
of images in an autonomous navigation system has challenges, light.
among them, the night flight. In this context, this article presents
a study to estimate the UAVs geographical position with use
of infrared images. From this image, a search is made in a II. UAV S AUTONOMOUS NAVIGATION
georeferenced satellite image in the visible band. To automatically
register between aerial and satellite images, edge information The approach widely used for UAV navigation is the
extracted by Artificial Neural Networks are used. The artificial combination of the Inertial Navigation System (INS) and
neural network is automatically configured with use of Multiple GNSS such as GPS or GLONASS. The main problems of this
Particle Collision Algorithm. Furthermore, the estimation of the methodology are:
UAVs position is obtained by calculating the correlation index.
The results are promissing to be employed in night autonomous GNSS can present blockage or signal interference; and
navigation.
INS is a system which has an error that increases over
time and requires a mechanism to correct this error.
I. I NTRODUCTION
The use of Unmanned Aerial Vehicles (UAVs) have en- One alternative to the previous approach is the use of image
hanced many applications. UAV with onboard cameras can processing associated with the INS to correct the error. The
provide important information for different scenarios: moni- functioning of this approach follows the steps below:
toring forest fires [1], search and rescue [2], environmental
Step 1: real-time capture of overflown terrain image
monitoring [3], border surveillance [4], among others. Some
and matching with a georeferenced satellite image;
research with UAV is focused on autonomous navigation
without GNSS (General Navigation Satellite System) signal Step 2: the result of the matching between the images
such as GPS (Global Positioning System) or GLONASS (first step) is the aircraft position. The INS error can
(GLObal NAvigation Satellite System) [5], [6], [7], [8], [9]. be corrected by this result; and
This research is motivated because the UAV is susceptible to
signal blocking or signal interference [6], [10], [9], [11]. Some Step 3: after the INS error correction, the UAV posi-
technological alternatives have been proposed [7], [12], [13], tion is corrected also.
[14], [15], one of which is based on image processing.
III. AUTONOMOUS NAVIGATION BASED ON THERMAL
The UAV positioning is one application for onboard image IMAGES
processing. Some authors ([16], [6], and [17]) have used UAV
and satellite images for producing image registration. In these A. Infrared Thermal Images
papers, image edges are employed in the following way:
Infrared thermal images have the following properties:
different operators for edge extraction were applied:
the Sobel algorithm [16], the Canny algorithm [17], usually in grayscale;
artificial neural networks (Multi-Layer Perceptron blacks are cold objects and whites are hot objects; and
MLP, Radial Base Function RBF, and Celular Neural
Network CNN ) [6] were employed; grayscale indicates temperature variations.
1) Thermal Infrared Sensor: Electromagnetic radiation 1) Radial Base Function Neural Network: RBFs are mod-
consists of the interaction of electric and magnetic fields that els that approximate functions that can be trained, for example,
propagate in vacuum with a speed of 299.792Km/s and are to implement a mapping (interpolation) of input/output. The
generated, e.g., by means of thermal excitation. Electromag- performance of RBF depends on the number of center and
netic energy refers to their interaction with the atmosphere and the form of radial basis functions [22]. The architecture of the
their targets or objects that can be spread or absorbed by the RBF has three layers:
atmosphere [18].
the first layer (input layer) is composed of non-
In the thermal infrared region, the electromagnetic radiation computational sensitive neurons;
covers a wavelength range of 3 14m, as shown in Figure 1, a single hidden layer, computational neurons, whose
being different from the visible radiation. The spectral bands activation function is a radial basis function, that
most important for thermal infrared airborne sensors are within employs a non-linear transformation on the input data.
the range 3 5m which is especially useful for monitoring
hot targets as, for example, forest fires, to the region 814m the third layer is the output, also formed by compu-
to monitor vegetation, soil and rock [19]. tational neurons, having a linear activation function,
applying a linear transformation on the data [23].
In this study, the hidden layer activation function is the
Gaussian function 1.

v2
(v) = exp( ) (1)
2 2

where v = ||x ||, is the center of radial function and


the width of the radial function. The value of the standard
Fig. 1. Electromagnetic spectrum, infrared radiation is highlighted. deviation , of the Gaussian function 1, is obtained by:

dmax
A thermal infrared sensor is a device able to collect, detect = (2)
2 centers
and transduce the thermal infrared radiation emitted by the
target. These thermal imagers are divided basically into two where dmax is the maximum distance between the centers.
groups: Infrared Linescanner (IRLS) and Forward Looking The training of a RBF consists of determining the parameters
Infrared (FLIR). In this work the infrared sensor used was (center) and (standard deviation desired), the radial function
FLIR, but this has been adapted to be used in the target nadir. and, also, the determination of weights (values of connections)
The sensor used has the following characteristics: between the hidden layer and the output layer [23]. To deter-
mine the parameters, in this paper the algorithm selection of
TABLE I. M AIN FEATURES OF THE SENSOR EMPLOYED self-organized centers is employed [23].
Variation Spectrum 3.4 4.8m
Material InSb 2) Multilayer Perceptron Neural Network: MLP can be
Size of the image 640 x 512 seen as a non-linear classifier. MLP maps a set of input data
Frames generation rate 25Hz to a linearly separable space [23]. MLP has the following
characteristics: the network is fully connected; stores the
knowledge in the connections between neurons, called synaptic
weights or just weights; it has an input layer consisting of
B. Artificial Neural Networks in Edge Detection
non-computational neurons; it is composed of one or more
Artificial neural networks can be used to detect edges hidden layers formed by computational neurons (they have
in replacement of traditional methods. ANN, compared to an activation function); there is an output layer composed of
conventional methods such as Sobel, Roberts and Canny, computational neurons. The output layer is the response of the
present ability to adapt to be employed in images and have MLP; In this article, the error back-propagation algorithm is
the advantage of reducing the effect of noise in the image [20] used for training the MLP neural network. This is based on
[21]. Artificial Neural Networks are computational techniques the adjustment of weights to minimize the difference between
inspired by the functioning of the biological neuron. The main the desired response and the output of the network. The error
similarities are: back-propagation algorithm consists of two steps [23]:
propagation: the synaptic weights of the MLP are fixed
knowledge is acquired through an iterative learning in this algorithm phase. An output is produced by the
process, called a training algorithm; and network and its value is subtracted from the desired
connections, between the basic units of operation, response, producing an error signal [23].
store knowledge. backpropagation: the error calculated in the propa-
gation phase is back-propagated by MLP, layer after
In this study, two ANN are used in edge detection process: layer, to perform the correction of synaptic weights,
MLP and RBF. according to a correction rule [23]. The synaptic
weights are adjusted to make the ANN response
similar to the desired response.

3) Collision of Multiple Particles Algorithm : ANN has


been successfully applied in several areas of knowledge, but
its still a challenge to determine an optimal architecture of an
ANN [23]. Currently, some strategies for automatic configu-
ration of an ANN have been proposed. The main parameters
that need to be adjusted are: number of neurons in each layer;
number of hidden layers; activation function; learning rate;
and momentum rate. One approach to automatically configure
an ANN is the Multiple Particle Collision Algorithm (MPCA) Fig. 3. ANN activation process for detecting edges
[24]

MPCA, an extension of the Particle Collision Algorithm


(PCA) proposed by [25], is a new meta-heuristic introduced by
[26], seeking an optimum configuration of an objective func-
tion. The problem of finding an optimal architecture for a MLP 5) Estimation of UAV Position: With the use of edge
can be formulated as a problem of finding the parameters that information, to estimate the geographic position of the UAV,
optimize an objective function, thus, the research developed by the correlation index between the aerial image captured by
[24] introduced the objective function of a ANN determined the UAV and satellite georeferenced image is calculated. The
by [27]. MPCA can be used to find the optimal architecture highest value of the resulting matrix indicates the position of
of the MLP with this modification. the UAV in the satellite image. The correlation coefficient is
given by Equation 3.
In this research, the architecture of the MLP has been
determined automatically by the MPCA.

4) Training and Activation of Artificial Neural Networks:


In this research the RBF and MLP are employed as edge XX
detection algorithms in images. 10 patterns are considered for c(s, t) = f (x, y)w(x s, y t) (3)
training,each pattern corresponds to a binary matrix 33, being x y
8 edge patterns and 2 no border patterns. Figure 2 illustrates
the ten patterns and the desired response for each pattern.

where c(s, t) is the correlation matrix c, with the indices


s and t with s = 0, 1, ..., M 1 and t = 0, 1, ..., N 1, M
and N are the matrix dimensions f containing satellite image
edges, and the matrix w with dimensions J x K, with J M
Fig. 2. Edge and non-edge patterns employed in training. and K N containing the aerial image edges.

Thus, with the highest correlation c between the matrices


For activation phase, the image, for which edges have to w and f , the central point of aerial image w which coincides
be extracted, is covered by a mask size 33 from left to right, with the point on the satellite image f , corresponds to the
top to bottom, pixel by pixel. Each element of this window is location of the UAV. As the satellite image is georeferenced,
an input to the ANN, which presents the response: the coordinates of the point are known and estimation of the
UAVs position can be performed.
If the ANN response is smaller than a threshold, the
presented pattern represents an edge, thus, the central
pixel of the mask receives 1.

If the output of ANN is greater than or equal to a


threshold, it is a non-edge pattern, thus, the central
pixel of the mask receives 0.
C. Methodology

Figure 3 shows the ANN activation process for detecting


edges in images. Figure 4 illustrates the methodology used in this study.
satellite image and the image captured by the UAV is per-
formed by Equation 3. Then, the position of the UAV is
estimated.

D. Images Obtained by UAV

In the experiments, images of an aerial videography held


in November 2014 in Brazil have been used. The flight was
conducted at a constant altitude, and the captured image is in
Nadir view. The sensor used was the FLIR infrared.

The Quickbird satellite image was used as a reference. In


Table II the spatial, spectral and radiometric resolution are
presented. The chosen image, obtained in August 2014, is in
the same region imaged by the UAV.

TABLE II. R ESOLUTION AND COMPOSITION OF SATELLITE BANDS


Quickbird
Panchromatic Spatial Resolution 0.61m
Multispectral Spatial Resolution 2.44m
Panchromatic Spectral Resolution 0.45 0.90m
Fig. 4. Methodology Band 1 - Blue Spectral resolution 0.45 0.52m
Band 2 - Green Spectral resolution 0.52 0.60m
Band 3 - Red Spectral resolution 0.63 0.69m
Radiometric resolution 11 bits - 2048 grayscale
The methodology presented in Figure 4 is divided into
two phases: processing the georeferenced satellite image and
processing the aerial image obtained by a UAV.
The satellite images processing follows the steps:
satellite image selection Quickbird, subimage contain- The calculation of correlation coefficient used images with
ing entire UAV flight, considering the area around the 500 x 800 pixels for satellite image and 117 x 117 pixels for
position flown by UAV, estimated by INS; the aerial image. The aerial image has odd numbered size, the
central pixel is the position of the UAV. Figure 5 illustrates
grayscale and Gaussian filter to remove noise in the both images used for obtaining the results.
image;
edge extraction;
The processing of images captured by the UAV follows the
steps:
selecting the image obtained by the UAV;
grayscale and corrections on the scale and rotation.
The information required for the rotation correction
relative to the satellite may be obtained from sensors
on the UAV. And the scale correction can be made with
altimeter information. In this study the scale has been Fig. 5. Selected region. Em (A) Infrared aerial image (B) satellite image in
preprocessed with the UAV altimeter information. the visible bands.
However, the rotation is made with image registration,
because there is no UAV compass information.
image cropping around the location of the UAV and
Gaussian filter to remove noise in the image; IV. R ESULTS
Edge detection;
Figure 6 illustrates the result of edge extraction process
After edge extraction in satellite and aerial images, the using ANN in the same region of the satellite image in the
calculation of correlation index between the georeferenced visible band and the aerial image in the infrared band.
The study developed here is applicable in autonomous
navigation system for UAVs. This approach will allow the UAV
system make the estimation of its position on night flights,
without the use of GNSS signal for its location and using
satellite images in the visible band.

R EFERENCES
[1] G. F. S. M. Dantas, Enxame de vants para a detecca o de incendios
florestais, Recife, p. 66, 2014.
[2] A. N. Chaves, Proposta de modelo de veculos aereos nao tripulados
(vants) cooperativos aplicados a` operaca o de busca, Sao Paulo, p. 149,
2013.
[3] G. A. Longhitano, Vants para sensoriamento remoto: Aplicabilidade
Fig. 6. Edges extraction obtained by MLP, whose architecture was determined na avaliaca o e monitoramento de impactos ambientais causados por
by MPCA. (A) result of the aerial image (B) results of satellite image acidentes com carga perigosa, Sao Paulo, p. 163, 2010.
[4] R. O. Andrade, O voo do falca o, Pesquisa FAPESP, vol. 211, pp.
6469, 2013.
Figure 7 illustrates the correlation process and the estima- [5] S. Rathinam, P. Almeida, Z. W. Kim, S. Jackson, A. Tinka, W. Gross-
man, and R. Sengupta, Autonomous searching and tracking of a river
tion of the UAV position. using an uav. Proceedings of the 2007 American Control Conference,
July 2007, pp. 359364.
[6] G. A. M. Goltz, E. H. Shiguemori, and H. F. C. Velho, Position
estimation of uav by image processing with neural networks. X
Congresso Brasileiro de Inteligncia Computacional - CBIC, 2011, pp.
16.
[7] V. Tchernykh, M. Beck, and K. Janschek, Optical flow navigation for
an outdoor uav using a wide angle mono camera and dem matching.
4th IFAC - Symposium on Mecathronic System, 2006, pp. 16.
[8] M. A. P. Domiciano, E. H. Shiguemori, and D. L. A. V., Automatic
estimation of altitude of the references points for aerial autonomous
navigation using aerial photographs and characteristic points. 10th
World Congress on Computational Mechanics, 2012, pp. 18.
[9] A. Cesetti, E. Frontoni, A. Mancini, P. Zingaretti, and S. Longhi, A
vision-based guidance system for uav navigation and safe landing using
natural landmarks, Jounal Intell Robot System, vol. 57, pp. 233253,
2010.
[10] C. R. M. Souza, M. H. C. Dias, and J. C. A. Santos, Analise da
Fig. 7. Estimation of UAV position with use of the correlation between the
vulnerabilidade de receptores gps comerciais sob aca o de interferencia
aerial image and satellite. (A) aerial image in the infrared band. (B) satellite
intencional. VII SIGE: Simposio de Guerra Eletronica, 2006, pp.
image, with the aerial image highlighted in black, the position of the UAV in
110.
blue and the position estimated by the correlation in red
[11] F. Caballero, L. Merino, J. Ferruz, and A. Ollero, Improving vision-
based planar motion estimation for unmanned aerial vehicles through
online mosaicing. IEEE International Conference on Robotics and
The estimation of the real position of UAV in the satellite Automation, 2006, pp. 28602865.
image has been determined visually in order to calculate the [12] , Vision-based odometry and slam for medium and high altitude
error in meters. For error calculation, the Euclidean distance flying uavs, Journal of Intelligent and Robotic Systems, vol. 1, pp. 19,
between the real position and the estimated point is employed. 2009.
The result of this calculation is a value in pixels that is [13] O. Amidi, T. Kanade, and K. Fujita, A visual odometer for autonomous
multiplied by the spatial resolution of the panchromatic band helicopter fight. Fifth International Conference on Intelligent Au-
tonomous Systems (IAS-5), 1998, p. 185193.
satellite Quickbird, which is 0, 6m.
[14] P. I. Corke, P. Sikka, and J. M. Roberts, Height estimation for an
TABLE III. UAV ESTIMATION POSITION ERROR autonomous helicopter. Experimental Robotics VII, 2000, pp. 101
110.
Canny MLP RBF MLP (MPCA) [15] S. Saripally and G. Sukhatme, Landing on a moving target using an
Estimation error (meters) 2.4739 1.9132 1.8974 1.8974 autonomous helicopter. International Conference on Field and Service
Robotics, 2003, pp. 17.
[16] D. Conte and P. Doherty, An integrated uav navigation system based
on aerial image matching, 2008, p. 110.
V. C ONCLUSION
[17] Z. Sjanic, Navigation and sar auto-focusing in a sensor fusion frame-
Based on the obtained results it can be seen that it is work, Linkoping, Sweden, p. 97, 2011.
possible to estimate the position of the UAV with the use of [18] A. L. N. Coelho, Sensoriamento remoto infravermelho termal:
Contribuico es para o estudo do clima urbano. XIII Simposio Nacional
images in the infrared band. It also observed that aerial images de Geografia Urbana, 2013, p. 118.
in the infrared band can be compared with satellite images in
[19] J. R. Jensen, Remote Sensing of the Environment: An Earth Resource
the visible band, with borders extracted by ANN. And based on Perspective, 2nd ed. Upper Saddle River, NJ: Pearson Prentice Hall,
the results, it can be seen that for images obtained by different 2007.
sensors, ANNs exhibit the lowest error estimation of position [20] A. J. Pinho and L. B. Almeida, Edge detection filters based on artificial
when compared to the Canny algorithm. neural networks. IEEE Computer Society Press, 1995, pp. 159164.
[21] H. Wang, X. Ye, and W. Gu, Training a neuralnetwork for moment
based image edge detection, Journal of Zhejiang University SCIENCE,
vol. 1, pp. 398401, 2000.
[22] L. S. Coelho, Rede neural com funca o de base radial com treinamento
usando filtro de kalman e entropia cruzada aplicada a` identificaca o de
sistemas. VIII Simposio Brasileiro de Automaca o Inteligente - SBAI,
2007, pp. 16.
[23] S. Haykin, Neural Networks: A Comprehensive Foundation, 2nd ed.
Upper Saddle River, NJ, USA: Prentice Hall, 1999.
[24] J. A. Anochi, H. F. C. Velho, H. C. M. Furtado, and L. E. F. P., Self-
configuring two types neural networks by mpca, 2nd International
Symposium on Uncertainty Quantifcation and Stochastic Modeling,
2014.
[25] W. F. Sacco, D. C. Knupp, and A. J. S. Neto, Estimation of radiative
properties with the particle collision algorithm, Inverse Problems,
Design and Optimization Symposium, 2007.
[26] E. F. P. Luz, J. C. Becceneri, and H. F. C. Velho, A new multi-
particle collision algorithm for optimization in a high performance
environment, Journal of Computational Interdisciplinary Sciences,
2008.
[27] E. H. Shiguemori, H. F. C. Velho, J. D. S. Silva, and J. C. Carvalho,
Neural network based models in the inversion of temperature vertical
profiles from radiation data. Proceedings of the Inverse Problems,
Design and Optimization (IPDO-2004) Symposium, 2004, pp. 543556.

You might also like