Professional Documents
Culture Documents
infrared images
AbstractThe use of Unmanned Aerial Vehicles has increased spectral bands for images: visible band [16], [6], [17],
and become indispensable for many applications where human synthetic aperture radar (SAR) images [17].
intervention is exhausting, dangerous or expensive. With this
increase in UAV employment, autonomous navigation has been In this paper, UAV infrared images are used to develop a
the subject of several studies. For this purpose, several systems positioning algorithm. However, the UAV images are combined
have been used, among them, image processing, that is an with satellite images on visible electromagnetic spectrum,thus
alternative to the Global Positioning System. The employment enabling the aircraft to perform night time flight or with low
of images in an autonomous navigation system has challenges, light.
among them, the night flight. In this context, this article presents
a study to estimate the UAVs geographical position with use
of infrared images. From this image, a search is made in a II. UAV S AUTONOMOUS NAVIGATION
georeferenced satellite image in the visible band. To automatically
register between aerial and satellite images, edge information The approach widely used for UAV navigation is the
extracted by Artificial Neural Networks are used. The artificial combination of the Inertial Navigation System (INS) and
neural network is automatically configured with use of Multiple GNSS such as GPS or GLONASS. The main problems of this
Particle Collision Algorithm. Furthermore, the estimation of the methodology are:
UAVs position is obtained by calculating the correlation index.
The results are promissing to be employed in night autonomous GNSS can present blockage or signal interference; and
navigation.
INS is a system which has an error that increases over
time and requires a mechanism to correct this error.
I. I NTRODUCTION
The use of Unmanned Aerial Vehicles (UAVs) have en- One alternative to the previous approach is the use of image
hanced many applications. UAV with onboard cameras can processing associated with the INS to correct the error. The
provide important information for different scenarios: moni- functioning of this approach follows the steps below:
toring forest fires [1], search and rescue [2], environmental
Step 1: real-time capture of overflown terrain image
monitoring [3], border surveillance [4], among others. Some
and matching with a georeferenced satellite image;
research with UAV is focused on autonomous navigation
without GNSS (General Navigation Satellite System) signal Step 2: the result of the matching between the images
such as GPS (Global Positioning System) or GLONASS (first step) is the aircraft position. The INS error can
(GLObal NAvigation Satellite System) [5], [6], [7], [8], [9]. be corrected by this result; and
This research is motivated because the UAV is susceptible to
signal blocking or signal interference [6], [10], [9], [11]. Some Step 3: after the INS error correction, the UAV posi-
technological alternatives have been proposed [7], [12], [13], tion is corrected also.
[14], [15], one of which is based on image processing.
III. AUTONOMOUS NAVIGATION BASED ON THERMAL
The UAV positioning is one application for onboard image IMAGES
processing. Some authors ([16], [6], and [17]) have used UAV
and satellite images for producing image registration. In these A. Infrared Thermal Images
papers, image edges are employed in the following way:
Infrared thermal images have the following properties:
different operators for edge extraction were applied:
the Sobel algorithm [16], the Canny algorithm [17], usually in grayscale;
artificial neural networks (Multi-Layer Perceptron blacks are cold objects and whites are hot objects; and
MLP, Radial Base Function RBF, and Celular Neural
Network CNN ) [6] were employed; grayscale indicates temperature variations.
1) Thermal Infrared Sensor: Electromagnetic radiation 1) Radial Base Function Neural Network: RBFs are mod-
consists of the interaction of electric and magnetic fields that els that approximate functions that can be trained, for example,
propagate in vacuum with a speed of 299.792Km/s and are to implement a mapping (interpolation) of input/output. The
generated, e.g., by means of thermal excitation. Electromag- performance of RBF depends on the number of center and
netic energy refers to their interaction with the atmosphere and the form of radial basis functions [22]. The architecture of the
their targets or objects that can be spread or absorbed by the RBF has three layers:
atmosphere [18].
the first layer (input layer) is composed of non-
In the thermal infrared region, the electromagnetic radiation computational sensitive neurons;
covers a wavelength range of 3 14m, as shown in Figure 1, a single hidden layer, computational neurons, whose
being different from the visible radiation. The spectral bands activation function is a radial basis function, that
most important for thermal infrared airborne sensors are within employs a non-linear transformation on the input data.
the range 3 5m which is especially useful for monitoring
hot targets as, for example, forest fires, to the region 814m the third layer is the output, also formed by compu-
to monitor vegetation, soil and rock [19]. tational neurons, having a linear activation function,
applying a linear transformation on the data [23].
In this study, the hidden layer activation function is the
Gaussian function 1.
v2
(v) = exp( ) (1)
2 2
dmax
A thermal infrared sensor is a device able to collect, detect = (2)
2 centers
and transduce the thermal infrared radiation emitted by the
target. These thermal imagers are divided basically into two where dmax is the maximum distance between the centers.
groups: Infrared Linescanner (IRLS) and Forward Looking The training of a RBF consists of determining the parameters
Infrared (FLIR). In this work the infrared sensor used was (center) and (standard deviation desired), the radial function
FLIR, but this has been adapted to be used in the target nadir. and, also, the determination of weights (values of connections)
The sensor used has the following characteristics: between the hidden layer and the output layer [23]. To deter-
mine the parameters, in this paper the algorithm selection of
TABLE I. M AIN FEATURES OF THE SENSOR EMPLOYED self-organized centers is employed [23].
Variation Spectrum 3.4 4.8m
Material InSb 2) Multilayer Perceptron Neural Network: MLP can be
Size of the image 640 x 512 seen as a non-linear classifier. MLP maps a set of input data
Frames generation rate 25Hz to a linearly separable space [23]. MLP has the following
characteristics: the network is fully connected; stores the
knowledge in the connections between neurons, called synaptic
weights or just weights; it has an input layer consisting of
B. Artificial Neural Networks in Edge Detection
non-computational neurons; it is composed of one or more
Artificial neural networks can be used to detect edges hidden layers formed by computational neurons (they have
in replacement of traditional methods. ANN, compared to an activation function); there is an output layer composed of
conventional methods such as Sobel, Roberts and Canny, computational neurons. The output layer is the response of the
present ability to adapt to be employed in images and have MLP; In this article, the error back-propagation algorithm is
the advantage of reducing the effect of noise in the image [20] used for training the MLP neural network. This is based on
[21]. Artificial Neural Networks are computational techniques the adjustment of weights to minimize the difference between
inspired by the functioning of the biological neuron. The main the desired response and the output of the network. The error
similarities are: back-propagation algorithm consists of two steps [23]:
propagation: the synaptic weights of the MLP are fixed
knowledge is acquired through an iterative learning in this algorithm phase. An output is produced by the
process, called a training algorithm; and network and its value is subtracted from the desired
connections, between the basic units of operation, response, producing an error signal [23].
store knowledge. backpropagation: the error calculated in the propa-
gation phase is back-propagated by MLP, layer after
In this study, two ANN are used in edge detection process: layer, to perform the correction of synaptic weights,
MLP and RBF. according to a correction rule [23]. The synaptic
weights are adjusted to make the ANN response
similar to the desired response.
R EFERENCES
[1] G. F. S. M. Dantas, Enxame de vants para a detecca o de incendios
florestais, Recife, p. 66, 2014.
[2] A. N. Chaves, Proposta de modelo de veculos aereos nao tripulados
(vants) cooperativos aplicados a` operaca o de busca, Sao Paulo, p. 149,
2013.
[3] G. A. Longhitano, Vants para sensoriamento remoto: Aplicabilidade
Fig. 6. Edges extraction obtained by MLP, whose architecture was determined na avaliaca o e monitoramento de impactos ambientais causados por
by MPCA. (A) result of the aerial image (B) results of satellite image acidentes com carga perigosa, Sao Paulo, p. 163, 2010.
[4] R. O. Andrade, O voo do falca o, Pesquisa FAPESP, vol. 211, pp.
6469, 2013.
Figure 7 illustrates the correlation process and the estima- [5] S. Rathinam, P. Almeida, Z. W. Kim, S. Jackson, A. Tinka, W. Gross-
man, and R. Sengupta, Autonomous searching and tracking of a river
tion of the UAV position. using an uav. Proceedings of the 2007 American Control Conference,
July 2007, pp. 359364.
[6] G. A. M. Goltz, E. H. Shiguemori, and H. F. C. Velho, Position
estimation of uav by image processing with neural networks. X
Congresso Brasileiro de Inteligncia Computacional - CBIC, 2011, pp.
16.
[7] V. Tchernykh, M. Beck, and K. Janschek, Optical flow navigation for
an outdoor uav using a wide angle mono camera and dem matching.
4th IFAC - Symposium on Mecathronic System, 2006, pp. 16.
[8] M. A. P. Domiciano, E. H. Shiguemori, and D. L. A. V., Automatic
estimation of altitude of the references points for aerial autonomous
navigation using aerial photographs and characteristic points. 10th
World Congress on Computational Mechanics, 2012, pp. 18.
[9] A. Cesetti, E. Frontoni, A. Mancini, P. Zingaretti, and S. Longhi, A
vision-based guidance system for uav navigation and safe landing using
natural landmarks, Jounal Intell Robot System, vol. 57, pp. 233253,
2010.
[10] C. R. M. Souza, M. H. C. Dias, and J. C. A. Santos, Analise da
Fig. 7. Estimation of UAV position with use of the correlation between the
vulnerabilidade de receptores gps comerciais sob aca o de interferencia
aerial image and satellite. (A) aerial image in the infrared band. (B) satellite
intencional. VII SIGE: Simposio de Guerra Eletronica, 2006, pp.
image, with the aerial image highlighted in black, the position of the UAV in
110.
blue and the position estimated by the correlation in red
[11] F. Caballero, L. Merino, J. Ferruz, and A. Ollero, Improving vision-
based planar motion estimation for unmanned aerial vehicles through
online mosaicing. IEEE International Conference on Robotics and
The estimation of the real position of UAV in the satellite Automation, 2006, pp. 28602865.
image has been determined visually in order to calculate the [12] , Vision-based odometry and slam for medium and high altitude
error in meters. For error calculation, the Euclidean distance flying uavs, Journal of Intelligent and Robotic Systems, vol. 1, pp. 19,
between the real position and the estimated point is employed. 2009.
The result of this calculation is a value in pixels that is [13] O. Amidi, T. Kanade, and K. Fujita, A visual odometer for autonomous
multiplied by the spatial resolution of the panchromatic band helicopter fight. Fifth International Conference on Intelligent Au-
tonomous Systems (IAS-5), 1998, p. 185193.
satellite Quickbird, which is 0, 6m.
[14] P. I. Corke, P. Sikka, and J. M. Roberts, Height estimation for an
TABLE III. UAV ESTIMATION POSITION ERROR autonomous helicopter. Experimental Robotics VII, 2000, pp. 101
110.
Canny MLP RBF MLP (MPCA) [15] S. Saripally and G. Sukhatme, Landing on a moving target using an
Estimation error (meters) 2.4739 1.9132 1.8974 1.8974 autonomous helicopter. International Conference on Field and Service
Robotics, 2003, pp. 17.
[16] D. Conte and P. Doherty, An integrated uav navigation system based
on aerial image matching, 2008, p. 110.
V. C ONCLUSION
[17] Z. Sjanic, Navigation and sar auto-focusing in a sensor fusion frame-
Based on the obtained results it can be seen that it is work, Linkoping, Sweden, p. 97, 2011.
possible to estimate the position of the UAV with the use of [18] A. L. N. Coelho, Sensoriamento remoto infravermelho termal:
Contribuico es para o estudo do clima urbano. XIII Simposio Nacional
images in the infrared band. It also observed that aerial images de Geografia Urbana, 2013, p. 118.
in the infrared band can be compared with satellite images in
[19] J. R. Jensen, Remote Sensing of the Environment: An Earth Resource
the visible band, with borders extracted by ANN. And based on Perspective, 2nd ed. Upper Saddle River, NJ: Pearson Prentice Hall,
the results, it can be seen that for images obtained by different 2007.
sensors, ANNs exhibit the lowest error estimation of position [20] A. J. Pinho and L. B. Almeida, Edge detection filters based on artificial
when compared to the Canny algorithm. neural networks. IEEE Computer Society Press, 1995, pp. 159164.
[21] H. Wang, X. Ye, and W. Gu, Training a neuralnetwork for moment
based image edge detection, Journal of Zhejiang University SCIENCE,
vol. 1, pp. 398401, 2000.
[22] L. S. Coelho, Rede neural com funca o de base radial com treinamento
usando filtro de kalman e entropia cruzada aplicada a` identificaca o de
sistemas. VIII Simposio Brasileiro de Automaca o Inteligente - SBAI,
2007, pp. 16.
[23] S. Haykin, Neural Networks: A Comprehensive Foundation, 2nd ed.
Upper Saddle River, NJ, USA: Prentice Hall, 1999.
[24] J. A. Anochi, H. F. C. Velho, H. C. M. Furtado, and L. E. F. P., Self-
configuring two types neural networks by mpca, 2nd International
Symposium on Uncertainty Quantifcation and Stochastic Modeling,
2014.
[25] W. F. Sacco, D. C. Knupp, and A. J. S. Neto, Estimation of radiative
properties with the particle collision algorithm, Inverse Problems,
Design and Optimization Symposium, 2007.
[26] E. F. P. Luz, J. C. Becceneri, and H. F. C. Velho, A new multi-
particle collision algorithm for optimization in a high performance
environment, Journal of Computational Interdisciplinary Sciences,
2008.
[27] E. H. Shiguemori, H. F. C. Velho, J. D. S. Silva, and J. C. Carvalho,
Neural network based models in the inversion of temperature vertical
profiles from radiation data. Proceedings of the Inverse Problems,
Design and Optimization (IPDO-2004) Symposium, 2004, pp. 543556.