Professional Documents
Culture Documents
Abstract—Sense and Avoid is gaining importance in the II. STATE OF THE ART
integration of unmanned aerial vehicles (UAVs) into civil
airspace. In order to be seen by other air traffic participants A. Cloud detection on-board UAV
operating under visual flight rules (VFR), own visibility must be Cloud detection as part of UAV Sense & Avoid is an
granted. This includes the observance of legal minimum cloud
upcoming topic in the integration of unmanned systems in
distances. An airborne electro-optical cloud detection and
avoidance system could cope with this task on unmanned
European airspace. However, research on cloud detection has
platforms. In this paper we propose an approach for cloud mainly focused on ground-based or space-based applications
distance estimation using computer vision algorithms. In for current weather determination in meteorology. As shown
addition to the functional architecture, results from a simulation in the previous chapter, an on-board cloud distance
environment are shown and the demands for future estimation system is indispensable in airspaces with traffic
developments are addressed. under visual flight rules.
An early publication dealing with UAV-based cloud
Keywords—UAV Sense and Avoid, cloud distance estimation detection has already shown that it is possible to determine
I. INTRODUCTION the distance of clouds in flight altitude using electro-optical
systems. In their work the authors use feature point ranging
The integration of unmanned aircraft into civil airspace in combination with an image-based classifier that classifies
remains a challenge. While systems such as the Traffic local image patches as ground, cloud or sky [2]. In
Collision Avoidance System (TCAS) are used to avoid
comparison to [2] we try to separate cloud objects from the
collisions in controlled airspaces in which mainly air traffic
participants operate according to instrument flight rules (IFR), background using colour-based segmentation first and
the pilots flying in uncontrolled airspaces according to visual subsequently determine the distance in these areas with the
flight rules (VFR) are responsible for avoiding other aircrafts presented method.
and ensuring their own visibility. Depending on the airspace Previous research work at our Institute of Flight Systems
used, certain minimum cloud distances must therefore be has already focused on determining the degree of coverage
respected. With regard to our application of unmanned air and estimating vertical cloud distance using a two-view
cargo transportation, it is expected that it will take place triangulation approach in order to avoid dangers during
primarily in the lower airspace. Thus, a Sense & Avoid system ascent and descent due to icing and turbulence. However, in
that is able to detect and avoid clouds is essential as on-board our recent project the preservation of visibility is of highest
equipment. Apart from ensuring own visibility, there are priority. Therefore, we propose to investigate whether this
further reasons not to enter clouds. On the one hand, there is method using 3D reconstruction from two views is also
an increased risk of turbulence in the vicinity of clouds caused suitable for applications with perspective in flight direction
by convection. On the other hand, at low temperatures, for a short-term avoidance of clouds in flight altitude.
especially near clouds, icing can occur and influence the
aerodynamic structure. B. Monocular cloud distance estimation using two-view
The focus of our previous investigations, described in [1], triangulation
was on segmentation of clouds in a simulation environment. The advantage of using monocular approaches compared
As a second important step the results of cloud segmentation to stereoscopic ones is that the baseline necessary for
are used to perform an approach of cloud distance estimation, triangulation is less limited by the aircrafts design. By using
which is examined in the following paper. First, state of the the monocular two-view triangulation, only one camera is
art cloud detection research projects are presented followed by needed. The collection of images from different perspectives
an explanation of the 3D reconstruction method used. The is done at different times. However, in case of rapid changes
procedure for estimating cloud distance is explained step by
of the cloud structure or shifting due to wind, the estimation
step beginning with cloud segmentation methods, shown in
result will be negatively affected. Therefore, the time interval
[1]. Moreover, results are shown and compared with ground
truth data from the simulation. Finally, a possibility is shown between the images needs to be chosen according to the flight
how cloud ground truth data can be collected in real flight speed [3].
images to validate the determined cloud distance. The purpose of cloud detection as part of a Sense & Avoid
system requires the application of the approach for cloud
distance estimation in perspective of flight direction, which
brings along the challenge of optical flow divergence besides
The KoKo II project was funded by the German federal aviation research varying backgrounds (sky, ground and horizon). When
programme (LuFo V, 3rd call).
Cloud
Fig. 1. Apparent growth of a circle when approaching a sphere (left) segmentation
and corresponding radial optical flow vectors (right) [4]
Feature
III. PROPOSED CLOUD DISTANCE ESTIMATION APPROACH detection
fx s cx
P = K [R | t] = [ 0 fy cy ] [R | t]
0 0 1
Authorized licensed use limited to: UNIVERSIDAD DE OVIEDO. Downloaded on December 28,2020 at 12:28:08 UTC from IEEE Xplore. Restrictions apply.
Fig. 3. Systematically placed Fig. 4. Segmentation result using
clouds without simulated terrain brightness threshold
and atmospheric effects
Authorized licensed use limited to: UNIVERSIDAD DE OVIEDO. Downloaded on December 28,2020 at 12:28:08 UTC from IEEE Xplore. Restrictions apply.
symbol in three colours depending on altitude. The flight
altitude of the UAV is 1000m in this scenario.
Fig. 11. Second input image with detected (green) and tracked (blue)
features
Fig. 11 shows the second input image containing detected Fig. 13. Visualization of estimated cloud feature positions: cloud ground
features (green) and tracked features (blue). Fig. 12 illustrates truth (black), features below (green), above (blue) and within (red) legal
vertical cloud distance of 1000 ft
the estimated averaged cloud distance (orange) and the actual
distance to the cloud centre (green) of each segmented cloud
The cloud centres of the two upper clouds are located at an
object. The 2D map display (Fig. 13) shows the true bounding
altitude of 3000m and have a maximum vertical extension of
boxes and centre positions of the clouds in black, the UAV
1000m. The estimated distance result of these clouds is close
position in orange, and the cloud feature positions as a "+"
to the actual distance from the cloud centre (Fig. 12). In Fig.13
Authorized licensed use limited to: UNIVERSIDAD DE OVIEDO. Downloaded on December 28,2020 at 12:28:08 UTC from IEEE Xplore. Restrictions apply.
it can be seen that some absolute positions of the cloud Elimination of erroneously detected ground objects
features are within the range of the bounding box, even if by known terrain height and estimated altitude of the
some positions were estimated too far out. Both clouds are features or adjustment of cloud segmentation
placed well above the minimum vertical distance, so all
associated features should be marked blue, which is the case Investigations to improve the determination of
(Fig. 13). The two lower and front clouds are estimated too far distances in the area of focus of expansion
away (Fig. 12). The distance values should tend to be slightly Clustering of cloud features in groups of similar
less than the distance to the cloud centre, since the features distance for separation of clouds at high coverage.
whose distance is determined lie on the cloud surface.
However, the cloud on the lower left is estimated to be much Utilization of multiple perspectives, meaning more
too far away. If the feature positions of this cloud are analysed, than two input images, for distance estimation.
some features are scattered upwards along a line (Fig. 13). The
Apart from improving the presented method we want to
features that are very far away from the cloud bounding box
evaluate different depth determination methods of image
are all marked green, meaning they are estimated to be below
processing in order to apply them in the future in a context
about 700m. Some tests have shown that this behaviour only
sensitive adaptive system during flight. This adaptive system
occurs in clouds with the ground as background. The more will be extended by further cloud segmentation methods based
sensitive the cloud segmentation was chosen, the more the on deep learning, which are described in [10] and [11], for
distance estimation was worsened towards too large values. It example. In addition, investigation of real-time cloud
turns out that although clouds have prominent structures in the detection algorithms in real flight plays a central role in our
edge areas, which are more likely to be detected by feature
future research. Compared to simulation, the collection of
detectors than more homogeneous areas in the middle of the
ground truth data in the atmosphere is more complicated. One
cloud [2] (Fig. 11), these edge areas also tend to be more
possibility of determining the true cloud positions is briefly
transparent, so that features on the ground are also detected in
explained in the following. The satellite data portal
this area, which means that the determined distance for the
Copernicus Open Access Hub of the European Space Agency
entire cloud object is increased. This behaviour can be seen in
(ESA) provides images taken by the satellites Sentinel-2A and
the lower left cloud. Some of the triangulated features do not
Sentinel-2B. Every five days one of the satellites flies over a
belong to the cloud, but to the ground behind the cloud. Since
large area south of Munich and records this area. Exactly at
the height of the features is also determined, ground features
this moment we try to record airborne clouds and determine
could be eliminated if the terrain height at this position is
the distance to them. Afterwards the actual cloud position
known. An alternative would be to shrink segmented cloud
provided by satellite data can be compared with the cloud
objects artificially, for example by using the morphological
position determined by computer vision methods. For these
operation erosion or by rescaling the segmented cloud
flights we mount an action camera, which records the GPS
contours, so that transparent cloud edge areas are not
position, at the wing of a general aviation aircraft.
segmented. The cloud centre of the cloud on the lower left is
placed at 600m height and has a vertical extension of 300m. REFERENCES
So most of the features should be marked green and the rest
[1] A. Dudek, F. Funk, M. Russ, and P. Stuetz, “Cloud Detection System
red. The cloud centre of the cloud on the lower right is placed for UAV Sense and Avoid : First Results of Cloud Segmentation in a
in flight altitude with a vertical extension of 1000m that means Simulation Environment,” in IEEE International Workshop on
features would have to be classified in green, red and blue. Metrology for AeroSpace, 2019.
Both clouds show the expected colour distribution of the [2] H. Nguyen et al., “EO/IR Due Regard Capability for UAS Based on
features. Intelligent Cloud Detection and Avoidance,” in AIAA
Infotech@Aerospace, 2010
V. CONCLUSION AND FUTURE WORK [3] F. Funk and P. Stütz, “A Passive Cloud Detection System for UAV :
Concept and First Results,” in International Symposium on Enhanced
The presented results have shown that an estimation of Solutions for Aircraft and Vehicle Surveillance Applications (ESAVS),
cloud distance in perspective of the flight direction for short- 2016.
term cloud avoidance is possible with the proposed approach. [4] D. A. Forsyth and J. Ponce, Computer Vision - A Modern Approach,
Issues with distance estimation arise near the focus of 2nd ed. Edinburgh: Pearson Education, p. 345, 2012.
expansion and in transparent cloud areas when the distance to [5] J. S. J. Shi and C. Thomasi, “Good features to track,” in IEEE
features on the ground is determined. The distance Computer Society Conference on Computer Vision and Pattern
Recognition, 1994.
information obtained is valuable in several aspects. On the one
hand, avoidance manoeuvres can be initiated when [6] F. Funk and P. Stutz, “A passive cloud detection system for UAV:
Analysis of issues, impacts and solutions,” 5th IEEE Int. Work. Metrol.
approaching a cloud object. On the other hand, the determined AeroSpace, Metroaerosp. 2018 - Proc., pp. 236–241, 2018.
dimension of the cloud can be used to classify the cloud type [7] J.-Y. Bouguet, “Pyramidal Implementation of the Lucas Kanade
and draw conclusions about convective activity in the cloud Feature Tracker - Description of the algorithm,” Intel Corporation -
area. However, the reliability of the distance estimation Microprocessor Research Labs. 1999.
approach strongly depends on the background respectively the [8] M. A. Fischler and R. C. Bolles, “Random sample consensus,”
terrain characteristics. In addition, more complex cloud Commun. ACM., vol. 24, no. 6, pp. 381–395, 1981
situations and also hazy visibility conditions lead to increasing [9] R. Szeliski, “Computer Vision: Algorithms and Applications.”
challenge of separation between individual cloud objects, for Springer-Verlag, London, pp. 46–47, 2011.
example by colour segmentation. In these cases, it is suggested [10] M. Shi, F. Xie, Y. Zi and J. Yin, "Cloud detection of remote sensing
images by deep learning," 2016 IEEE International Geoscience and
to triangulate features in the large cloud areas and then cluster Remote Sensing Symposium (IGARSS), pp. 701-704, 2016.
them into groups of similar distance. It should be mentioned
[11] S. Dev, A. Nautiyal, Y. H. Lee and S. Winkler, "CloudSegNet: A Deep
that flying under or over a cloud layer is preferable to avoiding Network for Nychthemeron Cloud Image Segmentation," in IEEE
individual clouds in high cloud cover. Based on the gained Geoscience and Remote Sensing Letters, vol. 16, no. 12, pp. 1814-
knowledge, following improvements are proposed: 1818, 2019.
Authorized licensed use limited to: UNIVERSIDAD DE OVIEDO. Downloaded on December 28,2020 at 12:28:08 UTC from IEEE Xplore. Restrictions apply.