Professional Documents
Culture Documents
Abstract
Remotely Piloted Aircraft (RPA) is presently in continuous battery or energy system’s capabilities. There are vehicles with
development at a rapid pace. Unmanned Aerial Vehicles the ability to fly at medium and high altitudes with flight dura-
(UAVs) or more extensively Unmanned Aerial Systems (UAS) tions ranging from minutes to hours, i.e., from five minutes
are platforms considered under the RPAs paradigm. Simulta- to 30 hours. The horizontal range of the different platforms
neously, the development of sensors and instruments to be is also limited by the power of the communications system,
installed onboard such platforms is growing exponentially. which should ensure contact with a ground station, again
These two factors together have led to the increasing use of ranging from meters to kilometers. Communications using sat-
these platforms and sensors for remote sensing applications ellite input can also be used, expanding the operational range.
with new potential. Thus, the overall goal of this paper is There are several different categorizations for unmanned aerial
to provide a panoramic overview about the current status platforms depending on the criterion applied (Nonami et al.,
of remote sensing applications based on unmanned aerial 2010). Perhaps the most extensive and current classifications
platforms equipped with a set of specific sensors and instru- can be found in Blyenburgh (2014) with annual revisions.
ments. First, some examples of typical platforms used in An auto platform or remotely controlled platform through
remote sensing are provided. Second, a description of sensors a remote station together with a communication system,
and technologies is explored which are onboard instruments including the corresponding protocol, constitutes what is
specifically intended to capture data for remote sensing ap- known an Unmanned Aircraft System (UAS) (Gertler, 2012).
plications. Third, multi-UAVs in collaboration, coordination, According to Yan et al. (2009) and Gupta et al. (2013), UAS
and cooperation in remote sensing are considered. Finally, are considered as the full system, including the aircraft, the
a collection of applications in several areas are proposed, remote control station and all of the ground support elements,
Delivered by Ingenta
where the combination of unmanned platforms and sensors, communication links, air traffic control, and launching and
IP: 200.23.135.16 On: Mon, 02 Mar 2020 17:42:02
together with methods, algorithms, and procedures provide recovery system, as may be required (this is the opinion of
Copyright: American Society for Photogrammetry and Remote Sensing
the overview in very different remote sensing applications. the Civil Aviation Authority (CAA, 2015)). Unmanned Aerial
This paper presents an overview of different areas, each inde- Vehicles (UAVs) are included in the category of UAS, i.e., they
pendent from the others, so that the reader does not need to can fly autonomously, although they can be also remotely
read the full paper when a specific application is of interest. controlled (The UAV, 2015). From the standpoint of remote
sensing, the equipment of UAS is required for capturing
information, which is later conveniently handled (processed,
Introduction analyzed, or stored), but the term “UAV” is commonly used in
Remote sensing refers to the technique of capturing informa- remote sensing. Therefore, in this paper, we will refer to UAVs
tion at a distance (remotely) by specific instruments (sen- under the perspective of remote sensing operations, includ-
sors). Traditionally, remote sensing has been associated with ing drones, gliders, (quad-, hexa-, octo-) copters, helicopters,
satellites or manned aircraft with a set of airborne sensors. In balloon-launched gliders, airships, or stratospheric balloon
the last decade, the increasing developments and improve- systems and more broadly, any unmanned vehicle with the
ments in unmanned platforms, together with the development ability to fly auto-controlled using processors onboard, re-
of sensing technologies installed onboard of such platforms, motely controlled with human supervision based on a ground
provide excellent opportunities for remote sensing applica- station (remotely piloted aircraft; RPA) or through another aeri-
tions. Indeed, they can offer high versatility and flexibility, as al vehicle under coordination. Certainly, from a strict point of
compared to airborne systems or satellites, and can oper- view, all these systems should be considered as RPA systems,
ate rapidly without planned scheduling. In remote sensing because they need human supervision; full autonomy is not
operations with high human risk, lives can be safeguarded. generally yet achieved. Nevertheless, as mentioned earlier,
Additionally, they can fly at low altitudes and slowly, with throughout this paper we will refer to them as UAVs. This
the ability of acquiring spatial and temporal high resolution overview is focused on remote sensing applications based on
data, representing important advantages against conventional small UAVs of different categories flying at relatively low alti-
platforms that have been broadly used over the years. tudes with different take-off and landing systems, including
Watts et al. (2012), Dalamagkidis et al. (2012), and Ander- Vertical-Take-Off-and-Landing (VTOL), where UAVs operate in
son and Gaston (2013) provided a classification and use of different scenarios and situations. The potential use of UAVs
platforms where an important issue that determines this clas-
sification is the altitude they can fly, ranging from a few meters
up to 9,000 m or more. Micro- and nano- air vehicles can fly Photogrammetric Engineering & Remote Sensing
at low attitudes with limited flight duration because of their Vol. 81, No. 4, April 2015, pp. 281–329.
0099-1112/15/281–329
Department of Software Engineering and Artificial Intelligence, © 2015 American Society for Photogrammetry
Faculty of Informatics, University Complutense of Madrid, and Remote Sensing
Madrid 28040, Spain (pajares@ucm.es). doi: 10.14358/PERS.81.4.281
Figure 1. rpa system with the remote control system (Image cour-
tesy of iscar-ucm Group, Madrid, Spain).
UAVs and Sensors: Onboard Capabilities and Technologies Spain). Figures 2 (a) and (b) display two multi-rotors fly-
The conjunction of unmanned platforms equipped with ing; they are a quad-copter (courtesy of CartoUAV, La Coruña,
sensors onboard allows for the realization of remote sensing Spain) (CartoUAV, 2015) and a hexa-copter, respectively,
missions with applications in different areas. The Unmanned (courtesy of AirRobot GmbH, Arnsberg, Germany) (AirRobot,
Aerial Platforms Section displays typical platforms used 2015), used for weed patches detection in agriculture in the
for such purpose. The Sensors and Technologies Section context of the RHEA (2015) project. Figure 2(c) displays a quad-
describes different sensor-based technologies specifically rotor equipped with a multipurpose visible camera (courtesy
designed for remote sensing tasks. of eDroniX, Madrid, Spain) (eDronix, 2015). Figure 3 displays
Unmanned Aerial Platforms two fixed-wing UAVs, the Cropsight and Viewer (courtesy of
Figure 1 displays a quad-copter on the ground, with its remote QuantaLab-IAS-CSIC, Cordoba, Spain) equipped with multispec-
radio control system, used in collaborative missions together tral and hyperspectral, including thermal, sensors and used in
with USVs (Courtesy of ISCAR-UCM Group (2015), Madrid, airborne campaigns for biomass analysis, based on chlorophyll
or carotenoids content, in vineyard, citrus, peach and olive
(a) (b)
Figure 4. (a) Helicopter hero equipped with gps, visual and infrared cameras on the pan and tilt unit, and the required hardware, and (b)
Sensor system detail (Images courtesy of J.R. Martínez-de-Dios and A.by
Delivered Ollero; Robotics, Vision, and Control Group, University of Seville,
Ingenta
Seville Spain). IP: 200.23.135.16 On: Mon, 02 Mar 2020 17:42:02
Copyright: American Society for Photogrammetry and Remote Sensing
Table 1. Sensors Onboard UAVs: Auxiliary and Specific
Auxiliary Specific
• GPS • Video cameras (visible spectrum): EOS, • Ultraviolet spectrometer
• IMU stereoscopic, omnidirectional, fish eye lens. • Multi-gas detector
• Gyroscopes • Thermal cameras • Sonar
• Accelerometers • Infrared cameras • Smartphone
• Altimeters • FLIR • Particle counters (optical, condensation)
• Video stabilizer • LIDAR (Laser scanner) • Photometer, aethalometer
• Image transmitter • Multi-Hyperspectral (HyperUAS) • Aerosol sampling
• Communication antennas • Irradiance • Probes (temperature, humidity, pressure)
• (VHF, UHF) • Radar/SAR • Cloud droplet spectrometer
• Communication modems • Radiometer (multi-frequency) • Pyranometer
• Infrared spectroscopy • Electrostatic collector
• Electronic nose • Radiation gauge
• VCSEL • Magnetic sensor
• WMS • Ultraviolet flame detector
• Gas/smoke detector
orchards, always related with the photosynthesis (Zarco-Tejada affecting the attributes of the remote sensing application.
et al., 2013a and 2013b). However, an advantage is that small platforms require fewer
Figure 4a displays the helicopter HERO equipped with GPS, logistics, unlike larger platforms.
and the sensor system consisting of visual and infrared cam- Payload limits onboard UAVs represent a handicap in the
eras installed on a pan and tilt unit, the hardware enclosure is use of sensors. Under this assumption new challenges appear:
also displayed. Figure 4b displays the structure and detail of the sensors must be adapted to the platform or vice versa.
the sensor system. (Images courtesy of J.R. Martínez-de-Dios Sensors onboard the platform should not be a serious im-
and A. Ollero; Robotics, Vision, and Control Group, University pediment for maneuverability. In this regard several research
of Seville, Seville, Spain). The HERO platform has been used subjects have been opened, where recent advances in MEMS
for early fire detection (Martínez-de-Dios et al., 2007). are currently in continuous progress from the point of view of
From the point of view of sensors onboard UAVs, payload systems engineering.
and logistic requirements are two important issues to be con- As reported in Dziubana et al. (2012) and previously in
sidered to ensure the success of remote sensing missions. The Everaerts (2008), UAVs are equipped with different sensors
smaller platform will be more limited for payload, directly that can exceed twenty in number. Some of them are used
affecting the types of sensors that can be transported and thus to capture data with the exclusive aim of controlling the
Plate 2 displays data captured at 40 m from an UAV above Bendig et al. (2012) equipped a mini octo-copter (<5 kg
ground level (AGL). The airborne laser scanning (ALS) data and payload between 0.2 to 1.5 kg) with a multispectral
was captured with an Airborne Laser Terrain Mapper system consisting in a multiple camera array (MCA) sensor
(ALTM) by Ingenta
Delivered
Gemini laser scanner with a pulse rate frequency of 70 kHz and with a total weight of about 720 g and mechanical trigger. It
IP: 200.23.135.16 On: Mon, 02 Mar 2020 17:42:02
an on ground laser footprint of 0.2 m. The plot Society
is in a forestry contains four arrays with spectral filters of 550, 671, 800, and
Copyright: American for Photogrammetry and Remote Sensing
plantation, close to Geeveston in southeast Tasmania, Australia. 950 nm corresponding to the green and red visible bands and
two bands of NIR. It was used together with a thermal system
Multispectral and Hyperspectral Sensors previously described.
Multispectral and hyperspectral sensors have been widely Honkavaara et al. (2013) used and described a multispec-
used in UAVs-based applications for multiple purposes. The tral camera developed by the Technical Research Center of
difference between these sensors and others is the number of Finland based on a Fabry-Perot interferometer with the capa-
spectral bands and the wavelength range covered, including bility of selecting different spectral bands with wavelengths
the visible spectrum. As hyperspectral sensors are based on ranging in 400 nm - 1000 nm. The full system is also equipped
line scanning through the movement of the UAV, they require with irradiance sensors to measure different levels of this
sufficient stabilization to build coherent images. Sometimes magnitude together with a GPS. The above interferometer was
these systems require geometric correction using specific fea- previously described in Saari et al. (2011) and Mäkynen et
tures and ground control points (Jensen et al., 2009 and 2011). al. (2011). Different tests conducted by Nackaerts et al. (2010)
Multispectral sensors are non-scanning, and they, in general, and Honkavaara et al. (2012) demonstrated its performance
provide lower image resolutions compared to hyperspectral for UAVs, including the processing for radiometric corrections
sensors. Ren et al. (2013) presented a strategy for spectral re- (Honkavara et al., 2012) and considering irradiance values
calibration (spectral response function, central wavelength, and (Hakala et al., 2013). This system was also used in Pölönen et
bandwidth) using man-made ground targets. The CCD-based al. (2012) for precision agriculture and Kaivosoja et al. (2013)
camera with four channels (blue: 420 nm–520 nm; green: 520 for building raster maps for a precision fertilizer application.
nm–600 nm; red: 630 nm–690 nm; NIR: 760 nm–900 nm) was Mäkeläinen et al. (2013) described the use of a 2D frame
mounted onboard an UAV with the targets on the ground surface. camera operating in the RGB and NIR for orthomosaicking and
Multispectral and hyperspectral sensors are often used DEM production. It is built with a CMOS-based technology and
together with other sensors with proven high performance to based on the Fabry-Perot interferometer.
increase the remote sensing capabilities of the UAV. Colomina Kelcey and Lucieer (2012a and 2012b) used a six-band
and Molina (2014) provided two lists of representative multi- multispectral sensor, which is improved based on radiometric
and hyper-spectral sensors. and spatial correction techniques in order to achieve noise
Achteren et al. (2007) described the MEDUSA multispectral reduction (based on dark offset imagery), sensor-based modi-
instrument, ranging in 400 nm - 650 nm with weight of 2 kg fication of incoming radiance (based on spatially/spectrally
and two frame sensors (panchromatic and RGB), designed to dependent correction factors), and lens distortion (through
be installed in a high altitude, long endurance UAV. the Brown-Conrady Model). These corrections improved the
Jensen et al. (2008) used two multispectral cameras, cover- quality of the raw multispectral imagery, facilitating subse-
ing the visible and NIR spectral bands, installed onboard a quent quantitative image analysis.
fixed wing UAV, with wingspan of 122 cm and weight of 454 g, Duan et al. (2013) evaluated the in-flight performance in
for georeferencing. terms of signal-to-noise ratio of a new hyperspectral sensor
Delivered by Ingenta
IP: 200.23.135.16 On: Mon, 02 Mar 2020 17:42:02
Copyright: American Society for Photogrammetry and Remote Sensing
Plate 4. Four strips obtained from a cir system: Left and right strips represent rgb images; left central strip is a dsm, with the associated color-
bar representing heights; right central strip represents the rgb plus the ir channel. (Image courtesy of QuantaLab-ias-csic, Cordoba, Spain)
on thin-metal oxide technology, for humanitarian demin- mass excluding payload of approximately 4.7 kg. Malaver et
ing, which is sensitive to different volatile compounds. This al. (2015) integrated a solar powered UAV (3 kg payload) with
device is installed onboard a blimp built with a hull filled wireless sensor networks (WSN) to measure concentrations of
with helium. It is 4.5 m long, with 1.2 m diameter, and 6 m3 CH4 and CO2 in greenhouses. The UAV was equipped with a
of volume, with a payload of approximately 3 kg. gas sensing system based on nanostructured metal oxide and
Gas emissions come from different activities. Indeed, non-dispersive infrared sensors.
greenhouse is an emerging activity in agriculture with gas Volcanic gases were measured at La Fossa crater, Vul-
emission. Methane is a gas flowing in gasification plants or cano Island (Italy) by McGonigle et al. (2008) with an UAV
refineries. Regarding gas detection with UAVs, Khan et al. flying through the plume and equipped with an ultraviolet
(2012a and 2012b) proposed a VCSEL (vertical cavity surface spectrometer for the SO2 flux and a multi-gas sensor system
emitting laser) sensor for measuring H2O, CO2, and CH4 in consisting in a non-dispersive CO2-H2O infrared spectrometer,
greenhouses based on Wavelength Modulation Spectroscopy an electrochemical SO2-H2S sensor, and a H2 semi-conductor.
(WMS); the device is installed onboard a helicopter with Different ratios of CO2/SO2 are measured. This multi-gas sen-
payload capacity of up to 5 kg. The UAV is 133 cm long, 41 sor technology was also installed onboard an UAV for analyz-
cm high, has a main rotor diameter of 156 cm, with a flying ing volcanic gas compositions of Shinmoedake, Kirishima
rescue missions in marine environments (Sánchez-Benítez et agricultural forestry and fisheries (environmental monitoring,
al., 2011). Figure 6b displays the same landing platform with crop dusting, optimizing use of resources); (e) earth observa-
the quad-rotor, from CartoUAV (2015) company, onboard with tion and remote sensing (climate monitoring, aerial photogra-
an enlarged level of detail. The use of the landing platform phy, mapping and surveying, seismic events, major incident,
can be extended to different scenarios and environments. pollution monitoring); and (f) communications and broadcast-
ing (Very High Altitude-Long Endurance (VHALE) platforms as
proxy-satellites and Medium Altitude-Long Endurance (MALE),
Applications or Small and Mini-Unmanned Aerial Systems (S/MUAS) as
The number of applications where UAVs become useful tool short-term, local communications coverage). Nowadays,
seems almost unlimited and is continually growing. We have many developments have followed these lines with success-
considered different applications grouped under major topics. ful results. The overview about the current state carried out in
Then, when appropriate, we break them down in other subtop- this paper focuses on a set of applications classified under the
ics. Obviously, this does not mean they are exclusive and surely, specific topics displayed in Table 3 for various areas. This di-
other applications not covered here could be also relevant. vision is established considering the main application, taking
The European Commission (2007) identified a set of cur- into account that some type of overlapping among them may
rent and potential UAV procurements, including: (a) govern- occur. Indeed, when a disaster occurs, humanitarian localiza-
ment (police, civil security, border security, coastguard); tion and rescue are tasks to be executed immediately. Pho-
(b) fire-fighting (forest fires, emergency rescue, other major togrammetry (mosaics, ortho-rectification) is an application
incidents); (c) energy sector (oil and gas industry distribution useful with great relevance in agriculture, cultural heritage,
infrastructure, electricity grids, distribution networks); (d) and archeology or urban environments, among others.
applications where UAVs are used for spraying chemicals on monitoring for prevention and cure when infestation has oc-
crops controlled by a WSN deployed on the crop field. curred were studied and addressed in Yue et al. (2012).
Samseemoung et al. (2012) designed a low altitude remote Figure 7a shows an UAV quad-rotor, from the CartoUAV
sensing (LARS) helicopter, with 6 kg and payload capacity of 5 (2015) company, flying a maize crop field to detect weed
kg, equipped with a commercial true color camera (RGB) and patches in order to design site-specific herbicide treatments.
a color-infrared digital camera (G-R-NIR) for monitoring crop The images are conveniently mosaicked and segmented for
growth and weed infestation in a soybean plantation flying at crop row identification and crop versus weeds discrimination
altitudes up to 15 m. Also a LARS system based on a helicop- with the aim of building density maps of crops and weeds
ter, with weight of 6 kg and payload of 5 kg, equipped with coverage. Figure 7b shows a map obtained from UAV images
a multispectral imaging system was proposed in Swain et al. showing three levels of weed infestation (low, moderate, and
(2010) and Swain and Zaman (2012) to determine the crop riceby Ingenta
Delivered high, in an ascending greyscale), crop rows (in grey) and free-
coverage with the aim of predictingIP:rice yield for planning
200.23.135.16 On: and
Mon, 02 weed
Marzones
2020(in white). The image displayed in Figure 7b is
17:42:02
expectation. Bendig et al. (2013a and
Copyright: 2013b) obtained
American Societycrop sur- adapted from
for Photogrammetry andPeña et al. (2013).
Remote Sensing
face models in rice fields based on stereo images with the aim Honkavaara et al. (2013) used a multispectral system (see
of analyzing crop growth and health status. The platform is an the Multispectral and Hyperspectral Subsection) for biomass
octo-copter with payload of 1 kg equipped with a true color estimation. Different vegetation densities in wheat and barley
RGB sensor with a weight of 400 g. In the context of corn yield crops were obtained according to the amounts of seeds and
prediction, Geipel et al. (2014) used a hexa-copter, equipped fertilizers applied, which allowed for determination of the
with standard navigation sensors (IMU, GNSS) to acquire RGB effect of these quantities in the health and growth stage of
imagery, which was later ortho-rectified with production of crops. Different experiments were also conducted by Jannoura
DEMs and maps leading to the computation of vegetation indi- et al. (2015) to monitor crop biomass based on RGB images
ces. Rice paddies were characterized in Uto et al. (2013) based captured by a true color camera onboard a hexa-copter.
on a miniature hyperspectral system onboard an UAV. Rabatel et al. (2014a) proposed various methods to obtain
Torres-Sanchez et al. (2013a, 2013b, and 2014), Peña- simultaneously visible and NIR bands for agricultural applica-
Barragán et al. (2012a and 2012b), and Peña et al. (2013) used tions, including weed monitoring. Red (R) and NIR data were
a quad-copter, with payload of 1.25 kg, with a lightweight obtained from a uniquely modified still camera, which was
700 g CMOS multispectral sensor with six individual digital achieved by removing the blocking internal NIR filter in the
channels and sometimes a commercial high-resolution RGB camera, inserting a red long-pass filter in front of the lens and
true color camera. Both cameras can be installed separately getting Red and NIR as linear combinations of the raw channel
onboard for deriving vegetation indices for crop and weed data. In a second system, in the context of the RHEA project
detection for generating weed coverage maps with the aim of and its associated second conference (RHEA, 2015), R and NIR
site-specific treatments in maize crops. The images, stored in bands are obtained by a couple of compact still cameras, one
SD and CF cards, are preprocessed for correct channel align- of them being modified as before. A specific image registration
ment suitable for accurate ortho-rectification and mosaicking procedure was developed for such purpose (Rabatel and Lab-
purposes before the map generation. A set of georeferenced bé, 2014b). Aerial images of wheat were acquired by a camera
ground control points (GCP) is used for such purpose. Small onboard the UAV from AirRobot (2015) company, Figure 2b, as
changes in flight altitudes can produce important differences part of the activities in the RHEA project with the aim of pro-
in the ortho-images resolutions. A study about accuracy in ducing georeferenced maps for follow-up, site-specific treat-
wheat fields infested by broad-leaved and grass weeds was ments in wheat fields. Hunt et al. (2010 and 2011) replaced
carried out in Gómez-Candón et al. (2014), where precision the internal hot-mirror filter with a red-light-blocking filter to
mapping for farm applications are built using a quad-copter get data in the near-infrared, green, and blue bands, i.e., CIR
equipped with a device providing CIR images. Different bands. Leaf area and green normalized difference vegetation
studies were conducted in Peña et al. (2015) quantifying the indices were correlated in fertilized wheat crops.
efficacy and limitations of remote images acquired with an Aerial reflectance measurements were conducted in Link-
UAV for detection and discrimination of weeds affected by Dolezal et al. (2010 and 2012) in winter wheat for crop moni-
the spectral, spatial and temporal resolutions. Crop pests and toring purposes based on georeferenced images. The UAV, with
Delivered by Ingenta
IP: 200.23.135.16 On: Mon, 02 Mar 2020 17:42:02
Copyright: American Society for Photogrammetry and Remote Sensing
Figure 9. Interface Human Machine in the gcs: visible and infrared images, dynamic parameters, trajectory, and tele-operation (Image
courtesy of J.R. Martínez-de-Dios and A. Ollero; Robotics, Vision, and Control Group, University of Seville, Seville Spain).
extent of the affected area. A set of UAVs in cooperation was the proposed model also considers weather forecasting for the
the proposed system. Also, simulated experiments were car- purpose of replacing stationary sensor networks.
ried out in Smídl and Hofman (2013) to model the tracking of Post-disaster surveillance with measurements after radio-
a plume of air contaminated by a nuclear leak based on UAVs; active leaks is possible for monitoring the disaster and its
similarity measurements between consecutive frames in the data (temperature, humidity, pressure, wind velocity) are
video stream. Skoglar et al. (2012) proposed a method to track obtained with UAVs in the Turrilba Volcano, Costa Rica. In
several vehicles on roads based on vision sensor with gimbal this line are focused the works described in Mondragón et al.
capabilities. All targets are monitored with simultaneous ac- (2015) for volcano inspections and hydrothermal alterations
tive search, and also for identifying new targets. The vision at mountains of Poas and Irazu (Costa Rica) with a multi-rotor
UAV equipped with thermal and visible cameras. McGonigle
sensor is oriented towards the field of interest.
Xiao et al. (2008) and Miller et al. (2008) proposed sev- et al. (2008) conducted different experiments to measure
eral techniques for tracking persons in video sequences from volcanic gases at La Fossa crater, Vulcano Island, Italy with a
UAVs. The first work used video cameras for tracking ground
multi-gas sensor (see the Chemical Sensors subsection). This
vehicles and the second one used infrared images. device was used in Shinohara (2013) for analyzing gas emis-
A target tracking approach, based on a monocular camera sions in Shinmoedake, Kirishima Volcano, Japan.
(pinhole model), for determining ranges from UAVs to objects Amici et al. (2013) described the inspection of the Le Sa-
was the approach proposed in Choi and Kim (2014). A guid- linelle, Italian mud volcano on the lower South West flank of
ance law is also proposed for such purpose. the Etna Volcano. The UAV is configured as a hexa-copter with
Different transformations and approaches have been a 1.7 kg payload, equipped with a lightweight thermal system
proposed for target detection and tracking. SIFT and variants of 67 g and spectral response in the range at 2 µm to 14 µm
such as the Mean SIFT, were proposed with the aim of match- and a PAL video camera with weight of 600 g.
ing objects in successive frames (Fang et al., 2011; Chao-Jian Soils
and San-Xue, 2011). Gleason et al. (2011) described a method Soil erosion analysis, hazard monitoring, reflectance proper-
based on Mean SIFT for vehicle detection and tracking in rural ties, and 3D modeling are motivating applications where UAVs
areas with the aim of detecting potential threats in oil and gas can play an important role.
pipelines with extension to other applications in these kinds Frankenberger et al. (2008) evaluated the feasibility of us-
of environments. Automatic aerial surveillance systems in ing low-altitude (100 m) photogrammetry to assess ephemeral
buried gas and oil pipelines based on UAVs were considered in gully erosion at agricultural fields after rainfall events. DEMs
Zaréa et al. (2014). were built for analysis from surface images acquired with
Fang et al. (2011) proposed particle filtering based on the a commercial camera onboard an UAV and checked against
mean-shift algorithm captured with an UAV. Rodríguez-Canosa ground-based systems, such as terrestrial lidar.
et al. (2012) described a compensated optical flow-based Hazard monitoring, for rockslides in Randa (Wallis, Swiss
approach between consecutive frames for tracking objects, Alps), was addressed in Eisenbeiss (2009). The images were
where low frequency vibrations caused by the UAV are con- acquired with a still video camera on-board an unmanned
veniently balanced. Lin and Saripalli (2012) applied a Hough helicopter.
transform-based approach for road detection and tracking in
(least-squares) were evaluated in Lingua et al. (2009) for DSM Harwin and Lucieer (2012a and 2012b) applied multi-
generation. The images were acquired with an off-the-shelf view stereovision (MVS) techniques to obtain 3D structure
visible light camera. from overlapping imagery captured from multiple angles. An
Progress related to machine learning-based techniques for octo-copter with approximate payload limit of 1 kg is the UAV
DSMs generation has reinforced its use for images acquired used. It was equipped with a stabilized camera mount to carry
from UAVs (Rosnell et al. 2011). different sensors, including a commercial digital camera. A
Virtual reality is another issue closely related to pho- very dense point cloud was produced with sufficient accura-
togrammetry. In this way, Linkugel and Schilling (2013) cy. Accuracy is a central issue in photogrammetry as reported
Delivered by Ingenta
proposed a simulation system where a micro- UAV is used
IP: 200.23.135.16 On: Mon, 02 in Mar
Küng2020
et al.17:42:02
(2011a) and Vallet et al. (2011), where different
for computing 3D measures for virtual
Copyright: reality purposes.
American The
Society for experiments
Photogrammetry have
and been carried
Remote Sensingout with light UAVs, weighing
mathematical model was described, including all aerodynam- less than 500 g with maximum payload of 125 g. Different
ic parameters of the UAV toward the definition of the geomet- methods and strategies for point cloud generation from digital
ric modeling based on different sensors. images captured with UAVs flying at relatively low altitudes
Digital elevation models and 3D mapping with DSM or DTM were also addressed in Siebert and Teizer (2014) for 3D map-
production together with mosaicking with geo- and ortho- ping in mapping earthwork projects.
rectification are two main topics inside photogrammetry; Regarding digital surface models, a laser scanner, two CCD-
both are considered separated here, although they are closely based digital cameras (with weights of 500 g each) and two
related. infrared devices (NIR sensitivity with 500 g) were integrated
together with an IMU and GPS in Nagai et al. (2009) for such a
3D Mapping, Digital Surface, Elevation, and Terrain Models purpose. A 3D shape is obtained by the laser scanner as point
Nex and Remondino (2014) and Remondino et al. (2011) cloud data, texture information is acquired by the digital cam-
provided a review with new insights and proposal for differ- eras, and vegetation indexes are acquired by the IR cameras
ent photogrammetry-based applications, including 3D digital simultaneously. The UAV is a helicopter with weight of 330 kg
terrain or 3D textured models. Photogrammetric approaches, in- and payload of 100 kg with two main rotors (4.8 m diameter)
cluding topographic maps with slopes, have been described in and two tail rotors (diameter of 0.8 m).
Tahar et al. (2011 and 2012) oriented to landslides applications. High-resolution surface models are possible by using UAVs
Hugenholtz et al. (2013) evaluated the accuracy in DTM flying at low altitudes. Mancini et al. (2013) developed a
production using a fixed-wing UAV that weighs less than 6.2 kg method based on SfM to build such models in unstructured
and equipped with an off-the-shelf CCD-based visible camera. coastal environments. An electric hexacopter was the UAV
Different works have been proposed for 3D model genera- used, with a 1 m diameter and total weight of approximately
tion. In this regard, point cloud generation is a task of interest 5 kg, and equipped with a digital camera. Walker (2012) also
for 3D mapping accuracy, a procedure for such a purpose was addressed the topic of coastal management applications.
proposed in Rosnell and Honkavaara (2012) with two RGB- In Delacourt et al. (2009) DEMs and orthorectified images,
based digital still CCD-cameras. Two quad-copters were used, acquired from and helicopter, are built with high spatial
which were able to carry 300 g and 1.2 kg payload, equipped resolutions (<5 cm). The system was tested on the beach of
with cameras weighing 180 g and 448 g, respectively. Porsmillin (French Brittany) for the quantification of morpho-
Neitzel et al. (2011) used an octo-copter (1.2 kg net weight sedimentary changes of the coastal fringe, including cross-
or TOW 2 kg with camera) for 3D mapping landfills with the shore and long-shore sediment transport.
aim of determining its volume and quantity based on point Imagery change detection techniques for topographical
cloud computation. 3D building models are obtained in Jizhou reconstruction were applied in Xuan (2011), where DSMs
et al. (2004) using a fixed-wing UAV equipped with a CCD-based were built from the remotely sensed data based on different
camera. They captured oblique images to obtain relevant parts techniques, including: triangulation, DEM generation, ortho-
of buildings instead of using a pair of images as usual. imaging, and mosaicking. Two types of UAVs were used: (a)
To join, go to
www.asprs.org
THE
IMAGING & GEOSPATIAL
INFORMATION SOCIETY