Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more
Download
Standard view
Full view
of .
Look up keyword
Like this
1Activity
0 of .
Results for:
No results containing your search query
P. 1
Remote Sensing of Crops

Remote Sensing of Crops

Ratings: (0)|Views: 64|Likes:
Published by UAVs Australia

More info:

Published by: UAVs Australia on Oct 12, 2010
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

10/12/2010

pdf

text

original

 
REMOTE SENSING OF VEGETATION FROM UAV PLATFORMS USINGLIGHTWEIGHT MULTISPECTRAL AND THERMAL IMAGING SENSORS
J.A.J. Berni
a
, P.J. Zarco-Tejada
a,
*,L. Suárez
a
, V. González-Dugo
a
, E. Fereres
a,b
a
Quantalab, Instituto de Agricultura Sostenible (IAS), Consejo Superior de Investigaciones Científicas (CSIC), 14004Cordoba, Spain - (berni, pzarco, lsuarez, vgonzalez)@ias.csic.es
 b
Dept. of Agronomy, University of Cordoba, Campus Univ. Rabanales, Cordoba, Spain – ag1fecae@uco.es
Inter-Commission WG I/VKEYWORDS:
Multispectral, narrowband, radiative transfer modeling, remote sensing, stress detection, thermal, unmanned aerialsystem (UAS), unmanned aerial vehicles (UAVs)
ABSTRACT:
Current high spatial resolution satellite sensors lack the spectral resolution required for many quantitative remote sensingapplications and, given the limited spectral resolution, they only allow the calculation of a limited number of vegetation indices andremote sensing products. Additionally if short revisit time is required for management applications, the cost of high resolutionimagery becomes a limiting factor whereas sensors with shorter revisit time lack the necessary spatial resolution which is critical, particularly for heterogeneous covers. Combining high spatial resolution and quick turnaround times is essential to generate usefulremote sensing products for vegetation monitoring applied to agriculture and the environment. Alternatives based on mannedairborne platforms provide high spatial resolution and potentially short revisit time, but their use is limited by their high operationalcomplexity and costs.Remote sensing sensors placed on unmanned aerial vehicles (UAVs) represent an option to fill this gap, providing low-costapproaches to meet the critical requirements of spatial, spectral, and temporal resolutions. However miniaturized electro-opticalsensors onboard UAVs require radiometric and geometric calibrations in order to allow further extraction of quantitative results andaccurate georeferencing. This paper describes how to generate quantitative remote sensing products using rotating-wing and fixed-wing UAVs equipped with commercial off-the-shell (COTS) thermal and narrowband multispectral imaging sensors. The paper alsofocuses on the radiometric calibration, atmospheric correction and photogrammetric methods required to obtain accurate remotesensing products that are useful for vegetation monitoring.During summer of 2007 and 2008, UAV platforms were flown over agricultural fields, obtaining thermal imagery in the 7.5–13-
μ
mregion (40 cm spatial resolution) and narrow-band multispectral imagery in the 400–800-nm spectral region (20 cm spatialresolution). Surface reflectance and temperature imagery were obtained, after atmospheric corrections with MODTRAN connectedto photogrammetric methods to solve the thermal path length. Biophysical parameters were estimated using vegetation indices,namely, normalized difference vegetation index, transformed chlorophyll absorption in reflectance index/optimized soil-adjustedvegetation index, and photochemical reflectance index (PRI), coupled with PROSPECT, SAILH and FLIGHT models. Based onthese parameters, image products of leaf area index, chlorophyll content (Cab), and water stress detection (based on the photochemical reflectance index and on canopy temperature) were produced and successfully validated. GPS/INS data from theautonomous navigation system were used into the aerotriangulation, to georeference the collected imagery, requiring only aminimum number of ground control points. This paper demonstrates that results obtained with a low-cost UAV system for vegetation monitoring applications yield comparable estimations, if not better, than those obtained with more traditional mannedairborne sensors.* Corresponding author.
1.
 
INTRODUCTION
The recent interest of unmanned aerial vehicles (UAV) for vegetation monitoring has been motivated by the benefits of these platforms as compared to full size airborne operation,namely the combination of high spatial resolution and quick turnaround times together with lower operation costs andcomplexity. These features are of special interest in agriculturewhere short revisit time is required for managementapplications and high spatial resolution is mandatory inheterogeneous covers like woody crops. During the last yearssome UAV based research have been developed for vegetationmonitoring on both agricultural and environmental applications.Large fixed-wing UAVs have been tested for agriculturalapplications like coffee crops (Herwitz et al. 2004) andvineyards (Johnson et al. 2003). Rotary wing UAVs have also been used for vegetation monitoring in different crops(Suguiura et al, 2005; Berni et al. 2009). Most of theseapplications have been possible due to the miniaturization of commercial multispectral and thermal cameras which however require radiometric and geometric calibrations together withatmospheric correction and photogrammetric techniques inorder to provide image products similar to the available fromtraditional airborne sensors. The radiometric quality of theimages is critical in order to enable the application of quantitative remote sensing methodologies for a successful
 
 estimation of biophysical parameters from remote sensingimagery.This paper describes the aerial platforms and sensors developedfor multispectral and thermal image collection but also focuseson the calibration, postprocessing techniques and further validation of remote sensing products obtained using thecombination of the images coupled to radiative transfer models.
2.
 
SYSTEM DESCRIPCTION2.1
 
Aerial Platforms
Two aerial platforms have been developed based on modifiedmodel aircrafts:a)
 
Quanta-H 
, is a rotary wing UAV of 1.9 m main rotor diameter and 29 cm
3
gas engine, modified modelhelicopter (figure 1). This vehicle is capable of carrying 7kg of payload with an endurance of 20mflying at a maximum speed of 30 km/h. The mainadvantage of this platform is the vertical take off andlanding capabilities. However the endurance andspeed are a limitation, but make it ideal for smallexperimental plots.Figure 1.
Quanta-H 
rotary wing UAV with the multispectralcamera. b)
 
Quanta-G
is a fixed wing UAV of 3.2m wingspanequipped with a 58 cm
3
gas engine (figure 2). Theendurance is 30 minutes flying at 90km/h and theavailable payload is 5.5kg. Despite the larger yielddue to the higher speed and endurance, the maindisadvantage is that it requires a runway for take off and landing.Figure 2. Quanta-G fixed wing UAV.All these UAV platforms are equipped with an autopilot (ModelAP04 UAV Navigation, Spain) which allows fully autonomousnavigation following a user-defined flight plan. The autopilotintegrates an artificial heading and attitude reference system(AHRS) based on the combination of 3 axis accelerometers andrate gyros, magnetometer, static/dynamic pressure and GPS (afull description is given in Berni et al., 2009). Differential GPScorrections are retrieved from the closest GPS reference stationusing the NTRIP protocol and uploaded to the UAV through theradio link in order to be applied in real time.The flight plan is created using a GIS based application thatcreates a photogrammetric flight plan based on user definedregion of interest, flight direction, altitude and focal length of the camera.
2.2
 
Image Sensors
Two different types of sensors can be installed on the aerial platforms. The multispectral camera (MCA-6, Tetracam Inc.,USA) consists on 6 individual sensors and optics withinterchangeable optical filters. Sensors features are presented ontable 1. The filters are selected depending on the vegetationindices (VI) that are required. Figure 3 shows the band centersand width of some of the narrow band filters used in this study.Array elements 1280 x 1024Pixel size 5.2
μ
m x 5.2
μ
mImage size 6.66mm x 5.32mmFocal length 8.49 mmOutput 10-bit raw dataS/N Ratio 54 dBFixed pattern noise < 0.03% V
PEAK-TO-PEAK 
Dark current 28mV/sDynamic range 60 dBTotal weight 2.7 kgTable 1. Technical facts of the multispectral sensors
01020304050607080400 450 500 550 600 650 700 750 800 850 900Wavelength (nm)
   T  r  a  n  s  m   i  s  s   i  o  n   (   %   )
490 530550
 
570 670 700 750 800
 Figure 3. Spectral characteristics of some of the filters used onthe multispectral camera.The thermal camera installed is a Thermovision A40M (FLIR,USA) equipped with a 40º field-of-view lens. The image sensor is a Focal Plane Array (FPA) based on uncooledmicrobolometers with a resolution of 320x240 pixels andspectral response in the range 7.5-13
μ
m (Table 2). The cameradelivers digital raw images at 16 bits of at-sensor calibratedradiance with a dynamic rage of 233 K – 393 K. The rawimages are stored onboard on a PC104 embedded computer (Cool Little Runner 2, LiPPERT, Germany).
 
 Array Elements 320 x 240Pixel size 38
μ
m x 38
μ
mSpectral response 7.5-13
μ
mSensibility 0.08K at 303K Dynamic Range 233K- 393K Total weight 1.7 kgTable 2. Technical facts of the thermal sensor 
3.
 
INSTRUMENTS CALIBRATION3.1
 
Radiometric calibration and atmospheric correction
All the cameras have been calibrated in the laboratoryfollowing different approaches depending on the type of sensor.In the case of the multispectral camera, a uniform light sourceconsisting on a 50 cm integrating sphere (Model USS-2000C,Labsphere, USA) is used to calibrate the camera to radiance. Inorder to convert the radiance images to reflectance, theSMARTS irradiance model (Gueymard, 2001) is used tosimulate ground surface irradiance. A sun photometer (Microtops, Solar Inc, USA) is used to retrieve aerosol opticaldepth at 6 different wavelength and total water content, whichare the inputs to the irradiance model.
y = 0.9937x + 0.0028R2 = 0.99RMSE=1.17%00.10.20.30.40.500.10.20.30.40.5Reflectance (airborne)
   R  e   f   l  e  c   t  a  n  c  e   (   f   i  e   l   d   A   S   D   )
9:00 GMT10:00 GMT11:00 GMT
 Figure 4. Field validation of the reflectance. The plot shows 90 points from 3 flights over 5 targets for 6 spectral bands.In the case of the thermal camera, laboratory calibration wasconducted using a calibration blackbody source (RAYBB400,Raytek, CA, USA). In order to apply an atmospheric correction,the MODTRAN radiative transfer code (Berk et al., 1999) isused to model the atmospheric transmissivity and longwavethermal path radiance. Since only vegetation temperature isretrieved, surface emissivity is considered as 0.98 as anaccepted value for natural vegetation. Local atmosphericconditions such as air temperature, relative humidity and barometric pressure are measured at the time of flight with a portable weather station (Model WXT510, Vaisala, Finland)and used as input into MODTRAN model. Both, pathtransmittance and thermal radiance are simulated at differentsensor altitudes and integrated for the spectral response range of the thermal camera. The validation was conducted flying at 3different times during one diurnal course and over threedifferent surfaces (see Berni et al., 2009) and showed that theerrors for not accounting for the atmospheric effects are higher than 3 K, which are reduced to errors below 1K after theatmospheric correction (figure 5).
285290295300305310315320325330335340285290295300305310315320325330335340Ground K
   S  e  n  s  o  r   K
Raw TsCalibrated TsRMSE Raw = 3.44 KRMSE Cal = 0.87 K
 Figure 5. Comparison between ground truth surface temperature(IRT measured) and obtained from the thermalcamera at 150 m flight altitude before (×) and after (
) applying the atmospheric correction
3.2
 
Geometric calibration
The geometric calibration of the cameras has been performedusing Bouguet’s camera calibration toolbox (Bouguet, 2001).This methodology consists on placing a calibrationcheckerboard pattern on a fixed location and acquiring severalimages from different locations and orientations. The gridcorner coordinates were extracted semiautomatically from theimages, and the intrinsic parameters and exterior orientation(EO) were calculated. In the case of the thermal camera, acalibration pattern was built using resistive wires to obtain a bright pattern when electricity circulated through the wires, thusincreasing their temperature. The results of the calibration arethe internal parameters of the camera (focal length and principal point coordinates) and a distortion map (figure 6) that can beused to estimate the distortion coefficients for any photogrammetric software.Figure 6. Distortion map generated with the geometriccalibration procedure.

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->