Professional Documents
Culture Documents
Computers and Electronics in Agriculture: Articleinfo
Computers and Electronics in Agriculture: Articleinfo
A R T I C L E I N F O A B S T R A C T
Keywords: The use of unmanned aerial vehicles (UAVs) as remote sensing platforms has tremendous potential for describing
Object-based image analysis detailed site-specific features of crops, especially in early post-emergence, which was not possible previously
Planting rows with satellite images. This article describes an object-based image analysis (OBIA) procedure for UAV images,
GIS designed to map and extract information about skips in sugarcane planting rows. The procedure consists of three
Skip rate
consecutive phases: (1) identification of sugarcane planting rows, (2) identification of the existent sugarcane
Decision making
within the crop rows, and (3) skip extraction and creation of field-extent crop maps. Results based on experi-
mental fields achieved skip rates of between 2.29% and 10.66%, indicating a planting operation with excellent
and good quality, respectively. The relationship of estimated versus observed skip length had a coefficient of
determination of 0.97, which was confirmed by the value of the enhanced Wilmott concordance coefficient of
0.92, indicating good agreement. The OBIA procedure allowed a high level of automation and adaptability, and
it provided useful information for decision making, agricultural monitoring, and the reduction of operational
costs.
⁎
Corresponding author.
E-mail addresses: carlos_hws@hotmail.com (C.H.W.d. Souza), rubens.lamparelli@gmail.com (R.A.C. Lamparelli), jansle.rocha@feagri.unicamp.br (J.V. Rocha),
graziano@g.unicamp.br (P.S.G. Magalhães).
1
CNPq Researcher (process: 307362/2014-0).
http://dx.doi.org/10.1016/j.compag.2017.10.006
Received 3 March 2016; Received in revised form 18 August 2017; Accepted 6 October 2017
0168-1699/ © 2017 Elsevier B.V. All rights reserved.
C.H.W.d. Souza et al. Computers and Electronics in Agriculture 143 (2017) 49–56
Fig. 1. UAV orthoimage of the study fields, together with pictures of the sugarcane at the time of flight.
are generally conducted from the field borders and thus, heterogeneity subsequently, a reduction in the degree of statistical separability among
across the entire crop field might affect the accuracy of such estimates the classes compared with conventional pixel-based classification
(Bocca et al., 2015). methods (Yu et al., 2006; Castillejo-González et al., 2014; Peña et al.,
Recently, the use of remote sensing technologies for agricultural 2015; Torres-Sánchez et al., 2015a). To overcome this limitation and to
monitoring has been gaining ground, because they can provide spatially attain a high level of automation and adaptability, object-based image
and temporally distributed information objectively and quickly over a analysis (OBIA) has been used successfully with high-resolution satellite
variety of scales (Zhang et al., 2002; Ahamed et al., 2011; Mulla, 2013). imagery (Novack et al., 2010; de Castro et al., 2013; Castillejo-González
The development of new technologies such as unmanned aerial vehicles et al., 2014) and UAV imagery (Laliberte and Rango, 2011; Peña et al.,
(UAVs) as platforms for the acquisition of remote sensing imagery, al- 2013, 2015; Diaz-Varela et al., 2014; Qin, 2014; Torres-Sánchez et al.,
lows some of the limitations of orbital and airborne platforms that 2015a,b). The OBIA approach first identifies spatially and spectrally
hinder crop monitoring in real-time to be overcome, e.g., the suitability homogeneous units called “objects”, which are created by grouping
of revisit times, avoidance of cloud cover, costs, complexity of opera- adjacent pixels following a segmentation process, and then using the
tion, and limitation of spatial resolution (Berni et al., 2009; Everaerts, created “objects” as the basic elements for analysis. Thus, it is possible
2009; Zhang and Kovacs, 2012; Colomina and Molina, 2014). These to create automated and auto-adaptive classification methods by com-
characteristics make UAV platforms suitable for a number of applica- bining the spectral, contextual, morphological, and hierarchical in-
tions, including crop monitoring (Hunt et al., 2005, 2010; Torres- formation of these elements (Blaschke, 2010).
Sánchez et al., 2014, 2015; Comba et al., 2015), weed detection This article presents an innovative procedure for the creation of skip
(Torres-Sánchez et al., 2013; Peña et al., 2015), water stress assessment maps in sugarcane fields that combines high-resolution images from a
(Berni et al., 2009; Zarco-Tejada et al., 2012), disease detection (Garcia- commercial UAV and the OBIA approach to generate useful decision-
Ruiz et al., 2013), and yield estimation (Swain et al., 2010), i.e., ap- support data. Based on UAV images, this procedure first performs the
plications where time-critical management is required. identification of the sugarcane crop rows. It then identifies the existent
UAVs can provide imagery with very high spatial resolution of only sugarcane within the crop rows and finally, performs skip extraction
a few centimeters and they allow images to be acquired at optimal and the creation of field-extent crop maps.
moments for the desired purposes, which makes them ideal for distin-
guishing crop plants during their first stages of development (Hengl,
2006; López-Granados, 2011; Torres-Sánchez et al., 2015a). However, 2. Materials and methods
very-high-resolution images require powerful image analysis proce-
dures because, unlike lower resolution images, single pixels might no 2.1. Study area
longer capture the characteristics of the classification targets. Ad-
ditionally, these images show higher intra-class spectral variability and The fields used in this study are located near Euclides da Cunha
Paulista in São Paulo state, Brazil (22°26′21″S, 52°35′46″W). They have
50
C.H.W.d. Souza et al. Computers and Electronics in Agriculture 143 (2017) 49–56
Fig. 2. Flowchart illustrating each stage of the object-based image analysis (white boxes) + geographic information system (GIS) (gray boxes) approach proposed to map skips of
sugarcane areas using unmanned aerial vehicle (UAV) images. First step: Identification of crop rows; Second step: Sugarcane classification; Third step: skip extraction and creation of the
maps. (NDVI = normalized difference vegetation index.) (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article.)
51
C.H.W.d. Souza et al. Computers and Electronics in Agriculture 143 (2017) 49–56
two predominant classes of soils: Rhodic Hapludox (Typic Hapludox) mosaicking of the 161 images acquired. The mean absolute geolocation
and Quartzarenic Neosol (Typic Quartzipsamment) and they present a variance obtained in each direction were of 0.0016 m in X, −0.0008 m
moderate slope (< 12%) that allows mechanical green harvest. The in Y and 0.0042 m in Z. No radiometric correction was performed.
region has an average ground elevation of approximately 400 m a.s.l.
and a humid subtropical climate (Cfa) according to the Köppen classi- 2.3. Skip mapping
fication. The average annual temperature is 22 °C, and the average
annual rainfall is 1200 mm, with much more precipitation in summer The procedure to analyze the UAV images and develop the OBIA
than in winter. The landscape of the region comprises some environ- procedure for skip mapping (Fig. 2) was developed using the com-
mental conservation units with a predominance of sugarcane cultiva- mercial software eCognition Developer 8 (Trimble GeoSpatial, Munich,
tion and livestock production. Germany). Similarly, the extraction of the vectorial data features and
The crop variety planted in the area was RB-86-7515. The first production of the final maps used ArcGis software (ESRI, Redlands, CA,
specimens were planted in rows 1.5-m apart over an area of approxi- USA).
mately 100 ha in September 2014 and they were harvested in October The rule set of the algorithm for failure mapping of the sugarcane
2015. UAV imagery was acquired on November 3, 2014, when the ran semi-automatically and it comprised three principal consecutive
sugarcane was at the tillering stage, approximately 60 days after steps: (1) identification of crop rows, (2) classification of sugarcane in
planting (DAP) (Fig. 1). The ideal time for image acquisition is when the rows, and (3) skip extraction and the creation of the maps.
the sugarcane has grown sufficiently to fill the rows but without cov-
ering the inter-row gaps, i.e., 60–90 DAP. However, this period might 2.4. First step: identification of crop rows
vary according to the crop variety, number of ratoons, and amount of
rainfall; therefore, the determination of the ideal time will depend on a This step is necessary if information on the crop rows is unavailable.
visual inspection of the field. For example, if the auto-steering guidance system (Autopilot) was used
at planting, the planting rows could be used for this purpose. Once this
2.2. UAV system and image acquisition has been achieved, only the second and third steps are necessary, for
example, if the requirement is to study the increase of skips in succes-
The UAV system used is based on an eBee Ag drone designed and sive ratoons, which might increase after each successive ratoon
developed by senseFly (senseFly Ltd, Cheseaux-Lausanne, Switzerland). (Matsuoka and Stolf, 2012).
It is built of expanded polypropylene and it weighs about 0.69 kg, in- For the identification of crop rows, the NDVI was used to increase
cluding the payload, which means it can be hand-launched. The UAV is the differences between the vegetation and bare soil. NDVI was selected
battery-powered and it can fly for 50 min at 40–90 km h−1 at altitudes because it is able to achieve greater spectral separability compared with
of up to 900 m, within wind speeds of up to 45 km h−1. Its usual flight other vegetation indices (Torres-Sánchez et al., 2013). Then, a filter was
height is 250 m, which provides spatial resolution of 0.10 m over an applied to create a layer in which each pixel represented the difference
area of up 12 km2 in a single flight. The flight is controlled auto- between the Kernel max or min value and the center value (Pixel Min/
matically using real-time positioning based on a GPS system that is Max Filter Algorithm). In order to enhance the difference between the
accurate to within a few meters. The attitude of the plane is monitored crop rows and inter-row gaps in the image, the algorithm parameters of
by accelerometers. The drone is supplied as standard with two ad- “difference-brightest-to-center” and Kernel size of 15 × 1 were used.
vanced software packages: eMotion 2 (senseFly, Cheseaux-Lausanne, A sub-rule set called “rows extraction” was created to identify the
Switzerland) for flight planning and control and Postflight Terra 3D row. First, a temporary layer called “Rows” (Layer arithmetic algo-
(Pix4D SA, Lausanne, Switzerland), which is photogrammetry software rithm) was created, which was used to add lines with different direc-
designed for post-flight image processing and analysis. tions and widths. Then, a loop process was used to search lines in dif-
The coupled sensor is a small consumer camera based on a Canon ferent directions (from 0° to 180° at intervals of 5°). The “Update
Powershot S110 compact camera with modified color filters. Instead of Variable” algorithm updated the angle parameter (angle: += 5°) and
the standard RGB system, the S110-NIR model uses spectral filters for the “Line Extraction” algorithm was used to classify the pixels ac-
green, red, and infrared wavelengths centered at 550, 625, and 850 nm, cording to their line signal strength in the different directions and to
respectively. This separation is performed by a Bayer color filter di- create a thematic output layer. This process was replicated three times
rectly on the sensor (Shortis et al., 2005). Thus, this sensor allows the by presenting different values of the line width parameter (3, 6, and
implementation of some vegetation indices derived from different 12 pixels) to the “Line Extraction” algorithm. The values of the others
combinations of the three channels, such as the normalized difference parameters remained the same in each of the three replicates.
vegetation index (NDVI) (Rouse et al., 1974). The camera has a 24-mm Furthermore, at the end of each sub-process, each thematic output layer
focal length and the CMOS sensor (7.6 × 5.7 mm) stores the acquired was added into the layer “Rows” by the “Layer Arithmetic” algorithm
images on a compact flash card with up to 12-bit radiometric resolu- (“output layer + Rows”). This way, at the end of the rule set, the layer
tion. The images were acquired using a shutter speed of 1/1200, in “Rows” represented the sum of these three sub-processes (“Rows
order to prevent them from being affected by vibrations and motion. Identification” result; Fig. 2).
For a particular mode, the camera adapts the aperture and ISO speed Subsequently, the layer “Rows” was segmented into homogeneous
according to the light conditions for every single shot to ensure the best multi-pixel objects using the “Contrast Filter Segmentation” algorithm,
image exposure, which ranged from 2.0 to 4.0 and 100 to 200, re- which uses pixel filters to detect and create objects based on contrast
spectively. and gradient data.
The UAV imagery was acquired at flight altitudes of 150–200 m. Thereafter, small objects that represented noise were removed using
Thus, for our objective, the pixel size of approximately 0.10 m was the “Remove Objects” algorithm, with a threshold condition of the
sufficient with 75% forward lap and 70% side lap. The eBee Ag UAS number of pixels being < 100. Then, the remaining crop row objects
allows for direct georeferencing of the images using an inertial mea- were assigned into a class called “Crop Rows” by the “Assign Class”
surement unit (IMU) in combination with an on-board Global algorithm, and the main line of each “Crop Rows Class” objects were
Navigation Satellite System (GNSS). Therefore, the ground control exported as shapefile vectors. Finally, the exported crop row lines were
points can be omitted, which speeds up data collection and processing, entered into ArcGis to check and correct for possible errors. The red
allowing for a fully automatic processing chain (Bendig et al., 2014; boxes in Fig. 2 shows two examples of errors that are common in the
Turner et al., 2014). The Postflight Terra 3D software (Pix4D SA, process, which are caused mainly by long skips, late germination, or
Lausanne, Switzerland) was used for orthorectification and the sugarcane covering the inter-row gaps.
52
C.H.W.d. Souza et al. Computers and Electronics in Agriculture 143 (2017) 49–56
53
C.H.W.d. Souza et al. Computers and Electronics in Agriculture 143 (2017) 49–56
Fig. 3. Skip maps for the sugarcane fields (F1 and F2) within the study area.
Fig. 4. Skip maps for the sugarcane fields (F3 and F4) within the study area.
54
C.H.W.d. Souza et al. Computers and Electronics in Agriculture 143 (2017) 49–56
30 ME = -0.33 The time of image acquisition is important when mapping skips. The
method presented here was influenced by the presence of long skips,
25 RMSE = 1.29
late germination, or sugarcane covering the inter-row gaps. The ac-
20 dr = 0.92 quisition of good quality imagery at the ideal time will reduce errors in
the construction of the crop rows. Future studies will monitor succes-
15
sive ratoons to determine the rate of skip increase, which will allow
10 estimation of the optimal time for field renewal.
Samples
5 Line 1:1 Acknowledgments
0
0 5 10 15 20 25 30 35 40 45
This study was supported by the São Paulo Research Foundation
Observed (m)
FAPESP (Fundação de Amparo à Pesquisa do Estado de São Paulo) and
Fig. 5. Comparison between the skip length observed in the field and that estimated by Odebrecht Agro-Industrial (Process Number. 12/50048-7). The first
the procedure (n = number of samples; ME = mean error; RMSE = root mean square author was supported by a PhD scholarship from CAPES (Coordenação
error; dr = enhanced Wilmott concordance coefficient). de Aperfeiçoamento de Pessoal de Nível Superior). We also thank Prof.
Mariana Abrantes Giannotti, coordinator of the Geoprocessing
of disease. Ray et al. (2009) found that the higher the flood time, the Laboratory (LABGEO) of the Polytechnic School of University of São
greater the loss of sugarcane yield. Glaz and Lingle (2012) reported that Paulo, for allowing the use of the eCognition software.
the leaves, stalks, roots, and biomass were all affected negatively by
increased flood time in recently planted sugarcane fields. Thus, those References
skips related to late germination at such flood points could not be
identified on the acquired images at 60 DAP (Fig. 1). Subsequent field Abdel-Rahman, E.M., Ahmed, F.B., 2008. The application of remote sensing techniques to
sugarcane (Saccharum spp. hybrid) production: a review of the literature. Int. J.
checks at these flood points revealed higher skip incidence, although
Remote Sens. 29, 3753–3767. http://dx.doi.org/10.1080/01431160701874603.
fewer in number and with shorter lengths than obtained from the maps. Ahamed, T., Tian, L., Zhang, Y., Ting, K.C., 2011. A review of remote sensing methods for
The sugarcane plants at these points presented reduced size and fewer biomass feedstock production. Biomass Bioenerg. 35, 2455–2469. http://dx.doi.org/
numbers of leaves and stalks per unit area compared with the re- 10.1016/j.biombioe.2011.02.028.
Bégué, A., Lebourgeois, V., Bappel, E., Todoroff, P., Pellegrino, A., Baillarin, F., Siegmund,
mainder of the field. Therefore, even if it is accepted that the method B., 2010. Spatio-temporal variability of sugarcane fields and recommendations for
overestimates skip numbers at such flood points, it will provide evi- yield forecast using NDVI. Int. J. Remote Sens. 31, 5391–5407. http://dx.doi.org/10.
dence of where yield loss will be higher. 1080/01431160903349057.
Bendig, J., Bolten, A., Bennertz, S., Broscheit, J., Eichfuss, S., Bareth, G., 2014. Estimating
Finally, the maps also provide some information about the quality of biomass of barley using crop surface models (CSMs) derived from UAV-based RGB
the planting operation, which can help improve the mechanical op- imaging. Remote Sens. 6, 10395–10412. http://dx.doi.org/10.3390/rs61110395.
erations. For example, in Field 2 (Fig. 3), two crop rows with huge skips Berni, J., Zarco-Tejada, P.J., Suarez, L., Fereres, E., 2009. Thermal and narrowband
multispectral remote sensing for vegetation monitoring from an unmanned aerial
were detected, which were caused by bad mechanical operations at vehicle. IEEE Trans. Geosci. Remote Sens. 47, 722–738. http://dx.doi.org/10.1109/
planting. Thus, the frequency, length, and distribution of skips can help TGRS.2008.2010457.
identify the sources of the problems. Blaschke, T., 2010. Object based image analysis for remote sensing. ISPRS J.
Photogramm. Remote Sens. 65, 2–16. http://dx.doi.org/10.1016/j.isprsjprs.2009.06.
004.
3.2. Validation Bocca, F.F., Rodrigues, L.H.A., Arraes, N.A.M., 2015. When do I want to know and why?
Different demands on sugarcane yield predictions. Agric. Syst. 135, 48–56. http://dx.
doi.org/10.1016/j.agsy.2014.11.008.
The relationship between the estimated and observed skip length Castillejo-González, I.L., Peña-Barragán, J.M., Jurado-Expósito, M., Mesas-Carrascosa,
was highly satisfactory with a value of R2 = 0.97, which was confirmed F.J., López-Granados, F., 2014. Evaluation of pixel- and object-based approaches for
mapping wild oat (Avena sterilis) weed patches in wheat fields using QuickBird
by the value of dr = 0.92, indicating good agreement (Fig. 5). Through imagery for site-specific management. Eur. J. Agron. 59, 57–66. http://dx.doi.org/
the mean error, we conclude that the procedure underestimated skip 10.1016/j.eja.2014.05.009.
length by an average of 0.33 m with a root mean square error of 1.29 m. Colomina, I., Molina, P., 2014. Unmanned aerial systems for photogrammetry and remote
sensing: a review. ISPRS J. Photogramm. Remote Sens. 92, 79–97. http://dx.doi.org/
These validation results indicate that the use of high-resolution images 10.1016/j.isprsjprs.2014.02.013.
from UAV systems during the early growth season of sugarcane can Comba, L., Gay, P., Primicerio, J., Ricauda Aimonino, D., 2015. Vineyard detection from
provide accurate information about skip incidence, which could help unmanned aerial systems images. Comput. Electron. Agric. 114, 78–87. http://dx.
doi.org/10.1016/j.compag.2015.03.011.
decision-making procedures. de Castro, A.I., López-Granados, F., Jurado-Expósito, M., 2013. Broad-scale cruciferous
weed patch classification in winter wheat using QuickBird imagery for in-season site-
specific control. Precis. Agric. 14, 392–413. http://dx.doi.org/10.1007/s11119-013-
4. Conclusions 9304-y.
Diaz-Varela, R.A., Zarco-Tejada, P.J., Angileri, V., Loudjani, P., 2014. Automatic identi-
The use of UAV images allows the creation of skip maps of su- fication of agricultural terraces through object-oriented analysis of very high re-
solution DSMs and multispectral imagery obtained from an unmanned aerial vehicle.
garcane fields. The method presented in this study proved efficient in J. Environ. Manage. 134, 117–126. http://dx.doi.org/10.1016/j.jenvman.2014.01.
the estimation of skip length when compared with information derived 006.
in situ. Such information is useful for decision making, agricultural Everaerts, J., 2009. New Platforms - Unconventional Platforms (Unmanned Aircraft
Systems) for Remote Sensing, Frankfurt.
monitoring, and reduction of operational costs reduction, and it can Garcia-Ruiz, F., Sankaran, S., Maja, J.M., Lee, W.S., Rasmussen, J., Ehsani, R., 2013.
help maintain the longevity and productivity of the crop over succes- Comparison of two aerial imaging platforms for identification of Huanglongbing-in-
sive cycles. The use of UAV technology optimized the surveying of skips fected citrus trees. Comput. Electron. Agric. 91, 106–115. http://dx.doi.org/10.
1016/j.compag.2012.12.002.
in fields, previously unfeasible with satellite imagery and currently Glaz, B., Lingle, S.E., 2012. Flood duration and time of flood onset effects on recently
realized by field investigation. planted sugarcane. Agron. J. 104, 575–583. http://dx.doi.org/10.2134/agronj2011.
The OBIA method resolves complex problems related to the UAV 0351.
Hengl, T., 2006. Finding the right pixel size. Comput. Geosci. 32, 1283–1298. http://dx.
images through the creation of process trees, which permits a high level
55
C.H.W.d. Souza et al. Computers and Electronics in Agriculture 143 (2017) 49–56
doi.org/10.1016/j.cageo.2005.11.008. in the Great Plains with ERTS. In: NASA Goddard Space Flight Center 3d ERTS-1
Hunt Jr., E.R., Hively, W.D., Fujikawa, S.J., Linden, D.S., Daughtry, C.S.T., McCarty, Symposium, United States, pp. 309–317.
G.W., 2010. Acquisition of NIR-green-blue digital photographs from unmanned air- Rudorff, B.F.T., de Aguiar, D.A., da Silva, W.F., Sugawara, L.M., Adami, M., Moreira,
craft for crop monitoring. Remote Sens. 2, 290–305. http://dx.doi.org/10.3390/ M.A., 2010. Studies on the rapid expansion of sugarcane for ethanol production in
rs2010290. São Paulo State (Brazil) using Landsat data. Remote Sens. 2, 1057–1076. http://dx.
Hunt, E.R., Cavigelli, M., Daughtry, C.S.T., Mcmurtrey, J.E., Walthall, C.L., 2005. doi.org/10.3390/rs2041057.
Evaluation of digital photography from model aircraft for remote sensing of crop Shortis, M.R., Seager, J.W., Harvey, E.S., Robson, S., 2005. Influence of Bayer filters on
biomass and nitrogen status. Precis. Agric. 6, 359–378. http://dx.doi.org/10.1007/ the quality of photogrammetric measurement. Electron. Imaging 5665, 164–171.
s11119-005-2324-5. http://dx.doi.org/10.1117/12.588217.
Keerthipala, A., Dharmawardene, N., 2000. Determination of optimal replanting cycles Spekken, M., de Bruin, S., 2012. Optimized routing on agricultural fields by minimizing
for sugarcane production in Sri Lanka. Sugar Tech 2, 9–19. http://dx.doi.org/10. maneuvering and servicing time. Precis. Agric. 14, 224–244. http://dx.doi.org/10.
1007/BF02945751. 1007/s11119-012-9290-5.
Laliberte, A.S., Rango, A., 2011. Image processing and classification procedures for Spekken, M., Molin, J.P., Romanelli, T.L., 2015. Cost of boundary manoeuvres in su-
analysis of sub-decimeter imagery acquired with an unmanned aircraft over arid garcane production. Biosyst. Eng. 129, 112–126. http://dx.doi.org/10.1016/j.
rangelands. GISci. Remote Sens. 48, 4–23. http://dx.doi.org/10.2747/1548-1603.48. biosystemseng.2014.09.007.
1.4. Stolf, R., 1986. Methodology for gap evaluation on sugarcane lines. STAB 4, 12–20.
Lebourgeois, V., Begue, A., Degenne, P., Bappel, E., Lebourgeois, B.V., Begue, A., Swain, K.C., Thomson, S.J., Jayasuriya, H.P.W., 2010. Adoption of an unmanned heli-
Degenne, P., Bappel, E., 2010. Improving harvest and planting monitoring for copter for low-altitude remote sensing to estimate yield and total biomass of a rice
smallholders with geospatial technology: the Reunion Island experience. Int. Sugar J. crop. Trans. ASABE 53, 21–27.
109, 109–119. Torres-Sánchez, J., López-Granados, F., De Castro, A.I., Peña-Barragán, J.M., 2013.
López-Granados, F., 2011. Weed detection for site-specific weed management: mapping Configuration and specifications of an unmanned aerial vehicle (UAV) for early site
and real-time approaches. Weed Res. 51, 1–11. http://dx.doi.org/10.1111/j.1365- specific weed management. PLoS One 8, 1–15. http://dx.doi.org/10.1371/journal.
3180.2010.00829.x. pone.0058210.
Martinelli, L.a., Garrett, R., Ferraz, S., Naylor, R., 2011. Sugar and ethanol production as a Torres-Sánchez, J., López-Granados, F., Peña, J.M., 2015a. An automatic object-based
rural development strategy in Brazil: evidence from the state of São Paulo. Agric. method for optimal thresholding in UAV images: application for vegetation detection
Syst. 104, 419–428. http://dx.doi.org/10.1016/j.agsy.2011.01.006. in herbaceous crops. Comput. Electron. Agric. 114, 43–52. http://dx.doi.org/10.
Martinelli, L.a., Filoso, S., 2008. Expansion of sugarcane ethanol production in Brazil: 1016/j.compag.2015.03.019.
environmental and social challenges. Ecol. Appl. 18, 885–898. Torres-Sánchez, J., López-Granados, F., Serrano, N., Arquero, O., Peña, J.M., 2015b.
Matsuoka, S., Stolf, R., 2012. Chapter 5 - sugarcane tillering and ratooning: key factors for High-throughput 3-D monitoring of agricultural-tree plantations with unmanned
a profitable cropping. In: Goncalves, J.F., Correia, K.D. (Eds.), Sugarcane: Production, aerial vehicle (UAV) technology. PLoS One 10, 1–20. http://dx.doi.org/10.1371/
Cultivation and Uses. Nova Science Publishers, pp. 137–157. journal.pone.0130479.
Mulla, D.J., 2013. Twenty five years of remote sensing in precision agriculture: key ad- Torres-Sánchez, J., Peña, J.M., de Castro, A.I., López-Granados, F., 2014. Multi-temporal
vances and remaining knowledge gaps. Biosyst. Eng. 114, 358–371. http://dx.doi. mapping of the vegetation fraction in early-season wheat fields using images from
org/10.1016/j.biosystemseng.2012.08.009. UAV. Comput. Electron. Agric. 103, 104–113. http://dx.doi.org/10.1016/j.compag.
Nassar, A.M., Rudorff, B.F.T., Antoniazzi, L.B., Aguiar, D.A.De, Bacchi, M.R.P., Adami, M., 2014.02.009.
2008. Chapter 3 Prospects of the sugarcane expansion in Brazil : impacts on direct Turner, D., Lucieer, A., Wallace, L., 2014. Direct georeferencing of ultrahigh-resolution
and indirect land use changes. In: Sugarcane Ethanol: Contributions to Climate UAV imagery. IEEE Trans. Geosci. Remote Sens. 52, 2738–2745. http://dx.doi.org/
Change Mitigation and the Environment. Wageningen Academic Publishers, The 10.1109/TGRS.2013.2265295.
Netherlands, pp. 63–93. Willmott, C.J., Robeson, S.M., Matsuura, K., 2012. A refined index of model performance.
Novack, T., Kux, H.J.H., Feitosa, R.Q., Costa, G.a, 2010. Per block urban land use inter- Int. J. Climatol. 32, 2088–2094. http://dx.doi.org/10.1002/joc.2419.
pretation using optical VHR data and the knowledge-based system Interimage. Int. Xavier, A.C., Rudorff, B.F.T., Shimabukuro, Y.E., Berka, L.M.S., Moreira, M.A., 2006.
Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 38, 6p. Multi-temporal analysis of MODIS data to classify sugarcane crop. Int. J. Remote
Peña, J.M., Torres-Sánchez, J., de Castro, A.I., Kelly, M., López-Granados, F., 2013. Weed Sens. 27, 755–768. http://dx.doi.org/10.1080/01431160500296735.
mapping in early-season maize fields using object-based analysis of unmanned aerial Yu, Q., Gong, P., Clinton, N., Biging, G., Kelly, M., Schirokauer, D., 2006. Object based
vehicle (UAV) images. PLoS One 8, 1–11. http://dx.doi.org/10.1371/journal.pone. detailed vegetation classification with airborne high spatial resolution remote sensing
0077151. imagery. Photogramm. Eng. Remote Sensing 72, 799–811. http://dx.doi.org/10.
Peña, J.M., Torres-Sánchez, J., Serrano-Pérez, A., de Castro, A., López-Granados, F., 2015. 14358/PERS.72.7.799.
Quantifying efficacy and limits of unmanned aerial vehicle (UAV) technology for Zarco-Tejada, P.J., González-Dugo, V., Berni, J.a.J, 2012. Fluorescence, temperature and
weed seedling detection as affected by sensor resolution. Sensors 15, 5609–5626. narrow-band indices acquired from a UAV platform for water stress detection using a
http://dx.doi.org/10.3390/s150305609. micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 117,
Qin, R., 2014. An object-based hierarchical method for change detection using unmanned 322–337. http://dx.doi.org/10.1016/j.rse.2011.10.007.
aerial vehicle images. Remote Sens. 7911–7932. http://dx.doi.org/10.3390/ Zhang, C., Kovacs, J.M., 2012. The application of small unmanned aerial systems for
rs6097911. precision agriculture: a review. Precis. Agric. 13, 693–712. http://dx.doi.org/10.
Ray, J.D., Sinclair, T.R., Glaz, B., 2009. Sugarcane response to high water tables and 1007/s11119-012-9274-5.
intermittent flooding. J. Crop Improv. 24, 12–27. http://dx.doi.org/10.1080/ Zhang, N., Wang, M., Wang, N., 2002. Precision agriculture—a worldwide overview.
15427520903304269. Comput. Electron. Agric. 36, 113–132. http://dx.doi.org/10.1016/S0168-1699(02)
Rouse, J.W., Haas, R.H., Schell, J.A., Deering, D.W., 1974. Monitoring vegetation systems 00096-0.
56