You are on page 1of 14

SS symmetry

Article
Identification and Classification of Maize Drought
Stress Using Deep Convolutional Neural Network
Jiangyong An 1 , Wanyi Li 2 , Maosong Li 1, *, Sanrong Cui 1 and Huanran Yue 1
1 Key Laboratory of Agricultural Remote Sensing, Ministry of Agriculture/Institute of Agricultural Resources
and Regional Planning, Chinese Academy of Agricultural Sciences, Beijing 100081, China;
an_jiangyong@163.com (J.A.); sanrongcui@163.com (S.C.); herainyoe@hotmail.com (H.Y.)
2 Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China; wanyi.li@ia.ac.cn
* Correspondence: limaosong@caas.cn

Received: 11 January 2019; Accepted: 14 February 2019; Published: 18 February 2019 

Abstract: Drought stress seriously affects crop growth, development, and grain production.
Existing machine learning methods have achieved great progress in drought stress detection and
diagnosis. However, such methods are based on a hand-crafted feature extraction process, and the
accuracy has much room to improve. In this paper, we propose the use of a deep convolutional neural
network (DCNN) to identify and classify maize drought stress. Field drought stress experiments
were conducted in 2014. The experiment was divided into three treatments: optimum moisture, light
drought, and moderate drought stress. Maize images were obtained every two hours throughout
the whole day by digital cameras. In order to compare the accuracy of DCNN, a comparative
experiment was conducted using traditional machine learning on the same dataset. The experimental
results demonstrated an impressive performance of the proposed method. For the total dataset,
the accuracy of the identification and classification of drought stress was 98.14% and 95.95%,
respectively. High accuracy was also achieved on the sub-datasets of the seedling and jointing
stages. The identification and classification accuracy levels of the color images were higher than those
of the gray images. Furthermore, the comparison experiments on the same dataset demonstrated
that DCNN achieved a better performance than the traditional machine learning method (Gradient
Boosting Decision Tree GBDT). Overall, our proposed deep learning-based approach is a very
promising method for field maize drought identification and classification based on digital images.

Keywords: drought identification; drought classification; phenotype; drought stress; maize; deep
convolutional neural network; traditional machine learning

1. Introduction
Maize is one of the main food crops, and drought stress significantly affects its growth and
decreases its yield at all developmental stages [1,2]. Different drought stress levels have different
effects on maize growth and yield [3]. Meanwhile, different drought stress levels of maize require
different amounts of irrigation [4]. Therefore, early detection and accurate drought stress monitoring
is of great significance for maize precision irrigation, water consumption reduction, and to ensure a
high and stable yield of maize [5].
Traditional drought stress severity monitoring is based on soil moisture sensors [6], which have
low efficiency and limited indirect and spatial area [7,8]. Maize plants develop different physiological
mechanisms to mitigate the impact of drought stress, causing a series of phenotypic changes, such
as changes in leaf color and texture [9], leaf rolling, a decreased leaf extension rate, and plant
dwarfing [10,11]. Thus, maize phenotypic variation is a direct manifestation of maize drought stress.

Symmetry 2019, 11, 256; doi:10.3390/sym11020256 www.mdpi.com/journal/symmetry


Symmetry 2019, 11, 256 2 of 14

It is a rapid and non-destructive method to identify and classify maize drought stress based on
phenotypic characteristics.
With the development of computer vision, machine learning and image processing techniques are
widely being used in crop biotic and abiotic stress phenotype research, and traditional machine learning
has proved to be one of the most flexible and powerful crop phenotype analysis techniques [12–14].
An artificial neural network (ANN) and image processing techniques were applied to identify and
classify phalaenopsis seedling diseases. Specifically, the color and texture features of images were
extracted and then used as inputs to the neural network to detect disease [15]. Zakaluk and Sri Ranjan
used digital camera-acquired tomato images under four soil water stress levels and built an artificial
neural network model with RGB images to determine the leaf water potential [16]. Support vector
machine (SVM) and the Gaussian processes classifier (GPC) were applied to automatically detect
regions in spinach canopies with different soil moisture levels based on thermal and digital images [17].
Stress detection based on traditional machine learning mainly includes image segmentation and feature
extraction. Then, the features are input into the machine learning algorithm for stress identification,
classification, quantification, and prediction [12]. Although traditional machine learning has achieved
good results in biotic and abiotic stress recognition, it often requires segmentation of the target image
and manual extraction of image features, and is therefore time-consuming and labor-intensive [18].
Furthermore, manually extracted features are easily affected by aspects of the background and
environment, such as soil, lighting, and wind; therefore, the accuracy of classification is limited [19].
In recent years, deep convolutional neural network integrating image feature extraction with
classification in a single pipeline made great breakthroughs in the image identification field [20,21],
so much so that CNNs have started to become mainstream in biotic and abiotic stress diagnosis and
classification [22]. In contrast to traditional machine learning approaches, CNNs have been shown to
substantially outperform approaches for complex and abstract plant phenotyping tasks [19]. A deep
convolutional network was applied to sort haploid seeds, and it outperformed the traditional machine
learning method based on color and texture [23]. Uzal et al. reported that when the deep learning
model and SVM were used to estimate the number of seeds in soybean pods, the estimation accuracy
of deep learning was significantly higher than that of SVM [24]. In the monitoring and diagnosis
of crop stress, the deep learning method was shown to be more accurate and objective than expert
diagnosis [25]. Deep learning models were developed to detect and diagnose plant diseases based
on leaf images. The detection accuracy of 58 plant diseases was 99.53% [26]. In general, there are
two means to apply the deep learning model. One is “Train from scratch” and the other is “Transfer
learning”. “Train from scratch” refers to changing all the weights and parameters of the model during
training. “Transfer learning” keeps the weight of the model convolution layer fixed, only changing the
weight and parameters of the classification layer. “Transfer learning” only requires a small dataset and
low hardware facilities and achieves better results in smaller datasets. Therefore, it is a good choice to
use a pre-trained model to transfer learning with small datasets [27].
In this study, we used the DCNN method to identify and classify different drought stress levels
(optimum moisture, light drought stress, and moderate drought stress) of maize. Datasets were
collected from the field environment. Two DCNN models were applied to detect maize drought stress.
We also compared the identification accuracy of DCNN and traditional machine learning (GBDT) [9].
The main contributions of this paper are summarized as follows:

1. In order to rapidly, accurately, and nondestructively identify different drought stress levels in maize,
we proposed the use of the DCNN method to identify and classify different drought stress levels of
maize. We compared the methods of transfer learning and train from scratch using the Resnet50 and
Resnet152 model to identify maize drought stress. The results show that the identification accuracy
of transfer learning is higher than that of train from scratch, and it saves a lot of time.
2. We set up an image dataset for maize drought stress identification and classification.
Digital cameras were applied to obtain maize drought stress images in the field. Drought stress
was divided into three levels: optimum moisture, light, and moderate drought stress. Maize plant
Symmetry 2019, 11, 256 3 of 14

Symmetry 2019, 11, x 3 of 14

images were captured every two hours. The size of the images was 640 × 480 pixels, and a total
of
of 3461
3461 images
images were
were captured
captured in
in two
two stages
stages (seedling
(seedling and
and jointing
jointing stage)
stage) and
and were
were transformed
transformed
into gray images.
into gray images.
3 We compared the accuracy of maize drought identification between the DCNN and traditional
3. We compared the accuracy of maize drought identification between the DCNN and traditional
machine learning approach based on manual feature extraction on the same dataset. The results
machine learning approach based on manual feature extraction on the same dataset. The results
showed that the accuracy of DCNN was significantly higher than that of the traditional machine
showed that the accuracy of DCNN was significantly higher than that of the traditional machine
learning method based on color and texture.
learning method based on color and texture.
2. Materials
2. Materials and
and Methods
Methods

2.1. Field Experiment


The experiment was carried out at the experimental base of the Cotton Research Institute of
Academyof of
Shanxi Academy Agricultural
Agricultural Sciences
Sciences 2014◦ N,(35.03°N,
in 2014in(35.03 110.58◦ E).110.58°E).
The annualThe annual
average average
temperature
temperature of ◦
the site is 14.5°C, and the annual illumination is 2139
of the site is 14.5 C, and the annual illumination is 2139 h. The experiment used a combination h. The experiment usedof a
combination
large of a largerain-out
electric movable electric movable
shelter and rain-out shelter
a control pondand a control
to control thepond to controlcontent
soil moisture the soil(Figure
moisture 1).
content
The area(Figure
of each1). The
plot was area 2 (1
2 mof each
m ×plot 2 m),was
and2m 2 (1 m × 2 m), and each plot was surrounded by
each plot was surrounded by cement. The soil type
cement.
was loam The soil
soil, thetype
soilwas
depthloam
was soil,
1.5the
m, soil depth was
the average bulk1.5density
m, the average bulk
of the soil wasdensity 3 , soil
of the
1.39 g/cm andwasthe
1.39 g/cm
field water, and
3 the field
capacity waswater
25.09%.capacity
In orderwasto25.09%.
achieveInaccurate
order towater
achieve accurate
control, water
each plotcontrol,
bottom eachwas
plot bottom
covered withwas covered with
polyethylene polyethylene
thick thickwater
film to partition film toexchange
partitionwith
water exchange
the deep soilwith theThe
layer. deep soil
maize
layer. The
variety maize
used was variety
Zhengdan used958,
waswhich
Zhengdanis the958,
most which is thecultivated
popular most popular cultivated
variety in Chinavariety
and has in
China and
excellent has excellent
agronomic agronomic
traits. The growth traits.
stageThe growth
was aboutstage wasthe
96 days, about 96 height
plant days, the
wasplant height
240 cm, andwasthe
240 cm,
spike and the
height wasspike
aboutheight
100 cm.wasMaize
aboutwas 100 sown
cm. Maize
in Junewas18,sown
2014,inwith
June218,
rows2014,
andwith 2 rows
6 plants perand 6
row.
plants per
Except row.
for the Except irrigation
different for the different
treatments,irrigation treatments,
all other all other were
farming activities farming activities
the same wereused
as those the
same
in ashigh
local thoseyield
usedfields.
in local high 1yield
Figure fields.
depicts the Figure
test area1 depicts the test area situation.
situation.

Figure 1. The general situation of the test area.


Figure 1. The general situation of the test area.

The experiment was divided into two growth stages (seedling and jointing) and three different
The experiment was divided into two growth stages (seedling and jointing) and three different
water treatments, that is, optimum moisture (OM), light drought stress (LD), and moderate drought
water treatments, that is, optimum moisture (OM), light drought stress (LD), and moderate drought
stress (MD). Based on the water requirements at different growth stages of maize, the water contents for
stress (MD). Based on the water requirements at different growth stages of maize, the water contents
the optimum moisture, light drought, and moderate drought conditions were 65–80% Field capacity
for the optimum moisture, light drought, and moderate drought conditions were 65–80% Field
(FC), 50–65% FC, and 30–40% FC, respectively, at the seedling stage and 65–80% FC, 50–60% FC,
capacity (FC), 50–65% FC, and 30–40% FC, respectively, at the seedling stage and 65%–80% FC, 50%–
and 40–50% FC, respectively, at the jointing stage. Normal soil moisture supply during the early
60% FC, and 40%–50% FC, respectively, at the jointing stage. Normal soil moisture supply during the
growth stage ensured that the maize sprouted normally. By July 3, the maize grew to the three-leaf
early growth stage ensured that the maize sprouted normally. By July 3, the maize grew to the three-
stage. On July 30, the maize was in the huge bellbottom stage. Drought control at the seedling stage
leaf stage. On July 30, the maize was in the huge bellbottom stage. Drought control at the seedling
was conducted from July 3 to July 30, and at the jointing stage from July 31 to August 26. The soil
stage was conducted from July 3 to July 30, and at the jointing stage from July 31 to August 26. The
moisture sensor TDC210I was used to monitor the soil moisture. It was laid before planting, and a
soil moisture sensor TDC210I was used to monitor the soil moisture. It was laid before planting, and
50 cm wide straight ditch was excavated. The sensor was inserted laterally into the channel side
a 50 cm wide straight ditch was excavated. The sensor was inserted laterally into the channel side
and pushed tightly, then backfilled with soil. During the backfilling process, the soil was backfilled
and pushed tightly, then backfilled with soil. During the backfilling process, the soil was backfilled
according to the original level, without changing the original structure of the soil and leaving no
according to the original level, without changing the original structure of the soil and leaving no gaps.
gaps. During drought control, according to the volumetric water content, data measured every five
During drought control, according to the volumetric water content, data measured every five minutes
by the soil moisture sensor was used to judge whether or not irrigation was needed. When the
measured soil water content was lower than the lower control limit, irrigation was performed to the
Symmetry 2019, 11, 256 4 of 14

minutes by the soil moisture sensor was used to judge whether or not irrigation was needed. When the
measured soil water content was lower than the lower control limit, irrigation was performed to the
upper limit of the control. When watering, a water meter was attached to the head of the water pipe,
and the water was filled according to the calculated amount of irrigation. When watering, the water
head was moved at a uniform speed according to the flow rate of the water, so that the whole area was
filled evenly with water.

2.2. Image Data Acquisition


Image acquisition was undertaken using a 720 p outdoor network dome camera WV-SW396AH,
produced by Panasonic. The camera was equipped with a 1.3 million pixel dual speed MOS sensor,
and the tilt table was able to rotate 360 degrees. The camera was fixed at 4.5 m above the ground.
Each treatment had three repetitions, namely A, B and C; each repetition had a different camera
shooting angle to increase the diversity of image data. During the maize growth process, the shooting
parameters needed to be adjusted to maintain the quality of the collected images. Shooting time
was from 6:00 a.m. to 18:00 p.m. at the seedling stage, and from 6:00 a.m. to 19:00 p.m. at the
jointing stage. Maize plant images were captured every two hours and the image resolution was
640 × 480 in JPG storage format. Due to the uneven distribution of soil moisture and the influence
of external environmental factors, a small number of image samples had different drought stress
maize plants and different phenotypic characteristics in a single image. The label of this kind of image
is determined by the drought stress level of most maize plants in the image sample. Label criteria
refer to the Agricultural Industry Standard of the People’s Republic of China “Technical Specification
for Field Investigation and Classification of Maize Disasters” [28] (Optimum moisture: plants grow
normally. Light drought stress: some leaves are rolled during the day and return to normal at night.
Moderate drought stress: most of the leaves are rolled during the day, a few of them are withered,
and some of them return to normal at night.). As a result of field management and irrigation treatment,
images for some of the time stages were not acquired. The numbers of maize images from the different
fields are shown in Table 1. After data acquisition, the RGB images were transformed into gray images
(Figure 2).

Table 1. The number of maize images.

Repeat
Stage Treatment Total
A B C
Optimum moisture 197 187 188 572
Seedling
Light drought 196 187 186 569
(n 1 = 1710)
Moderate drought 195 188 186 569
Optimum moisture 195 194 194 583
Jointing
Light drought 198 195 196 589
(n 1 = 1751)
Moderate drought 195 190 194 579
1 n represent the total number of maize images.
Symmetry 2019, 11, x 5 of 14
Symmetry 2019, 11, 256 5 of 14

(a) (b) (c)

(d) (e) (f)

(g) (h) (i)

(j) (k) (l)

Figure 2. Examples of the two stages (seedling and jointing) of maize drought under the three
Figure 2. Examples of the two stages (seedling and jointing) of maize drought under the three
treatments. (a–c) are the optimum moisture, light drought, and moderate drought maize color samples
treatments. (a), (b) and (c) are the optimum moisture, light drought, and moderate drought maize
in the seedling stage, respectively; (d–f) are the corresponding gray images; (g–i) are the optimum
color samples in the seedling stage, respectively; (d),(e),(f) are the corresponding gray images;
moisture, light drought, and moderate drought maize color samples at the jointing stage, respectively;
(g),(h),(i) are the optimum moisture, light drought, and moderate drought maize color samples at the
(j–l) are the corresponding gray images.
jointing stage, respectively; (j),(k),(l) are the corresponding gray images.
2.3. DCNN Model
2.3. DCNN Model
In this study, two different DCNN models were applied to identify and classify drought stress in
maize, this
In study,
namely two different
ResNet50 DCNN models
and ResNet152. were applied
The numbers 50 andto152
identify and the
represent classify drought
layers stress
of the model.
in maize,
The ResNet namely ResNet50
network mainlyand ResNet152.
consists of 3 × 3The numbers 50
convolution and[29].
layers 152 represent the layers
The key problem of the
solved by
model.
ResNetThe ResNet
is the networkproblem”
“degradation mainly consists of 3 × 3 After
of the model. convolution layers
the model [29].
layer The key problem
is deepened, ResNetsolved
adds a
by ResNetstructure
residual is the “degradation problem”
between layers to solveofthe
theproblem
model. of
After the model layer is gradients
vanishing/exploding deepened,when
ResNet
the
adds
model a residual
is deeper.structure between
This type layers
of DCNN cantobesolve
verythe problem
deep of vanishing/exploding
and achieve gradients
a high identification when
accuracy [30].
the model is deeper. This type of DCNN can be very deep and achieve a high identification accuracy
2.4. Training Process
[30].
In the experiment, we used the transfer learning and train from scratch methods to train the model.
2.4 .Training Process
Model training consisted of three parts: the total dataset, the seedling sub-dataset, and the jointing
sub-dataset. The training
In the experiment, weand testthe
used settransfer
were randomly
learningdivided,
and train of which 80% was
from scratch the training
methods set the
to train and
20% was
model. the test
Model set. All
training images were
consisted converted
of three parts: theto the
totaltf.record
dataset,format (data format
the seedling in tensorflow).
sub-dataset, and the
During sub-dataset.
jointing training, all model hyperparameters
The training and test setwere
wereconsistent
randomlyand we setofthese
divided, empirically:
which 80% was the thetraining
learning
rateand
set was20%
0.01,was
the the
typetest
of learning decay was
set. All images wereexponential,
converted the batch
to the size was
tf.record 32, and
format the format
(data optimizer
in
was SGD. The
tensorflow). convergence
During of the
training, all model
modelwas judged by the value
hyperparameters wereofconsistent
the loss function.
and weAfter set model
these
training, wethe
empirically: tested the drought
learning rate wasrecognition accuracy
0.01, the type of the
of learning model.
decay wasFigure 3 shows
exponential, thethe flowchart
batch size wasof
theand
32, experiment.
the optimizer was SGD. The convergence of the model was judged by the value of the loss
In comparison
function. After model experiments
training, we with traditional
tested the droughtmachine learning,accuracy
recognition we usedofthethesame dataset
model. as in
Figure 3
a previous study [9]. The dataset
shows the flowchart of the experiment. is a sub-dataset of the seedling stage and contained 656 images
including 219 optimum moisture images, 218 light drought stress images, and 219 moderate drought
Symmetry
Symmetry 2019,
2019, 11,
11, x256 66 of
of 14
14

including 219 optimum moisture images, 218 light drought stress images, and 219 moderate drought
stress images.
images. The
Thedifference
differencewas
wasthat manual
that feature
manual extraction
feature was used,
extraction including
was used, color and
including texture
color and
featuresfeatures
texture and then training
and with the
then training traditional
with machine
the traditional learning
machine method
learning (GBDT)
method and 6-fold
(GBDT) cross
and 6-fold
validation
cross were applied.
validation In ourInmethod,
were applied. we used
our method, we the
usedDCNN modelmodel
the DCNN ResNet50 to extract
ResNet50 the image
to extract the
features
image directlydirectly
features and thenandidentified and classified
then identified the maize
and classified thedrought stress. Due
maize drought to the
stress. Duesmall amount
to the small
of data, we
amount only we
of data, trained
only 100 epochs
trained 100and 100 images
epochs and 100for testing.
images for testing.

Figure 3.
Figure Schematic of
3. Schematic of the
the proposed
proposed method.
method.

The digital
The digital image
image processing
processingandandtraining
trainingmodel
modelwere
weredone
donewith
withthe Python
the Python 3.63.6
software
softwareandand
all
of the
all above
of the aboveexperiments
experiments were
wereconducted
conductedon onthe
theTensorflow
Tensorflowplatform,
platform,which
whichisis aa fast,
fast, open source
open source
framework for deep learning. The software was run on a PC with Xeon W-2145, 3.7 GHz
framework for deep learning. The software was run on a PC with Xeon W-2145, 3.7 GHz CPU (Central CPU (Central
processing unit),
processing unit), and
and 6464 GB
GB RAM
RAM withwith aa GPU
GPU (Graphic
(Graphic Processor
Processor Unit):
Unit): NVIDIA
NVIDIA GeForceGeForce GTX
GTX
1080Ti 11G.
1080Ti 11G.
Referring to
Referring to previous
previous work
work byby our
our team
team [9],
[9], two
two accuracy
accuracy measures
measures were
were proposed
proposed to to evaluate
evaluate
the effectiveness
the effectiveness ofof the
the detection
detection model:
model: the
the accuracy
accuracy of of drought
drought stress
stress identification
identification (DI)(DI) and
and the
the
accuracy of drought stress classification (DC). These were defined
accuracy of drought stress classification (DC). These were defined as as

D𝐷𝐿𝑀
DI𝐷𝐼== LM ××100% 100% (1)
(1)
N𝑁
L 𝐿++𝑁NM 𝑀
𝐷𝑡𝑟𝑢𝑒
𝐷𝐶 = D × 100% (2)
DC = 𝑁𝑂 + 𝑁𝐿 +𝑁𝑀 × 100%
true
(2)
NO + NL + NM
where NO, NL, and NM are the number of optimum moisture, light drought, and moderate drought test
where NOrespectively;
samples, , NL , and NMDare the number of optimum moisture, light drought, and moderate drought
LM is the number of light drought and moderate drought stressed samples
test samples, respectively; D
detected correctly to be under is the number
LM water of light
stress; and Dtrue drought and moderate
is the correct drought
identification stressed
number samples
of all of the
detected
samples. correctly to be under water stress; and D true is the correct identification number of all of
the samples.
3. Results
3. Results
3.1.
3.1. Comparison
Comparison of
of Accuracy
Accuracy and
and Training Time of
Training Time of Different
Different Models
Models
The
The training
training loss
loss functions
functionsof of different
differentimage
imagetypes
typesduring
duringthe thetwo
twostages
stagesare
areshown
shownin inFigure
Figure4.4.
With
With anan increase
increasein intraining
trainingepochs,
epochs,the themodel
model gradually
gradually converged.
converged.However,
However, in the totaltotal
in the dataset, the
dataset,
number of convergence epochs of the model was relatively large. In this experiment,
the number of convergence epochs of the model was relatively large. In this experiment, the training the training epochs
are set toare
epochs 7000setontothe totalon
7000 dataset and 5000
the total datasetepochs
andat5000
the seedling
epochs at andthe
jointing stages.
seedling andThe accuracy
jointing and
stages.
training time of the model’s transfer learning and train from scratch were compared
The accuracy and training time of the model’s transfer learning and train from scratch were compared by the specific epochs
(Table
by the2). The time
specific consumption
epochs (Table 2). of The
transfer
timelearning was significantly
consumption of transferless (by about
learning was2–3 times) than that
significantly less
of train from scratch. ResNet50 required less training time than ResNet152.
(by about 2–3 times) than that of train from scratch. ResNet50 required less training time than In the seedling and jointing
sub-datasets,
ResNet152. ResNet50 took the
In the seedling least
and time forsub-datasets,
jointing transfer learning—only aboutthe
ResNet50 took eight minutes—whereas
least time for transfer it
took about sixteen minutes to train from scratch. ResNet152 transfer learning
learning—only about eight minutes—whereas it took about sixteen minutes to train from scratch. took about nineteen minutes,
while train from scratch took about forty-one minutes. The accuracy of transfer learning in the two models
Symmetry 2019, 11, 256 7 of 14

ResNet152 transfer learning took about nineteen minutes, while train from scratch took about forty-one
minutes.
Symmetry The accuracy
2019, 11, x of transfer learning in the two models was higher than that of train 7 offrom
14
scratch in all datasets. At the same time, both models achieved high accuracy, and the difference in
was higher
accuracy than that
between of train and
ResNet50 fromResNet152
scratch in allisdatasets. At the same
not significant. time, both models
Considering the timeachieved high
consumption,
weaccuracy, and the difference
chose ResNet50 in accuracy
as our model between
for the ResNet50 and ResNet152
next analysis, we trainedis the
not significant.
model with Considering
the transfer
the timemethod.
learning consumption, we chose ResNet50 as our model for the next analysis, and we trained the model
with the transfer learning method.

Figure
Figure4. 4.Changes
Changesinincross
crossentropy
entropy of
of color
color images: (a) total
images: (a) totaldataset;
dataset;(b)
(b)seedling
seedling stage
stage dataset;
dataset; (c) (c)
jointing stage dataset.
jointing stage dataset.
Table 2. Comparison of the accuracy and training times of the different models.
Table 2. Comparison of the accuracy and training times of the different models.
ResNet50 ResNet152
Stage Image
ImageType Training
Training Mode ResNet50 ResNet152
Stage Time (m) Accuracy (%) Time (m) Accuracy (%)
type mode Time (m) Accuracy (%) Time (m) Accuracy (%)
TL 1 7.59 98.23 ± 0.03 19.48 98.17 ± 0.14
Color TLTFS1 2 7.59 16.35 98.23±0.03 19.48 98.17±0.14
Color 95.67 ± 0.52 41.47 96.00 ± 0.25
Seedling
TFS 2
TL 16.35 7.58 95.67±0.52
96.58 ± 0.38 41.4720.32 96.00±0.25
96.50 ± 0.43
Seedling Gray
TLTFS 91.33 ± 1.38 20.3242.15
7.58 16.54 96.58±0.38 93.67 ± 0.14
96.50±0.43
Gray
TFSTL 97.58 ± 0.14 42.1519.09
16.54 7.58 91.33±1.38 97.75 ± 0.25
93.67±0.14
Color
TFS 16.16 95.67 ± 1.01 43.48 95.08 ± 0.63
Jointing TLTL 7.58 7.40 97.58±0.14
96.58 ± 0.14
19.0919.50 97.75±0.25
96.58 ± 0.29
Color
Gray TFSTFS 16.16 16.16 95.67±1.01
93.25 ± 1.39 43.4841.20 95.08±0.63
94.00 ± 0.66
Jointing
TLTL 7.40 11.28 96.58±0.14
96.00 ± 0.50 19.5027.10 96.58±0.29
95.09 ± 0.30
Gray
Color
TFSTFS 16.16 23.55 93.25±1.39
95.99 ± 1.00 41.2048.48 94.00±0.66
94.85 ± 1.31
Total
TL TL 11.28 10.42 94.95
96.00±0.50 ± 0.17 27.1030.10 93.42 ± 0.25
95.09±0.30
Gray
Color TFS 23.35 91.33 ± 0.50 57.18 92.04 ± 1.46
TFS 23.55 95.99±1.00 48.48 94.85±1.31
Total 1 TL means transfer learning; 2 TFS means train from scratch; m stands for minutes.
TL 10.42 94.95±0.17 30.10 93.42±0.25
Gray
TFS 23.35 91.33±0.50
3.2. Identification and Classification of Drought Stress in Maize by DCNN 57.18 92.04±1.46
1 TL means transfer learning; 2 TFS means train from scratch; m stands for minutes.
The classification confusion matrix is shown in Figure 5. The classification confusion matrix
revealed that mostand
3.2. Identification maize images of
Classification were classified
Drought correctly.
Stress in Maize byDespite
DCNN erroneous classification taking
place among the three drought stress treatments, two linked moisture treatments were more prone
The classification confusion matrix is shown in Figure 5. The classification confusion matrix
to misclassification. Optimum moisture was easily misclassified as light drought; light drought
revealed that most maize images were classified correctly. Despite erroneous classification taking
was easily misclassified as optimum moisture and moderate drought; and moderate drought was
place among the three drought stress treatments, two linked moisture treatments were more prone
easily misclassified asOptimum
to misclassification. light drought.
moistureBased on themisclassified
was easily confusion matrix,
as light the accuracy
drought; light levels
droughtof was
maize
drought identification and classification were calculated to be over 95% (Figure
easily misclassified as optimum moisture and moderate drought; and moderate drought was easily 6). Across the total
dataset, no significant
misclassified as lightdifferences wereon
drought. Based observed between
the confusion the color
matrix, and graylevels
the accuracy imageofaccuracy in maize
maize drought
drought
identification and classification were calculated to be over 95% (Figure 6). Across the total dataset, nothe
stress identification, and the identification accuracy was 98.14%. However, regarding
classification accuracy of
significant differences maize
were drought
observed stress,the
between thecolor
color
andimages were more
gray image accurate
accuracy than
in maize the gray
drought
images.
stress These accuracy levels
identification, and thewereidentification
slightly loweraccuracy
than those wasof the seedling
98.14%. and jointing
However, sub-datasets.
regarding the
Atclassification
the seedlingaccuracy
stage, theofidentification
maize droughtand classification
stress, accuracy
the color images werelevels
moreofaccurate
the modelthanwere 98.92%
the gray
andimages.
98.33%, These accuracy levels
respectively, for thewere slightly
wheat colorlower
imagesthan
andthose of the
98.08% seedling
and 96.50%,and jointing sub-datasets.
respectively, for the gray
At the At
images. seedling stage, stage,
the jointing the identification and classification
the identification accuracyaccuracy
and classification levels of levels
the model
of thewere 98.92%
color images
and 98.33%, respectively, for the wheat color images and 98.08% and 96.50%, respectively, for the
gray images. At the jointing stage, the identification and classification accuracy levels of the color
Symmetry
Symmetry 2019,
2019, 11,
11, x256 88 of
of 14
14
Symmetry 2019, 11, x 8 of 14
images were higher than those of the gray images—99.00% and 97.58%, respectively, for the color
images
were
images were97.83%
higher
and higher
than than
those
and of those
the gray
96.58%, ofimages—99.00%
the gray images—99.00%
respectively, for theand
gray and
97.58%, 97.58%, respectively,
respectively,
images. for the colorfor the color
images and
images and
97.83% and 96.58%,
97.83% and 96.58%, respectively,
respectively, for the gray images.
for the gray images.

(a) (b) (c)


(a) (b) (c)

(d) (e) (f)


(d) (e) (f)
Figure
Figure5.5.Confusion
Confusion matrix ResNet50.
matrix ResNet50.(a), (b) and
(a–c) (c)the
are arecolor
the color images
images of the
of the total,seedling,
total, seedling,and
andjointing
jointing
Figure
stage 5. Confusion matrix ResNet50. (a), (b) and (c) are the color images of the total, seedling,
datasets. jointing
and
stagedatasets;
datasets;(d), (e) are
(d–f) andthe
(f) are
graythe gray images
images of theseedling
of the total, total, seedling
stage,stage, and jointing
and jointing datasets.
stage datasets; (d), (e) and (f) are the gray images of the total, seedling stage, and jointing datasets.

Figure 6. The accuracy of drought identification (DI) and drought classification (DC): (a) total dataset;
Figure 6. The accuracy of drought identification (DI) and drought classification (DC): (a) total dataset;
(b) seedling
Figure stage
6. Thestage dataset;
accuracy (c) jointing
of drought stage dataset.
identification (DI) and drought classification (DC): (a) total dataset;
(b) seedling dataset; (c) jointing stage dataset.
(b) seedling stage dataset; (c) jointing stage dataset.
3.3. A Comparison between DCNN and Traditional Machine Learning
3.3. A Comparison between DCNN and Traditional Machine Learning
3.3. AWe comparedbetween
Comparison DCNN DCNN with theand traditional
Traditionalmachine
Machine learning
Learning method (GBDT) previously published
by our Weteam compared
using theDCNN with the
same dataset traditional
as them [9]. The machine
test confusionlearningmatrix method
is shown (GBDT)
in Tablepreviously
3. Most of
We
published compared
by our teamDCNNusing with
the the
same traditional
dataset as machine
them [9]. learning
The test method
confusion (GBDT)
matrix ispreviously
shown in
the data were correctly classified, with only a small part misclassified. The accuracy comparison
published
Table 3. Mostby of
our
the team
data using
were the sameclassified,
correctly dataset aswith
them [9].aThe
only small test
partconfusion matrixThe
misclassified. is shown
accuracy in
between ResNet50 and GBDT, as seen in Table 4, demonstrated that ResNet50 can achieve a high
Table 3. Mostbetween
comparison of the data were correctly classified, with only a small part misclassified. The accuracy
accuracy level that is ResNet50
significantly andgreater
GBDT, as seen
than that in Table
of GBDT. 4,The
demonstrated
color imagethat ResNet50
accuracy levelscanof
comparison
achieve a highbetween
accuracy ResNet50
level that and
is GBDT,
significantly as seen
greater in Table
than that 4, ofdemonstrated
GBDT. The that
color ResNet50
image accuracycan
drought stress identification (DI) and drought stress classification (DC) reached up to 98% and 91%,
achieve adrought
levels high accuracy level that is significantly greater stress
than that of GBDT. The color image upaccuracy
higherofthan thosestress identification
of GBDT—7.61% (DI) 10.05%,
and and droughtrespectively. classification
Gray images (DC) reached
also achieved tobetter
98%
levels
and of drought
91%, higher stress
than thoseidentification
of of
GBDT—7.61%(DI) and drought
and stress classification
10.05%, respectively. (DC)
Gray images reached up to 98%
accuracy. The accuracy levels drought stress identification (DI) and drought stressalso achieved
classification
and
better 91%, higher
accuracy. than
The those
accuracyof GBDT—7.61% and 10.05%, respectively. Gray images also achieved
(DC) reached up to 93% and 88%,levels of drought
respectively, whichstress identification
was obviously higher(DI)thanandthosedrought stress
of the GBDT
better accuracy.
classification (DC) The accuracy
reached up to levels
93% and of88%,
drought stress identification
respectively, which was (DI) and
obviously drought
higher than stress
those
model. Meanwhile, from the point of view of time, it was also time-saving to use ResNet50 transfer
classification
of the GBDT (DC)
model. reached up
Meanwhile, to 93%
from and 88%,
the pointrespectively,
of view which
of time, was obviously
it wasthe also higher
time-saving than tothose
use
learning to identify and classify maize drought. When training 100 epochs, color images and gray
of the
ResNet50 GBDT model.
transfer Meanwhile,
learning to from
identify the
and point
classify of view
maize of time,
drought. it
When was also
training time-saving
100 epochs,to the
use
images only took about six seconds to complete this process.
ResNet50
color images transfer
and gray learning
images toonly
identify
tookandaboutclassify maizetodrought.
six seconds complete When training 100 epochs, the
this process.
color images and gray images only took about six seconds to complete this process.
Symmetry 2019, 11, 256 9 of 14

Table 3. Confusion matrix of the two types of image. OM: optimum moisture, LD: light drought stress,
and MD: moderate drought stress.

Color Image Gray Image


Treatment
OM LD MD OM LD MD
OM 30 1 2 30 1 0
LD 1 29 0 5 25 0
MD 0 7 32 1 5 33

Table 4. Comparison between the DCNN model ResNet50 and the traditional machine learning model
(Gradient Boosting Decision Tree GBDT).

ResNet50
Model GBDT
Color Image Gray Image
DI (%) 90.39 98.00 93.00
DC (%) 80.95 91.00 88.00
Time (s) - 6 6

4. Discussion

4.1. Extraction of the Phenotypic Characteristics of Maize by DCNN


Maize crops are sensitive to drought stress. Plants have developed several physiological
mechanisms to mitigate the impact of drought stress and display a series of drought phenotypes [11,31].
The identification of maize drought by phenotypic characteristics occurs in real-time and is accurate,
and rapid. Generally, traditional machine learning and deep learning are common methods for crop
stress identification [12,22,32]. The difference between them is the use of the feature extraction method.
Traditional machine learning needs to segment the target image and extract features manually, while
DCNN automatically extracts image features by convolution layers [19]. In this study, we used
the transfer learning method of DCNN to identify and classify maize drought stress. The results
showed that DCNN had a good effect on the identification and classification of maize drought
stress. In the dataset as a whole, the identification and classification accuracy levels of the model
for wheat color images were 98.14% and 95.95%, respectively. High accuracy was also achieved
on the sub-datasets of the seedling and jointing stages and the accuracy levels of identification and
classification were significantly higher than those of the traditional machine learning method (GBDT)
on the same dataset [9]. These results are in agreement with those previously reported by other stress
studies based on deep learning [26,33,34], likely because traditional machine learning extracts maize
phenotypic features manually, and only extracts color and texture features, leading to incomplete
extraction of phenotypic features. The phenotypic characteristics of maize under drought stress are
complex. In addition to color and texture, morphological characteristics, such as leaf rolling and
plant height, play an important role in drought stress identification [10]. DCNN can extract more
effective information through the use of many convolution layers. Thus, DCNN obtained a high
level of accuracy in identifying and classifying maize drought stress—significantly higher than that
of traditional machine learning. In recent years, there has been an increasing number of studies on
biotic and abiotic stress recognition by deep learning [22]. Juncheng et al. reported four cucumber leaf
diseases that were identified by DCNN and indicated that DCNN achieves high recognition accuracy
that was significantly higher than that obtained by the random forest and support vector machine
methods [35]. Balaji et al. also revealed that the recognition accuracy of the deep learning method was
higher than that of traditional machine learning based on color, texture, and morphology [23]. In our
study, maize images were captured from fields. The background of the maize images was complex and
included features such as soil, weeds, and other sundries, which were also influenced by the wind, light,
and shadows [36]. Thus, accurate segmentation of the field maize images was difficult. Under drought
Symmetry 2019, 11, 256 10 of 14

stress,
Symmetrythere
2019,were
11, x some abstract phenotypic characteristics such as leaf rolling and leaf inclination 10 [10],
of 14
which are important phenotypic characteristics of drought stress in maize; however, the effective
and complete
phenotypic segmentation
characteristics and extraction
of drought stress inofmaize;
this information
however, the manually
effectiveremains a challenge
and complete [19,37].
segmentation
DCNN extractsofimage
and extraction features through
this information manually continuous
remains aconvolution layersDCNN
challenge [19,37]. and therefore
extracts can extract
image the
features
phenotypic characteristics of the images comprehensively [38]. In general, the
through continuous convolution layers and therefore can extract the phenotypic characteristics of the color, texture, and edge
characteristic information
images comprehensively ofIn
[38]. images
general,is the
mainly
color,extracted
texture, andfromedgethecharacteristic
convolutioninformation
layer in theofshallow
images
layers, and the more abstract phenotypic information is extracted from the
is mainly extracted from the convolution layer in the shallow layers, and the more abstract phenotypicdeeper convolution layer,
which is more
information intelligentfrom
is extracted [39].theIn our maize
deeper drought recognition
convolution layer, whichmodel,is morethe shallow convolution
intelligent layer
[39]. In our maize
extracts
drought color, edge and
recognition morphology
model, information
the shallow of maize,
convolution layerwhile the deep
extracts convolution
color, edge andlayer extracts
morphology
texture features
information and more
of maize, whileabstract
the deep phenotypic
convolutionfeatures.
layerAlthough deep abstract
extracts texture featuresphenotypic
and more features
abstract
are difficultfeatures.
phenotypic to understand
Althoughby human beings,phenotypic
deep abstract they play an important
features role in the
are difficult maize drought
to understand stress
by human
identification
beings, they play (Figure 7b). Meanwhile,
an important it is time-saving
role in the maize drought stressto use deep learning.
identification (FigureOn7b).the total dataset,
Meanwhile, it is
transfer learning of resnet50 and Resnet152 only took about ten and thirty
time-saving to use deep learning. On the total dataset, transfer learning of resnet50 and Resnet152 only minutes, respectively.
Thus, when the
took about ten amount
and thirtyof data is sufficient,
minutes, the identification
respectively. Thus, when and classification
the amount of accuracy of the DCNN
data is sufficient, the
method is generally better than that of traditional machine learning. It can not
identification and classification accuracy of the DCNN method is generally better than that of traditional only achieve better
results,
machinebut also saves
learning. It can time
not onlyduring thebetter
achieve imageresults,
segmentation and time
but also saves data during
screen thesteps
imageandsegmentation
reduces the
running
and data time
screenofsteps
the model.
and reduces the running time of the model.

(a) (b)
Figure 7. Visualization of the feature map in the initial layers of the deep learning model. (a) is the
Figure 7. Visualization of the feature map in the initial layers of the deep learning model. (a) is the
original image; (b) is the visualization image, where the first row is part of feature maps of the first
original image; (b) is the visualization image, where the first row is part of feature maps of the first
convolutional layer, the second row is part of feature maps of the second convolutional layer, and the
convolutional layer, the second row is part of feature maps of the second convolutional layer, and the
third row is part of feature maps of the third convolutional layer.
third row is part of feature maps of the third convolutional layer.
4.2. Comparison of the Accuracy of Color and Gray Images of Maize at Different Stages
4.2. Comparison of the Accuracy of Color and Gray Images of Maize at Different Stages
In this study, we compared the accuracy levels of drought stress identification and classification of
maizeIncolor
this study,
and gray weimages.
compared the whole
In the accuracy levels the
datasets, of drought
droughtstress
stressidentification
identification andand classification
classification
of maize color and gray images. In the whole datasets, the drought stress
accuracy levels were higher in the color images than in the gray images. This is most likely because color identification and
classificationisaccuracy
information one of thelevels were higher
important in the color
characteristics images than
of drought stressin[40,41].
the gray images.
After Thisthe
graying, is color
most
likely because color information is one of the important characteristics of
information of the image is lost, which makes the model more likely to be confused. Mohanty etdrought stress [40,41]. After
graying,
al. the color
also found thatinformation
the accuracy of the image
of the coloris lost,
image which
was makes
higherthe model
than that more
of thelikely
gray to be confused.
image in all of
the experiments during biotic stress identification [42]. For drought stress, color informationimage
Mohanty et al. also found that the accuracy of the color image was higher than that of the gray is an
in all of theindicator.
important experiments during
Under biotic stress
optimum moistureidentification
conditions, [42].
theFor drought stress,
phenotypic color information
characteristics of maize
is an important
leaves indicator.
did not change, Under
so the optimum
leaf color moisture
was always conditions,
green. However, theunder
phenotypic
droughtcharacteristics of
stress, the color
maize leaves did not change, so the leaf color was always green. However,
and morphology of maize leaves changed. Under light drought stress, the growth of maize was under drought stress, the
color and slow,
relatively morphology
the colorofof
maize
maize leaves
leaves changed.
lost lusterUnder lightand
at noon, drought stress,
the leaves the growth
were slightly of maize
rolled; was
under
relatively slow,
moderate drought thestress,
color of
themaize
plants leaves
grewlost luster
slowly andatdwarfed,
noon, andand thethe
leaves were
maize lostslightly rolled;
its luster under
and leaves
moderate drought stress, the plants grew slowly and dwarfed, and the maize lost its luster and leaves
rolled; however, it was basically restored to normal in the evening. For different levels of drought
stress, the recovery time of the leaves differed during the day [10]. When the color image of maize
Symmetry 2019, 11, 256 11 of 14

Symmetry 2019, 11, x 11 of 14


rolled; however, it was basically restored to normal in the evening. For different levels of drought
was transformed
stress, intotime
the recovery a gray image,
of the thediffered
leaves color information
during thewas
daylost;
[10].thus,
When thethe
accuracy of gray
color image ofimage
maize
identification and classification was lower than that of color images.
was transformed into a gray image, the color information was lost; thus, the accuracy of gray image
identification and classification was lower than that of color images.
4.3. Misclassified Image Analysis
4.3. Misclassified Image Analysis
In this study, we analyzed the misclassification images and a portion of the misclassified
samples is shown
In this study,inweFigure 8. Figures
analyzed 8a–c are seedling images
the misclassification maize samples, where Figure
and a portion of the 8a shows the
misclassified
optimum
samples is moisture
shown misclassified
in Figure 8. as light drought
Figure 8a–c arestress samples,
seedling maizeFigure 8b shows
samples, where light drought
Figure stress
8a shows
misclassified as optimum moisture samples, and Figure 8c shows the
the optimum moisture misclassified as light drought stress samples, Figure 8b shows light drought moderate drought stress
misclassified as lightasdrought
stress misclassified optimum stress samples.
moisture Figures
samples, and8d–f show
Figure the jointing
8c shows stage maize
the moderate droughtsamples,
stress
where Figure 8d
misclassified as is the drought
light optimumstress moisture misclassified
samples. as light
Figure 8d–f showdrought stress samples,
the jointing stage maizeand samples,
Figures
8e,f
whereshow the8dmoderate
Figure drought
is the optimum stressmisclassified
moisture misclassified as light
as light drought
drought stressstress.
samples,Analysis
and Figure of the
8e,f
misclassified imagesdrought
show the moderate showedstress
that although
misclassifiedeachas maize
light image
drought sample
stress.had a defined
Analysis drought
of the stress
misclassified
level
imageslabel, therethat
showed were differenteach
although drought
maizestress
imagemaize
sample plants
had and different
a defined phenotypic
drought characteristics
stress level label, there
in a single
were image.
different For example,
drought in theplants
stress maize same image, some leaves
and different were rolled
phenotypic and others
characteristics in were
a singlenot.image.
This
led
Forto incorrectinclassification
example, the same image, of the somesamples,
leaveslikely
were due to and
rolled the uneven distribution
others were not. Thisofled
soiltomoisture,
incorrect
although the drought
classification level of each
of the samples, plot
likely wastothe
due thesame for each
uneven 2 m2 plot of
distribution with
soila total of 12 maize
moisture, although plants.
the
Soil moisture
drought leveldistribution
of each plot wasin thethesame
same plot
forwas
eachuneven 2
2 m plotdue to the
with inaccuracy
a total of 12 maizeof water supply
plants. and the
Soil moisture
different evaporation
distribution in the samecapacity
plot atwas different
unevenlocations.
due to the Maize plants are
inaccuracy of sensitive
water supplyto drought
and the stress, and
different
the drought degree of maize plants in different locations in the same plot
evaporation capacity at different locations. Maize plants are sensitive to drought stress, and themay be different, leading
to different
drought phenotypic
degree of maize characteristics
plants in differentin terms of leafincolor,
locations texture,
the same plotand
maymorphology,
be different,which
leading can
to
result in incorrect
different phenotypic model classification.
characteristics In addition,
in terms as atexture,
of leaf color, maize plant has many leaves,
and morphology, which under
can resultlight
in
drought
incorrectstress,
modelthe new leavesInofaddition,
classification. maize are as prone
a maizetoplant
rolling,
has while
many the old under
leaves, leaveslight
remain stretched.
drought stress,
Therefore, obtaining
the new leaves of maize only part of
are prone the corn
to rolling, image
while may
the old alsoremain
leaves lead stretched.
to model Therefore,
identification and
obtaining
classification errors.
only part of the corn image may also lead to model identification and classification errors.

(a) (b) (c)

(d) (e) (f)

Figure 8.
Figure Misclassifiedsamples:
8. Misclassified samples:(a)
(a)optimum
optimummoisture
moisturemisclassified
misclassifiedas
aslight
lightdrought
droughtstress
stresssamples,
samples,
(b) light drought stress misclassified as optimum moisture samples, (c) moderate
(b) light drought stress misclassified as optimum moisture samples, (c) moderate drought stressdrought stress
misclassified as light drought stress samples. (d–f) jointing stage maize samples. (d) optimum
misclassified as light drought stress samples. (d, e, f) jointing stage maize samples. (d) optimum moisture
misclassified
moisture as a light drought
misclassified stress
as a light sample,
drought (e,f) moderate
stress sample, (e)drought
and (f)stress misclassified
moderate drought as astress
light
drought stress
misclassified assample.
a light drought stress sample.

We acknowledge that the amount of data we acquired was relatively small for the deep learning
We acknowledge that the amount of data we acquired was relatively small for the deep learning
model, and we did not design and train our own models. However, this did not affect the identification
model, and we did not design and train our own models. However, this did not affect the
and classification accuracy of drought stress in maize by DCNN. In general, when datasets were small,
identification and classification accuracy of drought stress in maize by DCNN. In general, when
datasets were small, the transfer learning method not only achieved good results, but also saved time
and was effective in the field of image classification. Transfer learning is not very strict in terms of
Symmetry 2019, 11, 256 12 of 14

the transfer learning method not only achieved good results, but also saved time and was effective
in the field of image classification. Transfer learning is not very strict in terms of data volume and
hardware requirements and has good applicability. In addition, although a few maize images were
misclassified due to the uneven distribution of soil moisture, this had little effect on maize drought
identification and classification.

5. Conclusions
We proposed a novel end-to-end automated pipeline for drought stress classification in maize
based on digital images. DCNN had a good effect on the identification and classification of maize
drought stress, and it is feasible to identify drought stress by transfer learning in the case of small
datasets and a field environment. Our method does not require segmentation, regardless of the
background, so it saves a lot of time. In the dataset as a whole, the accuracy levels of identification
and classification of drought stress were 98.14% and 95.95%, respectively. High accuracy was also
achieved on the sub-datasets of the seedling and jointing stages. When the datasets are limited in the
research of maize drought identification, transfer learning is a good choice. Through a comparison of
the tests between DCNN and traditional machine learning on the same dataset, we concluded that
the DCNN method is significantly better than the traditional machine learning method. Through the
comparison of different types of images, the accuracy levels of the identification and classification
of maize color images were higher than those of the gray images, as color information is one of the
important phenotypic characteristics of drought stress.
It is suggested that the DCNN method not only saves time, but also obtains high accuracy in
maize drought identification and classification. Based on digital images, the detection of crop water
stress using the DCNN method could be applied to a complex field environment. Future research will
focus on more degrees of drought classification assessment, which is an intensified process on the soil
water content and is more accurate when identifying and classifying maize drought stress according to
the degree of rolling and the time of leaf rolling and stretching. Furthermore, a trained model based on
digital image, developed into an app, installed on mobile devices (unmanned aerial vehicles, smart
phones, etc.), is proposed to provide real-time, accurate and wide-ranging monitoring and warning of
maize drought stress.

Author Contributions: Conceptualization, M.L. and J.A.; Methodology, M.L.; Software, W.L. and J.A.; Formal
Analysis, J.A. and S.C.; Writing—Original Draft Preparation, J.A.; Writing—Review and Editing, J.A. and W.L.;
Supervision, M.L.; Project Administration, M.L.; Funding Acquisition, M.L.
Funding: This work was funded by Projects in National Science Technology Pillar Program during the Twelfth
Five-year Plan Period (2012BAD20B01) and National Natural Science Foundation of China (No. 61771471).
Conflicts of Interest: No conflicts of interest exist in the submission of this manuscript and the manuscript has
been approved by all authors for publication.

References
1. Sun, Q.; Liang, X.L.; Zhang, D.G.; Li, X.H.; Hao, Z.F.; Weng, J.F.; Li, M.S.; Zhang, S.H. Trends in drought
tolerance in Chinese maize cultivars from the 1950s to the 2000s. Field Crops Res. 2017, 201, 175–183.
[CrossRef]
2. Anwar, S.; Iqbal, M.; Akram, H.M.; Niaz, M.; Rasheed, R. Influence of Drought Applied at Different Growth
Stages on Kernel Yield and Quality in Maize (Zea Mays L.). Commun. Soil Sci. Plant Anal. 2016, 47, 2225–2232.
[CrossRef]
3. Jiang, P.; Cai, F.; Zhao, Z.-Q.; Meng, Y.; Gao, L.-Y.; Zhao, T.-H. Physiological and Dry Matter Characteristics
of Spring Maize in Northeast China under Drought Stress. Water 2018, 10, 1561. [CrossRef]
4. Comas, L.H.; Trout, T.J.; DeJonge, K.C.; Zhang, H.; Gleason, S.M. Water productivity under strategic growth
stage-based deficit irrigation in maize. Agric. Water Manag. 2019, 212, 433–440. [CrossRef]
5. Zhao, H.; Xu, Z.; Zhao, J.; Huang, W. A drought rarity and evapotranspiration-based index as a suitable
agricultural drought indicator. Ecol. Indic. 2017, 82, 530–538. [CrossRef]
Symmetry 2019, 11, 256 13 of 14

6. Jones, H.G. Irrigation scheduling: Advantages and pitfalls of plant-based methods. J. Exp. Bot. 2004, 55,
2427–2436. [CrossRef] [PubMed]
7. Mangus, D.L.; Sharda, A.; Zhang, N. Development and evaluation of thermal infrared imaging system
for high spatial and temporal resolution crop water stress monitoring of corn within a greenhouse.
Comput. Electron. Agric. 2016, 121, 149–159. [CrossRef]
8. Prive, J.P.; Janes, D. Evaluation of plant and soil moisture sensors for the detection of drought stress in
raspberry. In Environmental Stress and Horticulture Crops; Tanino, K., Arora, R., Graves, B., Griffith, M.,
Gusta, L.V., Junttila, O., Palta, J., Wisniewski, M., Eds.; International Society Horticultural Science: Leuven,
Belgium, 2003; pp. 391–396.
9. Zhuang, S.; Wang, P.; Jiang, B.; Li, M.S.; Gong, Z.H. Early detection of water stress in maize based on digital
images. Comput. Electron. Agric. 2017, 140, 461–468. [CrossRef]
10. Baret, F.; Madec, S.; Irfan, K.; Lopez, J.; Comar, A.; Hemmerle, M.; Dutartre, D.; Praud, S.; Tixier, M.H.
Leaf-rolling in maize crops: From leaf scoring to canopy-level measurements for phenotyping. J. Exp. Bot.
2018, 69, 2705–2716. [CrossRef]
11. Avramova, V.; Nagel, K.A.; AbdElgawad, H.; Bustos, D.; DuPlessis, M.; Fiorani, F.; Beemster, G.T.
Screening for drought tolerance of maize hybrids by multi-scale analysis of root and shoot traits at the
seedling stage. J. Exp. Bot. 2016, 67, 2453–2466. [CrossRef]
12. Singh, A.; Ganapathysubramanian, B.; Singh, A.K.; Sarkar, S. Machine Learning for High-Throughput Stress
Phenotyping in Plants. Trends Plant Sci. 2016, 21, 110–124. [CrossRef]
13. Naik, H.S.; Zhang, J.; Lofquist, A.; Assefa, T.; Sarkar, S.; Ackerman, D.; Singh, A.; Singh, A.K.;
Ganapathysubramanian, B. A real-time phenotyping framework using machine learning for plant stress
severity rating in soybean. Plant Methods 2017, 13, 23. [CrossRef] [PubMed]
14. Chemura, A.; Mutanga, O.; Sibanda, M.; Chidoko, P. Machine learning prediction of coffee rust severity on
leaves using spectroradiometer data. Trop. Plant Pathol. 2017, 43, 117–127. [CrossRef]
15. Huang, K.-Y. Application of artificial neural network for detecting Phalaenopsis seedling diseases using
color and texture features. Comput. Electron. Agric. 2007, 57, 3–11. [CrossRef]
16. Zakaluk, R.; Sri Ranjan, R. Artificial Neural Network Modelling of Leaf Water Potential for Potatoes Using
RGB Digital Images: A Greenhouse Study. Potato Res. 2007, 49, 255–272. [CrossRef]
17. Raza, S.E.; Smith, H.K.; Clarkson, G.J.; Taylor, G.; Thompson, A.J.; Clarkson, J.; Rajpoot, N.M.
Automatic detection of regions in spinach canopies responding to soil moisture deficit using combined
visible and thermal imagery. PLoS ONE 2014, 9, e97612. [CrossRef] [PubMed]
18. Liu, B.; Zhang, Y.; He, D.; Li, Y. Identification of Apple Leaf Diseases Based on Deep Convolutional Neural
Networks. Symmetry 2017, 10, 11. [CrossRef]
19. Ubbens, J.R.; Stavness, I. Deep Plant Phenomics: A Deep Learning Platform for Complex Plant Phenotyping
Tasks. Front. Plant Sci. 2017, 8, 1190. [CrossRef]
20. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [CrossRef]
21. Le Cun, Y.; Boser, B.; Denker, J.; Henderson, D.; Howard, R.E.; Hubbard, W.; Jackel, L. Handwritten Digit
Recognition with a Back-Propagation Network. In Advances in Neural Information Processing Systems 2;
Morgan Kaufmann Publishers Inc.: San Francisco, CA, USA, 1990; pp. 396–404.
22. Singh, A.K.; Ganapathysubramanian, B.; Sarkar, S.; Singh, A. Deep Learning for Plant Stress Phenotyping:
Trends and Future Perspectives. Trends Plant Sci. 2018, 23, 883–898. [CrossRef]
23. Veeramani, B.; Raymond, J.W.; Chanda, P. DeepSort: Deep convolutional networks for sorting haploid maize
seeds. BMC Bioinform. 2018, 19, 85–93. [CrossRef] [PubMed]
24. Uzal, L.C.; Grinblat, G.L.; Namías, R.; Larese, M.G.; Bianchi, J.S.; Morandi, E.N.; Granitto, P.M. Seed-per-pod
estimation for plant breeding using deep learning. Comput. Electron. Agric. 2018, 150, 196–204. [CrossRef]
25. Ghosal, S.; Blystone, D.; Singh, A.K.; Ganapathysubramanian, B.; Singh, A.; Sarkar, S. An explainable deep
machine vision framework for plant stress phenotyping. Proc. Natl. Acad. Sci. USA 2018, 15, 4613–4618.
[CrossRef]
26. Ferentinos, K.P. Deep learning models for plant disease detection and diagnosis. Comput. Electron. Agric.
2018, 145, 311–318. [CrossRef]
27. Donahue, J.; Jia, Y.; Vinyals, O.; Hoffman, J.; Zhang, N.; Tzeng, E.; Darrell, T. DeCAF: A Deep Convolutional
Activation Feature for Generic Visual Recognition. In Proceedings of the International Conference on
International Conference on Machine Learning, Beijing, China, 21–26 June 2014; p. I-647.
Symmetry 2019, 11, 256 14 of 14

28. Li, E.; Chen, Y.; Li, X.; Li, M. Technical Specification for Field Investigation and Classification of Maize
Disaster. Industry Standard Agriculture. 2012. Available online: https://www.docin.com/p-1635041785.html
(accessed on 16 February 2019).
29. Nivin, T.W.; Scott, G.J.; Davis, C.H.; Marcum, R.A. Rapid broad area search and detection of Chinese
surface-to-air missile sites using deep convolutional neural networks. J. Appl. Remote Sens. 2017, 11, 042614.
30. He, K.M.; Zhang, X.Y.; Ren, S.Q.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings
of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CPVR), Las Vegas, NV, USA,
26 June–1 July 2016; pp. 770–778.
31. Masuka, B.; Araus, J.L.; Das, B.; Sonder, K.; Cairns, J.E. Phenotyping for abiotic stress tolerance in maize.
J. Integr. Plant Biol. 2012, 54, 238–249. [CrossRef] [PubMed]
32. Behmann, J.; Mahlein, A.-K.; Rumpf, T.; Römer, C.; Plümer, L. A review of advanced machine learning
methods for the detection of biotic stress in precision crop protection. Precis. Agric. 2014, 16, 239–260.
[CrossRef]
33. Hasan, M.M.; Chopin, J.P.; Laga, H.; Miklavcic, S.J. Detection and analysis of wheat spikes using
Convolutional Neural Networks. Plant Methods 2018, 14, 100. [CrossRef]
34. Yalcin, H. Phenology Recognition using Deep Learning. In Proceedings of the International Conference on
Agro-Geoinformatics, Fairfax, VA, USA, 7–10 August 2017; pp. 1–5.
35. Ma, J.; Du, K.; Zheng, F.; Zhang, L.; Gong, Z.; Sun, Z. A recognition method for cucumber diseases using leaf
symptom images based on deep convolutional neural network. Comput. Electron. Agric. 2018, 154, 18–24.
[CrossRef]
36. Li, L.; Zhang, Q.; Huang, D. A review of imaging techniques for plant phenotyping. Sensors 2014, 14,
20078–20111. [CrossRef]
37. Xiong, X.; Duan, L.; Liu, L.; Tu, H.; Yang, P.; Wu, D.; Chen, G.; Xiong, L.; Yang, W.; Liu, Q. Panicle-SEG:
A robust image segmentation method for rice panicles in the field based on deep learning and superpixel
optimization. Plant Methods 2017, 13, 104. [CrossRef] [PubMed]
38. Koushik, J. Understanding Convolutional Neural Networks. 2016. Available online: https://arxiv.org/pdf/
1605.09081.pdf (accessed on 16 February 2019).
39. Fergus, M.D.Z.R. Visualizing and Understanding Convolutional Networks. arXiv, 2013; arXiv:1311.2901.
40. Jiang, B.; Wang, P.; Zhuang, S.; Li, M.; Li, Z.; Gong, Z. Detection of maize drought based on texture and
morphological features. Comput. Electron. Agric. 2018, 151, 50–60. [CrossRef]
41. Han, W.; Sun, Y.; Xu, T.; Chen, X.; Su Ki, O. Detecting maize leaf water status by using digital RGB images.
Int. J. Agric. Biol. Eng. 2014, 7, 45–53.
42. Mohanty, S.P.; Hughes, D.P.; Salathe, M. Using Deep Learning for Image-Based Plant Disease Detection.
Front. Plant Sci. 2016, 7, 1419. [CrossRef] [PubMed]

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like