Professional Documents
Culture Documents
A R T I C L E I N F O A B S T R A C T
Keywords: Rice is one of the most important crops in the world, and most people consume rice as a staple food, especially in
Rice disease identification Asian countries. Various rice plant diseases have a negative effect on crop yields. If proper detection is not taken,
Transfer learning they can spread and lead to a significant decline in agricultural productions. In severe cases, they may even cause
Convolutional neural networks
no grain harvest entirely, thus having a devastating impact on food security. The deep learning-based CNN
Image classification
methods have become the standard methods to address most of the technical challenges related to image
identification and classification. In this study, to enhance the learning capability for minute lesion features, we
chose the MobileNet-V2 pre-trained on ImageNet as the backbone network and added the attention mechanism
to learn the importance of inter-channel relationship and spatial points for input features. In the meantime, the
loss function was optimized and the transfer learning was performed twice for model training. The proposed
procedure presents a superior performance relative to other state-of-the-art methods. It achieves an average
identification accuracy of 99.67% on the public dataset. Even under complicated backdrop conditions, the
average accuracy reaches 98.48% for identifying rice plant diseases. Experimental findings demonstrate the
validity of the proposed procedure, and it is accomplished efficiently for rice disease identification.
1. Introduction disease outbreaks can be prevented by this means, there are not
adequate plant disease specialists compared with the number of farmers
The world’s population is expected to increase by 2 billion persons in in many areas. In addition, due to social or economic reasons, few people
the next 30 years, from 7.7 billion currently to 9.7 billion in 2050, and are engaged in plant protection originally. Hence, there are significant
can peak at nearly 11 billion around 2100 (www.un.org). The majority needs and realistic importance to construct a simple, fast, less expensive,
of the world’s population regards rice as the primary grain, and it is the and accurate system to automatically identify rice plant diseases.
source of a substantial portion of total calories for over half the earth’s In recent years, a new way of crop disease identification has been
population. Nevertheless, rice plants are quite susceptible to various evolved with the improvement of digital cameras and computational
diseases as well. The occurrence of diverse rice plant diseases has a capacity. Computer vision technology has become an attractive
negative impact on crop growth, and if the rice plant diseases are not approach for continuous monitoring of crop diseases due to its low cost,
detected in time, they may cause a disastrous effect on food security visualization capability, and contact-free approach (Bai et al., 2018).
(Mohanty et al., 2016). Early prediction and warning can suppress the Much attention has been paid to the research and application of image
outbreaks of rice plant diseases and reduce the unnecessary usage of processing and machine learning (ML), and specific classifiers are usu
pesticides (Picon et al., 2019). It plays a key role to ensure the effective ally employed to classify plants into healthy or diseased types (Li et al.,
yields of rice plants, but so far, the visual observations of experienced 2013; Prajapati et al., 2017; Zhang et al., 2018). For example, Kahar
producers or plant specialists are still the primary approach to diagnose et al. (2015) used an artificial neural network (ANN) to detect three
rice plant diseases in many areas, especially in developing countries. varieties of rice plant diseases, including leaf blight, leaf blast, and
When a particular rice disease spreads, plant protection specialists or sheath blight, and their classifier achieved an accuracy of 74.21%.
technicians from corresponding agencies appointed by governments Through extracting grey level co-occurrence matrix (GLCM) texture
approach farmers to advise the necessary actions. Although some crop features, Gharge et al. (2016) trained a backpropagation neural network
The code (and data) in this article has been certified as Reproducible by the CodeOcean: https://codeocean.com. More information on the Reproducibility Badge
Initiative is available at https://www.elsevier.com/physical-sciences-and-engineering/computer-science/journals.
* Corresponding author.
E-mail addresses: chen2wo@126.com (J. Chen), dfzhang@xmu.edu.cn (D. Zhang), adnanzeb@stu.xmu.edu.cn (A. Zeb), artavil20@gmail.com (Y.A. Nanehkaran).
https://doi.org/10.1016/j.eswa.2020.114514
Received 11 August 2020; Received in revised form 16 December 2020; Accepted 17 December 2020
Available online 5 January 2021
0957-4174/© 2020 Elsevier Ltd. All rights reserved.
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
2
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
unfilled seeds and sterile panicles; the lesion starts at the uppermost leaf and glumes; the color is gray and the lesion margin is reddish-brown to
sheath enclosing the young panicles, and it appears oblong or as the dark brown. Table 1 summarizes the key symptom features of diverse
irregular spot with dark reddish, brown margins, and gray center or rice diseases and the sample size. The partial sample images are dis
brownish-gray throughout. Rice stripe blight can cause high yield losses played in Fig. 1.
when severe epidemics occur; it presents chlorotic to yellowish-white
stripes, mottling, and necrotic streaks on the leaves. Bacterial leaf
streak (BLS) causes wilting of seedlings, yellowing and drying of leaves, 2.2. Overall process
and it can be checked by wilting and yellowing of leaves. Rice brown
spot has been historically ignored as one of the most common and most The overall processes of our approach for the identification of rice
damaging rice diseases, and it appears as minute spots on the leaf blade plant diseases are presented below. Firstly, the rice plant disease images
were collected from real-life agricultural fields and directly downloaded
3
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
from different online sources. The rice disease types of these samples feature map. Suppose that an intermediate feature map Fs ∈ RC×H×W is
were known in advance and labelled according to the domain experts’ input to the attention module, Fs contains the lower stage feature map FL
knowledge. Then, the image pre-processing techniques, including image and higher stage feature map FH. Here the lower stage feature map FL
resizing, image filtering, and image sharpening, were conducted on the represents the local features of images such as the edges, lines, and
sample images. The dimensions of the sample images were uniformly corners, etc., and the higher stage feature map FH is the composition of
resized to 224 × 224 pixels to fit the model, the image filtering was low stage features. Thus, the CAM will infer a 1D channel attention map
implemented to eliminate the noise of digital images, and the image MC ∈ RC×1×1 and SAM will generate a 2D spatial attention map Ms ∈
sharpening was conducted on individual blurred images to make them R1×H×W. The overall processes of attention mechanism can be summa
clean. Next, in addition to reserving some raw images for verifying the rized as follows:
validity of the model, we utilized the data augmentation scheme to
Fs = FH + FL
synthesize new images for balancing and diversifying the sample im
(1)
′
4
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
DWC to perform a 1 × 1 convolution kernel operation, as calculated in dimensional features (excluding ReLU), and then 1 × 1 convolution is
Eqs. (4,5) respectively. utilized for dimension reduction of the features. In addition, MobileNet-
V2 applies a linear bottleneck rather than ReLU to avoid damage to
H ∑
∑ L
DWC(Wd , y)(i,j) = Wd(h,l) ⊙ y(i+h,j+l) (4) features. Compared with the other start-of-the-art methods, MobileNet-
h=0 l=0 V2 has a smaller model size and fewer parameters. It is the most optimal
deep learning architecture till date (Shen et al., 2019; Sandler et al.,
∑
K
2018), Therefore, in this work, we chose MobileNet-V2 as the backbone
PWC(Wp , y)(i,j) = Wk × y(i,j,k) (5)
k=0
network to identify rice plant diseases. Fig. 3 shows the structural
changes from MobileNetV1 to MobileNetV2.
where L and H represent the width and height of the image respectively,
(i, j) index position of the image, ʘ denotes element-wise multiplication, 2.3.3. Transfer learning
W is the weights of filters, K is the number of channels, and y denotes the Transfer learning is a machine learning technique by which the
input image. Accordingly, the DWSC is conducted by the following knowledge learned from a rich-label source domain is used to train in
procedure, as expressed in Eq. (6). another poor-label target domain (Pan & Yang, 2009). The principles of
transfer learning are briefly described below. A domain D consists of two
DWSC(Wp , Wd , y)(i,j) = PWC(i,j) (Wp , DWC(i,j) (Wd , y)) (6)
parts: a feature space χ and a marginal probability distribution P(X),
MobileNet-V2 (Sandler et al., 2018) proposes the concept of the where X = {x1 , ⋯, xn } ∈ χ . Source domain DS and target domain DT are
inverted residual block and linear bottleneck structures based on two basic concepts in transfer learning. Domain with a large amount of
MobileNet-V1. It focuses on solving the problem that MobileNet-V1 is labelled data and knowledge is the source domain, which is used to train
easy to vanishing the training process, and has a certain improvement the basic model. The target domain is the unlabelled domain waiting for
over MobileNet-V1. In the workflow of the inverse residual block, a 1 × knowledge transfer. A task Tis comprised of two parts: a label space Y
1 convolution kernel is employed to increase the dimension of low- and an objective predictive function f(∙). Accordingly, TS is the task of
the source domain, and TT is the task of the target domain. Based on the
5
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
definitions above, a domain is a pair D = {χ ,P(X)}, and a task is defined the newly formed network namely Mobile-Atten was used to identify the
as a pair T = {Y , P(Y|X)}. Transfer learning plays a vital role in deep rice plant diseases. Note that the parameters of auxiliary convolution
CNNs because the deep learning-based CNN algorithms need a large layers are initialized by following (He et al., 2015), and the channel first
amount of labelled data to train models, while collecting massive and then the spatial order is applied in the attention module. Fig. 5
labelled data in a field is undoubtedly a challenging task. Therefore, depicts the network structure and relevant parameters are presented in
transfer learning has naturally become the preferred scheme and is Table 2.
increasingly used in practical applications, e.g. the solutions of using a
pre-trained network from ImageNet (Russakovsky et al., 2015), where 2.4.2. Loss function
only the parameters of newly extended layers are inferred from scratch As is well known, the Cross-Entropy Loss function is the most widely
while the bottom convolution layers are frozen. Fig. 4 depicts a typical used in deep CNNs, and it is defined in Eq. (7).
schematic diagram of transfer learning.
∑
C
L= − yc log(pc ) (7)
c=1
2.4. Proposed approach
where C denotes the number of classes, yc is the indicator variable (yc =
2.4.1. Mobile-Atten network 0 or 1; if class c is the same as the category of the sample, yc = 1;
As mentioned earlier, Mobile-nets is a class of lightweight convolu otherwise, yc = 0.), and pc is the predicted probability that the observed
tional neural networks based on depthwise separable convolution and sample belongs to class c. On this basis, Lin et al. (2017) proposed a
has shown outstanding capability in processing both large scale and Focal Loss function (see Eq. (8)), because the prediction loss weights of
small scale problems. Among them, MobileNet-V2 is one of the most the cross-entropy function were the same for positive and negative
optimal networks with a small volume and few parameters. More than samples.
that, the attention mechanism can make good use of both the spatial
attention and channel-wise attention to learn the importance of spatial L(pc ) = − αc (1 − pc )γ log(pc ) (8)
points and inter-channel relationship for the input features, thereby
where pc denotes the predicted probability, ac is the weighting factor of
improving the accuracy of the models. Thus, inspired by the perfor
Focal Loss function when the corresponding category is 1, and γ is a
mance, the MobileNet-V2 paired with the attention mechanism was
modulating factor (hyperparameter). Note that the Focal Loss function is
selected in our approach. Using the method of transfer learning, we
primarily used for binary classification problems in target detection. The
transferred the common knowledge of MobileNet-V2 pre-trained from
identification of crop diseases is a multi-classification problem, and thus
ImageNet and added the attention module in the pre-trained model to
the classical Focal Loss function is enhanced and used to replace the
create a new network, which we termed the Mobile-Atten, for identi
original Cross-Entropy loss function in our model, as computed in Eqs.
fying the crop disease types. To enhance the learning ability of minute
(9–11).
lesion features, we modified the network structure of conventional
MobileNet-V2 and loaded the pre-trained weights from ImageNet (https ∑
C
://keras.io/api/applications/). The classification layer at the tail of the Lmult = − αc (1 − pc )γ yc log(pc ) (9)
MobileNet-V2 was abandoned and we added the attention module
c=1
behind the pre-trained model, which was followed by an additional αc = count(xi ∈ c)/count(xi ) (10)
convolutional layer of 1280 × 1 × 1 for high-dimensional feature
extraction. Then, one BatchNormalization layer was added to make the {
1, c = acutal class
network converge faster and more stable, and the shortcut connection yc = (11)
0, c ∕
= acutal class
approach was particularly utilized in the attention module to avoid in
formation loss and increase the expressive ability of the model. Next, the where C is the number of classes, and xi denotes the sample.
completely linked layer was replaced by a global pooling layer, and a
new completely linked Softmax layer with a practical number of cate
gories was added as the new classification layer of the network. Thus,
Table 2
The main parameters of the network.
Module (type) Input shape Expansion factor output channel no. Repeated times Stride
6
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
7
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
Table 3
The training accuracy and loss of different methods on the public dataset.
Pre-trained models 10 epochs 30 epochs
Training accuracies % Test accuracies % Training losses Training accuracies % Test accuracies % Training losses Test losses
EfficientNet-B0 (Tan & Le, 2019), and DenseNet121 (Huang et al, 2017), optimum classifier is employed. The key explanation for this is that the
as the baseline algorithms to perform the comparative analyses. Using Mobile-Atten model integrates the merits of both MobileNet-V2 and
the approach of transfer learning, the top layers of all the models were attention module of channel and location, which increases the capability
abandoned and a new fully-connected Softmax layer with the practical of learning minute lesion features. Moreover, transfer learning is per
number of categories was used as the classification layer. In addition, to formed twice for model training, which makes the optimum weight
especially study the rationality and effectiveness of the attention module parameters be obtained for the newly generated network. By contrast,
used in the networks, we adopted the approach of using the pre-trained the other methods are the individual networks; although the weights are
MobileNet-V2 model by twice transfer learning, which we termed the initialized with pre-trained networks instead of inferring from scratch
MobileNetv2_2stage, to identify the plant disease images. In the same and the fine-tuning approach is applied too, the optimum results are not
way, we have also used the approach of once training the Mobile-Atten achieved for these models. Therefore, using the model trained by the
network instead of twice transfer learning solution, which we call this proposed procedure, the new unseen images are chosen to perform the
approach as MobileAtten_alpha, to study the improvement of the overall identification of plant disease types. Referring to Hari et al. (2019) work,
network performance by the proposed procedure. By doing this, these the partial plant leaf images such as apple, grape, and potato, are
CNN models were constructed and the parameters were initialized selected to test, and the corresponding ROC (receiver operating char
through loading the pre-trained weights from ImageNet. A batch size of acteristic) curve and confusion matrices of identified results are depicted
64 was used to train the networks, with 30 epochs and a learning rate of in Fig. 7.
lr = 10-4. Therefore, the diverse CNN models were trained and multiple Considering the statistics of accurate detections (true positives),
experiments were conducted on the public dataset. Table 3 displays the misdetections (false negatives), true negatives, and false positives, we
training and test results. performed an exhaustive analysis of the results output from different
From Table 3, it can be visualized that the proposed procedure CNN models. Various metrics like Accuracy, Sensitivity (Recall), Speci
achieves an increased efficacy and has delivered comparable results ficity, Precision, and F1-Score, were used to measure the performance of
relative to other state-of-the-art methods. After training for 30 epochs, plant disease identification, as written in Eqs. (14–18).
the proposed procedure reached a test accuracy of 98.84% on the public
TP + TN
dataset. This is the top performance of all the algorithms except for the Accuracy = (14)
TP + TN + FP + FN
DenseNet-121, which is a deep CNNs with a relatively large volume and
consumes more computational resources. In contrast, the proposed TP
Mobile-Atten approach is memory efficient, and the model volume is Sensitivity = (15)
TP + FN
only about 1/3 of that of DenseNet-121. It delivered a comparable effect
and relatively high accuracy in the experiments, even though the
8
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
Table 4
The identification results of different plant disease types.
No. Category names Identified samples Correct samples Accuracy (%) Sensitivity (%) Specificity (%) Precision (%) F1-Score (%)
Table 5
Comparing with the latest literature on the public PlantVillage dataset.
References CNN models No. of classes No. of training samples No. of test sample Accuracies
9
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
Table 6
The training accuracy and loss of different methods on the local rice image dataset.
Pre-trained models 10 epochs 30 epochs
Training accuracies % Test accuracies % Training losses Training accuracies % Test accuracies % Training losses Test losses
was set 224 × 224 × 3 and the corresponding 32 × 3 convolutional block all the augmented images were uniformly resized to the fixed-dimension
followed by a 64 × 128 × 3 convolutional block was used in the of 224 × 224 pixels to fit the model. Likewise, the five most-used CNNs
discriminator module. In this manner, the assigned input dimension of such as MobileNet, MobileNet-V2, NASNetMobile, EfficientNet-B0, and
64 × 64 pixels was substituted for a 224 × 224 pixel input dimension. DenseNet121, were selected to compare models. In the meantime, the
The hyperparameters were set as follows: an epoch of 5 × 105, a batch- approaches of MobileNetv2_2stage and MobileAtten_alpha are also
size of 16, a learning rate of 1 × 10-4, and the optimizer of Adam. added in the comparative experiments. Applying the transfer learning,
Therefore, by doing this, we augment the sample images, and the partial these models are created and loaded with pre-trained weights from
samples synthesized by the traditional approaches and the enhanced ImageNet, and the top layers are truncated by defining a new fully-
DCGAN method are respectively displayed in Figs. 8 and 9. connected Softmax layer with the practical number of classifications.
Subsequently, using the method proposed in Section 2.4, we per Thus, the models of diverse CNNs are trained and multiple experiments
formed the model training and validation on the rice image dataset, and are conducted on our local dataset. Table 6 shows the training accuracy
10
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
Fig. 10. ROC curve and confusion matrix of rice disease identification.
Table 7
The identified results of different rice disease classes.
ID Rice disease types Identifiedsamples Correct samples Accuracy(%) Sensitivity(%) Specificity(%) Precision(%) F1-Score(%)
and loss of different approaches. samples, and the identification accuracy reaches 99.09%. Thus, a total of
It can be observed from Table 6 that the proposed procedure has 240 samples are accurately identified in 265 unseen images, and the
gained a significant increase in efficacy relative to other state-of-the-art average Accuracy, Sensitivity, and Specificity respectively achieve
CNNs. When trained for 10 epochs and 30 epochs, the training accu 98.48%, 90.56%, and 99.17%, as shown in Table 7. On the other hand,
racies of the proposed procedure have reached 98.52% and 99.75% there are also some misidentifications such as 3 samples in the disease
respectively. In particular, after 30 epoch training, the test accuracy types of “Rice white tip”, which is caused by different plant diseases such
achieves 90.51%, which is the top performance of all the algorithms as “ rice white tip ” and “ rice leaf burn ” occurring in the same plant.
except for the EfficientNet-B0 and DenseNet-121. However, the training Besides, the heterogeneous background conditions and irregular light
log-loss of EfficientNet-B0 is too big and it even reaches 19.3843 when ing strengths may also affect the identification results. Fig. 11 displays
training in 10 epochs. For DenseNet-121, as discussed earlier, its volume the partial identified sample images.
is around 3 times that of our method and consumes more computational As seen in Fig. 11, the top images are the original sample images, the
resources. By contrast, the proposed procedure is memory efficient and middle images are the positioning images presented by the visualization
performs well with relatively high accuracy and low log-loss. Thus, technique of classification activation map, and the bottom images are
using the model trained by the proposed procedure, the new unseen the results identified by the proposed procedure. From Fig. 11, we can
images are selected for the identification of rice plant diseases. Fig. 10 see that the identified categories of most samples are consistent with the
depicts the identification results and an exhaustive indicator analysis is actual categories of them and most of the rice plant diseases have been
displayed in Table 7. accurately identified by the proposed procedure except for some indi
As can be visualized in Fig. 10(a), the ROC curve presents a higher vidual samples. Such as Fig. 11(a), the actual disease type of this sample
TPR (true-positive rates) while a lower FPR (false-positive rate) for most is “Rice stackburn”, which is correctly identified by the proposed pro
identified classes, which is also reflected in the confusion matrix of cedure with the probability of 0.74. Similarly, Fig. 11(b, d) are all
Fig. 10(b). The proposed procedure has successfully identified most of accurately identified and the identification probabilities are respectively
the test samples in different rice disease types. For example, 13 “Rice greater than 0.99 and 0.84. Conversely, there are individual mis
stackburn” samples are accurately identified except for 1 false detection, identified samples due to serious clutter backgrounds and uneven
and the identification accuracy is 98.49%. The correct number is 18 for lighting conditions, which affect the feature extraction of disease spot
the identification of “Rice leaf smut” rice disease in 38 samples. Also, 15 images. Besides, some different rice diseases occurring in the same plant
images are accurately identified by the proposed procedure in 19 may also affect the classification results, as shown in Fig. 11(c). In spite
11
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
of individual misidentifications, most of the rice plant diseases are CRediT authorship contribution statement
accurately identified by the proposed procedure. The high accuracy has
been achieved on the unseen images for a set of experiments, which Junde Chen: Methodology, Software, Writing - original draft. Defu
indicates that the Mobile-Atten approach has a significant capability to Zhang: Supervision. Adnan Zeb: . Yaser A. Nanehkaran: Data
identify rice plant diseases. Based on the experimental analysis, it can be curation.
assumed that the proposed procedure is successful in the identification
of rice plant diseases, and can also be extended to other fields such as
Declaration of Competing Interest
computer-aided detection, online defect assessment, etc.
The authors declare that they have no known competing financial
4. Conclusions
interests or personal relationships that could have appeared to influence
the work reported in this paper.
Identification and classification of various plant diseases by means of
digital images is very necessary to get the quality of plant products
Acknowledgements
better. Deep learning techniques, especially CNNs, can effectively and
efficiently identify most of the visual symptoms related to plant diseases
The work is supported by the National Natural Science Foundation of
(Barbedo, 2019). In this paper, we proposed a novel network architec
China under grant No. 61672439 and the Fundamental Research Funds
ture called Mobile-Atten, which had a small model size and relatively
for the Central Universities No. 20720181004. The authors also would
high accuracy, to perform the identification of rice disease types. The
like to thank the editors and all the anonymous reviewers for their
MobileNet-V2 was chosen as the backbone model in our method. Using
constructive advice.
transfer learning, we transferred the common knowledge of MobileNet-
V2 pre-trained from ImageNet and added the attention module in the
pre-trained model to create a new network for identifying the types of References
rice plant diseases. Also, the loss function was optimized and the model
Arsenovic, M., Karanovic, M., Sladojevic, S., Anderla, A., & Stefanovic, D. (2019).
training was performed by twice transfer learning. The experimental Solving current limitations of deep learning based approaches for plant disease
findings demonstrate the feasibility and validity of the proposed pro detection. Symmetry, 11(7), 939.
cedure, and it is successfully accomplished to identify diverse plant Amara, J., Bouaziz, B., & Algergawy, A. (2017). A deep learning-based approach for
banana leaf diseases classification. Datenbanksysteme für Business, Technologie und
diseases on both the public dataset and the local dataset. In future Web (BTW 2017)-Workshopband.
development, we intend to deploy it on mobile devices to monitor and Anderson, P., He, X., Buehler, C., Teney, D., Johnson, M., Gould, S., & Zhang, L. (2018).
identify the wide range of rice plant diseases automatically. Meanwhile, Bottom-up and top-down attention for image captioning and visual question
answering. In Proceedings of the IEEE conference on computer vision and pattern
we plan to apply it on more real-world applications. recognition (pp. 6077-6086).
Bai, X., Cao, Z., Zhao, L., Zhang, J., Lv, C., Li, C., & Xie, J. (2018). Rice heading stage
automatic observation by multi-classifier cascade based rice spike detection method.
Agricultural and Forest Meteorology, 259, 260–270.
12
J. Chen et al. Expert Systems With Applications 169 (2021) 114514
Barbedo, J. G. A. (2018). Factors influencing the use of deep learning for plant disease Lu, Y., Yi, S., Zeng, N., Liu, Y., & Zhang, Y. (2017). Identification of rice diseases using
recognition. Biosystems Engineering, 172, 84–91. https://doi.org/10.1016/j. deep convolutional neural networks. Neurocomputing, 267, 378–384. https://doi.
biosystemseng.2018.05.013 org/10.1016/j.neucom.2017.06.023
Arnal Barbedo, J. G. (2019). Plant disease identification from individual lesions and spots Mohanty, S. P., Hughes, D. P., & Salathé, M. (2016). Using deep learning for image-based
using deep learning. Biosystems Engineering, 180, 96–107. https://doi.org/10.1016/j. plant disease detection. Frontiers in Plant Science, 7, 1419.
biosystemseng.2019.02.002 Nan, Y., & Xi, W. (2019, November). Classification of Press Plate Image Based on
Brahimi, M., Boukhalfa, K., & Moussaoui, A. (2017). Deep Learning for tomato diseases: Attention Mechanism. In 2019 2nd International Conference on Safety Produce
Classification and symptoms visualization. Applied Artificial Intelligence, 31(4), Informatization (IICSPI) (pp. 129–132). IEEE.
299–315. https://doi.org/10.1080/08839514.2017.1315516 Pan, S. J., & Yang, Q. (2009). A survey on transfer learning. IEEE Transactions on
Dargan, S., Kumar, M., Ayyagari, M. R., & Kumar, G. (2019). A survey of deep learning Knowledge and Data Engineering, 22(10), 1345–1359. https://doi.org/10.1109/
and its applications: A new paradigm to machine learning. Archives of Computational TKDE.2009.191
Methods in Engineering, 1–22. Prajapati, H. B., Shah, J. P., & Dabhi, V. K. (2017). Detection and classification of rice
Durmuş, H., Güneş, E. O., & Kırcı, M. (2017, August). Disease detection on the leaves of plant diseases. Intelligent Decision Technologies, 11(3), 357–373. https://doi.org/
the tomato plants by using deep learning. In 2017 6th International Conference on 10.3233/IDT-170301
Agro-Geoinformatics (pp. 1–5). IEEE. Picon, A., Seitz, M., Alvarez-Gila, A., Mohnke, P., Ortiz-Barredo, A., & Echazarra, J.
Elhassouny, A., & Smarandache, F. (2019, July). Smart mobile application to recognize (2019). Crop conditional Convolutional Neural Networks for massive multi-crop
tomato leaf diseases using Convolutional Neural Networks. In 2019 International plant disease classification over cell phone acquired images taken on real field
Conference of Computer Science and Renewable Energies (ICCSRE) (pp. 1–4). IEEE. conditions. Computers and Electronics in Agriculture, 167, 105093. https://doi.org/
Ferentinos, K. P. (2018). Deep learning models for plant disease detection and diagnosis. 10.1016/j.compag.2019.105093
Computers and Electronics in Agriculture, 145, 311–318. https://doi.org/10.1016/j. Radford, A., Metz, L., & Chintala, S. (2015). Unsupervised representation learning with
compag.2018.01.009 deep convolutional generative adversarial networks. arXiv preprint arXiv:
Gandhi, R., Nimbalkar, S., Yelamanchili, N., & Ponkshe, S. (2018, May). Plant disease 1511.06434.
detection using CNNs and GANs as an augmentative approach. In 2018 IEEE Rahman, C. R., Arko, P. S., Ali, M. E., Iqbal Khan, M. A., Apon, S. H., Nowrin, F., &
International Conference on Innovative Research and Development (ICIRD) (pp. Wasif, A. (2020). Identification and recognition of rice diseases and pests using
1–5). IEEE. convolutional neural networks. Biosystems Engineering, 194, 112–120. https://doi.
Gharge, S., & Singh, P. (2016). Image processing for soybean disease classification and org/10.1016/j.biosystemseng.2020.03.020
severity estimation. In Emerging research in computing, information, Russakovsky, O., Deng, J., Su, H., Krause, J., Satheesh, S., Ma, S., ... & Berg, A. C. (2015).
communication and applications (pp. 493–500). Springer, New Delhi. Imagenet large scale visual recognition challenge. International journal of computer
Ghazi, M. M., Berrin, Y., & Erchan, A. (2017). Plant identification using deep neural vision, 115(3), 211–252.
networks via optimization of transfer learning parameters. Neurocomputing, 235, Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., & Chen, L. C. (2018). Mobilenetv2:
228–235. Inverted residuals and linear bottlenecks. In Proceedings of the IEEE conference on
Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., ... & computer vision and pattern recognition (pp. 4510–4520).
Bengio, Y. (2014). Generative adversarial nets. In Advances in neural information Sardogan, M., Tuncer, A., & Ozen, Y. (2018, September). Plant leaf disease detection and
processing systems (pp. 2672–2680). classification based on CNN with LVQ algorithm. In 2018 3rd International
Hari, S. S., Sivakumar, M., Renuga, P., & Suriya, S. (2019, March). Detection of Plant Conference on Computer Science and Engineering (UBMK) (pp. 382–385). IEEE.
Disease by Leaf Image Using Convolutional Neural Network. In 2019 International Sharif, M., Khan, M. A., Iqbal, Z., Azam, M. F., Lali, M. I. U., & Javed, M. Y. (2018).
Conference on Vision Towards Emerging Trends in Communication and Networking Detection and classification of citrus diseases in agriculture based on optimized
(ViTECoN) (pp. 1–5). IEEE. weighted segmentation and feature selection. Computers and Electronics in Agriculture,
He, K., Zhang, X., Ren, S., & Sun, J. (2015). Delving deep into rectifiers: Surpassing 150, 220–234. https://doi.org/10.1016/j.compag.2018.04.023
human-level performance on imagenet classification. In Proceedings of the IEEE Shen, Y., Sun, H., Xu, X., & Zhou, J. (2019, July). Detection and Positioning of Surface
international conference on computer vision (pp. 1026–1034). Defects on Galvanized Sheet Based on Improved MobileNet v2. In 2019 Chinese
Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., ... & Adam, Control Conference (CCC) (pp. 8450–8454). IEEE.
H. (2017). Mobilenets: Efficient convolutional neural networks for mobile vision Sifre, L., & Mallat, S. (2014). Rigid-motion scattering for image classification. Ph. D.
applications. arXiv preprint arXiv:1704.04861. thesis.
Huang, G., Liu, Z., Van Der Maaten, L., & Weinberger, K. Q. (2017). Densely connected Singh, A. K., Rubiya, A., & Raja, B. S. (2015). Classification of rice disease using digital
convolutional networks. In Proceedings of the IEEE conference on computer vision image processing and svm classifier. International Journal of Electrical and Electronics
and pattern recognition (pp. 4700–4708). Engineers, 7(1), 294–299.
Hughes, D., & Salathé, M. (2015). An open access repository of images on plant health to Song, K., Sun, X. Y., & Ji, J. W. (2007). Corn leaf disease recognition based on support
enable the development of mobile disease diagnostics. arXiv preprint arXiv: vector machine method. Transactions of the CSAE, 23(1), 155–157.
1511.08060. Tan, M., & Le, Q. V. (2019). Efficientnet: Rethinking model scaling for convolutional
Karthik, R., Hariharan, M., Anand, S., Mathikshara, P., Johnson, A., & Menaka, R. neural networks. arXiv preprint arXiv:1905.11946.
(2020). Attention embedded residual CNN for disease detection in tomato leaves. Tm, P., Pranathi, A., SaiAshritha, K., Chittaragi, N. B., & Koolagudi, S. G. (2018, August).
Applied Soft Computing, 86, Article 105933. Tomato leaf disease detection using convolutional neural networks. In 2018
Kahar, M. A., Mutalib, S., & Abdul-Rahman, S. H. U. Z. L. I. N. A. (2015). Early detection Eleventh International Conference on Contemporary Computing (IC3) (pp. 1–5).
and classification of paddy diseases with neural networks and fuzzy logic. IEEE.
In Proceedings of the 17th International Conference on Mathematical and Too, E. C., Yujian, L.i., Njuki, S., & Yingchun, L. (2019). A comparative study of fine-
Computational Methods in Science and Engineering, MACMESE (pp. 248–257). tuning deep learning models for plant disease identification. Computers and
Kessentini, Y., Besbes, M. D., Ammar, S., & Chabbouh, A. (2019). A two-stage deep neural Electronics in Agriculture, 161, 272–279. https://doi.org/10.1016/j.
network for multi-norm license plate detection and recognition. Expert Systems with compag.2018.03.032
Applications, 136, 159–170. https://doi.org/10.1016/j.eswa.2019.06.036 Wang, F., Jiang, M., Qian, C., Yang, S., Li, C., Zhang, H., ... & Tang, X. (2017). Residual
Khamparia, A., Saini, G., Gupta, D., Khanna, A., Tiwari, S., & de Albuquerque, V. H. C. attention network for image classification. In Proceedings of the IEEE conference on
(2020). Seasonal crops disease prediction and classification using deep convolutional computer vision and pattern recognition (pp. 3156–3164).
encoder network. Circuits, Systems, and Signal Processing, 39(2), 818–836. https:// Woo, S., Park, J., Lee, J. Y., & So Kweon, I. (2018). Cbam: Convolutional block attention
doi.org/10.1007/s00034-019-01041-0 module. In Proceedings of the European conference on computer vision (ECCV) (pp.
Kingma, D. P., & Ba, J. (2014). Adam: A method for stochastic optimization. arXiv 3–19).
preprint arXiv:1412.6980. Zhang, S., Zhang, C., Wang, Z., & Kong, W. (2018). Combining sparse representation and
Li, C. H., Zhang, J. B., Hu, X. P., & Zhao, G. F. (2013). Algorithm research of two- singular value decomposition for plant recognition. Applied Soft Computing, 67,
dimensional size measurement on parts based on machine vision. In Advanced 164–171. https://doi.org/10.1016/j.asoc.2018.02.052
Materials Research (Vol. 694, pp. 1945–1948). Trans Tech Publications Ltd. Zoph, B., Vasudevan, V., Shlens, J., & Le, Q. V. (2018). Learning transferable
Lin, T. Y., Goyal, P., Girshick, R., He, K., & Dollár, P. (2017). Focal loss for dense object architectures for scalable image recognition. In Proceedings of the IEEE conference
detection. In Proceedings of the IEEE international conference on computer vision on computer vision and pattern recognition (pp. 8697–8710).
(pp. 2980–2988).
13