You are on page 1of 50

LEAF DISEASE DETECTION USING CNN ALGORITHM

Minor project-1 report submitted


in partial fulfillment of the requirement for award of the degree of

Bachelor of Technology
in
Computer Science & Engineering

By

THOTLA NARASIMHA REDDY (21UECM0245) (VTU19373)


THOTLA DHARMA REDDY (21UECM0244) (VTU19372)
NARAYANAPURAM RAVINDRA BABU (21UECM0170) (VTU19868)

Under the guidance of


Dr.S.LALITHA,B.Tech.,M.E.,Ph.D.,
ASSOCIATE PROFESSOR

DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING


SCHOOL OF COMPUTING

VEL TECH RANGARAJAN DR. SAGUNTHALA R&D INSTITUTE OF


SCIENCE & TECHNOLOGY
(Deemed to be University Estd u/s 3 of UGC Act, 1956)
Accredited by NAAC with A++ Grade
CHENNAI 600 062, TAMILNADU, INDIA

January, 2024
LEAF DISEASE DETECTION USING CNN ALGORITHM

Minor project-1 report submitted


in partial fulfillment of the requirement for award of the degree of

Bachelor of Technology
in
Computer Science & Engineering

By

THOTLA NARASIMHA REDDY (21UECM0245) (VTU19373)


THOTLA DHARMA REDDY (21UECM0244) (VTU19372)
NARAYANAPURAM RAVINDRA BABU (21UECM0170) (VTU19868)

Under the guidance of


Dr.S.LALITHA,B.Tech.,M.E.,Ph.D.,
ASSOCIATE PROFESSOR

DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING


SCHOOL OF COMPUTING

VEL TECH RANGARAJAN DR. SAGUNTHALA R&D INSTITUTE OF


SCIENCE & TECHNOLOGY
(Deemed to be University Estd u/s 3 of UGC Act, 1956)
Accredited by NAAC with A++ Grade
CHENNAI 600 062, TAMILNADU, INDIA

January, 2024
CERTIFICATE
It is certified that the work contained in the project report titled ”LEAF DISEASE DETECTION
USING CNN ALGORITHM” by ”THOTLA NARASIMHA REDDY (21UECM0245), THOTLA
DHARMA REDDY (21UECM0244), NARAYANAPURAM RAVINDRA BABU (21UECM0170)”
has been carried out under my supervision and that this work has not been submitted elsewhere for a
degree.

Signature of Supervisor
Dr.S.Lalitha
Associate Professor
Computer Science & Engineering
School of Computing
Vel Tech Rangarajan Dr. Sagunthala R&D
Institute of Science & Technology
January, 2024

Signature of Head of the Department Signature of the Dean


Computer Science & Engineering Dr. V. Srinivasa Rao
School of Computing Professor & Dean
Vel Tech Rangarajan Dr. Sagunthala R&D Computer Science & Engineering
Institute of Science & Technology School of Computing
January, 2024 Vel Tech Rangarajan Dr. Sagunthala R&D
Institute of Science & Technology
January, 2024

i
DECLARATION

We declare that this written submission represents my ideas in our own words and where others ideas
or words have been included, we have adequately cited and referenced the original sources. We
also declare that we have adhered to all principles of academic honesty and integrity and have not
misrepresented or fabricated or falsified any idea/data/fact/source in our submission. We understand
that any violation of the above will be cause for disciplinary action by the Institute and can also
evoke penal action from the sources which have thus not been properly cited or from whom proper
permission has not been taken when needed.

Thotla Narasimha Reddy


Date: / /

Thotla Dharma Reddy


Date: / /

Narayanapuram Ravindra Babu


Date: / /

ii
APPROVAL SHEET

This project report entitled ”LEAF DISEASE DETECTION USING CNN ALGORITHM” by THOTLA
NARASIMH A REDDY (21UECM0245), THOTLA DHARMA REDDY (21UECM0244), NARAYAN
APURAM RAVINDRA BABU (21UECM0170) is approved for the degree of B.Tech in Computer
Science & Engineering.

Examiners Supervisor

Dr. S. LALITHA, B.Tech, M.E, Ph.D.,

Date: / /
Place:

iii
ACKNOWLEDGEMENT

We express our deepest gratitude to our respected Founder Chancellor and President Col. Prof.
Dr. R. RANGARAJAN B.E. (EEE), B.E. (MECH), M.S (AUTO),D.Sc., Foundress President Dr.
R. SAGUNTHALA RANGARAJAN M.B.B.S. Chairperson Managing Trustee and Vice President.

We are very much grateful to our beloved Vice Chancellor Prof. S. SALIVAHANAN, for provid-
ing us with an environment to complete our project successfully.

We record indebtedness to our Professor & Dean, Department of Computer Science & Engi-
neering, School of Computing, Dr. V. SRINIVASA RAO, M.Tech., Ph.D., for immense care and
encouragement towards us throughout the course of this project.

We are thankful to our Head, Department of Computer Science & Engineering, Dr.M.S. MU-
RALI DHAR, M.E., Ph.D., for providing immense support in all our endeavors.

We also take this opportunity to express a deep sense of gratitude to our Internal Supervisor
Dr.S.LALITHA,B.Tech.,M.E.,Ph.D., for her cordial support, valuable information and guidance,
she helped us in completing this project through various stages.

A special thanks to our Project Coordinators Mr. V. ASHOK KUMAR, M.Tech., Ms. C. SHYA-
MALA KUMARI, M.E., Mr. SHARAD SHANDHI RAVI, M.Tech., for their valuable guidance
and support throughout the course of the project.

We thank our department faculty, supporting staff and friends for their help and guidance to com-
plete this project.

THOTLA NARASIMHA REDDY (21UECM0245)


THOTLA DHARMA REDDY (21UECM0244)
NARAYANAPURAM RAVINDRA BABU (21UECM0170)

iv
ABSTRACT

Leaf diseases are a major threat to farmers, consumers, environment and the global
economy. In India alone, 35 percentage of field crops are lost to pathogens and pests
causing losses to farmers. Indiscriminate use of pesticides is also a serious health
concern as many are toxic and biomagnified. These adverse effects can be avoided
by early disease detection, crop surveillance and targeted treatments. Most diseases
are diagnosed by agricultural experts by examining external symptoms. However,
farmers have limited access to experts. Project is the first integrated and collabo-
rative platform for automated disease diagnosis, tracking and forecasting. Farmers
can instantly and accurately identify diseases and get solutions with a mobile app by
photographing affected plant parts. Real-time diagnosis is enabled using the latest
Artificial Intelligence (AI) algorithms for Cloud-based image processing. The AI
model continuously learns from user uploaded images and expert suggestions to en-
hance its accuracy. Farmers can also interact with local experts through the platform.
For preventive measures, disease density maps with spread forecasting are rendered
from a Cloud based repository of geo-tagged images and micro-climactic factors. A
web interface allows experts to perform disease analytics with geographical visual-
izations. In our experiments, the AI model (CNN) was trained with large disease
datasets, created with plant images self-collected from many farms over 7 months.
Test images were diagnosed using the automated CNN model and the results were
validated by plant pathologists. Over 95 percentage disease identification accuracy
was achieved.Solution is a novel, scalable and accessible tool for disease manage-
ment of diverse agricultural crop plants and can be deployed as a Cloud based service
for farmers and experts for ecologically sustainable crop production.

Keywords:

Leaf Disease, Organic Farming, Image Classification, Computer Vision, Deep


Learning, Feature Extraction, Training Dataset, Image Preprocessing, Neural Net-
work Architecture, Disease Recognition

v
LIST OF FIGURES

4.1 Architecture diagram . . . . . . . . . . . . . . . . . . . . . . . . 12


4.2 DataFlow diagram . . . . . . . . . . . . . . . . . . . . . . . . . . 13
4.3 Usecase diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
4.4 Class diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
4.5 Sequence diagram . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4.6 Collaboration diagram . . . . . . . . . . . . . . . . . . . . . . . 17
4.7 Activity Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

5.1 Input of the leaf . . . . . . . . . . . . . . . . . . . . . . . . . . . 21


5.2 Leaf disease detected . . . . . . . . . . . . . . . . . . . . . . . . 22
5.3 Unit Testing Result . . . . . . . . . . . . . . . . . . . . . . . . . . 23
5.4 Integration Testing Result . . . . . . . . . . . . . . . . . . . . . . 24
5.5 System Testing Result . . . . . . . . . . . . . . . . . . . . . . . . 25
5.6 Test Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

6.1 Output 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
6.2 Output 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

8.1 Plagiarism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

9.1 Poster . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

vi
LIST OF ACRONYMS AND
ABBREVIATIONS

AI Artificial Intelligence
CNN Convolutional Neural Networks
IOT Internet of Things
NIR Near Infra-Red
NN Neural Networks
UGV Unmanned Ground vehicle

vii
TABLE OF CONTENTS

Page.No

ABSTRACT v

LIST OF FIGURES vi

LIST OF ACRONYMS AND ABBREVIATIONS vii

1 INTRODUCTION 1
1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Aim of the project . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Project Domain . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4 Scope of the Project . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 LITERATURE REVIEW 5

3 PROJECT DESCRIPTION 8
3.1 Existing System . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
3.2 Proposed System . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
3.3 Feasibility Study . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.3.1 Economic Feasibility . . . . . . . . . . . . . . . . . . . . . 9
3.3.2 Technical Feasibility . . . . . . . . . . . . . . . . . . . . . 9
3.3.3 Social Feasibility . . . . . . . . . . . . . . . . . . . . . . . 10
3.4 System Specification . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.4.1 Hardware Specification . . . . . . . . . . . . . . . . . . . . 11
3.4.2 Software Specification . . . . . . . . . . . . . . . . . . . . 11
3.4.3 Standards and Policies . . . . . . . . . . . . . . . . . . . . 11

4 METHODOLOGY 12
4.1 General Architecture . . . . . . . . . . . . . . . . . . . . . . . . . 12
4.2 Design Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
4.2.1 Data Flow Diagram . . . . . . . . . . . . . . . . . . . . . . 13
4.2.2 Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . 14
4.2.3 Class Diagram . . . . . . . . . . . . . . . . . . . . . . . . 15
4.2.4 Sequence Diagram . . . . . . . . . . . . . . . . . . . . . . 16
4.2.5 Collaboration diagram . . . . . . . . . . . . . . . . . . . . 17
4.2.6 Activity Diagram . . . . . . . . . . . . . . . . . . . . . . . 18
4.3 Algorithm & Pseudo Code . . . . . . . . . . . . . . . . . . . . . . 19
4.3.1 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.3.2 Pseudo Code . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.4 Module Description . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.4.1 Module1:Disease Surveillance and Monitoring . . . . . . . 20
4.4.2 Module2:Data Analytics and Modeling . . . . . . . . . . . 20

5 IMPLEMENTATION AND TESTING 21


5.1 Input and Output . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
5.1.1 Input Design . . . . . . . . . . . . . . . . . . . . . . . . . 21
5.1.2 Output Design . . . . . . . . . . . . . . . . . . . . . . . . 22
5.2 Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
5.3 Types of Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
5.3.1 Unit testing . . . . . . . . . . . . . . . . . . . . . . . . . . 23
5.3.2 Integration testing . . . . . . . . . . . . . . . . . . . . . . 24
5.3.3 System testing . . . . . . . . . . . . . . . . . . . . . . . . 25
5.3.4 Test Result . . . . . . . . . . . . . . . . . . . . . . . . . . 26

6 RESULTS AND DISCUSSIONS 27


6.1 Efficiency of the Proposed System . . . . . . . . . . . . . . . . . . 27
6.2 Comparison of Existing and Proposed System . . . . . . . . . . . . 28
6.3 Sample Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

7 CONCLUSION AND FUTURE ENHANCEMENTS 32


7.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
7.2 Future Enhancements . . . . . . . . . . . . . . . . . . . . . . . . . 33

8 PLAGIARISM REPORT 34

9 SOURCE CODE & POSTER PRESENTATION 35


9.1 Source Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
9.2 Poster Presentation . . . . . . . . . . . . . . . . . . . . . . . . . . 38

References 39
Chapter 1

INTRODUCTION

1.1 Introduction

Agriculture is fundamental to human survival. For populated developing coun-


tries like India, it is even more imperative to increase the productivity of crops,
fruits and vegetables. Not only productivity, the quality of produce needs to stay
high for better public health. However, both productivity and quality of food gets
hampered by factors such as spread of diseases that could have been prevented with
early diagnosis. Many of these diseases are infectious leading to total loss of crop
yield. Given the vast geographical spread of agricultural lands, low education levels
of farmers coupled with limited awareness and lack of access to plant pathologists,
human assisted disease diagnosis is not effective and cannot keep up with the exor-
bitant requirements.To overcome the shortfall of human assisted disease diagnosis,
it is imperative to build automation around crop disease diagnosis with technology
and introduce low cost and accurate machine assisted diagnosis easily accessible to
farmers. Some strides have been made in applying technologies such as robotics and
computer vision systems to solve myriad problems in the agricultural domain. The
potential of image processing has been explored to assist with precision agriculture
practices, weed and herbicide technologies, monitoring plant growth and plant
nutrition management.However, progress on automating plant disease diagnosis is
still rudimentary in spite of the fact that many plant diseases can be identified by
plant pathologists by visual inspection of physical symptoms such as detectable
change in color, wilting, appearance of spots and lesions etc. along with soil
and climatic conditions. Overall, the commercial level of investment in bridging
agriculture and technology remains lower as compared to investments done in more
lucrative fields such as human health and education. Promising research efforts have
not been able to productize due to challenges such as access and linkage for farmers
to plant pathologists, high cost of deployment and scalability of solution. Recent
developments in the fields of Mobile technology, Cloud computing and Artificial In-
telligence (AI) create a perfect opportunity for creating a scalable low-cost solution

1
for crop diseases that can be widely deployed. In developing countries such as India,
mobile phones with internet connectivity have become ubiquitous. Camera and GPS
enabled low cost mobile phones are widely available that can be leveraged by indi-
viduals to upload images with geolocation. Over widely available mobile networks,
they can communicate with more sophisticated Cloud based backend services which
can perform the compute heavy tasks, maintain a centralized database, and perform
data analytics. Another leap of technology in recent years is AI based image analysis
which has surpassed human eye capabilities and can accurately identify and classify
images. The underlying AI algorithms use Neural Networks (NN) which have
layers of neurons with a connectivity pattern inspired by the visual cortex. These
networks get “trained” on a large set of pre-classified “labeled” images to achieve
high accuracy of image classification on new unseen images. Since 2012 with
“AlexNet” winning the ImageNet competition, deep Convolutional Neural Networks
(CNNs) have consistently been the winning architecture for computer vision and
image analysis . The breakthrough in the capabilities of CNNs have come with a
combination of improved compute capabilities, large data sets of images available
and improved NN algorithms. Besides accuracy, AI has evolved and become more
affordable and accessible with open source platforms such as TensorFlow . Prior
art related to our project includes initiatives to gather healthy and diseased crop
images , image analysis using feature extraction, RGB images , spectral patterns and
fluorescence imaging spectroscopy . Neural Networks have been used in the past
for plant disease identification but the approach was to identify texture features. Our
proposal takes advantage of the evolution of Mobile, Cloud and AI to develop an
end-to-end crop diagnosis solution that simulates the expertise (“intelligence”) of
plant pathologists and brings it to farmers. It also enables a collaborative approach
towards continually increasing the disease database and seeking expert advice when
needed for improved NN classification accuracy and tracking for outbreaks.

2
1.2 Aim of the project

The primary objective of a leaf disease detection project is to develop a robust


system that can swiftly and accurately identify diseases affecting crops. This en-
deavor involves leveraging advanced technologies such as image processing, ma-
chine learning, and data analytics to detect and diagnose crop diseases. The project
aims to achieve early detection of diseases, enabling timely interventions to mitigate
the spread and impact on crop yields. Key goals include ensuring high accuracy
in disease identification through the training of machine learning models on com-
prehensive datasets, automating the detection process using tools like drones for ef-
ficient coverage of large agricultural areas, and providing a user-friendly interface
for farmers to access real-time information about their crops. The project also as-
pires to address the diversity of crops and diseases by developing a system capable
of identifying various ailments, promoting cost-effective solutions, and integrating
seamlessly with existing agricultural practices. Educational components are incor-
porated to empower farmers with insights into identified diseases and recommended
treatments. Scalability, data security, and privacy considerations are vital to ensure
widespread adoption and trust among farmers, ultimately contributing to improved
crop health, increased yields, and sustainable agricultural practices on a global scale.

1.3 Project Domain

The project domain for leaf disease detection lies at the intersection of agriculture,
technology, and data science. In this domain, the focus is on developing advanced
systems and methodologies to detect and diagnose diseases affecting crops. Modern
technologies such as computer vision and machine learning play a pivotal role in
the implementation of this project. By utilizing image processing techniques, the
system can analyze visual data, such as images of crops captured by drones or other
imaging devices, to identify subtle signs and patterns indicative of various diseases.
Machine learning models are trained on extensive datasets containing both healthy
and diseased crop images, enabling the system to learn and accurately classify the
health status of crops.
Additionally, the project may involve the integration of sensor technologies, in-
cluding IoT (Internet of Things) devices, to monitor environmental conditions such
as humidity, temperature, and soil quality. These data points, when combined with

3
visual information, contribute to a more comprehensive understanding of the factors
influencing crop health. The project’s domain extends beyond mere disease detec-
tion to encompass a holistic approach to precision agriculture, aiming to empower
farmers with timely information for effective decision-making. The overarching goal
is to enhance agricultural productivity, minimize yield losses, and contribute to sus-
tainable and efficient farming practices by leveraging cutting-edge technologies in
the realm of crop disease detection.

1.4 Scope of the Project

The scope of a leaf disease detection project is extensive, encompassing various


dimensions of agricultural technology and data science. At its core, the project aims
to address the critical challenge of timely disease identification in crops, thereby en-
hancing agricultural productivity and ensuring food security. The scope includes the
development of robust algorithms and machine learning models capable of process-
ing large datasets of crop images. These models can learn to differentiate between
healthy and diseased crops based on visual cues, enabling early detection and in-
tervention. The project may explore the integration of emerging technologies such
as drones and satellite imagery to facilitate scalable and efficient monitoring of vast
agricultural landscapes.
Furthermore, the scope extends to the implementation of a user-friendly inter-
face that allows farmers to interact with the system seamlessly. This interface could
take the form of a mobile application or a web platform, providing real-time in-
sights into the health of their crops and offering actionable recommendations for
disease management. Additionally, the project may involve the incorporation of ed-
ucational components to empower farmers with knowledge about prevalent diseases,
their causes, and potential preventive measures. By encompassing these facets, the
project’s scope aims not only to detect diseases but also to promote sustainable and
informed agricultural practices, contributing to the overall well-being of the agricul-
tural ecosystem.

4
Chapter 2

LITERATURE REVIEW

L. Saxena and L. Armstrong et al.,[5] used computer technologies have been shown
to improve agricultural productivity in a number of ways. One technique which
is emerging as a useful tool is image processing. Image processing has been used
to assist with precision agriculture practices, weed and herbicide technologies,
monitoring plant growth and plant nutrition management. This paper highlights the
future potential for image processing for different agricultural industry contexts.

A. Krizhevsky and I. Sutskever Deep et al.,[1] used convolutional neural network


to classify the 1.2 million high-resolution images. Consists of five convolutional
layers, some of which are followed by max-pooling layers, and three fully-connected
layers with a final 1000-way softmax. To reduce overriding in the fully-connected
layers we employed a recently-developed regularization method called ”dropout”
that proved to be very effective.

D. L. Hernandez-Rabadan and F. Ramos-Quintan et al.,[3] used a methodol-


ogy that integrates a nonsupervised learning approach (self-organizing map ) and a
supervised one (a Bayesian classifier) for segmenting diseased plants. During the
testing phase an input image is segmented by the Bayesian classifier and then it
is converted into a binary image, wherein contours are extracted and analyzed to
recover diseased areas that were incorrectly classified as nonvegetation.

S. Sankaran and A. Mishra et al.,[10] used visible-near infrared spectroscopy


for detecting Huanglongbing in citrus orchards. Spectral reflectance data, spanning
350–2500nm with 989 features, were collected using a spectroradiometer. Pre-
processing normalized and averaged data, reducing features to 86. Three datasets,
including first and second derivatives, were generated. Principal component analysis
refined features for classification, achieving high accuracies 95 percentage in the
second derivatives dataset and 92 percentage in the combined dataset using soft
independent modeling of classification analogies.

5
C. Szegedy et al.,[2] used convolutional networks, central to modern computer
vision, have seen significant advancements since 2014, particularly with very deep
architectures gaining prominence. Despite the typical correlation between increased
model size and improved performance, computational efficiency remains crucial for
tasks like mobile vision. This study explores scalable network approaches through
factorized convolutions and rigorous regularization, achieving substantial gains on
the ILSVRC 2012 classification challenge. The proposed methods demonstrate a
notable 21.2 percentage top-1 and 5.6 percentage top-5 error reduction for single-
frame evaluation, emphasizing efficient utilization of added computation with a
network cost of 5 billion multiply-adds and less than 25 million parameters.

S. Khirade et al.,[8] tackled the problem of plant disease detection using digi-
tal image processing techniques and back propagation neural network . Authors
have elaborated different techniques for the detection of plant disease using the
images of leaves. They have implemented Otsu’s thresholding followed by boundary
detection and spot detection algorithm to segment the infected part in leaf. After
that they have extracted the features such as color, texture, morphology, edges etc.
for classification of plant disease.It is used for classification i.e. to detect the plant
disease.

S. C. Madiwalar and M. V. Wyawahare et al.,[7] analyzed different image pro-


cessing approaches for plant disease detection in their research. Authors analyzed
the color and texture features for the detection of plant disease. They have experi-
mented their algorithms on the dataset of 110 RGB images. The features extracted
for classification were mean and standard deviation of Red Green Blue channels,
grey level cooccurrence matrix features, the mean and standard deviation of the
image convolved with Gabor filter. Support vector machine classifier was used for
classification. Authors concluded that features are effective to detect normal leaves.
Whereas color features and Gabor filter features are considered as best for detecting
anthracnose affected leaves and leaf spot respectively. They have achieved highest
accuracy of 83.34 percentage using all the extracted features.

6
P. Moghadam et al.,[6] demonstrated the application of hyperspectral imaging
in plant disease detection task. visible and near-infrared and short-wave infrared
spectrums were used in this research. Authors have used k-means clustering algo-
rithm in spectral domain for the segmentation of leaf. They have proposed a novel
grid removal algorithm to remove the grid from hyperspectral images. Authors have
achieved the accuracy of 83 percentage with vegetation indices in spectral range and
93 percentage accuracy with full spectrum. Though the proposed method achieved
higher accuracy, it requires the hyperspectral camera with 324 spectral bands so the
solution becomes too costly.

Sharath D. M. et al.,[9] developed the Bacterial Blight detection system for


Pomegranate plant by using features such as color, mean, homogeneity, SD,
variance, correlation, entropy, edges etc. Authors have implemented grab cut
segmentation for segmenting the region of interest in the image [4]. Canny edge
detector was used to extract the edges from the images. Authors have successfully
developed a system which can predict the infection level in the fruit.

G. Shrestha et al.,[4] deployed the convolutional neural network to detect the


plant disease. Authors have successfully classified 12 plant diseases with 88.80
percentage accuracy. The dataset of 3000 high resolution RGB images were used
for experimentation. The network has 3 blocks of convolution and pooling layers.
This makes the network computationally expensive. Also the F1 score of the model
is 0.12 which is very low because of higher number of false negative predictions.

7
Chapter 3

PROJECT DESCRIPTION

3.1 Existing System

In India alone, 35 percentage of field crops are lost to pathogens and pests causing
losses to farmers. Indiscriminate use of pesticides is also a serious health concern
as many are toxic and biomagnified. These adverse effects can be avoided by early
disease detection, crop surveillance and targeted treatments. Most diseases are diag-
nosed by agricultural experts by examining external symptoms. However, farmers
have limited access to experts.
Existing System Disadvantages:
1.Less Accuracy
2.Low Efficiency

3.2 Proposed System

In this project using convolution neural network as artificial intelligence to train


all plant diseases images and then upon uploading new images CNN will predict
plant disease available in uploaded images. For storing CNN train model and images
author is using cloud services. so, using Al author predicting plant disease and cloud
is used to store data. In this Project author using smart phone to upload image but
designing android application will take extra cost and time so we build it as python
web application.Using this web application CNN model will get trained and user
can upload images and then application will apply cnn model on uploaded images
to predict diseases. If this web application deployed on real web server then it will
extract users location from request object and can display those location in map.
Proposed System Advantages:
1.High Accuracy
2.High Efficiency

8
3.3 Feasibility Study

3.3.1 Economic Feasibility

The economic feasibility of deploying a leaf disease detection system using Con-
volutional Neural Network (CNN) algorithms involves a comprehensive evaluation
of various factors. Initial development costs encompass hardware and software ac-
quisitions, including development tools, along with salaries for data scientists and
machine learning engineers. Data-related expenses, such as acquiring and annotat-
ing a diverse dataset, are critical considerations. The computational resources and
time required for model training and validation, as well as deployment infrastruc-
ture costs, need to be accounted for. Ongoing maintenance, updates, and monitoring
carry additional expenses. Assessing the benefits, including potential crop yield im-
provement, reduced pesticide use, and market value, aids in estimating the return
on investment. Regulatory compliance and ethical considerations add complexity.
Scalability, both in terms of system performance and costs, should be evaluated, and
alternative solutions should be compared to determine the most economically viable
approach. This analysis provides a holistic perspective on the economic feasibility
of implementing a CNN-based leaf disease detection system in agriculture.

3.3.2 Technical Feasibility

The technical feasibility of implementing a leaf disease detection system through


Convolutional Neural Network (CNN) algorithms involves a multifaceted assess-
ment of various components. Initial considerations include the selection of a suit-
able CNN architecture, such as ResNet, based on the complexity of the leaf disease
detection task. The availability and quality of labeled datasets containing diverse
leaf images become pivotal, necessitating a comprehensive review. Computational
resources, particularly for model training, and the potential use of cloud services for
scalability, should be evaluated. Data preprocessing steps, encompassing resizing
and normalization, are critical to enhance model training. Further, the integration of
the proposed system with existing agricultural technologies demands attention to en-
sure compatibility. Real-time processing capabilities, scalability for future growth,
and the selection of appropriate software development tools and frameworks must
be thoroughly considered. Rigorous testing and validation procedures, including
cross-validation, are essential to guarantee the model’s accuracy and generalization.

9
Additionally, addressing security and privacy concerns becomes paramount, neces-
sitating the implementation of encryption and secure communication protocols. This
comprehensive examination of technical aspects is vital for determining the feasibil-
ity and successful implementation of a CNN-based leaf disease detection system in
agriculture.

3.3.3 Social Feasibility

Assessing the social feasibility of implementing a leaf disease detection system


employing Convolutional Neural Network (CNN) algorithms entails a thorough ex-
amination of its alignment with societal values and norms. Central to this evaluation
is understanding the level of acceptance and awareness within the farming com-
munity regarding the integration of technology for disease detection. Initiatives to
enhance awareness and acceptance, such as outreach programs and educational re-
sources, play a pivotal role. User-friendliness is another critical factor, ensuring that
the system is accessible and comprehensible to farmers who may not possess ad-
vanced technological expertise. A focus on intuitive interfaces and straightforward
implementation contributes significantly to the system’s social acceptance. More-
over, recognizing and addressing any cultural or ethical considerations associated
with the deployment of technology in agriculture is essential for fostering positive
reception within the community. By actively considering these social dimensions,
stakeholders can enhance the likelihood of successful adoption and integration of
the leaf disease detection system in the agricultural context.

10
3.4 System Specification

3.4.1 Hardware Specification

• System : Pentium IV 2.4 GHz.


• Hard Disk : 40 GB.
• Floppy Drive : 1.44 Mb.
• Monitor : 15 VGA Colour.
• Mouse : Logitech.
• Ram : 512 Mb.

3.4.2 Software Specification

• Operating system : Windows8 or Above.


• Coding Language : python 3.7

3.4.3 Standards and Policies

Pycharm
PyCharm, as an Integrated Development Environment (IDE) for Python, follows
certain standards and policies to ensure a consistent and productive development
experience for its users PEP 8 Compliance: PyCharm often aligns with PEP 8, the
official style guide for Python code, promoting readability and consistency. Code
Formatting: Offers customizable code formatting options to enforce consistent styles
and indents.

11
Chapter 4

METHODOLOGY

4.1 General Architecture

Figure 4.1: Architecture diagram

This diagram illustrates how a smartphone can be used to diagnose the health of
a leaf by capturing, scanning, and processing its image. The system compares the
leaf image with a database of healthy and diseased leaves and returns a result that
classifies the leaf as either healthy or diseased, with a green or red color code. The
system also provides a feedback loop that allows the user to adjust the output until
they are satisfied or stop the process.

12
4.2 Design Phase

4.2.1 Data Flow Diagram

Figure 4.2: DataFlow diagram

The diagram shows how a user interacts with a system that allows them to upload
plant images. The user first undergoes a check to verify their authorization. If the
user is authorized, they can proceed to upload a plant image and then log out of the
system. If the user is not authorized, they are labeled as an unauthorized user and the
process ends. The diagram uses different shapes and arrows to represent the steps
and the flow of data in the process.

13
4.2.2 Use Case Diagram

Figure 4.3: Usecase diagram

The diagram shows how a user interacts with a system for a plant image uploading
application. The user can register, login, upload a plant image, and logout. Each oval
represents a different use case or action that the user can perform. The lines connect-
ing the “User” icon to each oval indicate that the user has the option to execute these
actions

14
4.2.3 Class Diagram

Figure 4.4: Class diagram

The diagram shows how a user interacts with a system for a plant image uploading
application. The user can register, login, upload a plant image, and logout. Each oval
represents a different use case or action that the user can perform. The lines connect-
ing the “User” icon to each oval indicate that the user has the option to execute these
actions.

15
4.2.4 Sequence Diagram

Figure 4.5: Sequence diagram

The sequence diagram shows how a user interacts with a database system to per-
form four actions: registering, logging in, uploading a plant image, and logging out.
Each action is represented by a horizontal arrow from the user column to the database
column, labeled with the corresponding function name. The sequence of actions is
from top to bottom, indicating the order in which they occur. The diagram illus-
trates the basic steps involved in using the database system and the functions that are
available to the user

16
4.2.5 Collaboration diagram

Figure 4.6: Collaboration diagram

The collaboration diagram shows how a User interacts with a Database through
four steps: Register, Login, Upload plant Image, and Logout. The User is an object
that initiates the interaction, and the Database is an object that responds to the mes-
sages from the User. The diagram illustrates the flow of data and control between
the User and the Database, as well as the states of the objects during the interaction.
The diagram can help to understand the requirements and design of the system, as
well as to identify potential errors or improvements.

17
4.2.6 Activity Diagram

Figure 4.7: Activity Diagram

The activity diagram shows the steps involved in uploading plant images on a
platform. The process starts with the user registering and logging in to their account.
After that, they can upload as many plant images as they want by selecting the option
from the menu. The process ends when the user logs out of their account. The
diagram uses ovals to represent the user and the start and end points, and rectangles
to represent the actions.

18
4.3 Algorithm & Pseudo Code

4.3.1 Algorithm

1. Input: Image of the crop


2. Preprocess the image to enhance features and reduce noise
3. Extract relevant features from the preprocessed image
4. Load a pre-trained machine learning model for crop disease detection
5. Use the model to predict whether the crop is healthy or diseased based on the
extracted features
6. If the prediction indicates a diseased crop:
a. Provide information about the detected disease
b. Optionally, suggest treatment or preventive measures
7. If the prediction indicates a healthy crop:
a. Display a message confirming the crop’s health

4.3.2 Pseudo Code

1 input image = capture image ()


2

3 preprocessed image = preprocess image ( input image )


4

5 features = e x t r a c t f e a t u r e s ( preprocessed image )


6

7 training data = load training data ()


8 model = t r a i n m o d e l ( f e a t u r e s , t r a i n i n g d a t a )
9

10 p r e d i c t i o n = p r e d i c t ( model , f e a t u r e s )
11

12 i f p r e d i c t i o n == ” h e a l t h y ” :
13 d i s p l a y m e s s a g e ( ” The c r o p a p p e a r s t o be h e a l t h y . ” )
14 else :
15 d i s p l a y m e s s a g e ( ” The c r o p i s l i k e l y a f f e c t e d by a d i s e a s e . F u r t h e r a n a l y s i s n e e d e d . ” )
16

17 i f p r e d i c t i o n == ” d i s e a s e d ” :
18 treatment recommendations = recommend treatment ( p r e d i c t i o n )
19 display recommendations ( treatment recommendations )

19
4.4 Module Description

4.4.1 Module1:Disease Surveillance and Monitoring

Step:1 Implement a real-time monitoring system for early disease detection.


Step:2 Utilize advanced technologies like drones, satellite imagery, or on-field sen-
sors.
Step:3 Establish a centralized database for recording and analyzing disease inci-
dence.
Step:4 Integrate weather data to identify potential disease outbreaks based on envi-
ronmental conditions.
Step:5 Ensure timely communication of surveillance results to farmers for prompt
action.

4.4.2 Module2:Data Analytics and Modeling

Step:1 Collect and analyze historical data on crop diseases and environmental fac-
tors.
Step:2 Develop predictive models using machine learning algorithms for accurate
disease forecasting.
Continuously update models based on new data and emerging trends.
Step:3 Provide user-friendly interfaces for farmers to access and interpret model pre-
dictions.
Step:4 Integrate real-time data from surveillance systems to enhance model accuracy.

20
Chapter 5

IMPLEMENTATION AND TESTING

5.1 Input and Output

5.1.1 Input Design

Figure 5.1: Input of the leaf

The input dataset includes several classes of leaf diseases caused by fungi,
viruses,pests, bacteria, Phytophthora, nematodes, and healthy leaves.

21
5.1.2 Output Design

Figure 5.2: Leaf disease detected

The output got shows that the leaf is diseased.Each prediction would indicate the
likelihood of the input leaf image belonging to a specific disease class, such as early
blight, late blight, or others.The model’s output serves as a diagnostic tool, aiding in
the identification and classification of various leaf diseases. Users can interpret the
predictions to understand the specific disease affecting the plant.

22
5.2 Testing

5.3 Types of Testing

5.3.1 Unit testing

Input

Figure 5.3: Unit Testing Result

Unit testing involves the design of test cases that validate that the internal pro-
gram logic is functioning properly, and that program inputs produce valid outputs.
All decision branches and internal code flow should be validated. It is the testing of
individual software units of the application .it is done after the completion of an in-
dividual unit before integration. This is a structural testing, that relies on knowledge
of its construction and is invasive. Unit tests perform basic tests at component level
and test a specific business process, application, and/or system configuration. Unit
tests ensure that each unique path of a business process performs accurately to the
documented specifications and contains clearly defined inputs and expected results.

23
5.3.2 Integration testing

Figure 5.4: Integration Testing Result

Integration tests are designed to test integrated software components to determine


if they actually run as one program. Testing is event driven and is more concerned
with the basic outcome of screens or fields. Integration tests demonstrate that al-
though the components were individually satisfaction, as shown by successfully unit
testing, the combination of components is correct and consistent. Integration testing
is specifically aimed at exposing the problems that arise from the combination of
components.

24
5.3.3 System testing

Figure 5.5: System Testing Result

System testing ensures that the entire integrated software system meets require-
ments. It tests a configuration to ensure known and predictable results. An example
of system testing is the configuration oriented system integration test. System testing
is based on process descriptions and flows, emphasizing pre-driven process links and
integration points.

25
5.3.4 Test Result

Figure 5.6: Test Image

The output got shows that the leaf is diseased. Each prediction would indicate the
likelihood of the input leaf image belonging to a specific disease class, such as early
blight, late blight, or others.The model’s output serves as a diagnostic tool, aiding in
the identification and classification of various leaf diseases. Users can interpret the
predictions to understand the specific disease affecting the plant.

26
Chapter 6

RESULTS AND DISCUSSIONS

6.1 Efficiency of the Proposed System

• Accuracy: The primary measure of efficiency is the accuracy of disease prediction


using the Convolutional Neural Network (CNN). A higher accuracy indicates the
system’s effectiveness in correctly identifying plant diseases, contributing to better
decision-making for farmers.
• Speed and Responsiveness:The efficiency of the system is also determined by
how quickly it processes and predicts diseases upon image uploads. Faster response
times enhance user experience and provide timely information for farmers to take
necessary actions.
• Scalability:The ability of the system to handle an increasing number of users and
a growing dataset is crucial. A scalable system ensures that as more images are
uploaded and the user base expands, the performance and response time remain con-
sistent.
• Resource Utilization:Efficient use of computational resources, including memory
and processing power, is essential for a well-performing system. Optimized resource
utilization contributes to cost-effectiveness and sustainability.
• User-Friendliness:The user interface of the web application plays a significant
role in system efficiency. An intuitive and user-friendly design enhances the overall
efficiency by reducing the learning curve for farmers and encouraging regular usage.
• Robustness and Reliability:A reliable system operates consistently under various
conditions, handling unexpected inputs and potential errors gracefully. The robust-
ness of the proposed system is essential for providing dependable disease predic-
tions.
• Integration with Cloud Services:The efficiency of the system depends on the
seamless integration with cloud services for storing the CNN model and images. A
well-integrated cloud solution ensures data accessibility, reliability, and scalability.
• Geolocation Feature:If deployed on a real web server, the system’s efficiency can
be assessed based on how accurately it extracts and utilizes user locations from the

27
request object. The geolocation feature’s effectiveness enhances the system’s value
by providing geographical insights.
• Adaptability and Upgradability:An efficient system should be adaptable to
changing requirements and capable of receiving upgrades. This ensures its longevity
and relevance in the dynamic field of agriculture and technology.

6.2 Comparison of Existing and Proposed System

Technological Advancement: The proposed system introduces advanced technol-


ogy (CNN) for automated disease detection, surpassing the reliance on manual expert
diagnosis in the existing system.
Enhanced Accessibility: The proposed web application, despite being a cost-
effective alternative to an Android app, improves accessibility for farmers, enabling
them to easily upload images and receive disease predictions.
Cloud-Based Scalability: Storing data on cloud services in the proposed system en-
hances scalability and accessibility, overcoming potential limitations in the existing
system.
Geographical Insights: The proposed system introduces a geolocation feature, of-
fering valuable insights into the geographical distribution of plant diseases, which is
lacking in the existing system.

6.3 Sample Code

1 from t e n s o r f l o w . k e r a s . m o d e l s i m p o r t l o a d m o d e l
2 from t e n s o r f l o w . k e r a s . p r e p r o c e s s i n g . image i m p o r t l o a d i m g , i m g t o a r r a y
3 i m p o r t numpy a s np
4 from f l a s k i m p o r t F l a s k , r e q u e s t , r e n d e r t e m p l a t e
5 from w e r k z e u g . u t i l s i m p o r t s e c u r e f i l e n a m e
6

8 i m p o r t os , s y s , g l o b , r e
9

10 app = F l a s k ( n a m e )
11

12 m o d e l p a t h = ” weeds . h5 ”
13

14

15

16 c l a s s e s = { 0 : ” P l a n t s : −{ About P l a n t s } ” , 1 : ” weeds : −{ a b o u t weeds } ” }


17

28
18 def model predict ( image path ) :
19 print ( ” Predicted ” )
20 image = l o a d i m g ( i m a g e p a t h , t a r g e t s i z e = ( 2 2 4 , 2 2 4 ) )
21 image = i m g t o a r r a y ( image )
22 image = image / 2 5 5
23 image = np . e x p a n d d i m s ( image , a x i s = 0 )
24 model = l o a d m o d e l ( m o d e l p a t h )
25 r e s u l t = np . argmax ( model . p r e d i c t ( image ) )
26 prediction = classes [ result ]
27

28

29 i f r e s u l t == 0 :
30 p r i n t ( ” p l a n t . html ” )
31

32 r e t u r n ” P l a n t s ” , ” p l a n t . html ”
33 elif r e s u l t == 1 :
34 p r i n t ( ” weeds . h t m l ” )
35

36 r e t u r n ” weeds ” , ” weeds . h t m l ”
37

38

39

40 @app . r o u t e ( ’ / ’ , m e t h o d s = [ ’GET ’ ] )
41 def index ( ) :
42 r e t u r n r e n d e r t e m p l a t e ( ’ index . html ’ )
43

44

45

46 @app . r o u t e ( ’ / p r e d i c t ’ , m e t h o d s = [ ’GET ’ , ’POST ’ ] )


47 def p r e d i c t ( ) :
48 p r i n t ( ” Entered ” )
49 i f r e q u e s t . method == ’POST ’ :
50 p r i n t ( ” Entered here ” )
51 f i l e = r e q u e s t . f i l e s [ ’ image ’ ] # f e t i n p u t
52 filename = f i l e . filename
53 p r i n t ( ”@@ I n p u t p o s t e d = ” , f i l e n a m e )
54

55 f i l e p a t h = os . p a t h . j o i n ( ’ s t a t i c / u s e r uploaded ’ , f i l e n a m e )
56 f i l e . save ( f i l e p a t h )
57

58 p r i n t ( ”@@ P r e d i c t i n g c l a s s . . . . . . ” )
59 pred , o u t p u t p a g e = m o d e l p r e d i c t ( f i l e p a t h )
60

61 r e t u r n r e n d e r t e m p l a t e ( o u t p u t p a g e , p r e d o u t p u t = pred , u s e r i m a g e = f i l e p a t h )
62

63

64

65 if name == ’ m a i n ’ :
66 app . r u n ( debug = True , t h r e a d e d = F a l s e )

29
Output

Figure 6.1: Output 1

The login page for the ”Leaf Disease Detection Using CNN Algorithm”. Navi-
gation links for “Home”, “Login”, and “Register Here” are displayed.Here, the user
can login into the application by using username and password The “User Login
Screen” section includes input fields for Username and Password, along with a Lo-
gin button.

30
Figure 6.2: Output 2

The output got shows that the leaf is diseased.Each prediction would indicate
the likelihood of the input leaf image belonging to a specific disease class, such as
early blight, late blight, or others.The model’s output serves as a diagnostic tool,
aiding in the identification and classification of various leaf diseases. Users can
interpret the predictions to understand the specific disease affecting the plant.

31
Chapter 7

CONCLUSION AND FUTURE


ENHANCEMENTS

7.1 Conclusion

This Project presents an automated, low cost and easy to use end-to-end solution
to one of the biggest challenges in the agricultural domain for farmers – precise,
instant and early diagnosis of crop diseases and knowledge of disease outbreaks -
which would be helpful in quick decision making for measures to be adopted for
disease control. This Project innovates on known prior art with the application of
deep Convolutional Neural Networks (CNNs) for disease classification, introduc-
tion of social collaborative platform for progressively improved accuracy, usage of
geocoded images for disease density maps and expert interface for analytics. High
performing deep CNN model “Inception” enables real time classification of diseases
in the Cloud platform via a user facing mobile app. Collaborative model enables con-
tinuous improvement in disease classification accuracy by automatically growing the
Cloud based training dataset with user added images for retraining the CNN model.
User added images in the Cloud repository also enable rendering of disease density
maps based on collective disease classification data and availability of geolocation
information within the images. Overall, the results of the experiments demonstrate
that the proposal has significant potential for practical deployment due to multiple
dimensions – the Cloud based infrastructure is highly scalable and the underlying
algorithm works accurately even with large number of disease categories, performs
better with high fidelity real-life training data, improves accuracy with increase in
the training dataset, is capable of detecting early symptoms of diseases and is able to
successfully differentiate between diseases of the same family.

32
7.2 Future Enhancements

Future work involves expanding the model to include more parameters which can
improve the correlation to the disease.Image database with supporting inputs from
the farmer on soil, past fertilizer and pesticide treatment along with publicly avail-
able environmental factors such as temperature, humidity and rainfall to improve our
model accuracy and enable disease forecasting, also wish to increase the number of
crop diseases covered and reduce the need for expert intervention except for new
types of diseases. For automatic acceptance of user uploaded images into the Train-
ing Database for better classification accuracy and least possible human intervention,
a simple technique of computing the threshold based on a mean of all classification
scores can be used. Further application of this work could be to support automated
time-based monitoring of the disease density maps that can be used to track the
progress of a disease and trigger alarms. Predictive analytics can be used to send
alerts to the users on the possibility of disease outbreaks near their location.

33
Chapter 8

PLAGIARISM REPORT

Figure 8.1: Plagiarism

34
Chapter 9

SOURCE CODE & POSTER


PRESENTATION

9.1 Source Code

1 # ! / u s r / b i n / env p y t h o n
2 # c o d i n g : u t f −8
3

5 from t e n s o r f l o w . k e r a s . l a y e r s i m p o r t I n p u t , Lambda , Dense , F l a t t e n


6 from t e n s o r f l o w . k e r a s . m o d e l s i m p o r t Model
7 from t e n s o r f l o w . k e r a s . p r e p r o c e s s i n g i m p o r t image
8 from t e n s o r f l o w . k e r a s . p r e p r o c e s s i n g . image i m p o r t I m a g e D a t a G e n e r a t o r
9 from t e n s o r f l o w . k e r a s . m o d e l s i m p o r t S e q u e n t i a l
10 i m p o r t numpy a s np
11 from g l o b i m p o r t g l o b
12 import m a t p l o t l i b . pyplot as p l t
13

14

15 IMAGE SIZE = [ 2 2 4 , 2 2 4 ]
16

17 t r a i n p a t h = ” a g r i c u l t u r e / Train / ”
18

19 # In [ 3 ] :
20

21 from k e r a s . p r e p r o c e s s i n g . image i m p o r t I m a g e D a t a G e n e r a t o r
22

23 t r a i n d a t a g e n = I m a g e D a t a G e n e r a t o r ( r e s c a l e = 1 . / 2 5 5 , h o r i z o n t a l f l i p = True , z o o m r a n g e = 0 . 2 ,
v a l i d a t i o n s p l i t =0.15)
24

25 training set = train datagen . flow from directory (


26 t r a i n p a t h , t a r g e t s i z e = ( 2 2 4 , 2 2 4 ) , b a t c h s i z e =32 , c l a s s m o d e = ’ c a t e g o r i c a l ’ ,
27 subset=’ training ’ )
28

29 validation set = train datagen . flow from directory (


30 t r a i n p a t h , t a r g e t s i z e = ( 2 2 4 , 2 2 4 ) , b a t c h s i z e =32 , c l a s s m o d e = ’ c a t e g o r i c a l ’ , s h u f f l e = True ,
31 subset=’ validation ’ )
32

33

34 from t e n s o r f l o w . k e r a s . a p p l i c a t i o n s i m p o r t VGG19

35
35 from t e n s o r f l o w . k e r a s . l a y e r s i m p o r t G l o b a l A v e r a g e P o o l i n g 2 D , D r o p o u t
36

37 ## We a r e i n i t i a l i s i n g t h e i n p u t s h a p e w i t h 3 c h a n n e l s r g b and w e i g h t s a s i m a g e n e t and i n c l u d e t o p
a s F a l s e w i l l make t o u s e o u r own c u s t o m i n p u t s
38

39 mv = VGG19 ( i n p u t s h a p e =IMAGE SIZE + [ 3 ] , w e i g h t s = ’ i m a g e n e t ’ , i n c l u d e t o p = F a l s e )


40

41

42 f o r l a y e r s i n mv . l a y e r s :
43 layers . trainable = False
44

45

46 # i f u want t o add more f o l d e r s and t r a i n t h e n c h a n g e number 4 t o 5 o r 6 b a s e d on f o l d e r s u h a v e t o


train
47 x = F l a t t e n ( ) ( mv . o u t p u t )
48 p r e d i c t i o n = Dense ( 2 , a c t i v a t i o n = ’ s o f t m a x ’ ) ( x )
49

50

51 model = Model ( i n p u t s =mv . i n p u t , o u t p u t s = p r e d i c t i o n )


52

53 model . summary ( )
54

55

56 import tensorflow as t f
57

58 c l a s s myCallback ( t f . k e r a s . c a l l b a c k s . C a l l b a c k ) :
59 d e f o n e p o c h e n d ( s e l f , epoch , l o g s ={}) :
60 i f ( l o g s . g e t ( ’ l o s s ’ ) <=0.05) :
61 p r i n t ( ”\ nEnding t r a i n i n g ” )
62 s e l f . model . s t o p t r a i n i n g = T r u e
63 # i n i t i a t i n g t h e myCallback f u n c t i o n
64 c a l l b a c k s = myCallback ( )
65

66

67

68 ## L e t u s c o m p i l e t h e model w i t h Adam o p t i m i z e r and l o s s f u n c t i o n c a t e g o r i c a l c r o s s e n t r o p y and


metrics as c a t e g o r i c a l a c c u r a c y
69 from t e n s o r f l o w . k e r a s . o p t i m i z e r s i m p o r t Adam
70 model . c o m p i l e ( o p t i m i z e r =Adam ( l r = 0 . 0 0 0 1 ) , l o s s = ’ c a t e g o r i c a l c r o s s e n t r o p y ’ , m e t r i c s = [ ’
categorical accuracy ’ ])
71

72

73 h i s t o r y = model . f i t ( t r a i n i n g s e t ,
74 validation data=validation set ,
75 e p o c h s =50 ,
76 v e r b o s e =1 ,
77 steps per epoch=len ( t r a i n i n g s e t ) ,
78 v a l i d a t i o n s t e p s =len ( v a l i d a t i o n s e t ) ,
79 callbacks = [ callbacks ]
80 )

36
81

82 # In [ 1 1 ] :
83

84 acc = h i s t o r y . h i s t o r y [ ’ c a t e g o r i c a l a c c u r a c y ’ ]
85 val acc = history . history [ ’ val categorical accuracy ’ ]
86

87 loss = history . history [ ’ loss ’ ]


88 val loss = history . history [ ’ val loss ’]
89

90 epochs = range ( len ( acc ) )


91

92 import m a t p l o t l i b . pyplot as p l t
93 p l t . p l o t ( epochs , acc )
94 p l t . p l o t ( epochs , v a l a c c )
95 p l t . t i t l e ( ” T r a i n i n g and v a l i d a t i o n A c c u r a c y ” )
96 plt . figure ()
97

98 p l t . p l o t ( epochs , l o s s )
99 p l t . p l o t ( epochs , v a l l o s s )
100 p l t . t i t l e ( ” T r a i n i n g and v a l i d a t i o n L o s s ” )
101 plt . figure ()
102

103 # In [ 1 3 ] :
104

105 model . s a v e ( ”VGG− 1 9 . h5 ” )


106

107 # In [ 1 9 ] :
108

109 from t e n s o r f l o w . k e r a s . m o d e l s i m p o r t l o a d m o d e l
110 from t e n s o r f l o w . k e r a s . p r e p r o c e s s i n g i m p o r t image
111 i m p o r t numpy a s np
112

113 # dimensions of our images


114 img width , i m g h e i g h t = 224 ,224
115

116 # l o a d t h e model we s a v e d
117 model = l o a d m o d e l ( ’VGG− 1 9 . h5 ’ )
118 # p r e d i c t i n g images
119 img = image . l o a d i m g ( ’ a g r i c u l t u r e / T e s t / p l a n t s / a g r i 0 3 . j p e g ’ , t a r g e t s i z e = ( i m g w i d t h , i m g h e i g h t ) )
120 x = image . i m g t o a r r a y ( img )
121 x = np . e x p a n d d i m s ( x , a x i s = 0 )
122

123 c l a s s e s = model . p r e d i c t ( x )
124 print ( classes )
125

126 # In [ ] :

37
9.2 Poster Presentation

Figure 9.1: Poster

38
References

[1] A. Krizhevsky, I. Sutskever, ”Imagenet classification with deep convolutional


neural networks,” in Advances in Neural Information Processing Systems, 2020.
[2] C. Szegedy,“Rethinking the inception architecture for computer vision,” in Pro-
ceedings of the IEEE Conference on Computer Vision and Pattern Recognition,
2018, pp. 2818-2826.
[3] D. L. Hernandez-Rabadán, F. Ramos-Quintana and J. Guerrero Juk, “Integrating
soms and a bayesian classifier for segmenting diseased plants in uncontrolled
environments,” 2014, in the Scientific World Journal, 2019.
[4] G. Shrestha, Deepsikha, M. Das and N. Dey, ”Plant Disease Detection Using
CNN,” 2020 IEEE Applied Signal Processing Conference , 2020, pp. 109-113.
[5] L. Saxena and L. Armstrong, “A survey of image processing techniques for agri-
culture,” in Proceedings of Asian Federation for Information Technology in Agri-
culture, 2019, pp. 401-413.
[6] P. Moghadam, D. Ward, E. Goan, S. Jayawardena, P. Sikka and E. Hernan-
dez, ”Plant Disease Detection Using Hyperspectral Imaging,” 2018 Interna-
tional Conference on Digital Image Computing: Techniques and Applications
(DICTA), 2018, pp. 1-8.
[7] S. C. Madiwalar and M. V. Wyawahare, ”Plant disease identification: A compar-
ative study,”2017 International Conference on Data Management, Analytics and
Innovation , 2019, pp. 13-18.
[8] S. D. Khirade and A. B. Patil, ”Plant Disease Detection Using Image Process-
ing,” 2017 International Conference on Computing Communication Control and
Automation, 2015, pp. 768-771.
[9] Sharath. D.M., Akhilesh, S. A. Kumar and P. C., ”Image based Plant Disease
Detection in Pomegranate Plant for Bacterial Blight,” 2019 International Confer-
ence on Communication and Signal Processing, 2019, pp. 0645-0649.
[10] S. Sankaran, A. Mishra, J. M. Maja and R. Ehsani, “Visible-near infrared spec-
troscopy for detection of huanglongbing in citrus orchards,” in Computers and
Electronics in. Agriculture 77, 2019, pp. 127–134.

39

You might also like