Professional Documents
Culture Documents
Bachelor of Technology
in
Computer Science & Engineering
By
January, 2024
LEAF DISEASE DETECTION USING CNN ALGORITHM
Bachelor of Technology
in
Computer Science & Engineering
By
January, 2024
CERTIFICATE
It is certified that the work contained in the project report titled ”LEAF DISEASE DETECTION
USING CNN ALGORITHM” by ”THOTLA NARASIMHA REDDY (21UECM0245), THOTLA
DHARMA REDDY (21UECM0244), NARAYANAPURAM RAVINDRA BABU (21UECM0170)”
has been carried out under my supervision and that this work has not been submitted elsewhere for a
degree.
Signature of Supervisor
Dr.S.Lalitha
Associate Professor
Computer Science & Engineering
School of Computing
Vel Tech Rangarajan Dr. Sagunthala R&D
Institute of Science & Technology
January, 2024
i
DECLARATION
We declare that this written submission represents my ideas in our own words and where others ideas
or words have been included, we have adequately cited and referenced the original sources. We
also declare that we have adhered to all principles of academic honesty and integrity and have not
misrepresented or fabricated or falsified any idea/data/fact/source in our submission. We understand
that any violation of the above will be cause for disciplinary action by the Institute and can also
evoke penal action from the sources which have thus not been properly cited or from whom proper
permission has not been taken when needed.
ii
APPROVAL SHEET
This project report entitled ”LEAF DISEASE DETECTION USING CNN ALGORITHM” by THOTLA
NARASIMH A REDDY (21UECM0245), THOTLA DHARMA REDDY (21UECM0244), NARAYAN
APURAM RAVINDRA BABU (21UECM0170) is approved for the degree of B.Tech in Computer
Science & Engineering.
Examiners Supervisor
Date: / /
Place:
iii
ACKNOWLEDGEMENT
We express our deepest gratitude to our respected Founder Chancellor and President Col. Prof.
Dr. R. RANGARAJAN B.E. (EEE), B.E. (MECH), M.S (AUTO),D.Sc., Foundress President Dr.
R. SAGUNTHALA RANGARAJAN M.B.B.S. Chairperson Managing Trustee and Vice President.
We are very much grateful to our beloved Vice Chancellor Prof. S. SALIVAHANAN, for provid-
ing us with an environment to complete our project successfully.
We record indebtedness to our Professor & Dean, Department of Computer Science & Engi-
neering, School of Computing, Dr. V. SRINIVASA RAO, M.Tech., Ph.D., for immense care and
encouragement towards us throughout the course of this project.
We are thankful to our Head, Department of Computer Science & Engineering, Dr.M.S. MU-
RALI DHAR, M.E., Ph.D., for providing immense support in all our endeavors.
We also take this opportunity to express a deep sense of gratitude to our Internal Supervisor
Dr.S.LALITHA,B.Tech.,M.E.,Ph.D., for her cordial support, valuable information and guidance,
she helped us in completing this project through various stages.
A special thanks to our Project Coordinators Mr. V. ASHOK KUMAR, M.Tech., Ms. C. SHYA-
MALA KUMARI, M.E., Mr. SHARAD SHANDHI RAVI, M.Tech., for their valuable guidance
and support throughout the course of the project.
We thank our department faculty, supporting staff and friends for their help and guidance to com-
plete this project.
iv
ABSTRACT
Leaf diseases are a major threat to farmers, consumers, environment and the global
economy. In India alone, 35 percentage of field crops are lost to pathogens and pests
causing losses to farmers. Indiscriminate use of pesticides is also a serious health
concern as many are toxic and biomagnified. These adverse effects can be avoided
by early disease detection, crop surveillance and targeted treatments. Most diseases
are diagnosed by agricultural experts by examining external symptoms. However,
farmers have limited access to experts. Project is the first integrated and collabo-
rative platform for automated disease diagnosis, tracking and forecasting. Farmers
can instantly and accurately identify diseases and get solutions with a mobile app by
photographing affected plant parts. Real-time diagnosis is enabled using the latest
Artificial Intelligence (AI) algorithms for Cloud-based image processing. The AI
model continuously learns from user uploaded images and expert suggestions to en-
hance its accuracy. Farmers can also interact with local experts through the platform.
For preventive measures, disease density maps with spread forecasting are rendered
from a Cloud based repository of geo-tagged images and micro-climactic factors. A
web interface allows experts to perform disease analytics with geographical visual-
izations. In our experiments, the AI model (CNN) was trained with large disease
datasets, created with plant images self-collected from many farms over 7 months.
Test images were diagnosed using the automated CNN model and the results were
validated by plant pathologists. Over 95 percentage disease identification accuracy
was achieved.Solution is a novel, scalable and accessible tool for disease manage-
ment of diverse agricultural crop plants and can be deployed as a Cloud based service
for farmers and experts for ecologically sustainable crop production.
Keywords:
v
LIST OF FIGURES
6.1 Output 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
6.2 Output 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
8.1 Plagiarism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
9.1 Poster . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
vi
LIST OF ACRONYMS AND
ABBREVIATIONS
AI Artificial Intelligence
CNN Convolutional Neural Networks
IOT Internet of Things
NIR Near Infra-Red
NN Neural Networks
UGV Unmanned Ground vehicle
vii
TABLE OF CONTENTS
Page.No
ABSTRACT v
LIST OF FIGURES vi
1 INTRODUCTION 1
1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Aim of the project . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Project Domain . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4 Scope of the Project . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2 LITERATURE REVIEW 5
3 PROJECT DESCRIPTION 8
3.1 Existing System . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
3.2 Proposed System . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
3.3 Feasibility Study . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.3.1 Economic Feasibility . . . . . . . . . . . . . . . . . . . . . 9
3.3.2 Technical Feasibility . . . . . . . . . . . . . . . . . . . . . 9
3.3.3 Social Feasibility . . . . . . . . . . . . . . . . . . . . . . . 10
3.4 System Specification . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.4.1 Hardware Specification . . . . . . . . . . . . . . . . . . . . 11
3.4.2 Software Specification . . . . . . . . . . . . . . . . . . . . 11
3.4.3 Standards and Policies . . . . . . . . . . . . . . . . . . . . 11
4 METHODOLOGY 12
4.1 General Architecture . . . . . . . . . . . . . . . . . . . . . . . . . 12
4.2 Design Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
4.2.1 Data Flow Diagram . . . . . . . . . . . . . . . . . . . . . . 13
4.2.2 Use Case Diagram . . . . . . . . . . . . . . . . . . . . . . 14
4.2.3 Class Diagram . . . . . . . . . . . . . . . . . . . . . . . . 15
4.2.4 Sequence Diagram . . . . . . . . . . . . . . . . . . . . . . 16
4.2.5 Collaboration diagram . . . . . . . . . . . . . . . . . . . . 17
4.2.6 Activity Diagram . . . . . . . . . . . . . . . . . . . . . . . 18
4.3 Algorithm & Pseudo Code . . . . . . . . . . . . . . . . . . . . . . 19
4.3.1 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.3.2 Pseudo Code . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.4 Module Description . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.4.1 Module1:Disease Surveillance and Monitoring . . . . . . . 20
4.4.2 Module2:Data Analytics and Modeling . . . . . . . . . . . 20
8 PLAGIARISM REPORT 34
References 39
Chapter 1
INTRODUCTION
1.1 Introduction
1
for crop diseases that can be widely deployed. In developing countries such as India,
mobile phones with internet connectivity have become ubiquitous. Camera and GPS
enabled low cost mobile phones are widely available that can be leveraged by indi-
viduals to upload images with geolocation. Over widely available mobile networks,
they can communicate with more sophisticated Cloud based backend services which
can perform the compute heavy tasks, maintain a centralized database, and perform
data analytics. Another leap of technology in recent years is AI based image analysis
which has surpassed human eye capabilities and can accurately identify and classify
images. The underlying AI algorithms use Neural Networks (NN) which have
layers of neurons with a connectivity pattern inspired by the visual cortex. These
networks get “trained” on a large set of pre-classified “labeled” images to achieve
high accuracy of image classification on new unseen images. Since 2012 with
“AlexNet” winning the ImageNet competition, deep Convolutional Neural Networks
(CNNs) have consistently been the winning architecture for computer vision and
image analysis . The breakthrough in the capabilities of CNNs have come with a
combination of improved compute capabilities, large data sets of images available
and improved NN algorithms. Besides accuracy, AI has evolved and become more
affordable and accessible with open source platforms such as TensorFlow . Prior
art related to our project includes initiatives to gather healthy and diseased crop
images , image analysis using feature extraction, RGB images , spectral patterns and
fluorescence imaging spectroscopy . Neural Networks have been used in the past
for plant disease identification but the approach was to identify texture features. Our
proposal takes advantage of the evolution of Mobile, Cloud and AI to develop an
end-to-end crop diagnosis solution that simulates the expertise (“intelligence”) of
plant pathologists and brings it to farmers. It also enables a collaborative approach
towards continually increasing the disease database and seeking expert advice when
needed for improved NN classification accuracy and tracking for outbreaks.
2
1.2 Aim of the project
The project domain for leaf disease detection lies at the intersection of agriculture,
technology, and data science. In this domain, the focus is on developing advanced
systems and methodologies to detect and diagnose diseases affecting crops. Modern
technologies such as computer vision and machine learning play a pivotal role in
the implementation of this project. By utilizing image processing techniques, the
system can analyze visual data, such as images of crops captured by drones or other
imaging devices, to identify subtle signs and patterns indicative of various diseases.
Machine learning models are trained on extensive datasets containing both healthy
and diseased crop images, enabling the system to learn and accurately classify the
health status of crops.
Additionally, the project may involve the integration of sensor technologies, in-
cluding IoT (Internet of Things) devices, to monitor environmental conditions such
as humidity, temperature, and soil quality. These data points, when combined with
3
visual information, contribute to a more comprehensive understanding of the factors
influencing crop health. The project’s domain extends beyond mere disease detec-
tion to encompass a holistic approach to precision agriculture, aiming to empower
farmers with timely information for effective decision-making. The overarching goal
is to enhance agricultural productivity, minimize yield losses, and contribute to sus-
tainable and efficient farming practices by leveraging cutting-edge technologies in
the realm of crop disease detection.
4
Chapter 2
LITERATURE REVIEW
L. Saxena and L. Armstrong et al.,[5] used computer technologies have been shown
to improve agricultural productivity in a number of ways. One technique which
is emerging as a useful tool is image processing. Image processing has been used
to assist with precision agriculture practices, weed and herbicide technologies,
monitoring plant growth and plant nutrition management. This paper highlights the
future potential for image processing for different agricultural industry contexts.
5
C. Szegedy et al.,[2] used convolutional networks, central to modern computer
vision, have seen significant advancements since 2014, particularly with very deep
architectures gaining prominence. Despite the typical correlation between increased
model size and improved performance, computational efficiency remains crucial for
tasks like mobile vision. This study explores scalable network approaches through
factorized convolutions and rigorous regularization, achieving substantial gains on
the ILSVRC 2012 classification challenge. The proposed methods demonstrate a
notable 21.2 percentage top-1 and 5.6 percentage top-5 error reduction for single-
frame evaluation, emphasizing efficient utilization of added computation with a
network cost of 5 billion multiply-adds and less than 25 million parameters.
S. Khirade et al.,[8] tackled the problem of plant disease detection using digi-
tal image processing techniques and back propagation neural network . Authors
have elaborated different techniques for the detection of plant disease using the
images of leaves. They have implemented Otsu’s thresholding followed by boundary
detection and spot detection algorithm to segment the infected part in leaf. After
that they have extracted the features such as color, texture, morphology, edges etc.
for classification of plant disease.It is used for classification i.e. to detect the plant
disease.
6
P. Moghadam et al.,[6] demonstrated the application of hyperspectral imaging
in plant disease detection task. visible and near-infrared and short-wave infrared
spectrums were used in this research. Authors have used k-means clustering algo-
rithm in spectral domain for the segmentation of leaf. They have proposed a novel
grid removal algorithm to remove the grid from hyperspectral images. Authors have
achieved the accuracy of 83 percentage with vegetation indices in spectral range and
93 percentage accuracy with full spectrum. Though the proposed method achieved
higher accuracy, it requires the hyperspectral camera with 324 spectral bands so the
solution becomes too costly.
7
Chapter 3
PROJECT DESCRIPTION
In India alone, 35 percentage of field crops are lost to pathogens and pests causing
losses to farmers. Indiscriminate use of pesticides is also a serious health concern
as many are toxic and biomagnified. These adverse effects can be avoided by early
disease detection, crop surveillance and targeted treatments. Most diseases are diag-
nosed by agricultural experts by examining external symptoms. However, farmers
have limited access to experts.
Existing System Disadvantages:
1.Less Accuracy
2.Low Efficiency
8
3.3 Feasibility Study
The economic feasibility of deploying a leaf disease detection system using Con-
volutional Neural Network (CNN) algorithms involves a comprehensive evaluation
of various factors. Initial development costs encompass hardware and software ac-
quisitions, including development tools, along with salaries for data scientists and
machine learning engineers. Data-related expenses, such as acquiring and annotat-
ing a diverse dataset, are critical considerations. The computational resources and
time required for model training and validation, as well as deployment infrastruc-
ture costs, need to be accounted for. Ongoing maintenance, updates, and monitoring
carry additional expenses. Assessing the benefits, including potential crop yield im-
provement, reduced pesticide use, and market value, aids in estimating the return
on investment. Regulatory compliance and ethical considerations add complexity.
Scalability, both in terms of system performance and costs, should be evaluated, and
alternative solutions should be compared to determine the most economically viable
approach. This analysis provides a holistic perspective on the economic feasibility
of implementing a CNN-based leaf disease detection system in agriculture.
9
Additionally, addressing security and privacy concerns becomes paramount, neces-
sitating the implementation of encryption and secure communication protocols. This
comprehensive examination of technical aspects is vital for determining the feasibil-
ity and successful implementation of a CNN-based leaf disease detection system in
agriculture.
10
3.4 System Specification
Pycharm
PyCharm, as an Integrated Development Environment (IDE) for Python, follows
certain standards and policies to ensure a consistent and productive development
experience for its users PEP 8 Compliance: PyCharm often aligns with PEP 8, the
official style guide for Python code, promoting readability and consistency. Code
Formatting: Offers customizable code formatting options to enforce consistent styles
and indents.
11
Chapter 4
METHODOLOGY
This diagram illustrates how a smartphone can be used to diagnose the health of
a leaf by capturing, scanning, and processing its image. The system compares the
leaf image with a database of healthy and diseased leaves and returns a result that
classifies the leaf as either healthy or diseased, with a green or red color code. The
system also provides a feedback loop that allows the user to adjust the output until
they are satisfied or stop the process.
12
4.2 Design Phase
The diagram shows how a user interacts with a system that allows them to upload
plant images. The user first undergoes a check to verify their authorization. If the
user is authorized, they can proceed to upload a plant image and then log out of the
system. If the user is not authorized, they are labeled as an unauthorized user and the
process ends. The diagram uses different shapes and arrows to represent the steps
and the flow of data in the process.
13
4.2.2 Use Case Diagram
The diagram shows how a user interacts with a system for a plant image uploading
application. The user can register, login, upload a plant image, and logout. Each oval
represents a different use case or action that the user can perform. The lines connect-
ing the “User” icon to each oval indicate that the user has the option to execute these
actions
14
4.2.3 Class Diagram
The diagram shows how a user interacts with a system for a plant image uploading
application. The user can register, login, upload a plant image, and logout. Each oval
represents a different use case or action that the user can perform. The lines connect-
ing the “User” icon to each oval indicate that the user has the option to execute these
actions.
15
4.2.4 Sequence Diagram
The sequence diagram shows how a user interacts with a database system to per-
form four actions: registering, logging in, uploading a plant image, and logging out.
Each action is represented by a horizontal arrow from the user column to the database
column, labeled with the corresponding function name. The sequence of actions is
from top to bottom, indicating the order in which they occur. The diagram illus-
trates the basic steps involved in using the database system and the functions that are
available to the user
16
4.2.5 Collaboration diagram
The collaboration diagram shows how a User interacts with a Database through
four steps: Register, Login, Upload plant Image, and Logout. The User is an object
that initiates the interaction, and the Database is an object that responds to the mes-
sages from the User. The diagram illustrates the flow of data and control between
the User and the Database, as well as the states of the objects during the interaction.
The diagram can help to understand the requirements and design of the system, as
well as to identify potential errors or improvements.
17
4.2.6 Activity Diagram
The activity diagram shows the steps involved in uploading plant images on a
platform. The process starts with the user registering and logging in to their account.
After that, they can upload as many plant images as they want by selecting the option
from the menu. The process ends when the user logs out of their account. The
diagram uses ovals to represent the user and the start and end points, and rectangles
to represent the actions.
18
4.3 Algorithm & Pseudo Code
4.3.1 Algorithm
10 p r e d i c t i o n = p r e d i c t ( model , f e a t u r e s )
11
12 i f p r e d i c t i o n == ” h e a l t h y ” :
13 d i s p l a y m e s s a g e ( ” The c r o p a p p e a r s t o be h e a l t h y . ” )
14 else :
15 d i s p l a y m e s s a g e ( ” The c r o p i s l i k e l y a f f e c t e d by a d i s e a s e . F u r t h e r a n a l y s i s n e e d e d . ” )
16
17 i f p r e d i c t i o n == ” d i s e a s e d ” :
18 treatment recommendations = recommend treatment ( p r e d i c t i o n )
19 display recommendations ( treatment recommendations )
19
4.4 Module Description
Step:1 Collect and analyze historical data on crop diseases and environmental fac-
tors.
Step:2 Develop predictive models using machine learning algorithms for accurate
disease forecasting.
Continuously update models based on new data and emerging trends.
Step:3 Provide user-friendly interfaces for farmers to access and interpret model pre-
dictions.
Step:4 Integrate real-time data from surveillance systems to enhance model accuracy.
20
Chapter 5
The input dataset includes several classes of leaf diseases caused by fungi,
viruses,pests, bacteria, Phytophthora, nematodes, and healthy leaves.
21
5.1.2 Output Design
The output got shows that the leaf is diseased.Each prediction would indicate the
likelihood of the input leaf image belonging to a specific disease class, such as early
blight, late blight, or others.The model’s output serves as a diagnostic tool, aiding in
the identification and classification of various leaf diseases. Users can interpret the
predictions to understand the specific disease affecting the plant.
22
5.2 Testing
Input
Unit testing involves the design of test cases that validate that the internal pro-
gram logic is functioning properly, and that program inputs produce valid outputs.
All decision branches and internal code flow should be validated. It is the testing of
individual software units of the application .it is done after the completion of an in-
dividual unit before integration. This is a structural testing, that relies on knowledge
of its construction and is invasive. Unit tests perform basic tests at component level
and test a specific business process, application, and/or system configuration. Unit
tests ensure that each unique path of a business process performs accurately to the
documented specifications and contains clearly defined inputs and expected results.
23
5.3.2 Integration testing
24
5.3.3 System testing
System testing ensures that the entire integrated software system meets require-
ments. It tests a configuration to ensure known and predictable results. An example
of system testing is the configuration oriented system integration test. System testing
is based on process descriptions and flows, emphasizing pre-driven process links and
integration points.
25
5.3.4 Test Result
The output got shows that the leaf is diseased. Each prediction would indicate the
likelihood of the input leaf image belonging to a specific disease class, such as early
blight, late blight, or others.The model’s output serves as a diagnostic tool, aiding in
the identification and classification of various leaf diseases. Users can interpret the
predictions to understand the specific disease affecting the plant.
26
Chapter 6
27
request object. The geolocation feature’s effectiveness enhances the system’s value
by providing geographical insights.
• Adaptability and Upgradability:An efficient system should be adaptable to
changing requirements and capable of receiving upgrades. This ensures its longevity
and relevance in the dynamic field of agriculture and technology.
1 from t e n s o r f l o w . k e r a s . m o d e l s i m p o r t l o a d m o d e l
2 from t e n s o r f l o w . k e r a s . p r e p r o c e s s i n g . image i m p o r t l o a d i m g , i m g t o a r r a y
3 i m p o r t numpy a s np
4 from f l a s k i m p o r t F l a s k , r e q u e s t , r e n d e r t e m p l a t e
5 from w e r k z e u g . u t i l s i m p o r t s e c u r e f i l e n a m e
6
8 i m p o r t os , s y s , g l o b , r e
9
10 app = F l a s k ( n a m e )
11
12 m o d e l p a t h = ” weeds . h5 ”
13
14
15
28
18 def model predict ( image path ) :
19 print ( ” Predicted ” )
20 image = l o a d i m g ( i m a g e p a t h , t a r g e t s i z e = ( 2 2 4 , 2 2 4 ) )
21 image = i m g t o a r r a y ( image )
22 image = image / 2 5 5
23 image = np . e x p a n d d i m s ( image , a x i s = 0 )
24 model = l o a d m o d e l ( m o d e l p a t h )
25 r e s u l t = np . argmax ( model . p r e d i c t ( image ) )
26 prediction = classes [ result ]
27
28
29 i f r e s u l t == 0 :
30 p r i n t ( ” p l a n t . html ” )
31
32 r e t u r n ” P l a n t s ” , ” p l a n t . html ”
33 elif r e s u l t == 1 :
34 p r i n t ( ” weeds . h t m l ” )
35
36 r e t u r n ” weeds ” , ” weeds . h t m l ”
37
38
39
40 @app . r o u t e ( ’ / ’ , m e t h o d s = [ ’GET ’ ] )
41 def index ( ) :
42 r e t u r n r e n d e r t e m p l a t e ( ’ index . html ’ )
43
44
45
55 f i l e p a t h = os . p a t h . j o i n ( ’ s t a t i c / u s e r uploaded ’ , f i l e n a m e )
56 f i l e . save ( f i l e p a t h )
57
58 p r i n t ( ”@@ P r e d i c t i n g c l a s s . . . . . . ” )
59 pred , o u t p u t p a g e = m o d e l p r e d i c t ( f i l e p a t h )
60
61 r e t u r n r e n d e r t e m p l a t e ( o u t p u t p a g e , p r e d o u t p u t = pred , u s e r i m a g e = f i l e p a t h )
62
63
64
65 if name == ’ m a i n ’ :
66 app . r u n ( debug = True , t h r e a d e d = F a l s e )
29
Output
The login page for the ”Leaf Disease Detection Using CNN Algorithm”. Navi-
gation links for “Home”, “Login”, and “Register Here” are displayed.Here, the user
can login into the application by using username and password The “User Login
Screen” section includes input fields for Username and Password, along with a Lo-
gin button.
30
Figure 6.2: Output 2
The output got shows that the leaf is diseased.Each prediction would indicate
the likelihood of the input leaf image belonging to a specific disease class, such as
early blight, late blight, or others.The model’s output serves as a diagnostic tool,
aiding in the identification and classification of various leaf diseases. Users can
interpret the predictions to understand the specific disease affecting the plant.
31
Chapter 7
7.1 Conclusion
This Project presents an automated, low cost and easy to use end-to-end solution
to one of the biggest challenges in the agricultural domain for farmers – precise,
instant and early diagnosis of crop diseases and knowledge of disease outbreaks -
which would be helpful in quick decision making for measures to be adopted for
disease control. This Project innovates on known prior art with the application of
deep Convolutional Neural Networks (CNNs) for disease classification, introduc-
tion of social collaborative platform for progressively improved accuracy, usage of
geocoded images for disease density maps and expert interface for analytics. High
performing deep CNN model “Inception” enables real time classification of diseases
in the Cloud platform via a user facing mobile app. Collaborative model enables con-
tinuous improvement in disease classification accuracy by automatically growing the
Cloud based training dataset with user added images for retraining the CNN model.
User added images in the Cloud repository also enable rendering of disease density
maps based on collective disease classification data and availability of geolocation
information within the images. Overall, the results of the experiments demonstrate
that the proposal has significant potential for practical deployment due to multiple
dimensions – the Cloud based infrastructure is highly scalable and the underlying
algorithm works accurately even with large number of disease categories, performs
better with high fidelity real-life training data, improves accuracy with increase in
the training dataset, is capable of detecting early symptoms of diseases and is able to
successfully differentiate between diseases of the same family.
32
7.2 Future Enhancements
Future work involves expanding the model to include more parameters which can
improve the correlation to the disease.Image database with supporting inputs from
the farmer on soil, past fertilizer and pesticide treatment along with publicly avail-
able environmental factors such as temperature, humidity and rainfall to improve our
model accuracy and enable disease forecasting, also wish to increase the number of
crop diseases covered and reduce the need for expert intervention except for new
types of diseases. For automatic acceptance of user uploaded images into the Train-
ing Database for better classification accuracy and least possible human intervention,
a simple technique of computing the threshold based on a mean of all classification
scores can be used. Further application of this work could be to support automated
time-based monitoring of the disease density maps that can be used to track the
progress of a disease and trigger alarms. Predictive analytics can be used to send
alerts to the users on the possibility of disease outbreaks near their location.
33
Chapter 8
PLAGIARISM REPORT
34
Chapter 9
1 # ! / u s r / b i n / env p y t h o n
2 # c o d i n g : u t f −8
3
14
15 IMAGE SIZE = [ 2 2 4 , 2 2 4 ]
16
17 t r a i n p a t h = ” a g r i c u l t u r e / Train / ”
18
19 # In [ 3 ] :
20
21 from k e r a s . p r e p r o c e s s i n g . image i m p o r t I m a g e D a t a G e n e r a t o r
22
23 t r a i n d a t a g e n = I m a g e D a t a G e n e r a t o r ( r e s c a l e = 1 . / 2 5 5 , h o r i z o n t a l f l i p = True , z o o m r a n g e = 0 . 2 ,
v a l i d a t i o n s p l i t =0.15)
24
33
34 from t e n s o r f l o w . k e r a s . a p p l i c a t i o n s i m p o r t VGG19
35
35 from t e n s o r f l o w . k e r a s . l a y e r s i m p o r t G l o b a l A v e r a g e P o o l i n g 2 D , D r o p o u t
36
37 ## We a r e i n i t i a l i s i n g t h e i n p u t s h a p e w i t h 3 c h a n n e l s r g b and w e i g h t s a s i m a g e n e t and i n c l u d e t o p
a s F a l s e w i l l make t o u s e o u r own c u s t o m i n p u t s
38
41
42 f o r l a y e r s i n mv . l a y e r s :
43 layers . trainable = False
44
45
50
53 model . summary ( )
54
55
56 import tensorflow as t f
57
58 c l a s s myCallback ( t f . k e r a s . c a l l b a c k s . C a l l b a c k ) :
59 d e f o n e p o c h e n d ( s e l f , epoch , l o g s ={}) :
60 i f ( l o g s . g e t ( ’ l o s s ’ ) <=0.05) :
61 p r i n t ( ”\ nEnding t r a i n i n g ” )
62 s e l f . model . s t o p t r a i n i n g = T r u e
63 # i n i t i a t i n g t h e myCallback f u n c t i o n
64 c a l l b a c k s = myCallback ( )
65
66
67
72
73 h i s t o r y = model . f i t ( t r a i n i n g s e t ,
74 validation data=validation set ,
75 e p o c h s =50 ,
76 v e r b o s e =1 ,
77 steps per epoch=len ( t r a i n i n g s e t ) ,
78 v a l i d a t i o n s t e p s =len ( v a l i d a t i o n s e t ) ,
79 callbacks = [ callbacks ]
80 )
36
81
82 # In [ 1 1 ] :
83
84 acc = h i s t o r y . h i s t o r y [ ’ c a t e g o r i c a l a c c u r a c y ’ ]
85 val acc = history . history [ ’ val categorical accuracy ’ ]
86
92 import m a t p l o t l i b . pyplot as p l t
93 p l t . p l o t ( epochs , acc )
94 p l t . p l o t ( epochs , v a l a c c )
95 p l t . t i t l e ( ” T r a i n i n g and v a l i d a t i o n A c c u r a c y ” )
96 plt . figure ()
97
98 p l t . p l o t ( epochs , l o s s )
99 p l t . p l o t ( epochs , v a l l o s s )
100 p l t . t i t l e ( ” T r a i n i n g and v a l i d a t i o n L o s s ” )
101 plt . figure ()
102
103 # In [ 1 3 ] :
104
107 # In [ 1 9 ] :
108
109 from t e n s o r f l o w . k e r a s . m o d e l s i m p o r t l o a d m o d e l
110 from t e n s o r f l o w . k e r a s . p r e p r o c e s s i n g i m p o r t image
111 i m p o r t numpy a s np
112
116 # l o a d t h e model we s a v e d
117 model = l o a d m o d e l ( ’VGG− 1 9 . h5 ’ )
118 # p r e d i c t i n g images
119 img = image . l o a d i m g ( ’ a g r i c u l t u r e / T e s t / p l a n t s / a g r i 0 3 . j p e g ’ , t a r g e t s i z e = ( i m g w i d t h , i m g h e i g h t ) )
120 x = image . i m g t o a r r a y ( img )
121 x = np . e x p a n d d i m s ( x , a x i s = 0 )
122
123 c l a s s e s = model . p r e d i c t ( x )
124 print ( classes )
125
126 # In [ ] :
37
9.2 Poster Presentation
38
References
39