You are on page 1of 59

A

Mini Project
On
Android Application on Plant Leaf Disease Detection using
Machine Learning
(Submitted in partial fulfillment of the requirements for the award of Degree)
BACHELOR OF TECHNOLOGY
In
COMPUTER SCIENCE AND ENGINEERING
By
B. KARTHIK (197R1A0570)
C. RAHUL RAO (197R1A0571)
U. SRIKANTH (197R1A05B2)

Under the Guidance of


Dr. G. Somasekhar
(Associate Professor)

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

CMR TECHNICAL CAMPUS

UGC AUTONOMOUS
(Accredited by NAAC, NBA, Permanently Affiliated to JNTUH, Approved by AICTE, New

Delhi) Recognized Under Section 2(f) & 12(B) of the UGCAct.1956, Kandlakoya (V),

Medchal Road, Hyderabad-501401.


2019-2023
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

CERTIFICATE
This is to certify that the project entitled “ANDROID APPLICATION ON
PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING ” being
submitted by B. KARTHIK (197R1A0570) C. RAHUL RAO (197R1A0571) & U.
SRIKANTH (197R1A05B2) in partial fulfillment of the requirements for the award of the
degree of B.Tech in Computer Science and Engineering to the Jawaharlal Nehru
Technological University Hyderabad, is a record of bonafide work carried out by them
under our guidance and supervision during the year 2022-23.

The results embodied in this thesis have not been submitted to any other University
or Institute for the award of any degree or diploma.

Dr.G.Somasekhar Dr. A. Raji Reddy


(Associate Professor) DIRECTOR
INTERNAL GUIDE

Dr. K. Srujan Raju EXTERNAL EXAMINER


HOD

Submitted for viva voice Examination held on


ACKNOWLEDGEMENT

Apart from the efforts of us, the success of any project depends largely on the
encouragement and guidelines of many others. We take this opportunity to express our
gratitude to the people who have been instrumental in the successful completion of this
project.
We take this opportunity to express my profound gratitude and deep regard to
my guide Dr.G.Somasekhar, Associate Professor for his exemplary guidance,
monitoring and constant encouragement throughout the project work. The blessing, help
and guidance given by him shall carry us a long way in the journey of life on which we
are about to embark.

We also take this opportunity to express a deep sense of gratitude to the


Project Review Committee (PRC) Dr. Punyaban Patel, Ms. K. Shilpa, Dr.M . Subha
Mastan Rao & J. Narasimharao for their cordial support, valuable information and
guidance, which helped us in completing this task through various stages.
We are also thankful to Dr. K. Srujan Raju, Head, Department of Computer
Science and Engineering for providing encouragement and support for completing this
project successfully.
We are obliged to Dr. A. Raji Reddy, Director for being cooperative
throughout the course of this project. We also express our sincere gratitude to Sri. Ch.
Gopal Reddy, Chairman for providing excellent infrastructure and a nice atmosphere
throughout the course of this project.
The guidance and support received from all the members of CMR Technical
Campus who contributed to the completion of the project. We are grateful for their
constant support and help.
Finally, we would like to take this opportunity to thank our family for their
constant encouragement, without which this assignment would not be completed. We
sincerely acknowledge and thank all those who gave support directly and indirectly in
the completion of this project.

B. KARTHIK (197R1A0570)
C. RAHUL RAO (197R1A0571)
U. SRIKANTH (197R1A05B2)
ABSTRACT

The position of any country in the world depends on its agricultural


production. Majorly India is dependent on agriculture. There is a wide variety of
plants for cultivating and giving maximum productivity, but it depends on the
environment. Then also the production gets affected by diseases of the crop. The
yield of a farmer is always dependent upon the basis of the crop’s disease. If the
disease can be prevented, the yield will likely increase. The disease that caused the
crop to decline could be remembered by the leaf of the plant.
The method uses Artificial Intelligence based on an algorithm we call
machine learning to estimate the superiority of the leaf. The plant leaf provides us
with the most important data to distinguish the disease of the plant. The
development of the Android app gives farmers the ability to do this. Identify plant
leaf diseases based on the image of plant leaf taken from the Android app camera
source. Detecting diseases of the leaf of the plant at an early stage gives it the
strength to overcome and treat appropriately by providing details to the farmer,
what preventive measures should be taken. Android mobile application which can
automatically identify the plant’s diseases based on their leaf appearance with
some computer vision and machine learning techniques. The target group of the
user is those who request a free and quick diagnosis of common diseases at any
time of the day

i
LIST OF FIGURES
FIGURE NO FIGURE NAME PAGE NO

Figure 3.1 Project Architecture 9

Figure 3.2 Use Case Diagram 10

Figure 3.3 Class Diagram 11

Figure 3.4 Sequence diagram 12

Figure 3.5 Activity diagram 13

ii
RESULTS

SCREENSHOT NO. SCREENSHOT NAME PAGE NO.

Screenshot 6.1 Splash screen 27

Screenshot 6.2 Registration page 28

Screenshot 6.3 Login page 29

Screenshot 6.4 Main screen 30

Screenshot 6.5 Results display 31

Screenshot 6.6 Fertilizers list 32


iii
TABLE OF CONTENTS
ABSTRACT i
LIST OF FIGURES ii
LIST OF SCREENSHOTS iii
1. INTRODUCTION 1
1.1 PROJECT SCOPE 1
1.2 PROJECT PURPOSE 1
1.3 PROJECT FEATURES 1
2.LITERATURE SURVEY
3. SYSTEM ANALYSIS 2
3.1 PROBLEM DEFINITION 2
3.2 EXISTING SYSTEM 2
3.2.1 LIMITATIONS OF THE EXISTING SYSTEM 3
3.3 PROPOSED SYSTEM 3
3.3.1 ADVANTAGES OF PROPOSED SYSTEM 3
3.4 FEASIBILITY STUDY 4
3.4.1 ECONOMIC FEASIBILITY 4
3.4.2 TECHNICAL FEASIBILITY 5
3.4.3 SOCIAL FEASIBILITY 5
3.5 HARDWARE & SOFTWARE REQUIREMENTS 5
3.5.1 HARDWARE REQUIREMENTS 5
3.5.2 SOFTWARE REQUIREMENTS 6
4. ARCHITECTURE 7
4.1 PROJECT ARCHITECTURE 7
4.2 DESCRIPTION 7
4.3 USE CASE DIAGRAM 8
4.4 CLASS DIAGRAM 9
4.5 SEQUENCE DIAGRAM 10
4.6 ACTIVITY DIAGRAM 11
5. IMPLEMENTATION 12
5.1 SAMPLE CODE 12
6. RESULTS 16
7.TESTING 19
7.1 INTRODUCTION TO TESTING 19

7.2 TYPES OF TESTING


7.2.1 UNIT TESTING 19
7.2.2 INTEGRATION TESTING 20
7.2.3 FUNCTIONAL TESTING 20

7.3 TEST CASES 21


7.3.1 UPLOADING DATASET
7.3.2 CLASSIFICATION 21
8. CONCLUSION & FUTURE SCOPE 22
8.1 PROJECT CONCLUSION 22
8.2 FUTURE SCOPE 22
9. REFERENCES 23

9.1 REFERENCES 23
9.2 GITHUB LINK 23
1. INTRODUCTION
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

1. INTRODUCTION

1.1 PROJECT SCOPE

Plant diseases cause major production and economic losses in the


agricultural industry. Disease management is a challenging task. Usually, the
diseases or its symptoms are seen on the leaves of a plant. With the help of image
processing, Automatic detection of various diseases can be detected with the help
of image processing. Image processing plays a crucial role in the detection of plant
diseases since it provides best results and reduces human efforts. The image
processing could be used in the field of agriculture for several applications. It
includes detection of diseased leaf, stem or fruit, to measure the affected area by
disease, to determine the color of the affected area.

1.2 PROJECT PURPOSE


Android-based smartphones were used to capture images of the
terrestrial plant to detect the disease of the plant while Deep Learning Neural
Network Algorithm is utilized to distinguish the disease of plants. The results were
trained using classification models that could identify the diseases at a certain rate
and accuracy considering the number of images used. Therefore, the Deep
Learning Neural Network classification demonstrates the identification of the
common disease plants found.

1.3 PROJECT FEATURES


Tensor flow and machine vision have been widely used in monitoring
plants, harvesting, and other stages of plant growing. For example, Liu et al.
presented a review about the use of machine vision systems in identifying common
invertebrates on crops, such as butterflies, locusts, snails and slugs. Tensor flow is
usually combined with artificial intelligence like neural networks to detect mature
fruits in these cases, the accuracy ranges between 60% and 100% depending on the
type of fruit and other conditions.

CMRTC 1
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

Crop monitoring is another domain where machine vision has been


adopted (e.g., for production monitoring, the detection of diseases, or insect
invasion).

The early detection of plant diseases is necessary for elective control,


reassuring that the financial cost and the environmental impact of their treatment
will be minimized. If plant diseases are not treated in their early stages, the
production cost can be significantly increased since the disease can propagate to
the whole crop. The traditional way is to hire professional agriculturists who
monitor the plants, but this may not always be possible for farms set in isolated
rural areas. Moreover, the costmary is not adorable for smaller farmers. Remote
monitoring and automation overfed by precision agriculture solutions can reduce
the cost and over ancient protection.

The initial detection of a disease can be based on machine vision and


tensor flow that will generate an alert if its symptoms are recognized. Molecular
analyses have a higher cost but may be carried out later if a plant disease has to be
formally confirmed. The plant disease diagnosis can be based on various symptoms
as described in.

Symptoms can often be grouped as: (a) underdevelopment of tissues or


organs (short internodes, underdeveloped roots, malformed leaves, lack of
chlorophyll, fruits and flowers that do not develop), (b) overdevelopment of plant
parts like tissues or organs, (c) necrosis of plant parts (wilts, shoot or leaf blights,
leaf spots, fruit rots) and (d) alternations like mosaic patterns and altered coloration
in leaves and flowers. The progression of the disease symptoms can vary
significantly. Biotic agents the speed of the symptom progression. There are
primary and secondary symptoms of a disease.

CMRTC 2
2. LITERATURE SURVEY
2.LITERATURE SURVEY

1. Liu H, Lee S.-H, Chahl J.-S. A review of recent sensing technologies to


detect invertebrates on crops. Precis. Agric. 2017, 18, 635–666. Biosecurity
surveillance has been highlighted as a key activity to discover non-native species at
the initial stage of invasion. It provides an opportunity for rapidly initiating
eradication measures and implementing responses to prevent spread and permanent
establishment, reducing costs and damage. In importing countries, three types of
biosecurity activities can be carried out: border surveillance targets the arrival stage
of a non-native species at points-of-entry for commodities; post-border surveillance
and containment target the establishment stage, but post-border surveillances
carried out on a large spatial scale, where as contaminants carried out around
infested areas. In recent years, several surveillance approaches, such as baited
traps, sentinel trees, bio surveillance with sniffer dogs or predatory wasps,
electronic noses, acoustic detection, laser micrometer, citizen science, genetic
identification tools, and remote sensing, have been developed to complement
routine visual inspections and aid in bio security capacity. Here, we review the
existing literature on these tools, highlight their strengths and weaknesses, and
identify the biosecurity surveillance categories and sites where each tool can be
used more efficiently. Finally, we show how these tools can be integrated in a
comprehensive bio security program and discuss steps to improve biosecurity.

2. Kurtulmus F, Lee W.S, Vardar A. Immature peach detection in color


images acquired in natural illumination conditions using statistical classifiers and
neural networks. Precis. Agric. 2014, 15, 57–79. A fast normalized cross
correlation (FNCC) based machine vision algorithm was proposed in this study to
develop a method for detecting and counting immature green citrus fruit using
outdoor color images toward the development of an early yield mapping system.
As a template matching method, FNCC was used to detect potential fruit areas in
the image, which was the very basis for subsequent false positive removal.
Multiple features, including color, shape and texture features, were combined in
this algorithm to remove false positives. Circular Hough transform (CHT) was
used to detect circles from images after background removal based on color
components.

CMRTC 3
After building disks centered in cancroids resulting from both FNCC and CHT, the
detection results were merged based on the size and Euclidean distance of the
intersection areas of the disks from these two methods. Finally, the number of
fruits was determined after false positive removal using texture features. For a
validation dataset of 59 images, 84.4 % of the fruits were successfully detected,
which indicated the potential of the proposed method toward the development of
an early yield mapping system.

3. Chevapatrakul S, Dailey M. Texture-based fruit detection. Precis.


Agric. 2014, 15, 662–683. Technique based on texture analysis is proposed for
detecting green fruits on plants. The method involves interest point feature
extraction and descriptor computation, interest point classification using support
vector machines, candidate fruit point mapping, morphological closing and fruit
region extraction. In an empirical study using low-cost web camera sensors
suitable for use in mechanized systems, 24 combinations of interest point features
and interest point descriptors were evaluated on two fruit types (pineapple and
bitter melon). The method is highly accurate, with single-image detection rates of
85 % for pineapples and 100 % for bitter melons. The method is thus sufficiently
accurate for precise location and monitoring of textured fruit in the field. Future
work will explore a combination of detection and tracking for further improved
results.

CMRTC 4
3. SYSTEM ANALYSIS
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

3. SYSTEM ANALYSIS

SYSTEM ANALYSIS

System Analysis is the important phase in the system development


process. The System is studied to the minute details and analyzed. The system
analyst plays an important role of an interrogator and dwells deep into the working
of the present system. In analysis, a detailed study of these operations performed
by the system and their relationships within and outside the system is done. A key
question considered here is, “what must be done to solve the problem?” The
system is viewed as a whole and the inputs to the system are identified. Once
analysis is completed the analyst has a firm understanding of what is to be done.

3.1 PROBLEM DEFINITION

In today's world, agricultural land is more than just a feeding resource. But
considering the great benefits that agriculture has given the world, helping a large
population faces a number of challenges. Agriculture worldwide is affected by
many existential challenges. Massive challenges are the rise of the human
population and the loss of arable land. The central goal of today's agriculture was
to promote the immoderate use of materials such as pesticides, one of the causes of
unhealthy agricultural conditions, which in turn renders plants more susceptible to
pathogen attacks and more difficult to manage plant diseases, in order for the world
to provide an adequate supply and to maximize output. To do this a group of
experts would be needed to screen a plant as required, and then if carried out on a
large scale, costs are very high.

3.2 EXISTING SYSTEM

In the existing system, farmers are getting losses in their crops at the end of
the year, because only the reason is farmers are not in that much knowledge to
estimate the disease of the crop. The farmer is failing to identify the crop’s disease
and take it to the enhancing, which makes the farmer more prone to loss.

CMRTC 5
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

3.2.1 DISADVANTAGES OF EXISTING SYSTEM

The disadvantages of the existing system are :

● Recognition rate is very low


● Classification is not possible
● Lack of segmentation
● Vast identification criteria
● Delay in prevention
● Huge loss occurs

3.3 PROPOSED SYSTEM


To reduce the loss percentage of the crops, we present one Android app
which distinguishes and identifies the symptoms of the disease on a plant leaf. Our
app works on such plants which are infected by many diseases such as fungi,
viruses. To detect & classify plant disease by using machine learning techniques. It
identifies the actual type of disease and gives its preventive measures and related
recovery notations are displayed by using the CNN algorithm. And finally, we get
all information regarding the disease, its symptoms, its preventive mechanism, and
recovery suggestions at the very least time and low cost.

3.3.1 ADVANTAGES OF THE PROPOSED SYSTEM

The advantages of proposed system are :

● High prediction accuracy


● Robust working system
● High processing speed
● Quicker detection
● Prevention of spreading
● Dataset updation
CMRTC 6

ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

3.4 FEASIBILITY STUDY

The feasibility of the project is analyzed in this phase and a business


proposal is put forth with a very general plan for the project and some cost
estimates. During system analysis the feasibility study of the proposed system is to
be carried out. This is to ensure that the proposed system is not a burden to the
company. Three key considerations involved in the feasibility analysis:

● EconomicFeasibility

● TechnicalFeasibility

● SocialFeasibility

3.4.1 ECONOMIC FEASIBILITY

This study is carried out to check the economic impact that the system will
have on the organization. The amount of fund that the company can pour into the
research and development of the system is limited. The expenditures must be
justified. Thus, the developed system as well within the budget and this was
achieved because most of the technologies used are freely available. Only the
customized products had to be purchased.

3.4.2 TECHNICAL FEASIBILITY

This study is carried out to check the technical feasibility, that is, the
technical requirements of the system. Any system developed must not have a high
demand on the technical resources. This will lead to high demands on the available
technical resources. This will lead to high demands being placed on the client. The
developed system must have a modest requirement, as only minimal or null
charges are required for implementing this system.
CMRTC 7
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

3.4.3 BEHAVIORAL FEASIBILITY

This includes the following questions:


● Is there sufficient support for the users?
● Will the proposed system cause harm?

The project would be beneficial because it satisfies the objectives when


developed and installed. All behavioral aspects are considered carefully and
conclude that the project is behaviorally feasible

3.5 HARDWARE & SOFTWARE REQUIREMENTS

3.5.1 HARDWARE REQUIREMENTS:

Hardware interfaces specify the logical characteristics of each interface


between the software product and the hardware components of the system. The
following are some hardware requirements.

● NVIDIA GTX 1050


● Memory : 256 GB
● RAM : 8 GB

3.5.2 SOFTWARE REQUIREMENTS:

Software Requirements specifies the logical characteristics of each


interface and software components of the system. The following are some software
requirements,

● Tensor flow
● Java Development Kit
● Android Studio
● Android Virtual Device
CMRTC 8

4. ARCHITECTURE
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

4.ARCHITECTURE

4.1 PROJECT ARCHITECTURE


This project architecture shows the procedure followed for detection
of disease , starting from input to final result.

Figure 4.1: Project Architecture for Android Application on Plant Leaf


Disease Detection Using Machine Learning

4.2 DESCRIPTION

Input Data: Input data is generally in jpg format or png format where the data is
fetched and mapped in the data framed from the source columns.
Separating Features: In this following step we are going to separate the features
which we take to train the model by giving the target value i.e. 1/0 for the
particular features.
Normalization: Normalization is a very important step while we are dealing with
the large values in the features as the higher bit integers will cost high
computational power and time. To achieve efficiency in computation we are going
to normalize the data values.
Training and test data: Training data is passed to the CNN classifier to train the
model. Test data is used to test the trained model whether it is making correct
predictions or not.

CMRTC 9
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

CNN Classifier: the purpose of choosing the CNN classifier for this project is the
efficiency and accuracy that we have observed when compared to other classifiers.

4.3 USE CASE DIAGRAM

In the use case diagram we have basically two actors who are the user and
the stem. Where the user will provide the data in the form of images and system
verifies the data in order to give results.

Figure 4.2: Use Case Diagram for Android Application on Plant Leaf Disease
Detection Using Machine Learning

CMRTC 10

ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

4.4 CLASS DIAGRAM

Class diagram is the collection of Classes and Objects.It is a type of static


structure diagram that describes the structure of a system by showing the system's
classes, their attributes, operations(or methods), and the relationships among
objects.
Figure 4.3: Class Diagram for Android Application on Plant Leaf Disease
Detection Using Machine Learning

CMRTC 11
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

4.5 SEQUENCE DIAGRAM

Sequence Diagram is an interaction diagram that details how operations are


carried out -- what messages are sent and when. Sequence diagrams are organized
according to time. The time progresses as you go down the page. The objects
involved in the operation are listed from left to right according to when they take
part in the message sequence.

Figure 4.4: Sequence Diagram for Android Application on Plant Leaf


Disease Detection Using Machine Learning

CMRTC 12
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

4.6 ACTIVITY DIAGRAM

Activity diagram is basically a flowchart to represent the flow from one


activity to another activity. The activity can be described as an operation of the
system.The control flow is drawn from one operation to another. This flow can be
sequential, branched, or concurrent. Activity diagrams deal with all type of flow
control by using different elements such as fork, join, etc

Figure 4.5: Activity Diagram for Android Application on Plant Leaf Disease Detection
Using Machine Learning

CMRTC 13
5. IMPLEMENTATION
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

5. IMPLEMENTATION

5.1 SAMPLE CODE

5.1.1 MAIN ACTIVITY

package org.tensorflow.demo;
import android.content.Intent;
import android.os.Bundle;
import android.os.Handler;
import android.support.v7.app.AppCompatActivity;
import android.view.WindowManager;
import android.view.animation.Animation;
import android.view.animation.AnimationUtils;
import android.widget.ImageView;
import android.widget.TextView;
public class MainActivity extends AppCompatActivity {
Animation topAnim, bottomAnim;
ImageView image;
TextView logo;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,Wi
ndowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_main);
topAnim = AnimationUtils.loadAnimation(this,R.anim.top_animation);
bottomAnim = AnimationUtils.loadAnimation(this,R.anim.bottom_animation);
image = findViewById(R.id.imageView2);
logo = findViewById(R.id.textView2);
image.setAnimation(topAnim);
logo.setAnimation(bottomAnim);
int secondsDelayed = 1;
new Handler().postDelayed(new Runnable() {
public void run() {
startActivity(new Intent(MainActivity.this, Login.class));
finish();
}
}, secondsDelayed * 3000);
}
}

5.1.2 LOGIN ACTIVITY

package org.tensorflow.demo;
import android.content.DialogInterface;
import android.content.Intent;
import android.os.Bundle;
import android.support.v7.app.AlertDialog;
import android.support.v7.app.AppCompatActivity;

CMRTC 14
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

import android.widget.Toast;
import com.android.volley.AuthFailureError;
import com.android.volley.Request;
import com.android.volley.RequestQueue;
import com.android.volley.Response;
import com.android.volley.VolleyError;
import com.android.volley.toolbox.StringRequest;
import com.android.volley.toolbox.Volley;
import org.json.JSONException;
import org.json.JSONObject;
import java.util.HashMap;
import java.util.Map;
public class Login extends AppCompatActivity {
EditText lui, lps;
private static final String URL = "http://wizzie.tech/leaf/login.php";
private static final String URLF = "http://wizzie.tech/leaf/fp.php";
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_login);
lui = findViewById(R.id.lui);
lps = findViewById(R.id.lps);
}
public void signup(View view) {
startActivity(new Intent(Login.this,Register.class));
}
public void login(View view) {
StringRequest stringRequest = new StringRequest(Request.Method.POST, URL,
new Response.Listener<String>() {
@Override
public void onResponse(String response) {
try {
JSONObject jsonObject=new JSONObject(response);
if(jsonObject.getString("result").equals("success")){
Toast.makeText(Login.this, "Login "+jsonObject.getString("result"),
Toast.LENGTH_LONG).show();
startActivity(new Intent(Login.this,CameraRollActivity.class));
}
} catch (JSONException e) {
e.printStackTrace();
}
}},
new Response.ErrorListener() {
@Override
public void onErrorResponse(VolleyError error) {
Toast.makeText(Login.this, error.toString(), Toast.LENGTH_LONG).show();
}
})
{
@Override
protected Map<String,String> getParams() throws AuthFailureError {

CMRTC 15
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

Map<String,String> params = new HashMap<String, String>();


params.put("u", lui.getText().toString().trim());
params.put("p",lps.getText().toString().trim());
return params;
}
};
RequestQueue requestQueue = Volley.newRequestQueue(this);
requestQueue.add(stringRequest);
}
public void fp(View view) {
final EditText editText=new EditText(Login.this);
final AlertDialog alertDialog=new AlertDialog.Builder(Login.this)
.setTitle("Enter Your Email")
.setView(editText)
.setPositiveButton("ok", new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
StringRequest stringRequest=new StringRequest(Request.Method.POST, URLF,
new Response.Listener<String>() {
@Override
public void onResponse(String response) {
JSONObject jsonObject =null;
try {
jsonObject =new JSONObject(response);
Toast.makeText(Login.this, ""+jsonObject.getString("mobile"),
Toast.LENGTH_LONG).show();
}
catch (JSONException e){
e.printStackTrace();
}
}
},
new Response.ErrorListener() {
@Override
public void onErrorResponse(VolleyError error) {
}
}){
@Override
protected Map<String,String> getParams() throws AuthFailureError {
Map<String,String> params = new HashMap<String, String>();
params.put("u",editText.getText().toString().trim());
return params;
}
};
RequestQueue requestQueue = Volley.newRequestQueue(Login.this);
requestQueue.add(stringRequest);
}
})
.setNegativeButton("cancel", new DialogInterface.OnClickListener() {

CMRTC 16
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

@Override
public void onClick(DialogInterface dialog, int which) {
}
})
.create();
alertDialog.show();
}
}

5.1.3 REGISTER

package org.tensorflow.demo;
import android.content.Intent;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.view.WindowManager;
import android.widget.EditText;
import android.widget.Toast;
import com.android.volley.AuthFailureError;
import com.android.volley.Request;
import com.android.volley.RequestQueue;
import com.android.volley.Response;
import com.android.volley.VolleyError;
import com.android.volley.toolbox.StringRequest;
import com.android.volley.toolbox.Volley;
import org.json.JSONException;
import org.json.JSONObject;
import java.util.HashMap;
import java.util.Map;
public class Register extends AppCompatActivity {
EditText name, id, mobile, email, password;
private static final String URL = "http://wizzie.tech/leaf/register.php";
String emailPattern = "[a-zA-Z0-9._-]+@[a-z]+\\.+[a-z]+";
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_register);
name = findViewById(R.id.name);
id = findViewById(R.id.id);
email = findViewById(R.id.email);
mobile = findViewById(R.id.mob);
password = findViewById(R.id.ps);
}
public void reg(View view) {
if (name.getText().toString().equals("")) {
Toast.makeText(this, "Enter User name", Toast.LENGTH_SHORT).show();
} else if (id.getText().toString().equals("")) {
Toast.makeText(this, "Enter User ID", Toast.LENGTH_SHORT).show();

CMRTC 17
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
} else if (mobile.getText().toString().equals("")) {
Toast.makeText(this, "Enter Mobile Number", Toast.LENGTH_SHORT).show();
} else if (password.getText().toString().equals("")) {
Toast.makeText(this, "Enter Password", Toast.LENGTH_SHORT).show();
} else if (email.getText().toString().matches(emailPattern)) {
StringRequest stringRequest = new StringRequest(Request.Method.POST, URL,
new Response.Listener<String>() {
@Override
public void onResponse(String response) {
try {
JSONObject jsonObject = new JSONObject(response);
if (jsonObject.getString("result").equals("success")) {
startActivity(new Intent(Register.this, Login.class));
Toast.makeText(Register.this, "Registered Successfully",
Toast.LENGTH_SHORT).show();
}
} catch (JSONException e) {
e.printStackTrace();
}
}
},
new Response.ErrorListener() {
@Override
public void onErrorResponse(VolleyError error) {
Toast.makeText(Register.this, error.toString(), Toast.LENGTH_LONG).show();
}
}) {
@Override
protected Map<String, String> getParams() throws AuthFailureError {
Map<String, String> params = new HashMap<String, String>();
params.put("u", name.getText().toString());
params.put("i", id.getText().toString());
params.put("e", email.getText().toString());
params.put("m", mobile.getText().toString());
params.put("p", password.getText().toString());
return params;
}
};
RequestQueue requestQueue = Volley.newRequestQueue(this);
requestQueue.add(stringRequest);
} else {
Toast.makeText(this, "Enter Correct Email", Toast.LENGTH_SHORT).show();
}
}
}

5.1.4 PESTICIDE ACTIVITY

package org.tensorflow.demo;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.view.WindowManager

CMRTC 18
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
public class pesticidesActivity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,Wi
ndowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_pesticides);
}
}

5.1.5 CAMERA ROLL ACTIVITY

package org.tensorflow.demo;
import android.app.Activity;
import android.content.Intent;

import android.graphics.Bitmap;
import android.net.Uri;
import android.os.Bundle;
import android.provider.MediaStore;
import android.speech.tts.TextToSpeech;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.Toast;
import com.squareup.picasso.Picasso;
import java.io.IOException;
import java.util.List;
public class CameraRollActivity extends AppCompatActivity {
private static final int SELECT_IMAGE = 505;
private RecognitionScoreView resultView;
private Bitmap bitmap;
TextToSpeech textToSpeech;
// Classifier
private Classifier classifier;
private static final int INPUT_SIZE = 224;
private static final int IMAGE_MEAN = 128;
private static final float IMAGE_STD = 128.0f;
private static final String INPUT_NAME = "input";
private static final String OUTPUT_NAME = "final_result";
private static final String MODEL_FILE =
"file:///android_asset/optimized_mobilenet_plant_graph.pb";
private static final String LABEL_FILE = "file:///android_asset/plant_labels.txt";
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,

CMRTC 19
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_camera_roll);
Button chooseImage = (Button) findViewById(R.id.choose_image);
resultView = (RecognitionScoreView) findViewById(R.id.results);
resultView.setVisibility(View.INVISIBLE);
classifier =
TensorFlowImageClassifier.create(
getAssets(),
MODEL_FILE,
LABEL_FILE,
INPUT_SIZE,
IMAGE_MEAN,
IMAGE_STD,
INPUT_NAME,
OUTPUT_NAME);
chooseImage.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
Intent intent = new Intent();
intent.setType("image/*");
intent.setAction(Intent.ACTION_GET_CONTENT);
startActivityForResult(Intent.createChooser(intent, "Select Picture"),
SELECT_IMAGE);
}
});
}
@Override protected void onResume() {
super.onResume();
}
@Override protected void onActivityResult(int requestCode, int resultCode, Intent
data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == SELECT_IMAGE) {
if (resultCode == Activity.RESULT_OK) {
if (data != null) {
Uri selectedImageURI = data.getData();
Picasso.with(this).load(selectedImageURI).noPlaceholder().centerCrop().fit()
.into((ImageView) this.findViewById(R.id.image));
try
{
bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(),
data.getData());
bitmap = Bitmap.createScaledBitmap(bitmap, 224, 224, false);
} catch (IOException e)
{
e.printStackTrace();
}
classifyImage();
}

CMRTC 20
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
} else if (resultCode == Activity.RESULT_CANCELED) {
Toast.makeText(this, "Cancelled", Toast.LENGTH_SHORT).show();
}
}
}
private void classifyImage() {
resultView.setVisibility(View.VISIBLE);
final List<Classifier.Recognition> results = classifier.recognizeImage(bitmap);
resultView.setResults(results);
textToSpeech= new TextToSpeech(getApplicationContext(), new
TextToSpeech.OnInitListener() {@Override
public void onInit(int status) {
if(status!=TextToSpeech.ERROR){
textToSpeech.speak("Detected Disease
is"+results,TextToSpeech.QUEUE_FLUSH,null) ;
}
}
});
}
public void pest(View view) {
startActivity(new Intent(CameraRollActivity.this, pesticidesActivity.class));
}
public void signout(View view) {
startActivity(new Intent(CameraRollActivity.this, Login.class));
}
}

5.1.6 CLASSIFIER

package org.tensorflow.demo;
import android.graphics.Bitmap;
import android.graphics.RectF;
import java.util.List;
public interface Classifier {
public class Recognition {
private final String id;
private final String title;
private final Float confidence;
private RectF location;
public Recognition(
final String id, final String title, final Float confidence, final RectF location) {
this.id = id;
this.title = title;
this.confidence = confidence;
this.location = location;
}
public String getId() {
return id;
}

CMRTC 21
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
public String getTitle() {
return title;
}
public Float getConfidence() {
return confidence;
}
public RectF getLocation() {
return new RectF(location);
}
public void setLocation(RectF location) {
this.location = location;
}
@Override
public String toString() {
String resultString = "";
if (id != null) {
resultString += "[" + id + "] ";
}
if (title != null) {
resultString += title + " ";
}
if (confidence != null) {
resultString += String.format("(%.1f%%) ", confidence * 100.0f);
}
if (location != null) {
resultString += location + " ";
}
return resultString.trim();
}
}

List<Recognition> recognizeImage(Bitmap bitmap);


void enableStatLogging(final boolean debug);
String getStatString();
void close();
}

5.1.7 REGISTER SCORE VIEW

package org.tensorflow.demo;
import android.content.Context;
import android.graphics.Canvas;
import android.graphics.Paint;
import android.util.AttributeSet;
import android.util.TypedValue;
import android.view.View;
import org.tensorflow.demo.Classifier.Recognition;
import java.util.List;

CMRTC 22
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
public class RecognitionScoreView extends View implements ResultsView {
private static final float TEXT_SIZE_DIP = 15;
private List<Recognition> results;
private final float textSizePx;
private final Paint fgPaint;
private final Paint bgPaint;
public RecognitionScoreView(final Context context, final AttributeSet set) {
super(context, set);
textSizePx =
TypedValue.applyDimension(
TypedValue.COMPLEX_UNIT_DIP, TEXT_SIZE_DIP,
getResources().getDisplayMetrics());
fgPaint = new Paint();
fgPaint.setTextSize(textSizePx);
bgPaint = new Paint();
bgPaint.setColor(0xcc4285f4);
}
@Override
public void setResults(final List<Recognition> results) {
this.results = results;
postInvalidate();
}
@Override
public void onDraw(final Canvas canvas) {
final int x = 10;
int y = (int) (fgPaint.getTextSize() * 1.5f);
canvas.drawPaint(bgPaint);
if (results != null) {
for (final Recognition recog : results) {
canvas.drawText(recog.getTitle() + ": " + recog.getConfidence(), x, y, fgPaint);
y += fgPaint.getTextSize() * 1.5f;
}
}
}
}

5.1.8 RESULTS VIEW

package org.tensorflow.demo;
import org.tensorflow.demo.Classifier.Recognition;
import java.util.List;
public interface ResultsView {
public void setResults(final List<Recognition> results);
}

5.1.9 TENSORFLOW IMAGE CLASSIFIER

package org.tensorflow.demo;
import android.annotation.SuppressLint;
import android.content.res.AssetManager;
import android.graphics.Bitmap;

CMRTC 23
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
import android.os.Trace;
import android.util.Log;
import org.tensorflow.Operation;
import org.tensorflow.contrib.android.TensorFlowInferenceInterface;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.Comparator;
import java.util.List;
import java.util.PriorityQueue;
import java.util.Vector;
public class TensorFlowImageClassifier implements Classifier {
private static final int MAX_RESULTS = 3;
private static final float THRESHOLD = 0.1f;
private String inputName;
private String outputName;
private int inputSize;
private int imageMean;
private float imageStd;
private Vector<String> labels = new Vector<String>();
private int[] intValues;
private float[] floatValues;
private float[] outputs;
private String[] outputNames;
private boolean logStats = false;
private TensorFlowInferenceInterface inferenceInterface;
private TensorFlowImageClassifier1() {}
@SuppressLint("LongLogTag")
public static Classifier create(
AssetManager assetManager,
String modelFilename,
String labelFilename,
int inputSize,
int imageMean,
float imageStd,
String inputName,
String outputName) {
TensorFlowImageClassifier1 c = new TensorFlowImageClassifier1();
c.inputName = inputName;
c.outputName = outputName;
String actualFilename = labelFilename.split("file:///android_asset/")[1];
Log.i(TAG, "Reading labels from: " + actualFilename);
BufferedReader br = null;
try {
br = new BufferedReader(new
InputStreamReader(assetManager.open(actualFilename)));
String line;
while ((line = br.readLine()) != null) {
c.labels.add(line);

CMRTC 24
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
}
br.close();
} catch (IOException e) {
throw new RuntimeException("Problem reading label file!" , e);
}
c.inferenceInterface = new TensorFlowInferenceInterface(assetManager,
modelFilename);
final Operation operation = c.inferenceInterface.graphOperation(outputName);
final int numClasses = (int) operation.output(0).shape().size(1);
Log.i(TAG, "Read " + c.labels.size() + " labels, output layer size is " +
numClasses);
c.inputSize = inputSize;
c.imageMean = imageMean;
c.imageStd = imageStd;
c.outputNames = new String[] {outputName};
c.intValues = new int[inputSize * inputSize];
c.floatValues = new float[inputSize * inputSize * 3];
c.outputs = new float[numClasses];
return c;
}
@Override
public List<Recognition> recognizeImage(final Bitmap bitmap) {
Trace.beginSection("recognizeImage");
Trace.beginSection("preprocessBitmap");
bitmap.getPixels(intValues, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(),
bitmap.getHeight());
for (int i = 0; i < intValues.length; ++i) {
final int val = intValues[i];
floatValues[i * 3 + 0] = (((val >> 16) & 0xFF) - imageMean) / imageStd;
floatValues[i * 3 + 1] = (((val >> 8) & 0xFF) - imageMean) / imageStd;
floatValues[i * 3 + 2] = ((val & 0xFF) - imageMean) / imageStd;
}
Trace.endSection();
Trace.beginSection("feed");
inferenceInterface.feed(inputName, floatValues, 1, inputSize, inputSize, 3);
Trace.endSection();
Trace.beginSection("run");
inferenceInterface.run(outputNames, logStats);
Trace.endSection();
Trace.beginSection("fetch");
inferenceInterface.fetch(outputName, outputs);
Trace.endSection();
PriorityQueue<Recognition> pq =
new PriorityQueue<Recognition>(
3,
new Comparator<Recognition>() {
@Override
public int compare(Recognition lhs, Recognition rhs) {
return Float.compare(rhs.getConfidence(), lhs.getConfidence());
}
});
for (int i = 0; i < outputs.length; ++i) {

CMRTC 25
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
if (outputs[i] > THRESHOLD) {
pq.add(
new Recognition(
"" + i, labels.size() > i ? labels.get(i) : "unknown", outputs[i], null));
}
}
final ArrayList<Recognition> recognitions = new ArrayList<Recognition>();
int recognitionsSize = Math.min(pq.size(), MAX_RESULTS);
for (int i = 0; i < recognitionsSize; ++i) {
recognitions.add(pq.poll());
}
Trace.endSection();
return recognitions;
}
@Override
public void enableStatLogging(boolean logStats) {
this.logStats = logStats;
}
@Override
public String getStatString() {
return inferenceInterface.getStatString();
}
@Override
public void close() {
inferenceInterface.close();
}
}

CMRTC 26
6. RESULTS
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

6. RESULTS

Screenshot 6.1: Splash screen


CMRTC 27
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

Screenshot 6.2: Registration page


CMRTC 28

ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

Screenshot 6.3: Login page


CMRTC 29
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

Screenshot 5.4: Main Screen

CMRTC 30
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
Screenshot 5.5 : Results display

CMRTC 31
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
Screenshot 5.6: Fertilizers list

CMRTC 32
7. TESTING
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

7.TESTING

7.1 INTRODUCTION TO TESTING

The purpose of testing is to discover errors. Testing is the process of trying


to discover every conceivable fault or weakness in a work product. It provides a
way to check the functionality of components, subassemblies, assemblies and/or a
finished product. It is the process of exercising software with the intent of ensuring
that the Software system meets its requirements and user expectations and does not
fail in an unacceptable manner. There are various types of tests. Each test type
addresses a specific testing requirement.

7.2 TYPES OF TESTING


7.2.1 UNIT TESTING

Unit testing involves the design of test cases that validate that the internal
program logic is functioning properly, and that program inputs produce valid
outputs. All decision branches and internal code flow should be validated. It is the
testing of individual software units of the application .It is done after the
completion of an individual unit before integration. This is a structural testing that
relies on knowledge of its construction and is invasive. Unit tests perform basic
tests at component level and test a specific business process, application and/or
system configuration. Unit tests ensure that each unique path of a business process
performs accurately to the documented specifications and contains clearly defined
inputs and expected results.
CMRTC 33
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

7.2.2 INTEGRATION TESTING

Integration tests are designed to test integrated software components to


determine if they actually run as one program. Integration tests demonstrate that
although the components were individually satisfactory, as shown by successfully
unit testing, the combination of components is correct and consistent. Integration
testing is specifically aimed at exposing the problems that arise from the
combination of components.

7.2.3 FUNCTIONAL TESTING

Functional tests provide systematic demonstrations that functions tested are


available as specified by the business and technical requirements, system
documentation, and user manuals.

Functional testing is centered on the following items:

Valid Input : identified classes of valid input must

be accepted.
Invalid
: identified classes of invalid input must
Input
be rejected.
Functions
: identified functions must be exercised.
Output
: identified classes of application outputs

must be exercised.

Systems/Procedures: interfacing systems or procedures must be invoked.


Organization and preparation of functional tests is focused on requirements, key
functions, or special test cases.
CMRTC 34
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

7.3 TEST CASES

7.3.1 UPLOADING DATASET

Test case Test Case Purpose Test Case Output


id Name

1 User uploads Use it for The user Uploaded


image identification uploads successfully
the
diseased
leaf image

2 User uploads Use it for The user uploaded


2nd image identification uploads successfully
the non-
diseased
leaf image

Table 7.1 : Uploading dataset

7.3.1 CLASSIFICATION

Test Case Test Case Purpose Input Output


Id Name
1 Classification To check if the A plant Leaf is
test 1 classifier image is predicted
performs its given
task

2 Classification To check if the A diseased Its


test 2 classifier leaf image predicted
performs its is given as leaf
task having
disease
3 Classification To check if the A diseased Predicted
test 3 classifier leaf image the name of
performs its is given disease
task

Table 7.2 : Classification


CMRTC 35

8. CONCLUSION
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

8. CONCLUSION & FUTURE SCOPE

8.1 PROJECT CONCLUSION

This project proposed to point out disease in the leaf with a union of shape,
texture and color feature withdrawal. Initially the farmers send a digital image of
the diseased leaf of a plant and these images are read and processed automatically
and the results are shown. The output of this project is to get hold of relevant
results that can spot diseased leaves of certain commonly caused disease to plants.
Firstly, healthy and diseased images are composed and pre-processed. Later,
attributes like shape, color and texture are taken out from these images. Based on
the classified type of disease a text is displayed to the user in the project

8.2 FUTURE SCOPE

In future we can use other convolutional neural networks by downloading


the modules directly into the project files. The software can be developed further to
include a lot of modules because the proposed system is developed with the view
of the future. We can connect to other databases by including them.
CMRTC 36

9. BIBLIOGRAPHY
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING

9. BIBLIOGRAPHY

9.1 REFERENCES
[1] Jiang Lu, Jie Hu, Guannan Zhao, Fenghua Mei, Changshui Zhang, An in-field
automatic wheat disease diagnosis system, Computers and Electronics in
Agriculture 142 (2017) 369 379.

[2] Andreas Kamilaris, Francesc X. Prenafeta-Boldu Deep learning in agriculture.


A survey,
Computers and Electronics in Agriculture 147 (2018) 70-90.

[3] Konstantinos P. Ferentinos, Deep learning models for plant disease detection
and
Diagnosis Computers and Electronics in Agriculture 145 (2018) 311-318.

[4] Kulkarni Anand H, Ashwin Patil RK. Applying image processing techniques
to detect plant diseases. Int J Mod Eng Res 2012;2(5):3661-4. [5] Bashir Sabah,
Sharma Navdeep. Remote area plant disease detection using image.

processing. IOSR J Electron Commun Eng 2012;2(6):31-4. ISSN: 2278-2834.


[6] Rakesh Kaundal, Amar S Kapoor and Gajendra PS Raghava "Machine
learning technique in disease forecasting: a case study on rice blast prediction."
BMC Bioinformatics, 2006.
[7] Srdjan Sladojevic, Marko Arsenovic, Andras Anderla, Dubravko Culibrk, and
Darko Stefanovic, Deep Neural Networks Based Recognition of Plant Diseases by
Leaf Image Classification, Hindawi Publishing Corporation, Computational
Intelligence and Neuroscience Volume 2016, Article ID 3289801, 11 pages
http://dx.doi.org/10.1155/2016/3289801.

[8] J.Howse, OpenCV Computer Vision with Python, PacktPublishing,


Birmingham, UK. 2013.

9.2 GITHUB LINK

LINK: https://github.com/KarthikBogelly/Minor_Project.git
CMRTC 37

You might also like