Professional Documents
Culture Documents
Mini Project
On
Android Application on Plant Leaf Disease Detection using
Machine Learning
(Submitted in partial fulfillment of the requirements for the award of Degree)
BACHELOR OF TECHNOLOGY
In
COMPUTER SCIENCE AND ENGINEERING
By
B. KARTHIK (197R1A0570)
C. RAHUL RAO (197R1A0571)
U. SRIKANTH (197R1A05B2)
UGC AUTONOMOUS
(Accredited by NAAC, NBA, Permanently Affiliated to JNTUH, Approved by AICTE, New
Delhi) Recognized Under Section 2(f) & 12(B) of the UGCAct.1956, Kandlakoya (V),
CERTIFICATE
This is to certify that the project entitled “ANDROID APPLICATION ON
PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING ” being
submitted by B. KARTHIK (197R1A0570) C. RAHUL RAO (197R1A0571) & U.
SRIKANTH (197R1A05B2) in partial fulfillment of the requirements for the award of the
degree of B.Tech in Computer Science and Engineering to the Jawaharlal Nehru
Technological University Hyderabad, is a record of bonafide work carried out by them
under our guidance and supervision during the year 2022-23.
The results embodied in this thesis have not been submitted to any other University
or Institute for the award of any degree or diploma.
Apart from the efforts of us, the success of any project depends largely on the
encouragement and guidelines of many others. We take this opportunity to express our
gratitude to the people who have been instrumental in the successful completion of this
project.
We take this opportunity to express my profound gratitude and deep regard to
my guide Dr.G.Somasekhar, Associate Professor for his exemplary guidance,
monitoring and constant encouragement throughout the project work. The blessing, help
and guidance given by him shall carry us a long way in the journey of life on which we
are about to embark.
B. KARTHIK (197R1A0570)
C. RAHUL RAO (197R1A0571)
U. SRIKANTH (197R1A05B2)
ABSTRACT
i
LIST OF FIGURES
FIGURE NO FIGURE NAME PAGE NO
ii
RESULTS
9.1 REFERENCES 23
9.2 GITHUB LINK 23
1. INTRODUCTION
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
1. INTRODUCTION
CMRTC 1
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
CMRTC 2
2. LITERATURE SURVEY
2.LITERATURE SURVEY
CMRTC 3
After building disks centered in cancroids resulting from both FNCC and CHT, the
detection results were merged based on the size and Euclidean distance of the
intersection areas of the disks from these two methods. Finally, the number of
fruits was determined after false positive removal using texture features. For a
validation dataset of 59 images, 84.4 % of the fruits were successfully detected,
which indicated the potential of the proposed method toward the development of
an early yield mapping system.
CMRTC 4
3. SYSTEM ANALYSIS
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
3. SYSTEM ANALYSIS
SYSTEM ANALYSIS
In today's world, agricultural land is more than just a feeding resource. But
considering the great benefits that agriculture has given the world, helping a large
population faces a number of challenges. Agriculture worldwide is affected by
many existential challenges. Massive challenges are the rise of the human
population and the loss of arable land. The central goal of today's agriculture was
to promote the immoderate use of materials such as pesticides, one of the causes of
unhealthy agricultural conditions, which in turn renders plants more susceptible to
pathogen attacks and more difficult to manage plant diseases, in order for the world
to provide an adequate supply and to maximize output. To do this a group of
experts would be needed to screen a plant as required, and then if carried out on a
large scale, costs are very high.
In the existing system, farmers are getting losses in their crops at the end of
the year, because only the reason is farmers are not in that much knowledge to
estimate the disease of the crop. The farmer is failing to identify the crop’s disease
and take it to the enhancing, which makes the farmer more prone to loss.
CMRTC 5
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
● EconomicFeasibility
● TechnicalFeasibility
● SocialFeasibility
This study is carried out to check the economic impact that the system will
have on the organization. The amount of fund that the company can pour into the
research and development of the system is limited. The expenditures must be
justified. Thus, the developed system as well within the budget and this was
achieved because most of the technologies used are freely available. Only the
customized products had to be purchased.
This study is carried out to check the technical feasibility, that is, the
technical requirements of the system. Any system developed must not have a high
demand on the technical resources. This will lead to high demands on the available
technical resources. This will lead to high demands being placed on the client. The
developed system must have a modest requirement, as only minimal or null
charges are required for implementing this system.
CMRTC 7
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
● Tensor flow
● Java Development Kit
● Android Studio
● Android Virtual Device
CMRTC 8
4. ARCHITECTURE
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
4.ARCHITECTURE
4.2 DESCRIPTION
Input Data: Input data is generally in jpg format or png format where the data is
fetched and mapped in the data framed from the source columns.
Separating Features: In this following step we are going to separate the features
which we take to train the model by giving the target value i.e. 1/0 for the
particular features.
Normalization: Normalization is a very important step while we are dealing with
the large values in the features as the higher bit integers will cost high
computational power and time. To achieve efficiency in computation we are going
to normalize the data values.
Training and test data: Training data is passed to the CNN classifier to train the
model. Test data is used to test the trained model whether it is making correct
predictions or not.
CMRTC 9
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
CNN Classifier: the purpose of choosing the CNN classifier for this project is the
efficiency and accuracy that we have observed when compared to other classifiers.
In the use case diagram we have basically two actors who are the user and
the stem. Where the user will provide the data in the form of images and system
verifies the data in order to give results.
Figure 4.2: Use Case Diagram for Android Application on Plant Leaf Disease
Detection Using Machine Learning
CMRTC 10
CMRTC 11
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
CMRTC 12
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
Figure 4.5: Activity Diagram for Android Application on Plant Leaf Disease Detection
Using Machine Learning
CMRTC 13
5. IMPLEMENTATION
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
5. IMPLEMENTATION
package org.tensorflow.demo;
import android.content.Intent;
import android.os.Bundle;
import android.os.Handler;
import android.support.v7.app.AppCompatActivity;
import android.view.WindowManager;
import android.view.animation.Animation;
import android.view.animation.AnimationUtils;
import android.widget.ImageView;
import android.widget.TextView;
public class MainActivity extends AppCompatActivity {
Animation topAnim, bottomAnim;
ImageView image;
TextView logo;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,Wi
ndowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_main);
topAnim = AnimationUtils.loadAnimation(this,R.anim.top_animation);
bottomAnim = AnimationUtils.loadAnimation(this,R.anim.bottom_animation);
image = findViewById(R.id.imageView2);
logo = findViewById(R.id.textView2);
image.setAnimation(topAnim);
logo.setAnimation(bottomAnim);
int secondsDelayed = 1;
new Handler().postDelayed(new Runnable() {
public void run() {
startActivity(new Intent(MainActivity.this, Login.class));
finish();
}
}, secondsDelayed * 3000);
}
}
package org.tensorflow.demo;
import android.content.DialogInterface;
import android.content.Intent;
import android.os.Bundle;
import android.support.v7.app.AlertDialog;
import android.support.v7.app.AppCompatActivity;
CMRTC 14
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
import android.widget.Toast;
import com.android.volley.AuthFailureError;
import com.android.volley.Request;
import com.android.volley.RequestQueue;
import com.android.volley.Response;
import com.android.volley.VolleyError;
import com.android.volley.toolbox.StringRequest;
import com.android.volley.toolbox.Volley;
import org.json.JSONException;
import org.json.JSONObject;
import java.util.HashMap;
import java.util.Map;
public class Login extends AppCompatActivity {
EditText lui, lps;
private static final String URL = "http://wizzie.tech/leaf/login.php";
private static final String URLF = "http://wizzie.tech/leaf/fp.php";
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_login);
lui = findViewById(R.id.lui);
lps = findViewById(R.id.lps);
}
public void signup(View view) {
startActivity(new Intent(Login.this,Register.class));
}
public void login(View view) {
StringRequest stringRequest = new StringRequest(Request.Method.POST, URL,
new Response.Listener<String>() {
@Override
public void onResponse(String response) {
try {
JSONObject jsonObject=new JSONObject(response);
if(jsonObject.getString("result").equals("success")){
Toast.makeText(Login.this, "Login "+jsonObject.getString("result"),
Toast.LENGTH_LONG).show();
startActivity(new Intent(Login.this,CameraRollActivity.class));
}
} catch (JSONException e) {
e.printStackTrace();
}
}},
new Response.ErrorListener() {
@Override
public void onErrorResponse(VolleyError error) {
Toast.makeText(Login.this, error.toString(), Toast.LENGTH_LONG).show();
}
})
{
@Override
protected Map<String,String> getParams() throws AuthFailureError {
CMRTC 15
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
CMRTC 16
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
@Override
public void onClick(DialogInterface dialog, int which) {
}
})
.create();
alertDialog.show();
}
}
5.1.3 REGISTER
package org.tensorflow.demo;
import android.content.Intent;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.view.WindowManager;
import android.widget.EditText;
import android.widget.Toast;
import com.android.volley.AuthFailureError;
import com.android.volley.Request;
import com.android.volley.RequestQueue;
import com.android.volley.Response;
import com.android.volley.VolleyError;
import com.android.volley.toolbox.StringRequest;
import com.android.volley.toolbox.Volley;
import org.json.JSONException;
import org.json.JSONObject;
import java.util.HashMap;
import java.util.Map;
public class Register extends AppCompatActivity {
EditText name, id, mobile, email, password;
private static final String URL = "http://wizzie.tech/leaf/register.php";
String emailPattern = "[a-zA-Z0-9._-]+@[a-z]+\\.+[a-z]+";
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_register);
name = findViewById(R.id.name);
id = findViewById(R.id.id);
email = findViewById(R.id.email);
mobile = findViewById(R.id.mob);
password = findViewById(R.id.ps);
}
public void reg(View view) {
if (name.getText().toString().equals("")) {
Toast.makeText(this, "Enter User name", Toast.LENGTH_SHORT).show();
} else if (id.getText().toString().equals("")) {
Toast.makeText(this, "Enter User ID", Toast.LENGTH_SHORT).show();
CMRTC 17
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
} else if (mobile.getText().toString().equals("")) {
Toast.makeText(this, "Enter Mobile Number", Toast.LENGTH_SHORT).show();
} else if (password.getText().toString().equals("")) {
Toast.makeText(this, "Enter Password", Toast.LENGTH_SHORT).show();
} else if (email.getText().toString().matches(emailPattern)) {
StringRequest stringRequest = new StringRequest(Request.Method.POST, URL,
new Response.Listener<String>() {
@Override
public void onResponse(String response) {
try {
JSONObject jsonObject = new JSONObject(response);
if (jsonObject.getString("result").equals("success")) {
startActivity(new Intent(Register.this, Login.class));
Toast.makeText(Register.this, "Registered Successfully",
Toast.LENGTH_SHORT).show();
}
} catch (JSONException e) {
e.printStackTrace();
}
}
},
new Response.ErrorListener() {
@Override
public void onErrorResponse(VolleyError error) {
Toast.makeText(Register.this, error.toString(), Toast.LENGTH_LONG).show();
}
}) {
@Override
protected Map<String, String> getParams() throws AuthFailureError {
Map<String, String> params = new HashMap<String, String>();
params.put("u", name.getText().toString());
params.put("i", id.getText().toString());
params.put("e", email.getText().toString());
params.put("m", mobile.getText().toString());
params.put("p", password.getText().toString());
return params;
}
};
RequestQueue requestQueue = Volley.newRequestQueue(this);
requestQueue.add(stringRequest);
} else {
Toast.makeText(this, "Enter Correct Email", Toast.LENGTH_SHORT).show();
}
}
}
package org.tensorflow.demo;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.view.WindowManager
CMRTC 18
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
public class pesticidesActivity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,Wi
ndowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_pesticides);
}
}
package org.tensorflow.demo;
import android.app.Activity;
import android.content.Intent;
import android.graphics.Bitmap;
import android.net.Uri;
import android.os.Bundle;
import android.provider.MediaStore;
import android.speech.tts.TextToSpeech;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.Toast;
import com.squareup.picasso.Picasso;
import java.io.IOException;
import java.util.List;
public class CameraRollActivity extends AppCompatActivity {
private static final int SELECT_IMAGE = 505;
private RecognitionScoreView resultView;
private Bitmap bitmap;
TextToSpeech textToSpeech;
// Classifier
private Classifier classifier;
private static final int INPUT_SIZE = 224;
private static final int IMAGE_MEAN = 128;
private static final float IMAGE_STD = 128.0f;
private static final String INPUT_NAME = "input";
private static final String OUTPUT_NAME = "final_result";
private static final String MODEL_FILE =
"file:///android_asset/optimized_mobilenet_plant_graph.pb";
private static final String LABEL_FILE = "file:///android_asset/plant_labels.txt";
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
CMRTC 19
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
WindowManager.LayoutParams.FLAG_FULLSCREEN);
setContentView(R.layout.activity_camera_roll);
Button chooseImage = (Button) findViewById(R.id.choose_image);
resultView = (RecognitionScoreView) findViewById(R.id.results);
resultView.setVisibility(View.INVISIBLE);
classifier =
TensorFlowImageClassifier.create(
getAssets(),
MODEL_FILE,
LABEL_FILE,
INPUT_SIZE,
IMAGE_MEAN,
IMAGE_STD,
INPUT_NAME,
OUTPUT_NAME);
chooseImage.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
Intent intent = new Intent();
intent.setType("image/*");
intent.setAction(Intent.ACTION_GET_CONTENT);
startActivityForResult(Intent.createChooser(intent, "Select Picture"),
SELECT_IMAGE);
}
});
}
@Override protected void onResume() {
super.onResume();
}
@Override protected void onActivityResult(int requestCode, int resultCode, Intent
data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == SELECT_IMAGE) {
if (resultCode == Activity.RESULT_OK) {
if (data != null) {
Uri selectedImageURI = data.getData();
Picasso.with(this).load(selectedImageURI).noPlaceholder().centerCrop().fit()
.into((ImageView) this.findViewById(R.id.image));
try
{
bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(),
data.getData());
bitmap = Bitmap.createScaledBitmap(bitmap, 224, 224, false);
} catch (IOException e)
{
e.printStackTrace();
}
classifyImage();
}
CMRTC 20
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
} else if (resultCode == Activity.RESULT_CANCELED) {
Toast.makeText(this, "Cancelled", Toast.LENGTH_SHORT).show();
}
}
}
private void classifyImage() {
resultView.setVisibility(View.VISIBLE);
final List<Classifier.Recognition> results = classifier.recognizeImage(bitmap);
resultView.setResults(results);
textToSpeech= new TextToSpeech(getApplicationContext(), new
TextToSpeech.OnInitListener() {@Override
public void onInit(int status) {
if(status!=TextToSpeech.ERROR){
textToSpeech.speak("Detected Disease
is"+results,TextToSpeech.QUEUE_FLUSH,null) ;
}
}
});
}
public void pest(View view) {
startActivity(new Intent(CameraRollActivity.this, pesticidesActivity.class));
}
public void signout(View view) {
startActivity(new Intent(CameraRollActivity.this, Login.class));
}
}
5.1.6 CLASSIFIER
package org.tensorflow.demo;
import android.graphics.Bitmap;
import android.graphics.RectF;
import java.util.List;
public interface Classifier {
public class Recognition {
private final String id;
private final String title;
private final Float confidence;
private RectF location;
public Recognition(
final String id, final String title, final Float confidence, final RectF location) {
this.id = id;
this.title = title;
this.confidence = confidence;
this.location = location;
}
public String getId() {
return id;
}
CMRTC 21
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
public String getTitle() {
return title;
}
public Float getConfidence() {
return confidence;
}
public RectF getLocation() {
return new RectF(location);
}
public void setLocation(RectF location) {
this.location = location;
}
@Override
public String toString() {
String resultString = "";
if (id != null) {
resultString += "[" + id + "] ";
}
if (title != null) {
resultString += title + " ";
}
if (confidence != null) {
resultString += String.format("(%.1f%%) ", confidence * 100.0f);
}
if (location != null) {
resultString += location + " ";
}
return resultString.trim();
}
}
package org.tensorflow.demo;
import android.content.Context;
import android.graphics.Canvas;
import android.graphics.Paint;
import android.util.AttributeSet;
import android.util.TypedValue;
import android.view.View;
import org.tensorflow.demo.Classifier.Recognition;
import java.util.List;
CMRTC 22
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
public class RecognitionScoreView extends View implements ResultsView {
private static final float TEXT_SIZE_DIP = 15;
private List<Recognition> results;
private final float textSizePx;
private final Paint fgPaint;
private final Paint bgPaint;
public RecognitionScoreView(final Context context, final AttributeSet set) {
super(context, set);
textSizePx =
TypedValue.applyDimension(
TypedValue.COMPLEX_UNIT_DIP, TEXT_SIZE_DIP,
getResources().getDisplayMetrics());
fgPaint = new Paint();
fgPaint.setTextSize(textSizePx);
bgPaint = new Paint();
bgPaint.setColor(0xcc4285f4);
}
@Override
public void setResults(final List<Recognition> results) {
this.results = results;
postInvalidate();
}
@Override
public void onDraw(final Canvas canvas) {
final int x = 10;
int y = (int) (fgPaint.getTextSize() * 1.5f);
canvas.drawPaint(bgPaint);
if (results != null) {
for (final Recognition recog : results) {
canvas.drawText(recog.getTitle() + ": " + recog.getConfidence(), x, y, fgPaint);
y += fgPaint.getTextSize() * 1.5f;
}
}
}
}
package org.tensorflow.demo;
import org.tensorflow.demo.Classifier.Recognition;
import java.util.List;
public interface ResultsView {
public void setResults(final List<Recognition> results);
}
package org.tensorflow.demo;
import android.annotation.SuppressLint;
import android.content.res.AssetManager;
import android.graphics.Bitmap;
CMRTC 23
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
import android.os.Trace;
import android.util.Log;
import org.tensorflow.Operation;
import org.tensorflow.contrib.android.TensorFlowInferenceInterface;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.Comparator;
import java.util.List;
import java.util.PriorityQueue;
import java.util.Vector;
public class TensorFlowImageClassifier implements Classifier {
private static final int MAX_RESULTS = 3;
private static final float THRESHOLD = 0.1f;
private String inputName;
private String outputName;
private int inputSize;
private int imageMean;
private float imageStd;
private Vector<String> labels = new Vector<String>();
private int[] intValues;
private float[] floatValues;
private float[] outputs;
private String[] outputNames;
private boolean logStats = false;
private TensorFlowInferenceInterface inferenceInterface;
private TensorFlowImageClassifier1() {}
@SuppressLint("LongLogTag")
public static Classifier create(
AssetManager assetManager,
String modelFilename,
String labelFilename,
int inputSize,
int imageMean,
float imageStd,
String inputName,
String outputName) {
TensorFlowImageClassifier1 c = new TensorFlowImageClassifier1();
c.inputName = inputName;
c.outputName = outputName;
String actualFilename = labelFilename.split("file:///android_asset/")[1];
Log.i(TAG, "Reading labels from: " + actualFilename);
BufferedReader br = null;
try {
br = new BufferedReader(new
InputStreamReader(assetManager.open(actualFilename)));
String line;
while ((line = br.readLine()) != null) {
c.labels.add(line);
CMRTC 24
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
}
br.close();
} catch (IOException e) {
throw new RuntimeException("Problem reading label file!" , e);
}
c.inferenceInterface = new TensorFlowInferenceInterface(assetManager,
modelFilename);
final Operation operation = c.inferenceInterface.graphOperation(outputName);
final int numClasses = (int) operation.output(0).shape().size(1);
Log.i(TAG, "Read " + c.labels.size() + " labels, output layer size is " +
numClasses);
c.inputSize = inputSize;
c.imageMean = imageMean;
c.imageStd = imageStd;
c.outputNames = new String[] {outputName};
c.intValues = new int[inputSize * inputSize];
c.floatValues = new float[inputSize * inputSize * 3];
c.outputs = new float[numClasses];
return c;
}
@Override
public List<Recognition> recognizeImage(final Bitmap bitmap) {
Trace.beginSection("recognizeImage");
Trace.beginSection("preprocessBitmap");
bitmap.getPixels(intValues, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(),
bitmap.getHeight());
for (int i = 0; i < intValues.length; ++i) {
final int val = intValues[i];
floatValues[i * 3 + 0] = (((val >> 16) & 0xFF) - imageMean) / imageStd;
floatValues[i * 3 + 1] = (((val >> 8) & 0xFF) - imageMean) / imageStd;
floatValues[i * 3 + 2] = ((val & 0xFF) - imageMean) / imageStd;
}
Trace.endSection();
Trace.beginSection("feed");
inferenceInterface.feed(inputName, floatValues, 1, inputSize, inputSize, 3);
Trace.endSection();
Trace.beginSection("run");
inferenceInterface.run(outputNames, logStats);
Trace.endSection();
Trace.beginSection("fetch");
inferenceInterface.fetch(outputName, outputs);
Trace.endSection();
PriorityQueue<Recognition> pq =
new PriorityQueue<Recognition>(
3,
new Comparator<Recognition>() {
@Override
public int compare(Recognition lhs, Recognition rhs) {
return Float.compare(rhs.getConfidence(), lhs.getConfidence());
}
});
for (int i = 0; i < outputs.length; ++i) {
CMRTC 25
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
if (outputs[i] > THRESHOLD) {
pq.add(
new Recognition(
"" + i, labels.size() > i ? labels.get(i) : "unknown", outputs[i], null));
}
}
final ArrayList<Recognition> recognitions = new ArrayList<Recognition>();
int recognitionsSize = Math.min(pq.size(), MAX_RESULTS);
for (int i = 0; i < recognitionsSize; ++i) {
recognitions.add(pq.poll());
}
Trace.endSection();
return recognitions;
}
@Override
public void enableStatLogging(boolean logStats) {
this.logStats = logStats;
}
@Override
public String getStatString() {
return inferenceInterface.getStatString();
}
@Override
public void close() {
inferenceInterface.close();
}
}
CMRTC 26
6. RESULTS
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
6. RESULTS
CMRTC 30
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
Screenshot 5.5 : Results display
CMRTC 31
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
Screenshot 5.6: Fertilizers list
CMRTC 32
7. TESTING
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
7.TESTING
Unit testing involves the design of test cases that validate that the internal
program logic is functioning properly, and that program inputs produce valid
outputs. All decision branches and internal code flow should be validated. It is the
testing of individual software units of the application .It is done after the
completion of an individual unit before integration. This is a structural testing that
relies on knowledge of its construction and is invasive. Unit tests perform basic
tests at component level and test a specific business process, application and/or
system configuration. Unit tests ensure that each unique path of a business process
performs accurately to the documented specifications and contains clearly defined
inputs and expected results.
CMRTC 33
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
be accepted.
Invalid
: identified classes of invalid input must
Input
be rejected.
Functions
: identified functions must be exercised.
Output
: identified classes of application outputs
must be exercised.
7.3.1 CLASSIFICATION
8. CONCLUSION
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
This project proposed to point out disease in the leaf with a union of shape,
texture and color feature withdrawal. Initially the farmers send a digital image of
the diseased leaf of a plant and these images are read and processed automatically
and the results are shown. The output of this project is to get hold of relevant
results that can spot diseased leaves of certain commonly caused disease to plants.
Firstly, healthy and diseased images are composed and pre-processed. Later,
attributes like shape, color and texture are taken out from these images. Based on
the classified type of disease a text is displayed to the user in the project
9. BIBLIOGRAPHY
ANDROID APPLICATION ON PLANT LEAF DISEASE DETECTION USING MACHINE LEARNING
9. BIBLIOGRAPHY
9.1 REFERENCES
[1] Jiang Lu, Jie Hu, Guannan Zhao, Fenghua Mei, Changshui Zhang, An in-field
automatic wheat disease diagnosis system, Computers and Electronics in
Agriculture 142 (2017) 369 379.
[3] Konstantinos P. Ferentinos, Deep learning models for plant disease detection
and
Diagnosis Computers and Electronics in Agriculture 145 (2018) 311-318.
[4] Kulkarni Anand H, Ashwin Patil RK. Applying image processing techniques
to detect plant diseases. Int J Mod Eng Res 2012;2(5):3661-4. [5] Bashir Sabah,
Sharma Navdeep. Remote area plant disease detection using image.
LINK: https://github.com/KarthikBogelly/Minor_Project.git
CMRTC 37