You are on page 1of 6

2016 International Conference on Robotics: Current Trends and Future Challenges (RCTFC)

Design and Simulation of Crop Monitoring Robot for


Green House
K.R.Aravind1 and P.Raja2
School of Mechanical Engineering
SASTRA University
Thanjavur – 613401
Tamil Nadu
India
1
aravind@sastra.ac.in & 2raja@mech.sastra.edu

Abstract— Incidence of disease in horticultural crops is one of environment is highly unstructured, with different crops
the important problems affecting the production of fruits, exhibiting chaotic physical characteristics in terms of height,
vegetables and flowers. Regular monitoring of crops for early canopy size, etc. Implementing robot in such an environment is
diagnosis and treatment with pesticides or removal of the affected quite a challenge. Recent improvement on the sensor
crop is some of the solution to minimize the crop loss. Monitoring
of the crops by human labour is costly, time consuming, error
technologies has increased its reliability and accuracy in robotic
prone due to insufficient knowledge of the disease and highly applications [3]. For example navigation by Global Positioning
repetitive at different stages of crop growth. These needs have System (GPS) has become accurate within centimeters. But this
motivated to design a line following mobile robot with vision technology cannot be applied in green housed based cultivation
sensors for navigation across the field for disease identification in as the closed structures blocks signal transmission partially or
a green house. The robot has been designed and simulated in an completely depending on the structural material [4]. Ultrasonic
open source software known as Virtual Robot Experimental based sensor was used to estimate acoustic density profile and
Platform (V-REP) with experimental field of 40 crops. symmetry of the plants. This information can be used to
Programming for navigation is done in Lua scripting tool which is develop a navigation strategy [5]. Robot using visual odometer
embedded with the V-REP software. Capturing of images has
been performed using camera. Processing of images for
has been studied in green housed based farming which
identification of disease and its representation in a Graphical User estimates its rotation and translational positions relative to the
Interface (GUI) has been done using an algorithm in MATLAB world coordinate system [6]. Machine vision based navigation
R2011B which interacts with V-REP tool through socket has been promising but it suffers from the error arising due to
communication. Texture based analysis has been done to identify varying indoor lighting conditions [7].
the diseased crops among the healthy crops in the simulated field
setup. Studies with respect to the application of the robot with disease
identification system are very limited. Disease identification in
Keywords— Agricultural robots; crop monitoring robots; disease
crops using visible range image by image processing is quite a
identification robots; image processing
challenging task [8]. Many studies on disease identification
I. INTRODUCTION system alone using various sensing techniques such as colour,
Near Infra Red (NIR), multispectral and hyperspectral cameras
Green house based cultivation of horticultural crops are also
have been done widely without interfacing it with mobile robot
prone to pest and diseases which are lethal in some cases [1].
[2]. A number of quantifiable parameter such as Normalized
Farmers need to continuously monitor the crops for any change
Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation
in the colour of the leaves, growth pattern, flowering, wilting,
Index (SAVI), Photochemical Reflectance (PRI), Visible
blight, spots, etc., in order to implement necessary
Atmoshperic Resistance Index (VARI), etc., have been studied
precautionary measure as early as possible to minimize the loss
for classifying and quantifying the disease symptoms [2].
of yield. If farmer is unable to identify the symptoms of the
There are very few studies on robot integrated with the disease
disease, a sample of the infected crop is sent to the laboratory
identification system. Pilli et al., 2014 [9] has developed a
for molecular analysis which is time consuming to predict the
robot named “eAGROBOT” for identification of diseases in
type and severity of the disease [2]. As the continuous
groundnut and cotton crops at its early stage in an outdoor
monitoring task by human is tedious and classification of
environment. The robot can identify bacterial blight and
disease by farmer is difficult, an automated monitoring of crop
magnesium deficiency in cotton & yellow spot in groundnut
will be of significant benefit to the farmers. The evolution of
using neural network technique. Schor et al., 2016 [10]
the autonomous mobile robotics technology in recent time and
developed a robotic manipulator to identify diseases namely
need for precision in agriculture has resulted in increased study
powdery mildew and tomato spotted wilt virus in green house
on its application in agricultural operations. The agricultural

978-1-5090-3342-3/16/$31.00 ©2016 IEEE

Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 18,2020 at 00:39:55 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Robotics: Current Trends and Future Challenges (RCTFC)

based bell pepper cultivation using multispectral camera. The software. The physical design is such that the robot passes
manipulator is placed on conveyor simulating the working of through each crop in the field. The robot has two conventional
mobile platform. wheels in the rear side with two-castor wheels in the front.

In this paper, a simple crop status and disease monitoring robot


has been proposed. It is suitable for monitoring of small
horticultural crops and seedlings in the nursery using digital
camera. For navigation the robot uses simple line follower with
vision sensors for detecting the line in the green house. The
designed robot is simulated in an experimental field with crops
placed at specific interval as shown in Fig. 1. The simulation is
carried out using Virtual Robot Experimental Platform (V-
REP) by which feasibility of implementation, requirement of
necessary systems and sensors are realized.

In order to design the robot for its application in different


horticultural crop varieties, physical dimension of the crop has
to be taken in to consideration. The chosen crop for the
application of this robot should have a maximum height of 50 Fig. 2. Conceptual design of crop monitoring robot
cm with canopy diameter of 30 cm. The robotic prototype will
be able to monitor many horticultural crops such as radish, C. Components
onion, spinach, etc. and many other crops in their earlier stage of 1) Vision sensors
their growth especially in crop nurseries. The growth of the
crops has to be monitored continuously for site-specific The robot is equipped with three vision sensors for navigation as
application of pesticides and fertilizers. The crops growth can be shown in Fig. 3. The robot is basically a line follower which
assessed by using various parameter such as height, canopy utilizes intensity information of the image using the three vision
volume, vegetative index, etc. The crops infected with disease sensors. The difference in the intensity value of the obtained
exhibit visible symptoms in most cases. Thus a monitoring image through three-vision sensors will assist in navigating the
system with a conventional camera has been proposed which robot. The strategy for navigation has been explained in section
can obtain the visible image of the crop and further analysis can IV – A.
be done by using an image processing software.
2) Camera
A simulated camera is attached centrally to the roof of the robot,
which is facing downward focusing on the crop. Then the
captured images will be analyzed for estimating the status of the
crop growth and symptoms of disease. The methodology for
disease identification has been explained in section IV-B.
3) System architecture
The system architecture as shown in Fig. 3 consist of three-
vision sensors, a conventional digital camera and two DC

Fig. 1. VREP simulation of proposed design

II. SYSTEM REQUIREMENTS AND ARCHITECTURE


A. Simulation software
V-REP Proedu is a open source simulation tool with an
integrated development environment. It has inbuilt Lua
scripting language which can been interfaced with the robot to
manipulate its behavior. The simulation environment supports
many sensors, actuators, light sources, etc, and equipped with
user interface. The sensitivity range of sensors can be modified
depending on the requirement.
Fig. 3. System architecture of crop monitoring robot
B. Mechanical design
A Conceptual design of crop monitoring robot has been done as motors. Based on the comparison of the data using the three
vision sensors, commands to the left and right motor are sent for
shown in Fig. 2. The design has been done using AutoCAD
navigating the robot along the black line.

978-1-5090-3342-3/16/$31.00 ©2016 IEEE

Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 18,2020 at 00:39:55 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Robotics: Current Trends and Future Challenges (RCTFC)

D. Experimental field
Start
The experimental field as shown in Fig. 4 consist of 10 crops
equally spaced at 50 cm apart along the each row in the
simulation environment. Different colored texture has been
added to 10 crops among the 40 healthy crops for mirroring the
disease symptoms of the real crops and placed randomly in the If left sensor
True
field. Each column is separated by distance of 80 cm, as each intensity value
crop requires a lateral spacing of 20 cm and for the purpose of <0.3
Increase right
stable turning of robot. Further each crop requires specific
motor speed
spacing to minimize the competition for nutrients, to obtain
False
maximum energy from light source and for maximum canopy
size. The field has 4 columns and total number of crop in the
field is 40. Each crop is varying in height from 10 cm to 50 cm
If right sensor True
is placed across the field. The crops location at a specific intensity value
position is notified by a black perpendicular line of length of 50 <0.3
cm to ensure the robot’s halt even when it is oriented at some Increase left
angle around z-axis. The line is placed at 15 cm from the crops motor speed
along the direction of motion of the robot. False

If left & right


True
sensor intensity
value >0.3
Both motors in
same speed
False

If left, right & True


center sensor intensity
value <0.3
Both motor speed is zero
for 2 sec, capture image &
increment plant by 1
False

True
Stop If plant = 40
Fig. 4. Experimental field set up

III. METHODOLOGY False

A. Experimental operation Both motors in


same speed for
The experimental operation can be split up in to basically 2sec

1. Forward movement
2. Halt Fig. 5. Flow chart for robot navigation

3. Turn movement
4. Image capturing operation
Robot with sophisticated sensor like LIDAR or other vision-
based navigation sensors are costly and requires complex post The presence of a perpendicular line denotes the presence of the
processing for obtaining the information regarding the crop. The three vision sensors present in the robot send
environmental features and objects. Robot navigated by line information to the controller which decides its heading direction
follower will make the strategy of navigation to be simple in and speed based on its current orientation. The average intensity
green house and more focus can be given to image processing of the image is taken as the threshold for the identification of
for identification of crop status and diseases. The flow chart for line. The intensity of the image is scaled between 0 and 1.
navigation strategy of the robot is shown in Fig. 5. The robot is When the average intensity of the image is less than 0.3 the line
placed at an initial position in the indoor experimental farm is detected. When the left and right vision sensor intensity is
which is a static environment. The robot is navigated by the line greater than 0.3, constant velocity is given to both left and right
which follows the entire rows and columns of the field. motors.

978-1-5090-3342-3/16/$31.00 ©2016 IEEE

Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 18,2020 at 00:39:55 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Robotics: Current Trends and Future Challenges (RCTFC)

When the average intensity of all the three sensors i.e. sec. The approximately 40 sec for the travel in the curved path and
image is captured using the web camera attached to the robot. reorientation of the robot in the next column. Thus the total time
After the time interval of 2 sec the robots moves in a constant
straight line path for 2 sec and then the entire process is repeated taken for single column including the curved path is
until it reaches the destination. approximately 106 sec. The absolute velocity for the robot in
curved path has high unevenness due to the constant adjustment
B. Disease identification of the orientation by differential velocity between two wheels.
The several stages in identifying disease in the crop using B. Disease identification
image processing are shown in Fig. 6. The captured image of
each crop is obtained by main function built using remote The standard deviation and mean intensity value of the normal
Application Programming Interface (API) functions in healthy crops (as shown in Table. I) were found to be greater
MATLAB programming software. A serial number is given for TABLE I. HEALTH STATUS DETERMINATION OF 40 CROPS
each crops image in its file name and it is stored in the database.
The captured image is of resolution 900 X 1200 pixels which is Crop no Mean Standard deviation Status
cropped in to 400 X 400 pixels image automatically. This is 1. 244.60 34.26 Infected
done as pixels in columns of image ranging from 1 to 300 pixels 2. 248.81 18.39 Healthy
3. 248.66 18.84 Healthy
and 900 to 1200 pixels contains intensity values of floor which 4. 248.86 18.21 Healthy
increases computing time and not needed for further analysis. 5. 248.67 18.85 Healthy
The image is then converted in to gray scale image for 6. 242.34 42.38 Infected
extracting the crop from the background as shown in Fig. 7. A 7. 248.61 19.11 Healthy
statistical textural analysis is performed by estimating the mean 8. 248.93 18.18 Healthy
and standard deviation from the matrix elements of the image. 9. 244.57 34.25 Infected
Based on this statistical parameter of the crop, it is classified as 10. 246.73 20.64 Healthy
diseased or healthy and the result of this anaylsis is shown in the 11. 244.43 33.40 Infected
12. 241.39 43.75 Infected
Graphical User Interface (GUI) with location of each crop. 13. 248.93 18.37 Healthy
14. 242.53 39.75 Infected
15. 248.87 18.46 Healthy
16. 248.72 18.76 Healthy
17. 247.03 24.21 Infected
18. 248.63 18.98 Healthy
19. 248.95 18.25 Healthy
20. 248.70 18.70 Healthy
21. 250.58 16.23 Healthy
22. 249.60 17.83 Healthy
23. 250.26 16.18 Healthy
24. 249.94 17.16 Healthy
25. 245.69 33.54 Infected
26. 250.06 17.03 Healthy
27. 250.12 16.22 Healthy
28. 249.60 17.63 Healthy
29. 250.07 16.30 Healthy
30. 243.87 40.45 Infected
Fig. 6. Image processing steps in identifying the disease
31. 249.25 17.65 Healthy
32. 248.93 18.44 Healthy
33. 249.12 18.00 Healthy
34. 241.34 43.56 Infected
35. 249.09 17.99 Healthy
36. 248.93 18.33 Healthy
37. 249.04 17.96 Healthy
38. 248.96 18.03 Healthy
39. 248.96 18.29 Healthy
Fig. 7. RGB to gray scale image 40. 249.00 18.14 Healthy

IV. RESULTS AND DISCUSSION


A. Navigation than 24.21 and less than 245.69 with white background. The
The total time taken by the robot to travel the entire field of 40 diseased crop clearly deferred except in crop no. 17 where the
crops is 6 min 25 sec. The robot was travelling at an average mean value alone is greater than the threshold. This is because
speed of 13.7 cm/sec when forward operation is in control and the mean intensity of the image is greater than 246 as the crop is
12.8cm/sec in turn operation along the curved path. The length partly yellow in colour which has maximum intensity values in
of each column is 6 m and the circumference of the curved path red and green channel.
to the subsequent column is 3.5 m. The robot takes
approximately 66 sec for single column travel with

978-1-5090-3342-3/16/$31.00 ©2016 IEEE

Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 18,2020 at 00:39:55 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Robotics: Current Trends and Future Challenges (RCTFC)

Each diseased crop were modeled with different random textural determining the health status of the crop as shown in Fig. 8 and
pattern and colour based on the most observed disease Fig. 9. The output of the resulting analysis of the experimental
symptoms in natural conditions. The results of the 40 crops are field crops will be presented to the user through GUI as shown
plotted in the graph clearly showing the threshold for in Fig. 10.
Mean

Crop no

Fig. 8. Mean intensity value of 40 plants


Standard deviation

Crop no
Fig. 9. Standard deviation of intensity value of 40 crops

Fig. 10. GUI output of the field crop health status

978-1-5090-3342-3/16/$31.00 ©2016 IEEE

Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 18,2020 at 00:39:55 UTC from IEEE Xplore. Restrictions apply.
2016 International Conference on Robotics: Current Trends and Future Challenges (RCTFC)

V. CONCLUSION REFERENCES
A successful simulation for the development of green house [1] R. Albajes, M.L. Guillno, J.C.V. Lenteren, Y. Elad, Integrated Pest and
based robotic system for the identification of diseased crop has Disease Management in Greenhouse Crops. Kluwer Academi Publishers,
USA.
been performed. A model algorithm based on statistical textural
[2] S. Sankaran, A. Mishra, R. Eshani, C. Davis, A Review of Advance
analysis has been developed for this simulation study which will Techniques for Detecting Crop Diseases. Comp.Elect.Agric, vol. 72(1),
be the benchmark for the future work. Also the algorithm has to pp. 1-13, June 2010.
be validated in wide variety of textural patterns and colour for [3] C.W. Zecha, J. Link, W. Claupein, Mobile Sensor Platforms:
the crop. In reality more number of statistical parameters related Categorisation and Research Applications Applications in Precision
to image features are required to determine the health status of Farming. J. Sens. Sens. Syst, vol. 2, pp. 51-72, May 2013.
the crop in green house as it will have more dynamic lighting of [4] L. Comba, P. Gay, P. Picccarolo, D.R. Aimonion, Robotics and
the environment, error due to camera parameters, textural Automation for Crop Management: Trends and Perspective, International
Conference Ragusa SHWA2010, pp. 471-478, September 2010.
features of the crop and other random errors. In order to
implement the proposed idea and to develop an image [5] N. Harper, P. McKerrow, Recognising Crop with Ultrasonic Sensing for
Mobile Robot Navigation, Robot.Auton.sys, vol. 34(2-3), pp. 71-82,
processing algorithm to identify disease in the (real) green house February 2001.
environment a mechanical prototype of the robot has been [6] P.J. Younse, T.F. Burks, Greenhouse Robot Navigation Using KLT
developed. Further studies will be carried out as the other Feature Tracking for Visual Odometry, Agric Eng Int: CIGR EJournal,
systems such as camera, sensors, etc., are interfaced and various vol. 9, Manuscript ATOE 07015, July 2007
disease identification parameters will be evaluated in order to [7] H.T. Sogaard, I. Lund, Application Accuracy of a Machine Vision-
categorize various diseases in real time. Controlled Robotic Micro-Dosing System, Biosyst Eng, vol. 96(3), pp.
315-322, January 2007.
[8] J. Garcia, A, Barbedo, A Review on the Main Challenges in Automatic
Crop Disease Identification Based on Visible Range Images, Biosyst Eng,
vol. 144, pp. 52-60, April 2016.
[9] S.K. Pilli, B. Nallathambi, S.J. George, V. Diwanji, eAGROBOT- A
Robot for Early Crop Disease Detection Using Image Processing, IEEE
International Conference on Electronics and Communication Systems, pp.
1-6, February 2014.
[10] N. Schor, A. Bechar, T. Ignat, A. Dombrovsky, Y. Elad, S. Berman,
“Robotic Disease Detection in Greenhouses: Combined Detection of
Powdery Mildew and Tomato Spotted Wilt Virus”, IEEE Robot. Autom.
Lett, Vol. 1(1), pp. 354-360, January 2016.

978-1-5090-3342-3/16/$31.00 ©2016 IEEE

Authorized licensed use limited to: SLIIT - Sri Lanka Institute of Information Technology. Downloaded on November 18,2020 at 00:39:55 UTC from IEEE Xplore. Restrictions apply.

You might also like