0% found this document useful (0 votes)
357 views6 pages

Precision Agricultural Spraying-ICEENG

This study presents an autonomous computer vision system for precision agricultural spraying, utilizing a Raspberry Pi for real-time image processing and an Arduino for controlling tractor-mounted sprayer nozzles. The system aims to minimize pesticide waste and environmental contamination by accurately targeting crops, thereby promoting sustainable agricultural practices. Key components include image segmentation algorithms, remote control mechanisms, and a user-friendly mobile interface, demonstrating significant reductions in pesticide use and improved application accuracy.

Uploaded by

kepah79167
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
357 views6 pages

Precision Agricultural Spraying-ICEENG

This study presents an autonomous computer vision system for precision agricultural spraying, utilizing a Raspberry Pi for real-time image processing and an Arduino for controlling tractor-mounted sprayer nozzles. The system aims to minimize pesticide waste and environmental contamination by accurately targeting crops, thereby promoting sustainable agricultural practices. Key components include image segmentation algorithms, remote control mechanisms, and a user-friendly mobile interface, demonstrating significant reductions in pesticide use and improved application accuracy.

Uploaded by

kepah79167
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

PRECISION AGRICULTURAL SPRAYING: AUTONOMOUS COMPUTER VISION FOR

SPRAYER NOZZLE CONTROL

Ahmed Osama Gouda Yousuf Adel Soliman Nofal Noha Essam Muhammed
Computer Systems department Computer Systems department Computer Systems department
Ain Shams University Ain Shams University Ain Shams University
Caio, Egypt Caio, Egypt Caio, Egypt
20201700018@[Link] 20201701017@[Link] 20201700925@[Link]

Nada Ahmed Shehata Renad Haitham Mohammed Maria Medhat Magdy


Computer Systems department Computer Systems department Computer Systems department
Ain Shams University Ain Shams University Ain Shams University
Caio, Egypt Caio, Egypt Caio, Egypt
20201700914@[Link] 20201700295@[Link] 20201701219@[Link]

Abstract—In modern agriculture, the precise application of economic inefficiencies and increased costs for farmers.
pesticides is critical for enhancing crop yields while mitigating Excess pesticides contaminate soil and water, posing risks to
environmental impacts. This study presents an innovative ecosystems and human health. Reducing pesticide waste is
system that leverages computer vision to control tractor- vital for sustainable agriculture and protecting ecosystems,
mounted sprayer nozzles for precision pesticide application. The
system uses a Raspberry Pi for real-time image processing and
motivating the project through a need for efficient resource
an Arduino for controlling the sprayer nozzles, ensuring utilization and technological innovation.
targeted spraying on crops. This minimizes pesticide wastage
and environmental contamination. Key components include The concept of autonomous systems has become a popular
image segmentation algorithms, remote control mechanisms, abstraction in recent times. These systems operate using
and a user-friendly mobile interface. Evaluations demonstrate technologies like GPS, sensors to detect and control object
significant reductions in pesticide use and improvements in
movement, and others that measure various environmental
application accuracy. This research advances precision
agriculture by providing a scalable, efficient, and eco-friendly
factors. Smith & Jones (2020) mention that these systems
solution for autonomous pesticide spraying leverage advanced technologies like GPS to perform tasks
with high accuracy and minimal human intervention [2].
Keywords— Precision Agriculture, Autonomous Systems,
Computer Vision, Pesticide Spraying, Tractor-Mounted Sprayers, Leveraging cutting-edge technology, such as computer vision
Crop Row Detection. and autonomous control, offers a groundbreaking solution to
I. INTRODUCTION traditional farming practices. Hence, the contributions of this
research can be summarized as follows:
Modern agriculture faces substantial challenges, particularly
in the efficient application of pesticides. Pesticides are vital • Developing a computer vision model capable of
for protecting crops from pests and diseases, but conventional detecting crop rows in images captured from agricultural
spraying methods often lead to inefficiencies, causing field. The algorithm accurately identifies crop lines and
environmental harm and economic losses. Inefficient differentiates between healthy and disconnected crop
pesticide application results in excessive chemical use, which sections.
adversely affects soil and water quality. According to the • The computer vision model is integrated with a
Food and Agriculture Organization (FAO) of the United Raspberry Pi microcontroller to process captured images
Nations, up to 50% of applied pesticides are lost or wasted and make real-time decisions on nozzle activation.
due to non-precise methods. In some cases, the effective • Building an embedded system capable of controlling
application rate can be as low as 0.1%, with the remaining tractor-mounted sprayer nozzles based on crop row
99.9% dispersing to unintended areas, contaminating surface detection. A prototype tractor-like RC vehicle with
and groundwater [1]. mounted sprayers and a mounted camera will simulate
the real process.
This research aims to develop a precision agricultural • Introducing a remote-controlled (RC) mechanism
spraying system using an autonomous computer vision-based coupled with mobile application to enable precise
approach for tractor-mounted sprayer nozzle control. By control of farm machinery. A user-friendly interface
improving the precision of pesticide application, this system allows users to control the embedded system, set the
seeks to address issues of inefficiency and environmental speed and direction of the tractor, and monitor the
contamination, promoting sustainable agricultural practices. spraying process [3]. It allows farmers to operate
The increasing global population heightens the demand for tractors remotely, making farm operations more
agricultural production and pesticide use, yet traditional convenient and efficient [4].
pesticide application methods are often wasteful, leading to
The paper is organized as follows: section 2 reviews simple controller for motors and actuators, is ideal for this
related work whereas section 3 proposes the system. purpose.
Section 4 describes the experiment setup and results.
Finally, section 5 concludes the paper. III. PROPOSED SYSTEM
The autonomous nozzle control system aims to optimize
II. RELATED WORK pesticide application in crop fields. It consists of several
interconnected components as shown in figure 1.
Computer vision techniques are critical for automated
extraction of information from images. Techniques such as
image segmentation, object detection, and pattern recognition
are crucial for identifying crop lines and distinguishing
between healthy and diseased sections of crops. Several
works have been proposed for utilizing computer vision in
detecting crop lines in autonomous agriculture spraying
systems. The authors in [4] present a modular system
designed for precision agriculture, specifically to automate
the spraying process. The system leverages computer vision
and individual nozzle control to optimize pesticide
application, utilizing affordable components such as Arduino
boards, solenoid valves, pressure and flow sensors, Figure 1. Block Diagram of the Proposed Framework
smartphones, webcams, and Raspberry Pi. This approach
aims to minimize pesticide usage and reduce environmental Input Image Acquisition: The system begins with a 5MP
impact by targeting weeds directly with minimal pesticide 1080HD camera connected to a Raspberry Pi (Raspberry Pi
application. The results highlight the system's effectiveness, Camera). This camera captures images of the crop field.
showing a significant reduction in pesticide use while Computer Vision Processing: The Raspberry Pi processes
maintaining high accuracy. In [6], the authors propose an the captured images using computer vision techniques.
integrated scheme that combines computer vision and multi- Preprocessing steps include noise reduction, contrast
tasking processes to develop a small-scale smart agricultural enhancement, and image scaling. The processed images serve
machine. This machine is capable of automatic weeding and as input for subsequent analysis. Two algorithms are
variable-rate irrigation within cultivated fields. By employing implemented for detecting crop lines and compared. The first
image processing methods such as HSV color conversion and is the Progressive Probabilistic Hough Transform (PPHT),
morphology operator procedures, the system effectively and the second is the U-Net machine learning model which is
distinguishes between plants and weeds, enabling precise a type of convolutional neural network (CNN) architecture.
weeding. Additionally, it utilizes soil moisture data to drive a Nozzle Mapping Algorithm: The system selects
fuzzy logic controller, optimizing irrigation rates to conserve appropriate nozzles based on the processed data. These
water. Experimental results demonstrate that the system nozzles will spray pesticides precisely on targeted crop
achieves a classification rate of 90% or higher for plant and sections. This algorithm is performed by the Raspberry Pi
weed identification, facilitating efficient weeding and microcontroller after computer vision processing.
irrigation with minimal human intervention. Line estimation Remote control mechanism: A mobile application with a
based on Gaussian Mixture Models (GMM) clustering was user-friendly interface is developed to allow users to control
proposed in [7] where the authors propose a modular system the system, set the speed and direction of the tractor, and
for precision agriculture designed to automate sprayers and monitor the spraying process.
optimize pesticide application through a robotic system
utilizing computer vision and individual nozzle on/off The next subsections describe the phases of the system
control. The primary motivation is to reduce pesticide usage including the computer vision processing, nozzle mapping
in crops, providing potential savings for farmers and algorithm, hardware spraying system, and remote-control
addressing environmental protection and food safety mechanism.
concerns. The system is adaptable to any crop planted in
A. Data preprocessing
rows, including onions, soybeans, corn, beans, and rice.
Results indicate that the system can detect plantation lines Images are converted to a consistent format for analysis.
and can be used to retrofit conventional boom sprayers. Preprocessing techniques (noise reduction, contrast
Regarding the autonomous system, it incorporates low-cost enhancement, and scaling) enhance image quality and
equipment such as Arduino boards, solenoid valves, pressure standardize the dataset.
and flow sensors, smartphones, webcams, and Raspberry Pi. 1) Image Resizing and Normalization
This development represents a significant step toward The input image is resized to a target size of (224, 224). This
creating a kit capable of upgrading a conventional sprayer to step ensures that the image dimensions conform to the
a fully autonomous robotic sprayer at an affordable cost for required input size for subsequent processing. Pixel Value
small and medium-sized farms. Raspberry Pi, a compact Normalization: The pixel values of the resized image are
single-board computer with a robust processor and extensive normalized to the range [0, 1]. Each pixel value is divided by
software support, is highly utilized in real-time image 255.0 to achieve normalization. The preprocessed image,
processing. Conversely, the Arduino platform, a versatile and
denoted as normalized image, is then used for further In the implementation of our Hough Transform-based
analysis. algorithm (PPHT) for plant detection and analysis, a critical
preprocessing step involves the calculation of centroids for
2) Plant Segmentation identified plant regions. This step is designed to operate on a
The initial step in our implementation involves segmenting processed binary image derived from the initial plant
the plants from the background in the images. The process detection phase. The function begins by converting the
begins with converting the input image, which is in RGB processed image into an unsigned 8-bit integer array (np.
format. We directly extract the individual color channels uint8) to facilitate contour extraction. Contour extraction is
from the image. Next, we define two thresholds for configured to retrieve only the external contours and
segmentation: k=0.57 and t=20/255. The two conditions are employs a simple approximation method to describe contour
applied based on these two thresholds. The first condition is shapes efficiently. For each identified contour, the function
G > k × (R + B), and the second condition is (R + B) > t. computes the centroid using the moments of the contour.
These conditions help differentiate the green vegetation The centroid coordinates (cX, cY) are computed using the
(plants) from the background. Using a logical AND equations: cX = (M 10)/M00 and cY = M01/M00, where
operation, we combine the two conditions to create a binary M00, M10 and M01 denote the zeroth, first horizontal, and
mask where plant pixels are identified. Finally, a binary first vertical moments of the contour, respectively. These
image is created where the plant pixels are set to 1 and the moments provide integral measures describing the spatial
background pixels are set to 0. The resulting binary image distribution of pixel intensities within the contour region. To
highlights the plant regions, effectively segmenting them ensure computational robustness, the function checks if M00
from the background. This segmentation step is crucial for representing the area of the contour, is non-zero before
further processing stages, such as centroid calculation and computing the centroid. In cases where M00, indicating an
application of the Probabilistic Hough Transform (PPHT). empty contour, the centroid coordinates default to (0, 0). All
The identified plant regions allow for precise analysis and computed centroids are accumulated into a list, centroids,
control of the autonomous nozzle system, improving the which serves as the output of the function. This spatial
efficiency of pesticide application [8]. information is crucial for subsequent stages of the PPHT
algorithm, facilitating accurate localization and analysis of
plant structures within digital images. Figure 4 shows the
result of preprocessing of the input image followed by the
3) Morphological Operations
PPHT algorithm.
Morphological operations are implemented to enhance the
binary plant segmentation:
• Erosion: is performed using a 3x3 kernel to
remove noise and small objects. The resulting
eroded image is saved for visualization purposes.

• Dilation: is applied to the eroded image using a


7x7 kernel.
This operation enhances the structure of remaining
plant regions and fills gaps. The final processed
image (dilated image) is obtained.

Figure 3 shows a preprocessed image sample from the


dataset.
Figure 4. Preprocessing and PPHT

C. U-Net Model
A U-Net model is employed for image segmentation to
accurately identify crop lines and differentiate between
healthy and disconnected crop sections. U-Net's architecture,
with its encoder-decoder structure, is particularly effective
for pixel-level classification tasks such as crop line detection
[9].

Figure 3. Preprocessing of an image sample

B. Progressive Probabilistic Hough Transform (PPHT)


Centroid Calculation:
𝐷 𝑐𝑚
Grid Size=
𝑁 𝑔𝑟𝑖𝑑𝑠
where D is the total distance (in cm) and N is the number of
grids. Next, based on the presence or absence of crops in each
grid, signals are generated to inform the nozzle spraying
system. This approach ensures efficient and targeted watering
based on real-time image processing and signal analysis,
optimizing irrigation coverage dynamically.
Figure 7 illustrates the grid-based nozzle mapping algorithm.

Figure 5. The original U-Net architecture [8].

The U-Net architecture is a convolutional neural network


designed for image segmentation tasks. It consists of a
contracting path (encoder) and an expansive path (decoder),
with each step in the contracting path corresponding to an
equivalent step in the expansive path. This architecture
allows for precise localization due to the symmetrical
structure and the skip connections between corresponding
layers in the encoder and decoder [9].
The model was trained with a learning rate of 0.01 over 20
epochs. It was evaluated using accuracy and precision to Figure 7. Signal Generation
assess its effectiveness in identifying crop lines and
optimizing pesticide application. The training accuracy E. Hardware Spraying System
reached 97%, as shown in Figure 6, where the model starts to
The autonomous pesticide spraying tractor integrates both an
converge. In testing, the accuracy was 94.6%, indicating the
Arduino Uno and an ESP32 microcontrollers to manage its
model's robustness in real-world scenarios.
physical components effectively. The system is structured as
follows:

1) Arduino Uno Microcontroller:


The Arduino microcontroller manages the activation of the
sprayer nozzles. It receives processed signals resulting from
the nozzle mapping algorithm. The Arduino Uno ensures that
the nozzles spray pesticides precisely on the targeted crop
sections, minimizing waste and enhancing efficiency. ESP32
Microcontroller. Two types of actuators are used; two DC
motors which drive the tractor's wheels, and four sprayer
Figure 6. U-net model training and evaluation nozzles are integrated to apply pesticides accurately based on
processed data and control signals.
The computer vision models are integrated with the 2) ESP32 microcontroller
Raspberry Pi microcontroller to process the captured images The nozzles, controlled by the Arduino, apply pesticides
and make real-time decisions. accurately based on the processed data and control signal
3) Extra equiements:
Motor drivers (L298N), Li-ion batteries, Bluetooth module
[Link] Mapping Algorithm (HC-05) and ESP32.
Grid Division: Post PPHT/U-Net processing, the resulting
labeled image is sent to the mapping algorithm to determine Figure 8 shows the complete spraying hardware system.
which nozzles should be activated. The image is partitioned
into grids. Each grid cell is checked for the presence of white
pixels, indicating the existence of crops. This algorithm
operates based on a set threshold for grid signals and the
movement distance of the car. The algorithm ignores parts of
the image where the threshold is less than 6 grids. It is
assumed that each grid corresponds to the area covered by a
single nozzle. The distance covered by each grid is calculated
dynamically based on the total distance input. Given this
operation, the grid size can be calculated as follows:
Testing Dataset: Comprises 500 images with ground truth
segmentations, distributed across the same 50 data classes (10
images per class).
Validation Dataset: Includes 250 images along with their
ground truth segmentations.
Figure 2 shows the categories of the dataset.

Figure 8. Complete Spraying Hardware

F. Remote control mechanism


1) Design and Implementation:
A mobile application with a user-friendly interface
is developed to allow users to control the system, set
the speed and direction of the tractor, and monitor
the spraying process.
2) Remote Operation:
The tractor can be operated remotely using mobile Figure 2. Data Categorie in CRDLDv1.0 dataset
applications and RC mechanisms for enhanced
convenience and efficiency. Blynk IoT platform is The complete spraying system comprised a 4-wheeler trolley
used for controlling the dc motors, MIT app inventor as application equipment, designed to be controlled remotely
for movement. The boom consisted of four nozzles with
is used for controlling the pumps. The hardware is
connected to the cloud via the Blynk app, designing plastic Nozzles for spraying. A 5MP camera is mounted at a
the application to be controlled and monitored in height of 1.15 feet from the ground for acquiring field images
in real-time.
real-time.
Figure 9 shows the remote-control application.
A Raspberry Pi 4 Model B was connected to the cameras and
served as the main processing unit for running the computer
vision model. The captured images were processed to detect
crop lines and determine the areas needing spraying. This
involved preprocessing steps like noise reduction, contrast
enhancement, and image scaling to improve the image
quality. Advanced computer vision algorithms are then used
to detect the crop lines, and a nozzle mapping software
determines which nozzles should be activated for spraying.
Python, with libraries like OpenCV and NumPy, offers robust
tools for image processing and segmentation. These scripts
can analyze images captured by the Raspberry Pi, detect crop
lines, and send control signals to the Arduino.
Figure 9. Control Application An Arduino Uno R3 microcontroller is used for controlling
the pump and solenoid valves mounted on each of the four
IV. EXPERIMENTAL SETUP AND RESULTS nozzles. The Arduino receives processed data and control
signals from the Raspberry Pi. A fixed displacement pump
driven by an electric DC motor is used to deliver the
The proposed system was trained and evaluated using the agrochemical at a rate of 80-120 liters/hour and at the
custom CRDLDv1.0 dataset, which consists of 2000 field desired pressure. The application rate through the nozzles
images specifically captured for this project. The dataset was varied by changing the duty cycles of the electric
includes images of various crop rows and is divided into solenoid valves attached to each nozzle according to the
training, testing, and validation sets, ensuring comprehensive feedback signal obtained from the vision system. A 100%
model evaluation across different conditions. By utilizing duty cycle (i.e., a fully on signal) resulted in maximum flow
CRDLDv1.0, we ensured that the computer vision model was rate, whereas lower duty cycles resulted in reduced flow
well-calibrated for detecting crop lines in real-time scenarios. rates accordingly. The system operated at approximately 10
It is divided into three groups: pulses per second (10Hz). Each nozzle was controlled
Training Dataset: Contains 1250 images with corresponding individually via ON/OFF solenoid valves, allowing for more
ground truth segmentations. Each image belongs to one of 50 accurate application of agrochemicals. An electronic
data classes (25 images per class). proportional control valve (EPV) mounted on the bypass
line was used to maintain the desired pressure in the system.
When the pressure exceeded the set limit due to the closure
of any outlet/nozzle, the EPV regulated the excess flow The processed images with detected lines are passed to the
back to the tank. nozzle mapping algorithm for the purpose of nozzles
activation. Every nozzle is programmed to open once the
The experiment is conducted through implementing a field detected signal is high on the raspberry pi GPIO
with 4 lines of plastic with discontinuity to simulate the crop corresponding pins. In this experiment, the field dimensions
lines. The Raspberry Pi camera captures a sequence of frames are 2.5m length * 1.5m width, and the chosen grid count is 10
from the agricultural field and performs computer vision grids. Hence, the grid size is calculated and set to 25 cm/grid.
processing. Figure 10 Shows (a) experiment field, (b) the The flow Rate of the liquid is 100ml/meter. distance between
captured frames along with (c) the result of identified plant every line and the other is 20cm. The discontinuity length is
lines using U-net. 25cm. The vehicle speed is 10 cm/second After spraying, the
system takes another signal and continues the process. The
result of the nozzle mapping algorithm on the frames
captured in figure 10 is shown in figure 11.

Figure 11. Grid view of nozzle mapping algorithm


(a) Experiment field
Finally, figure 12 clarifies the nozzle spray control
corresponding to the result of the mapping algorithm in figure
11 where nozzle numbered 1 & 3 from the right are turned on
then discontinuity happened at nozzle 3 so it stopped then it
opened again.

(b) Captured real-time frames using raspberry pi camera

Figure 12 Nozzles activation on/off

V. CONCLUSION

The integration of an autonomous nozzle system into a farm


(c) Processed image by the U-net model
tractor marks a substantial advancement in agricultural
technology, aiming to optimize pesticide usage through
Figure 10. In-Field experiment
computer vision and automation. rather than spraying fields.

You might also like