You are on page 1of 6

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/232618163

Edge Link Detector Based Weed Classifier

Article · March 2009


DOI: 10.1109/ICDIP.2009.64

CITATIONS READS
9 1,186

3 authors:

Muhammad Hameed Siddiqi Irshad Ahmad


Al-Jouf University Universiti Teknologi PETRONAS
51 PUBLICATIONS   610 CITATIONS    18 PUBLICATIONS   123 CITATIONS   

SEE PROFILE SEE PROFILE

Suziah Sulaiman
Universiti Teknologi PETRONAS, Bandar Seri Iskandar, Perak, Malaysia
176 PUBLICATIONS   819 CITATIONS   

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Article Study of Genetic Algorithm to Fully-automate the Design and Training of Artificial Neural Network View project

Side & Herbicide Specific Weed Classification View project

All content following this page was uploaded by Muhammad Hameed Siddiqi on 15 May 2014.

The user has requested enhancement of the downloaded file.


International Conference on Digital Image Processing

Edge Link Detector Based Weed Classifier

Muhammad Hameed Siddiqi Irshad Ahmad Suziah Bt Sulaiman


Department of Computer and Department of Computer and Department of Computer and
Information Sciences Information Sciences Information Sciences
Universiti Teknologi PETRONAS Universiti Teknologi PETRONAS Universiti Teknologi PETRONAS
Bandar Seri Iskandar, 31750 Bandar Seri Iskandar, 31750 Bandar Seri Iskandar, 31750
Tronoh, Perak, Malaysia Tronoh, Perak, Malaysia Tronoh, Perak, Malaysia
siddiqi_icp@yahoo.com iakhalil2003@yahoo.com suziah@petronas.com.my

Abstract— The identification and classification of weeds are of An imaging sensor is a key component of almost any weed
major technical and economical importance in the agricultural detection and classification system and methods of using
industry. To automate these activities, like in shape, color and them are various. Individual plant classification has been
texture, weed control system is feasible. The goal of this paper successfully demonstrated with either spectral [25] or color
is to build a real-time, machine vision weed control system that imaging [6]. The spatial resolutions of spectral systems are
can detect weed locations. The algorithm is developed to typically not adequate for accurate individual plant or leaf
classify images into broad and narrow class for real-time detection. Then again, color-imaging methods with higher
selective herbicide application. The developed algorithm based spatial resolution do not offer the important additional
on Edge Link Detector has been tested on weeds at various information the spectral data provides [26].
locations, which have shown that the algorithm to be very Caltrans sprays roadside plant material with herbicide to
effectiveness in weed identification. Further the results show a prevent the weeds from becoming a fire hazard during the
very reliable performance on weeds under varying field summer. The first step in identifying weeds within an
conditions. The analysis of the results shows over 93 % image involves classifying the pixels [22]. The pixels shall
classification accuracy over 240 sample images (broad, narrow be classified using a point operation. The surrounding
and no or little weeds) with 100 samples from broad weeds, 100 pixels will not bias a pixel’s classification. The purpose of
samples from narrow weeds and the remaining 40 from no or segmenting the image into plant and background pixels is to
little weeds. detect the amount of plant material within a specific area
[22]. If the amount of plant material reaches a specific
Keywords-component; real-time weed recognition; weed
threshold, that area is targeted for herbicidal spay
detection; image processing; Ranodom Transform; image
application [22]. The spray threshold is limited by the
classifier; weed segmentation.
fraction of background pixels that are misclassified as plant
material. If the spray threshold is set too close to the
I. INTRODUCTION background misclassification rate, then herbicide will be
Weed control is a critical farm operation and can wasted spraying background. Therefore, a larger
significantly affect crop yield. Herbicides play an important misclassification rate limits the smallest plant that can be
role in weed control but their use is under criticism due to detected without targeting the background for spray [22].
perceived excessive use and potentially harmful effects. A system that could make use of the spatial distribution
Several studies suggest that patch spraying considerably information in real-time and apply only the necessary
reduces herbicide use. Manual scouting for patch spraying amounts of herbicide to the weed-infested area would be
consumes considerable resources and is not a feasible option much more efficient and minimize environmental damage.
for most farm operations [24]. Many researchers with varied Therefore, a high spatial resolution, real-time weed
success have investigated Patch spraying using remote infestation detection system seems to be the solution for
sensing and machine vision. Machine vision systems are site-specific weed management.
suitable for plant scale herbicide application whereas remote
sensing can be employed on plot basis. Both of these II.
RELATED WORK
systems essentially require image acquisition and image In agriculture, imaging devices provide useful
processing. Image size ranges in the order of megabytes, information to detect in-field heterogeneities. Depending on
thus processing takes 0.34s to 7s depending on image what we want to see (a leaf, a plant) and to allow the
resolution, crop and weed type, algorithm used and features to be visualized, remote sensors can be embedded
hardware configurations [23, 19]. in different vehicles such as tractors, agricultural engines
The machine vision based approach uses shape, texture, [2], aircraft [3, 2] or satellites. For sustainable agriculture,
color and location based features individually or jointly to particular attention must be paid to herbicide treatments as
discriminate between weed and crop. The studies report they are the major agricultural pollutants and site-specific
varied results for these features and their combinations [10]. spraying of weeds in crop fields could avoid considerable

978-0-7695-3565-4/09 $25.00 © 2009 IEEE 255


DOI 10.1109/ICDIP.2009.64

Authorized licensed use limited to: UNIVERSITY TEKNOLOGY PETRONAS. Downloaded on August 29, 2009 at 06:58 from IEEE Xplore. Restrictions apply.
use of herbicides. Consequently, many researchers have IV. MATERIALS AND METHOD
developed different vision systems [5] to highlight weed
plants in crops and to map the weeds in real-time for site- A. Hardware Desighn
specific spraying of infested areas [7, 8]. These systems can The concept of the automated sprayer system (shown in
be based on optical sensors (photodiodes), which can be figure 3) includes camera, Central Processing Unit (CPU)
used for the classification/ discrimination between plants decision box and two dc pumps for spraying. The images
(narrow plants and broad plants) from their reflection were taken at a distance of 4 meter and at angle of 45 degree
spectra. Their reflectance spectra. The best known are Weed with the ground. Agriculture fields are selected for this type
seeker, and Spray vision [8], however, these systems cannot of study.
discriminate between crop and weeds. More recently [9, 10], The software is developed in Matlab. A Graphical User
have developed a robot, with two vision systems guided by Interface (GUI) is developed that shows the Original image,
crop rows, which aims to mechanically remove weeds in the
processed image and the result of the proposed algorithm.
inter row. However, this detection method is devoted to
The image resolution was 240 pixel rows by 320 pixel
crops sown by drilling methods such as salad or sugar beet
[1]. An off-line approach has been also investigated where columns.
data are acquired in one pass and analyzed at the office. In a B. Methodology
second pass, the weed mapping is used to control an
agricultural engine in the crop field. As an example, [12] Edge detection is a technique used to reduce the amount
developed a multi-spectral imaging system embedded in a of data and filters out unnecessary information through a
small aircraft, which over-flew a sunflower field crop. This structuring element moving pixel by pixel and to determine
discriminated between crop and inter-row weeds with the edges of that image. There are two basic methods, which
spatial image processing based on Gabor filtering, which are used to determine the edges of the image.
detected crop rows by their frequency. However, weeds Gradient: The gradient method detects the edges by
within the crop row were not recognized. looking for the maximum and minimum in the first
For weed detection a lot of accurate methods have been derivative of the image. And the operators/Filters used for
developed, such as wavelet transform to discriminate this method are Soble Filter, Prewitt Filter, Canny Filter,
between crop and weed in perspective agronomic images, Robert Filter, Cross Filter, and Smoothing Filter.
[13] spectral reflectance of plants with artificial neural Laplacian: The Laplacian method searches for zero
networks [15], such as principal component analysis. Other crossings in the second derivative of the image to find
researchers have investigated texture features [12] or edges. And the mask used for this method is Marr-Hildreth.
biological morphology such as leaf shape recognition [13]. The color (RGB) images are converted to gray scale
So in real time for the identification/Classification of crop images for easy and fast processing. Then the proposed
rows in images, a lot of fast methods have been implemented algorithm (Edge Link Detection) using 3x3 mask is applied
[14], some of them are based on Hough Transform [16], to those gray scale images to reduce the amount of data. For
some of them are based on Fourier Transform [17], some are example ‘A’ is an image and ‘B’ is a 3x3 morphological
based on Kalman filtering [18] or linear regression [19]. The mask/filter, that determines the edges of the image ‘A’ using
Hough Transform is usually implemented for automatic the structuring element B, as shown in figure 2, the
guidance in crop fields [20, 21]. Consequently there are now mathematical equations, which are defined as:
various vision systems available on autonomous weed
control robots for mechanical weed removal. A ⊗ B = A − (A * B)
The purpose of this paper is to develop an algorithm for c
real time that is not an autonomous robot but to investigate a = A Ι (A * B)
real time machine vision system. The system contains a CCD
camera mounted in front of the tractor at an angle of 45 { B } = { B 1 , B 2 , B 3 ....... B n }
degree at a distance of 4 meter long from the earth, which is A ⊗ { B } = ((...(( A ⊗ B 1 ) ⊗ B 2 )...) ⊗ B n
)
used for the classification of two types of weeds (broad and
narrow) by using the proposed image processing technique The pseudo-code used for the proposed algorithm is given as
Edge Link Detection. [Image taken by CCD camera]
[CPU receives the image]
III. OBJECTIVES [Apply the proposed algorithm and recognize the image]
Since in practice there are only two types of herbicides [Decision box ON or OFF the dc pump according to the
used: for broad leaves weed and narrow leaves weed (grass). decision of the CPU]
The objective of this paper is to: [DC pump will apply the right type of herbicides]
• Develop a vision algorithm that can recognize the
absence of weed and differentiate the presence of The pseudo-code is shown in the form of a flowchart
broad leaf weed and narrow leaf weed. shown in Figure 1. First the CCD camera captures the image
• Develop a real time sensor that are capable of and then forwards it to CPU, then the CPU process the image
by using the morphological filter (Structuring element) to
recognizing the presence and type of weeds and
remove the unnecessary information from the image and the
apply their right type of herbicides. edges are to be find as shown figure 4, and then the edges of

256

Authorized licensed use limited to: UNIVERSITY TEKNOLOGY PETRONAS. Downloaded on August 29, 2009 at 06:58 from IEEE Xplore. Restrictions apply.
that image are connected with each other through the Origin of the Morphological filter
proposed algorithm as shown in the image of figure 4 (Edge (Structuring Element)
junction Linking), then CPU takes the decision on the bases Size of the Image 320
of the summation of the pixels of the linked edges, i.e. Edge
junction Linking picture shown in figure. 4. The simple
equation for the summation of the pixels is given as:

M N Broad Images
Sum = (i + j) …………………..(1)
i=0 j=0

240
- Preprocessing steps and then

Size if the Image


- Proposed Algorithm

rows
Narrow Images
Start

Read Color Image Figure. 2. Raster Scanning of Image

Conver
Matlab Code to create lookup Tables:

Gray Scale % set up a look up table T1 to find Junctions.


lut = makelut (@junction, 3);
3x3 Mask junctions = applylut (b, lut);
[rj, cj] = find (junctions);
To reduce the amount of data
% set up a look up table T2 to find Endings.
Edge filter lut = makelut (@ending, 3);
ends = applylut (b, lut);
To remove Unnecessary Information and find the Edge [re, ce] = find (ends);
To test whether the center pixel within a 3x3
Proposed Algorithm neighborhood is a junction / ending in LT1 / LT2, the center
pixel must be set and the number of transitions/crossings
(For Decision See Eqn. 1)

Edge Junction Linking


between 0 and 1 as one traverse the perimeter of the 3x3
region must be 6 or 8 for junction and 2 for ending.
Pixels in the 3x3 region are numbered as follows
Summation of Pixels
1 4 7
2 5 8
3 6 9
If sum < threshold value
Decision

Now the intensities of the resultant image are added and


If sum > threshold value compared with the selected threshold value (T) for
Sprayer Controller classification of weeds into broad and narrow.
A Real Time Don’t Spray V. RESULTS AND DISCUSSION
Automatic
Spray System Fig.4. Show the classification images of broad and
narrow weeds. These images are processed by the proposed
Figure. 1. The concept of a Real-Time Specific Weed Sprayer System algorithm (Edge Link Detection). The algorithms gave
reliable accuracy to detect the presence or absence of weed
In next step raster-scanning technique is used to detect cover. For areas where weeds are detected, results show over
and link edge points together into lists of coordinate pairs. 93% classification accuracy over 240 sample images within
Where an edge junction is encountered the list is terminated which 100 samples from broad weeds, 100 samples from
and a separate list is generated for each of the branches as narrow weeds and the remaining 40 from no or little weeds.
shown in figure 4. Two lookup tables LT1 & LT2 are created The percentage of each class as shown in the Table 1
which store the edge junction and ending points respectively.

257

Authorized licensed use limited to: UNIVERSITY TEKNOLOGY PETRONAS. Downloaded on August 29, 2009 at 06:58 from IEEE Xplore. Restrictions apply.
TABLE I. RESULTS OF THE WEEDS USING THE PROPOSED With smaller region, there will be less possibility to find
ALGORITHM
more than one weed classes in this small region. The
Weeds Type Results found correct % algorithm is computationally complex and takes 13 to 20
seconds to process the image of 320 x 240 images. To make
Broad Weeds 92% it suitable for real time system the further research is needed
to reduce the processing.
Narrow Weeds 95%

No or Little Weeds 100% REFERENCES


[1] J. Bossu, Ch. Gée, G. Jones, F. Truchetet “Wavelet transform to
discriminate between crop and weed in perspective agronomic
In the above table the percentage of each category has images”, Comput. Electron. Agric (2008), doi:10.1016/j. compag
been determined by applying the proposed algorithm upon 2008.08.004
the database of 240 images (containing 100 samples of [2] Tian, L., Reid, J.F., Hummel, J.W., 1999. “Development of a
broad, 100 samples of narrow, 40 samples of no or little precision sprayer for site-specific weed management”. T. ASAE 42
(4), 893–900.
weed), from which we determined the percentage of each
category by adding the pixels of all the processed database of [3] Sugiura, R., Noguchi, R., Ishii, K., 2005. “Remote sensing
technology for vegetation monitoring using an unmanned helicopter”.
images containing of 240 images of broad, narrow, and no or Biosyst. Eng. 90 (4), 369–379.
little images. The algorithm takes 320 ms to process an
[4] Vioix, J.B., Douzals, J.P., Truchetet, F., Assemat, L., Guillemin, J.P.,
image. 2002. “Spatial and spectral method for weeds detection and
localization”. EURASIP JASP 7, 679–685.
[5] Slaughter, D.C., Giles, D.K., Downey, D., 2007. “Autonomous
robotic weed control systems”: a review. Comput. Electron. Agric. 61
(1), 63–78.
[6] Hemming, J., Rath, T., 2001. “Computer and vision-based weed
identification under field conditions using controlled lighting”. J.
Agric. Eng. Res. 78, 233–243.
[7] Blasco, J., Aleixos, N., Roger, J.M., Rabatel, G., Molto, E., 2002.
“Robotic weed control using machine vision”. Biosyst. Eng. 83 (2),
149–157.
[8] Felton, W.L., McCloy, K.R., 1992. Spot spraying. Agric. Eng. 11,
26–29
[9] Åstrand, B., Baerveldt, A.J., 2002. “An agricultural mobile robot with
vision-based perception for mechanical weed control”. Auton. Robots
13, 21–35.
[10] Åstrand, B., Baerveldt, A.J., 2005. “A vision based row-following
system for agricultural field machinery”. Mechatronics 15, 251–269.
[11] Vioix, J.B., Douzals, J.P., Truchetet, F., 2004. “Aerial detection and
localization of weed by using multispectral and spatia”. approaches.
In: Proc. Conf. Agricultural Engineering (AgEng2004), Leuven,
Belgium, pp. 298–299.
[12] Meyer, G., Metha, T., Kocher, M., Mortensen, D., Samal, A., 1998.
“Textural imaging and discriminant analysis for distinguishing weeds
for spot spraying”. T. ASAE 41 (4), 1189–1197
[13] Manh, A.G., Rabatel, G., Assemat, L., Aldon, M.J., 2001. “Weed leaf
Figure. 3. The Concept of a Weed Sprayer System image segmentation by deformable templates”. J. Agric. Eng.Res. 80
(2), 139–146.
VI. CONCLUSION [14] Moshou, D., Vrindts, E., De Ketelaere, B., De Baerdemaeker, J.,
Ramon, H., 2001. “A neural network based plant classifier”. Comput.
A weed detection/classification system is developed and Electron. Agric. 31 (1), 5–16.
tested in the lab for selective spraying of weeds using vision [15] Fontaine, V., Crowe, T.G., 2006. “Development of line-detection
recognition system. In this paper, an algorithm based on algorithms for local positioning in densely seeded crops”. Can.
detection and linking of edge point in a morphologically Biosyst. Eng. 48 (7), 19–29
thinned binary image is developed for weed classification [16] Leemans, V., Destain, M.F., 2006. “Application of the Hough
Transform for seed row location using machine vision”. Biosyst. Eng.
and recognition. The system shows an effective and reliable 94 (3), 325–336.
classification of images captured using a video camera. [17] Vioix, J.B., Douzals, J.P., Truchetet, F., Assemat, L., Guillemin, J.P.,
2002. “Spatial and spectral method for weeds detection and
VII. FUTURE WORK localization”. EURASIP JASP 7, 679–685.
In this paper weed image, which has one dominant weed [18] Hague, T., Tillet, N.D., 2001. “A bandpass filter-based approach to
species can be classified reasonably accurate. But the case crop row location and tracking”. Mechatronics 11, 1–12.
of more than one weed classes cannot be accurately [19] Søgaard, H.T., Olsen, H.J., 2003. “Determination of crop rows by
classified. Further research is needed to classify mixed image analysis without segmentation”. Comput. Electron. Agric. 38,
141–158.
weeds. One way is to break the image into smaller region.

258

Authorized licensed use limited to: UNIVERSITY TEKNOLOGY PETRONAS. Downloaded on August 29, 2009 at 06:58 from IEEE Xplore. Restrictions apply.
[20] Marchant, J., 1996. Tracking of row structure in three crops using [24] Sunil K Mathanker, P R Weckler, and R K Taylor. “Effective Spatial
image analysis. Comput. Electron. Agric. 15, 161–179. Resolution for Weed Detection”. 2007 ASABE Annual International
[21] Keicher, R., Seufert, H., 2000. Automatic guidance for agricultural Meeting Sponsored by ASABE Minneapolis Convention Center
vehicles in Europe. Computer. Electron. Agric. 25, 169–194 Minneapolis, Minnesota 17 - 20 June 2007.
[22] Chris Gliever, EEC206 Project Report. “Color Segmentation of Plant [25] Vrindts, E., Baerdemaeker, J. De and Ramon, H., “Weed Detection
and soil”. 3177 EUII , Friday 5PM, March 21th 2003. Using Canopy Reflection,” Precision Agriculture, 2002, Vol. 3, no. 1,
pp. 63-80.
[23] Lee, W. S., D. C. Slaughter, D. K. Giles. 1999. “Robotic weed control
system for Tomatoes”. Precision Agric 1: 95-113. [26] Pauli J. Komi, Dr Mike R. Jackson, and Rob M. Parkin, “Plant
Classification Combining Color and Spectral Cameras for Weed
Control Purposes”. 1-4244-0755-9/07 '2007 IEEE.

Broad Weed images Gray Scale Edge Detection Edge Junction Linking

Narrow Weed images Gray Scale Edge Detection Edge Junction Linking

Little Weed images Gray Scale Edge Detection Edge Junction Linking

Figure. 4. Resultant Images of Edge Link Detector Based Weed Classifier

259

Authorized licensed
View publication stats use limited to: UNIVERSITY TEKNOLOGY PETRONAS. Downloaded on August 29, 2009 at 06:58 from IEEE Xplore. Restrictions apply.

You might also like