Professional Documents
Culture Documents
Libin Zhang , Qinghua Yang , Yi Xun , Xiao Chen , Yongxin Ren , Ting Yuan ,
Yuzhi Tan & Wei Li
To cite this article: Libin Zhang , Qinghua Yang , Yi Xun , Xiao Chen , Yongxin Ren , Ting Yuan ,
Yuzhi Tan & Wei Li (2007) Recognition of greenhouse cucumber fruit using computer vision, New
Zealand Journal of Agricultural Research, 50:5, 1293-1298, DOI: 10.1080/00288230709510415
ImAge A c q u I s I t I o n
The variety of cucumber tested in this study was
'JinYu 5', which was planted in the greenhouse.
Fifty images of cucumber plants were obtained us-
ing a charge-coupled device (CCD) camera (JVC
TK-C1381) in a stationary mode by means of back-
lighting between 9:00 h and 15:00 h on a sunny
Fig. 1 Cucumber plant images in Rgb space. A, Red- day. The colour data signals from the camera were
green-blue (Rgb) image. B, R component image. c, g transferred as a 24-bit R g b colour image to an
component image. d, b component image.
image board (JoiNhoPE oK_C20a). The effec-
tive pixels of resolution of the image were 640(h)
x 480(V). brightness, contrast, shutter speed, and
aperture of the camera were kept constant most of
tomatoes were extracted. Xu et al. (2005) adopted the time during imaging. The camera was placed
a contrast colour index (R-b) model to recognise horizontally. The distance between camera and fruit
citrus fruit. The contrast colour index of citrus fruit was between 0.40 and 0.60 m. under backlighting
was significantly different from leaves and branches. conditions, very little light permeates the fruit, so
Experimental results showed that the rate of recog- the brightness of the cucumber surface is less than
nition was high. Foglia & giulio (2006) designed that of the leaf and stem in images, which is helpful
an agricultural robot for radicchio (leaf chicory) to image analysis.
harvesting. a n algorithm based on intelligent colour
filtering and morphological image operations was
provided to locate the radicchio plants. Most of the
above research detected objects by colour and bright- ImAge s e g m e n t A t I o n
ness difference between fruits and other portions.
For cucumber discrimination, these methods are not colour feature extraction
applicable because fruit colour is too similar to leaf Figure 1a shows an image of cucumber plant
and stem colour. Some investigators observed that including fruits, leaves, stems, sky and soil. The area
the reflectances of fruit, leaf and stem were almost of the fruits accounts for about 9% of the total area of
the same in visible light, but the reflectance of fruit the image. First, the colour image was decomposed
was higher in the near infra red (NiR) region. Fruit into a R, g and b component image, as shown in Fig.
could be identified based on their spectral character- 1. by analysing the grey values of the component
istics. Kondo et al. (1994) developed a visual sensor image, it was found that the maximum difference
for a cucumber harvesting robot. This employed between cucumber and other portions was in the b
550 and 850 nm wavelength interference filters to component image. The grey values of the most leafy
discriminate fruit from its leaves and from stems. region approached 0 and those of the most sky region
Van henten et al. (2002, 2006) detected cucumber were close to 255. So the b component was selected
fruit in the green environment using different filters for image segmentation as features.
on each of two cameras. one camera was equipped
in R g b colour space, R, g and b components
with an 850 nm filter, the other with a filter in the 970
were strongly influenced by light. To reduce the ef-
nm band. With filters, more demands were placed on
fect, the image was transformed into hSi space. h , S,
camera quality and the complexity of the machine
I component images were extracted, as shown in Fig.
vision system was also increased.
2. in the S channel, the leaves behind the fruit were
Zhang et al.—Recognition of greenhouse cucumber fruit 1295
Illllllllllllllll ' • • • • • • • • • • • • • • •
I T
Illllllllllllllll1
. IIIIIIIIIIIIIII
III] Ill I l l
Illllllllll I II I
Illllllllll •
mini III !!!!!!!S!!!!!!L; THU; ;,:H
/8L
upper part J L ^
end point B y^~ f
initial point A
window ><
centreline
Dn
D Fig. 7 Fruit upper part determination.
c
(2) Two templates composed of horizontal and ver- Bulanon DM, Kataoka T, Okamoto H et al. 2004. De-
tical white lines were defined. The non-fruit velopment of a real-time machine vision system
parts remaining in a discriminated image were for the apple harvesting robot. In: SICE 2004
removed by logical operations employing these Annual Conference, Japan: Sapporo, 4-6 August.
Pp. 595-598.
templates. The fruits were usually recognised.
(3) By a texture analysis, the parameter "third Cai Jianrong, Zhao Jiewen 2005. Recognition of mature
moment" was selected for determination of a fruit in natural scene using computer vision.
fruit's upper part. According to the median of Transactions of the CSAM 36(2): 61-64 (in
Chinese).
the "third moment" the approximate demarca-
tion line could be obtained. The portion above Chinchuluun R, Lee WS 2006. Citrus yield mapping sys-
the line was considered as the upper part. This tem in natural outdoor scenes using the Watershed
provided target grip position information for a transform. ASAE Paper No. 063010.
robot end-effector. Foglia MM, Giulio R 2006. Agricultural robot for radic-
chio harvesting. Journal of Field Robotics 23(6/7):
363-377.
Hayashi S, Ganno K, Ishii I, Tanaka I 2002. Robotic har-
ACKNOWLEDGMENTS vesting system for eggplants. Japan Agricultural
The authors gratefully acknowledge the financial support Review Quarterly 36(3): 163-168.
provided by the 863 project (2007AA04Z222). Kondo N, Nakamura H, Monta M et al. 1994. Visual
sensor for cucumber harvesting robot. In: Pro-
ceedings of the Food Processing Automation
Conference III: 461-470.
REFERENCES Regunathan M, Lee WS 2005. Citrus yield mapping and
Annamalai P, Lee WS 2003. Citrus yield mapping system size determination using machine vision and ul-
using machine vision. ASAE Paper No. 031002. trasonic sensors. ASAE Paper No. 053017.
Annamalai P, Lee WS, Burks T 2004. Colour vision sys- Van Henten EJ, Hemming J, van Tuijl BAJ et al. 2002.
tem for estimating citrus yieldinreal-time. ASAE An autonomous robot for harvesting cucumbers
Paper No. 043054. in greenhouses. Autonomous Robots 13(3):
241-258.
Bulanon DM, Kataoka T, Ota Y et al. 2000. Estimating of
apple fruit location using machine vision system Van Henten EJ, van Tuijl BAJ, Hoogaker GJ et al. 2006.
for apple harvesting robot. Proceedings of the XIV An autonomous robot for de-leafing cucumber
Memorial CIGR World Congress. Pp. 574-579. plants grown in a high-wire cultivation system,
Biosystems Engineering 94(3): 317-323.
Bulanon DM, Kataoka T, Ota Y, Hiroma T 2001. A ma-
chine vision system for the apple harvesting robot. Xu Huirong, Ye Zunzhong, Ying Yibin 2005. Identifica-
Agricultural Engineering International: the CIGR tion of citrus fruit in a tree canopy using colour
Journal of Scientific Research and Development information. Transactions of the CSAE 21(5):
III: 1-11. 98-101 (in Chinese).
Bulanon DM, Hiroma T, Ota Y et al. 2002. A segmenta- Zhao J, Tow J, Katupitiya J 2005. On-tree fruit recogni-
tion algorithm for the automatic recognition of tion using texture properties and colour data. In:
Fuji apples at harvest. Biosystems Engineering IEEE/RSJ International Conference on Intelligent
83(4): 405-412. Robots and Systems, 2-6 August. Pp. 263-268.