Professional Documents
Culture Documents
187-99Z - Article Text-491-1-2-20190504
187-99Z - Article Text-491-1-2-20190504
Abstract
In 2018, the Indonesian fruit exports increased by 24% from the previous year. The surge in demand for
tropical fruits from non-tropical countries is one of the contributing factors for this trend. Some of these
countries have strict quality requirements – the poor level quality control of fruit is an obstacle in
achieving greater export yield. This is because some exporters still use manual sorting processes
performed by workers, hence the quality standard varies depending on the individual perception of the
workers. Therefore, we need an intelligent system that is capable of automatic sorting according to the
standard set. In this research, we propose a system that can classify fruit defects automatically. Faster R-
CNN (FRCNN) architecture proposed as a solution to detect the level of defect on the surface of the fruit.
There are three types of fruit that we research, its mangoes (sweet fragrant), lime, and pitaya fruit. Each
fruit divided into three categories (i) Super, (ii) middle, (iii) and fruit defects. We exploit join detection
and video tracking to calculate and determine the quality fruit in real-time. The datasets are taken in the
field, then trained using the FRCNN Framework using the Tensorflow platform. We demonstrated that
this system can classify fruit with an accuracy level of 88% (mango), 83% (lime), and 99% (pitaya), with
an average computation cost of 0.0131 m/s. We can track and calculate fruit sequentially without using
additional sensors and check the defect rate on fruit using the video streaming camera more accurately
and with greater ease.
Keywords: Deep Learning, Faster R-CNN, Object Detection, TensorFlow, Tracking Detection.
Limes_C Limes_C/
3
(b) / Defect Defect
Fig 10. (a) the global step/sec is a performance indicator
showing after 18.000 steps we processed as the training
model. (c) show Totalloss which trend to be approching to
zero after doing 18.000 steps, this shows performance better.
1 limes_B limes_B
CONCLUSION
n this paper, we present a system that classify
fruit with an accuracy level of 88% (Mango), 83%
Mango_ Mango_ (Limes), 99% (Pitaya) and with an average
8 computation cost of 0.0131 m/s. We can track and
A A
calculate fruit sequentially without using
additional sensors. Additionally, we can also check
the defect rate on fruit accurately using the video
streaming camera. The datasets are taken in the
field areas then trained using the FRCNN
Framework using the Tensorflow platform. Faster
R-CNN is one model that proved possible in
solving complex computer vision problems with
Mango_
9 Limes_B the same principle and shows incredible results at
B the beginning deep learning revolution. This
algorithm provides a solution for solving cases of
fruit quality detection in real time using multi-
classes. In our future research, we experiment need
to be done and make Android Apps for integrated
The table shows the result of a screenshot of with analitic system. We also used the tracking
our experiment in conducting a fruit detection video system to calculate the quantity of fruits on
process in real time. From the results above, we the conveyor runway. We can track and calculate
conclude the process for object detection is fruit sequentially without using additional sensors
remarkably constant, but there are shortcomings. and check the defect rate on fruit using the video
When the fruit is inserted individually, the results streaming camera more accurately and with greater
of the detection are in accordance with the actual ease.
label. However, when fruit are entered simul-
taneously, there are fruits that do not match the ACKNOWLEDGMENTS
correct label, as shown in the last row of table 5.
This research was funded by
This algorithm offers a solution for resolving cases
KEMENRISTEKDIKTI from “Penelitian Tesis
of fruit quality detection in real time managing
Pascasarjana” in 2019 scheme with the title is
many classes. In our future research, we aim to
“Intelligent System Development for Automatic
modify the architecture on feature extraction, so
Classification of Fruit Defect”.The author would
that it becomes capable of producing the best
like to express his gratitude to Mr. Faisol as the
model and minimize errors in fruit quality
head of a farmer group in Gresik district for
detection.
providing an opportunity to collect data directly
from their garden.
REFERENCE Images,” vol. 151, no. 10, pp. 5–11, 2016.
[13] S. Nuske, S. Achar, T. Bates, S. Narasimhan,
[1] In. Sub-directorate of Horticulture Statistics, and S. Singh, “Yield Estimation in Vineyards
Statistics of Annual Fruit and Vegetable by Visual Grape Detection (in process of
Plants Indonesia. Indonesia: BPS-Statistics publication),” 2011 IEEE Conf. Intell. Robot.
Indonesia, 2018. Syst., pp. 2352–2358, 2011.
[2] E. B. Esguerra and R. Rolle, Post-harvest [14] K. Simonyan and A. Zisserman, “Very Deep
management of mango for quality and safety Convolutional Networks for Large-Scale
assurance. United Nations: Food and Image Recognition,” arXiv Technical Report,
Agriculture Organization(FAO), 2018. 2014. [Online]. Available:
[3] S. CODEX, Standard for Pitahayas are https://arxiv.org/abs/1409.1556. [Accessed:
classified in three classes defined. United 14-Feb-2019].
Nations: Codex Committee on Fresh Fruits [15] L. Agilandeeswari, M. Prabukumar, and G.
and Vegetables (CCFFV), 2011. Shubham, “Automatic Grading System
[4] S. CODEX, Standard for Limes are classified Mangoes Using Multiclass SVM Classifier,”
in three classes defined. United Nations: Int. J. Pure Appl. Math., vol. 116, no. 23, pp.
Codex Committee on Fresh Fruits and 515–523, 2017.
Vegetables (CCFFV), 2011. [16] Indarto and Murinto, “Banana Fruit Detection
[5] H. Basri, I. Syarif, and S. Sukaridhoto, “Faster Based on Banana Skin Image Features Using
R-CNN Implementation Method for Multi- HSI Color Space Transformation Method,”
Fruit Detection Using Tensorflow Platform,” JUITA, vol. 5, no. 1, pp. 15–21, 2017.
2018, no. October, pp. 1–4. [17] C. McCool, I. Sa, F. Dayoub, C. Lehnert, T.
[6] L. Ma, S. fadillah Umayah, S. Riyadi, C. Perez, and B. Upcroft, “Visual Detection of
Damarjati, and N. A. Utama, “Deep Learning Occluded Crop: For Automated Harvesting,”
Implementation using Convolutional Neural in Proc. IEEE Int. Conf. Robot. Autom, 2016,
Network in Mangosteen Surface Defect pp. 2506–2512.
Detection,” 2017, no. November, pp. 24–26. [18] T. Ojala, M. Pietikainen, and T. Maenpaa,
[7] K. N. Ranjit, H. K. Chethan, and C. Naveena, “Multiresolution grayscale and rotation
“Identification and Classification of Fruit invariant texture classification with local
Diseases,” vol. 6, no. 7, pp. 11–14, 2016. binary patterns,” IEEE Trans. Pattern Anal. M
[8] I. Sa, Z. Ge, F. Dayoub, B. Upcroft, T. Perez, ach. Intell, vol. 24, no. 7, pp. 971–987, 2002.
and C. McCool, “Deepfruits: A fruit detection [19] N. Dalal and B. Triggs, “Histograms of
system using deep neural networks,” Sensors oriented gradients for human detection,” in
(Switzerland), vol. 16, no. 8, 2016. Proc. IEEE Comput. Soc. Conf. Comput. Vis.
[9] S. Jana and S. Basak, “Automatic Fruit Pattern Recognit, 2005, pp. 886–893.
Recognition from Natural Images using Color [20] S. Ren, K. He, R. Girshick, and J. Sun,
and Texture Features,” Devices Integr. “Faster R-CNN: Towards Real-Time Object
Circuit, pp. 620–624, 2017. Detection with Region Proposal Networks,”
[10] M. Kaur and R. Sharma, “Quality Detection in Proc. Adv. Neural Inf. Process. Syst, 2015,
of Fruits by Using ANN Technique,” IOSR J. pp. 91–99.
Electron. Commun. Eng. Ver. II, vol. 10, no. [21] S. Chen, “Counting apples and oranges with
4, pp. 2278–2834, 2015. deep learning: A data driven approach,” IEEE
[11] C. S. Nandi, B. Tudu, and C. Koley, “A Robot. Autom. Lett, vol. 2, no. 2, pp. 781–788,
machine vision-based maturity prediction 2017.
system for sorting of harvested mangoes,” [22] Y. LeCun, B. Boser, J. S. Denker, and D.
IEEE Trans. Instrum. Meas., vol. 63, no. 7, Henderson, Backpropagation Recognition,
pp. 1722–1730, 2014. applied to handwritten zip code. 2015.
[12] S. Basak, “An Improved Bag-of-Features
Approach for Object Recognition from Natural