Professional Documents
Culture Documents
Image Processing
2021 IEEE 13th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM) | 978-1-6654-0167-8/21/$31.00 ©2021 IEEE | DOI: 10.1109/HNICEM54116.2021.9732038
Froilan N. Jimeno II Briely Jay A. Briz Marvin Roy P. Artiaga Randy E. Angelia
Electronics Engineering Program Electronics Engineering Program Electronics Engineering Program Computer Engineering Program
College of Engineering College of Engineering College of Engineering College of Engineering
Education, University of Education, University of Education, University of Education, University of
Mindanao Mindanao Mindanao Mindanao
Davao City, Philippines Davao City, Philippines Davao City, Philippines Davao City, Philippines
fjimeno@umindanao.edu.ph bbriz@umindanao.edu.ph martiaga@umindanao.edu.ph randy_angelia@umindanao.edu.ph
Noel B. Limsangan
School of Electrical, Electronics
and Computer Engineering
Mapua University
Manila, Philippines
nblinsangan@mapua.edu.ph
Abstract— In the Philippines, solid waste management is still a waste generation of 1,012 tons daily. In 2016, the city's
significant problem. Improper waste disposal causes serious dumpsite reached its highest capacity of 900,000 tons. Severe
health problems and environmental risks such as contamination health issues and environmental degradation arising from
of the water systems, floods, ground and air pollution, and improper waste disposal due to the incorrect assumption of
diseases. Unfortunately, most people mistakenly believe that not
most people who do not segregate waste [2]. Solid waste
segregating waste is acceptable. This study aims to develop and
design a Smart Waste Bin Segregation using Image Processing and management in the Philippines under the Republic Act 9003 is
assist waste segregation through waste identification and implemented in public places such as schools. Schools generate
segregation built on machine learning capable of navigating the mostly plastic wastes and single-use containers, making an
one-time path set by the user. In particular, create an intelligent ideal place to conduct the study [3]. Hence, this study aims to
waste bin segregation prototype using image processing with three develop a waste separation system used in public and private
classifications. These classifications are the biodegradable, non- areas to achieve appropriate waste disposal.
bio-degradable, and unknown intended to segregate solid waste Sorting at the primary level has an integral part before
into its respective bins and conduct accuracy tests using disposing of the waste into the landfills. More workforce and
appropriate statistical tools. This device is designed for school use
time are needed to sort waste using the conventional way.
and may also be used in other establishments if more waste is
trained, alleviate the waste segregation problem and help build an Different methods and forms may be used to sort waste. An
eco-friendlier society without compromising health and hygiene. example is a Mobile Robot Delivery System that collects trash
The proponents successfully materialized the device, and function bins and garbage sorted out by an Automated Waste Sorter
tests yielded an overall result of 97.33% accuracy. (AWS). In AWS, a set of sensors controls the process together
with a conveyor belt that corresponds to each waste material [4-
7]. In different studies, collecting waste is monitored using
Keywords— Aluminum Can, Plastic Bottles, Plastic Spoon and sensors in the bins, which signals the microcontroller full of
Fork, Plastic Cups, White Bond Paper, Faster R-CNN, Object garbage. The garbage collector is notified by sending a short
Detection, Push-Pull Type Solenoid message service (SMS) using Global System for Mobile
I. INTRODUCTION Communication (GSM) technology. In another study, a system
grants the user of the bin reward points for proper segregation
Appropriate waste management is included in the country's of waste, and the quantity of the trash can be checked on a
leading problems nowadays. The considerable quantity of website which also shows the reward points gained by the user
waste produced is disposed of through methods the adversely [8]. Several sensors were used to classify waste, determining
affect the environment due to improper waste management. the difference between wet, dry, metallic, and hers for
About 35,580 tons of trash are made each day in the Philippines. automated waste disposal. In their study, infrared, metal,
Rural and urban regions produce an average of 0.3 to 0.5 kg of ultrasonic, and proximity sensors were used to detect waste.
waste [1]. In Davao City, City Environmental and Natural The object is placed on the conveyor. The motor driver controls
Resources Office (CENRO) reported in 2018 that an average the motor according to the output of the inductive sensor and
Authorized licensed use limited to: PES University Bengaluru. Downloaded on October 14,2023 at 08:09:37 UTC from IEEE Xplore. Restrictions apply.
transistor (TIP120) is used as a bridge and acts as a switch for The images of the samples were saved in a single folder and
the solenoids. The TIP120 allows switching the voltage from used for image training. The application used to generate codes
small to large. The diode 1N4007 connected to the solenoid for image training was PyCharm Python which uses Faster R-
allows the ow of current to be unidirectional. The diode directs CNN.
the current back to the solenoid until it disappears. Thus, the
Arduino acts as a server that receives and sends data, while the Microsoft created the Faster RCNN model. It is a single,
computer acts as a client that sends requests and receives data. end-to-end, unified, deep convolutional network that predicts
the positions of distinct objects. The first object detection model
is the region-based CNN, followed by Fast R-CNN and the
Faster R-CNN. Traditionally, there are three significant steps in
object detection. The first one is creating region proposals which
are locations that possibly have objects in them. Image
descriptors are then used to extract fixed-length feature vectors
that should sufficiently describe the thing despite changes like
scale or translation. The region proposals are then assigned to
object classes based on the feature vector extracted. The Faster
R-CNN utilizes a fully convolutional network called Region
Proposal Network (RPN), which generates region proposals and
informs the object's location for object detection. In addition, the
RPN implements attention in neural networks. Like the
traditional object detection method, the Faster R-CNN follows
the three significant steps but additional features. The RPN
Fig. 3 Block Diagram of Operating the Solenoid generates region proposals, and a feature vector is extracted
from the recommendations using the Region of Interest pooling
The computer used were Ryzen5 3600 for the central layer, feature vectors extracted from region proposals are
processing unit, Nvidia RTX 2060 for the graphics processing classified, and detected objects in their bounding boxes with
unit, Aorus B450-M for the motherboard, 16GB of RAM with a their class scores are returned. As seen in [19, Fig. 5.]
3200MHz of RAM speed, 1 TB hard drive for storage, 250GB
solid-state drive, and 650W power supply. The Image
processing of the system is utilized Faster R-CNN.
C. Methods and Procedures
The project gathered aluminum cans, plastic bottles, plastic
cups, white bond papers, and plastic spoons and forks. The
proponents chose these wastes since they can be commonly
found around a school campus. Next, multiple photographs of
each sample were taken inside a box with a green. Fig. 4 shows
the examples of waste images. One thousand pictures were taken
of the aluminum cans, 695 plastic bottles, 695 plastic cups, 695
white bond paper, and 695 spoons and forks.
Authorized licensed use limited to: PES University Bengaluru. Downloaded on October 14,2023 at 08:09:37 UTC from IEEE Xplore. Restrictions apply.
three aspect ratios, therefore a total of nine anchor boxes. These training step that begins high and gradually becomes lower as
anchors are present in different scales and share features the training progresses, as shown in Fig 7, called the Loss graph.
between the RPN and the Fast R-CNN detection network. A The training of the classier was successfully done. PyFirmata is
feature vector with a length of 256 for ZF net and 512 for VGG- a protocol that permits serial communication between Arduino
16 net is extracted and fed to two fully connected (FC) layers Uno and Python script running on any computer, allowing
from the individual region proposal. The first FC layer is "cls," access to any pin on the Arduino to be read and written.
which creates the objectness score for a particular submission, PyFirmata was used for the computer and Arduino to control all
and the second is "reg," which returns a vector delineating the the pins in the Arduino Uno board. After the training, tests were
region's bounding box. run, and results were recorded. Fig. 8 shows the images of
running the application for testing.
Authorized licensed use limited to: PES University Bengaluru. Downloaded on October 14,2023 at 08:09:37 UTC from IEEE Xplore. Restrictions apply.
III. RESULTS AND DISCUSSION As shown in Fig. 11, the confusion matrix displays the
Fig. 10 shows the actual design or the physical structure of correct and incorrect predictions in the classification labels. The
the Smart Waste Bin Segregation. It is composed of a twenty- accuracy in the results shows how reliable the system is in
four-inch monitor, a hollow cube with a dimension of 1.67ft x predicting the five different classifications of waste and one
1.63ft x 0.67ft. Inside the cube, it has a 1080p with an autofocus unknown variety. As observed in the confusion matrix shown in
camera and LED lights above. The surface of the cube is covered Fig. 11, seven strange plastic cups were classified into aluminum
with a green cloth. Each bin has LED lights on top of the lid and cans. The wrong classification of the seven unknown and a
a push-pull type solenoid. The system unit is found inside the plastic cup into aluminum cans was attributable to the likelihood
waste bin. of the images with each other. These similarities are the shapes
and colors of the objects in the photos. This conveys that the
object inside the bounding box regression is not part of the
background. The percentage is computed by multiplying the
results by 100%. In the training process, the training loss and
accuracy were evaluated to ensure the reliability of the results.
The training loss, as shown in Fig. 7, gradually decreases over
time.
!"#$%&' %&
(1)
()*+, -./01 )2 +.3,04
REFERENCES
[1] Alicia L. Castillo et al., “Status of Solid Waste Management in the
Philippines”, International Resources, University of Kitakyushu, Japan,
Fig. 11 Confusion Matrix Results
2013.
Authorized licensed use limited to: PES University Bengaluru. Downloaded on October 14,2023 at 08:09:37 UTC from IEEE Xplore. Restrictions apply.
[2] K. Cortez, “Davao City’s Garbage is Fast Piling Up – Group” Technique” International Journal of Scientific and Research
DavaoToday, June 2019. Publications, Volume 6, Issue 4, April 2016.
[3] V.R.K. Galarpe, “Review on the Impacts of Waste Disposal Sites in the [13] Rani Rokade “Smart Garbage Separation Robot with Image Processing
Philippines,” Science International, March 2017. Technique” International Journal of Engineering Research &
[4] Md. Safiqul et al., “Solid Waste Bin Detection and Classification using Technology, Special Issue, 2018.
Dynamic Time Warping and MLP Classifier” Elsevier, Nov. 2013. [14] V. Mangaiyarkarasi “Automated Medical Waste Segregation Machine
[5] M.K. Pushpa, et al., "Microcontroller based Automatic Waste Using Arduino Controller” International Journal of Advanced Research
Segregator." International Journal of Innovative Research in Electrical, Trends in Engineering and Technology, Vol. 5, Special Issue 5, March
Electronics, Instrumentation and Control Engineering. Vol. 3, Issue 5. 2018.
May 2015. [15] Kesthara V. “Sensor Based Smart Dustbin for Waste Segregation and
[6] F.G. Ang et al., “Automated Waste Sorter with Mobile Robot Delivery Status Alert” International Journal of Latest Technology in Engineering,
Waste System”, Department of Electronics and Communications Management & Applied Science, Volume VII, Issue IV, April 2018.
Engineering, De La Salle University, Philippines. [16] Minal Patil “Implementation of Automated Waste Segregator at
[7] Aleena V.J., “Automatic Waste Segregator and Monitoring System” Household Level” International Journal of Innovative Research in
Journal of Microcontroller Engineering and Applications, Vol.3, Issue Science, Engineering and Technology, Vol. 6, Issue 4, April 2017.
2. June 2017. [17] Syeda Madiha Samreen “Automatic Metal, Glass and Plastic Waste
[8] N. Karuppiah “Wastage Pay Waste Bin” International Journal of Sorter” International Journal for Research in Applied Science &
Engineering & Technology, 2018). Engineering Technology, Volume 5 Issue VI, June 2017.
[9] Mahmudul Hasan Russel “Development of Automatic Smart Waste [18] Andres Torres-Garcia, Oscar Rodea-Oragon, Omar Longaria-Gandara,
Sorter Machine”. Francisco Sanchez-Garcia and Luiz Enrique Gonzales-Jimenez,”
[10] J. Sanjai “Automated Domestic Waste Segregator using Image Intelligent Waste Separator” Computación y Sistemas, Vol 19, No. 3,
Processing” International Research Journal of Engineering & 2015.
Technology, Vol.6, Issue 04. April 2019. [19] S. Ren, K. He, R. Girshick and J. Sun, "Faster R-CNN: Towards Real-
[11] R.S. Sandhya Devi, Vijaykumar VR, M. Muthumeena ‘Waste Time Object Detection with Region Proposal Networks," in IEEE
Segregation using Deep Learning Algorithm” International Journal of Transactions on Pattern Analysis and Machine Intelligence, vol. 39, no.
Innovative Technology and Exploring Engineering, Vol.8, Issue 02. 6, pp. 1137-1149, 1 June 2017, doi: 10.1109/TPAMI.2016.2577031.
December 2018.
[12] Saravana kannan G, Sasi kumar S, Ragavan R, Balakrishnan M.
“Automatic Garbage Separation Robot Using Image Processing .
Authorized licensed use limited to: PES University Bengaluru. Downloaded on October 14,2023 at 08:09:37 UTC from IEEE Xplore. Restrictions apply.