You are on page 1of 14

applied

sciences
Article
Cutting Blade Measurement of an Ultrasonic Cutting
Machine Using Multi-Step Detection
Tae-Gu Kim 1 , Sheikh Faisal Ahmad 2 , Byoung-Ju Yun 1, * and Hyun Deok Kim 1, *
1 School of Electronics Engineering, IT College, Kyungpook National University, 80, Daehak-ro, Buk-gu,
Daegu 41566, Korea
2 Institute of Advanced Convergence Technology, Kyungpook National University, Daegu 41566, Korea
* Correspondence: bjisyun@ee.knu.ac.kr (B.-J.Y.); hdkim@knu.ac.kr (H.D.K.); Tel.: +82-53-950-7329 (B.-J.Y.)

Received: 5 July 2019; Accepted: 10 August 2019; Published: 14 August 2019 

Abstract: An algorithm for measurement of the cutting blade of the ultrasonic vibration cutting
machine used in an automation machine is proposed in this paper. The proposed algorithm, which is
based upon a multi-step detection method, is developed for the accurate measurement of the cutting
blade by exactly determining its rotation angle, length, and thickness. Instead of the commonly used
Otsu method, we propose a new curvature-based adaptive binarization method, which provides
more accurate details about the dimensions of the cutting blade. A region of interest containing the
cutting blade from the acquired image is first extracted in the multi-step detection method, which is
further processed to remove the noise, which increases the measurement reliability. An important
feature of the proposed process is the restoration of the cutting blade’s tip data, which used to be
lost during the fine noise-filtering process. The rotation angle and length are measured using the
minimum rotated rectangle while the line fitting based upon the least square method is applied to
increase the reliability of the thickness measurement. Experimental results validate the superiority of
the proposed method over the conventional Otsu method.

Keywords: cutting blade detection; dimension measurement; adaptive binarization; multi-step region
of interest (ROI) detection; object detection

1. Introduction
With the development of ultrasonic cutting technology, an ultrasonic vibration cutting machine is
used for the precise cutting of fabrics, rubber, thermoplastic foil, non-woven fabric, etc. In particular,
cutting machines using ultrasonic vibration are capable of cutting very precisely without exerting any
pressure on the objects in order to avoid any surface deformation. Ultrasonic vibration cutting also
features very high cutting speeds and reduced cleaning costs, which has increased their demand for
use in automation machines [1]. However, when the ultrasonic vibration cutting machine is used in
connection with an automatic machine (such as robot arm), it is difficult to recognize the replacement
cycle or the degree of wear and tear of the cutting blade [2]. This process results in inaccurate cutting
of the object and material damage. For this reason, the development of an automation inspection
system, which is based upon the machine vision system, is actively underway [3,4]. The machine vision
system acquires images using a camera and then quantitatively judges [5] the degree of abrasion and
warping/breakage of the cutting blade. In order to improve the productivity and reliability, machine
vision inspection equipment is commonly used in actual practice. Recently, the machine vision has been
actively deployed in research for the development of measurement systems for different applications.
Measurement of the dimensions of an O-ring [5], detecting defects in a thin film transistor liquid
crystal display (TFT-LCD) [6], measurement of the refractory bricks [7], and measuring the size of an
automobile break pad [8] are the few areas benefited by the machine vision.

Appl. Sci. 2019, 9, 3338; doi:10.3390/app9163338 www.mdpi.com/journal/applsci


Appl. Sci. 2019, 9, x FOR PEER REVIEW 2 of 14

refractory bricks [7], and measuring the size of an automobile break pad [8] are the few areas
Appl. Sci.benefited
2019, 9, 3338by the machine vision. 2 of 14
In general, various methods, like brightness information, shape information, pattern
information, etc., are used for object recognition in the field of image processing. Among these
Inmethods
general, of various
objectmethods,
detection,like brightness
template information,
matching shapepoint
and feature information,
detectionpattern
are theinformation,
techniques which
etc., are used for object recognition in the field of image processing.
have gained more attraction [5]. However, template matching detects an object Among these methods of object the
by determining
detection, template
similarity of matching
the imageand with feature point detection
a reference image that areisthe
not techniques
adaptive whichto the have
change gained
of themore
reference
attraction [5]. However, template matching detects an object by determining
image, which consequently deteriorates the results. On the other hand, the feature extraction method the similarity of the image
with a uses
reference imageand
the shape thatcolor
is not adaptive toofthe
information thechange
object of or the
the reference image, which
blob information of theconsequently
object based on the
deteriorates the results.
binarization. On the other
The detection method hand, the the
using feature
shape extraction
and colormethod uses the
information of theshape and
object is color
sensitive to
information of the object or the blob information of the object based on the binarization.
the change of color depending on the illumination. If the shape of the object is not specified, it is The detection
methoddifficult
using the to shape
defineandthe color
featureinformation of the
of the shape. Inobject is sensitive
the object detection to the changebased
method of color
on depending
blob detection, it
on the is illumination. If the shape of the object is not specified, it is difficult
difficult to find the target object when there are objects other than the object to be detectedto define the feature of the[9,10].
shape. In the object detection method based on blob detection, it is difficult
In this paper, we propose a machine vision-based image processing algorithm that measures the to find the target object
when there
thicknessare objects other of
and length than
thethe objectblade
cutting to be todetected
detect [9,10].
its bending and fracture, and also measures the
Inrotation
this paper, we propose a machine vision-based
angle of the cutting blade in order to check the image processing
operation algorithm that measures
of the robot-arm. The the region of
thickness and length of the cutting blade to detect its bending and fracture,
interest (ROI) is extracted from the cutting blade area of the original image. The binarization and also measures the based
rotationonangle of the cutting
the curvature of theblade in order
extracted to check
image the operation
is performed by the of proposed
the robot-arm.method. TheTheregion of
background
interestregion
(ROI) isis extracted
extracted from the cutting blade area of the original image. The
and removed from the binarization image, which is followed by the removal of binarization based on
the curvature
variousof the extracted
noises includedimagein theisextracted
performed by theInproposed
image. order to method.
recover the Theinformation
backgroundofregion the cutting
is extracted
blade’s and tip,removed
which was fromlost
the in binarization image,
the fine noise whichprocess,
removal is followed onlybythethetip
removal
of the of various
cutting blade is
noises extracted
included in the extracted image. In order to recover the information
again from the background-removed image by assigning a new ROI only for the tip of theof the cutting blade’s tip,
which cutting
was lostblade.
in theThe
finethickness,
noise removal length,process, only the
and rotation tipof
angle ofthe thecutting
cuttingblade
bladeare is extracted
then measuredagainfor the
from the background-removed
restored image. image by assigning a new ROI only for the tip of the cutting blade.
The thickness, This length,
paperand rotation into
is divided angletwoof the
maincutting bladeThe
sections. arefirst
thensection
measured for theinformation
contains restored image. about the
This paperblade
“cutting is divided into two system”
measurement main sections.
while the The second
first section section contains information
is dedicated to theaboutproposed
the “cutting blade measurement system” while the second section is
“curvature-based binarization method”, which is further followed by the experimental results and dedicated to the proposed
“curvature-based
discussion. binarization method”, which is further followed by the experimental results
and discussion.
2. Cutting Blade Measurement System
2. Cutting Blade Measurement System
2.1. System Configuration
2.1. System Configuration
The structure of the cutting blade measurement system used in the experiment is shown in
The structure of the cutting blade measurement system used in the experiment is shown in
Figure 1a while the cutting blade used in the actual experiment is shown in Figure 1b. The cutting
Figure 1a while the cutting blade used in the actual experiment is shown in Figure 1b. The cutting blade
blade measurement system is composed of a vision system, vision camera, back light unit
measurement system is composed of a vision system, vision camera, back light unit (illumination), and
(illumination), and an ultrasonic cutting device carrying robot arm. The position of interest where
an ultrasonic cutting device carrying robot arm. The position of interest where the measurement of
the measurement of cutting blade has to be performed is in between the vision camera and the back
cutting blade has to be performed is in between the vision camera and the back light unit as shown
light unit as shown in Figure 1a. The camera and back light unit are placed in such a way that both
in Figure 1a. The camera and back light unit are placed in such a way that both face the measuring
face the measuring position. When the cutting blade reaches the measurement position through the
position. When the cutting blade reaches the measurement position through the robot arm, the image
robot arm, the image is acquired through the installed vision camera. The obtained image is then
is acquired through the installed vision camera. The obtained image is then transmitted to the vision
transmitted to the vision system for further processing.
system for further processing.

(a) (b)
Figure 1. (a) Cutting blade measurement system; (b) cutting blade.
Appl.
Appl. Sci.Sci. 2019,
2019, 9, x9,FOR
x FOR PEER
PEER REVIEW
REVIEW 3 of
3 of 14 14
Appl. Sci. 2019, 9, 3338 3 of 14
Figure
Figure 1. (a)
1. (a) Cutting
Cutting blade
blade measurement
measurement system;
system; (b)(b) cutting
cutting blade.
blade.

2.2. Detection
2.2.2.2. Algorithm
Detection
Detection Algorithm
Algorithm
The
The flowchart
The flowchart
flowchart of the
of of proposed
thethe algorithm
proposed
proposed for
algorithm
algorithm the
forfor detection
thethe of
detection
detection the
of of cutting
thethe blade
cutting
cutting of
blade
blade the
of of ultrasonic
thethe ultrasonic
ultrasonic
cutting
cuttingmachine
machine is shown
is shown in Figure
in
cutting machine is shown in Figure 2. 2.
Figure 2.

Figure
Figure 2. Flowchart
2. Flowchart forfor
thethe cutting
cutting blade
blade detection
detection algorithm.
algorithm. ROI: region
ROI: of interest.
Figure 2. Flowchart for the cutting blade detection algorithm. ROI: region of interest.
region of interest.

The The first


first step
step of of
thethe algorithm
algorithm is the
is the extraction
extraction of of
thethe
ROIROI (which
(which contains
contains thethe cutting
cutting blade)
blade) fromfrom
thethe original
original image,
image, which
which was was acquired
acquired using
using thethe
the vision
vision
vision camera.
camera.
camera. The The second
second stepstep is the
is the detection
detection of of
the peak curvature of the gray values that is used as the binarization standard, which is deployed to to
the peak curvature of the gray values that is used as the binarization standard, which is deployed
model
model thethe unwanted
unwanted areas
areas in in
thethe
ROI ROIas as
thethe background
background image,
image, which
which areare further
further removed
removed byby using
using
the morphological
morphologicalfilter.
the morphological filter.
filter.This This background-removed
Thisbackground-removed
background-removed image
image
image contains
contains
contains various various
variousfine fine fine
noises, noises,
noises,
which which
which
are need areare
need
need
to to be removed
to be removed
be removed using noise
usingfiltering
using noise filtering
noise filtering
for the for for the accurate
the accurate
accurate detection detection
detection of the
of the blade.
of the cutting cutting
cuttingHowever, blade. However,
blade. However,
in this fine in in
this
this
noise finefine
removal noise
noise removal
removal
process, process,
process,
the data whichthethedatadata which
which
contains contains
contains
the thethe
information information
information
about the tipaboutabout
of thethe thetiptip
cutting of
thethe
ofblade iscutting
cutting
also
blade
blade is also
removed. is also removed.
Theremoved.
third stepThe The third
is tothird
definestepstep is to
is to ROI
a new define
define a
a new
in the new ROI
ROIimages
binary in the
in the binary binary
before and images
images before
afterbefore
noiseand and after
after
removal.
noise
noisenew
This removal.
removal.
ROI only This new
Thisencloses
new ROIthe ROI only
only encloses
areaencloses
surrounding the
the area area surrounding
thesurrounding
tip of the cutting the
the tip tip of
of the
blade, the cutting
cutting
which blade,
blade,
is then which
which
restored
by is
is thenthen
comparing restored
restored theseby comparing
by comparing
two images. these
these two
two the
Finally, images.
images.
rotation Finally,
Finally,
anglethe the
and rotation
rotation
the length angle
angle ofand and the length
the length
the cutting bladeof ofthethe
are
cutting
cutting blade
measured blade are measured
are measured
by using the bounding by
by using using
box that the bounding
theis bounding
in contact with box
box the that
thatouter is in
is inline. contact
contact with
with thethe
In addition, the outer
outer line.
line. In
thickness of In
theaddition,
addition,
cuttingthe the
blade thickness
thickness of of
thethe
is also measured cutting
cutting
fromblade blade
the is is also
also
detected measured
measured
image from
by using from thethe
the detected
detected
line fittingimage image
technique byby using
using
based onthe
the
line
lineleast
the fitting
fitting technique
technique
square method. based on the least square
based on the least square method. method.

2.3.2.3.
ROIROI Detection
Detection
The The original
original image
image of of
thethe cutting
cutting blade
blade contains
contains various
various information
information which
which makes
makes thethe accurate
accurate
detection
detection of of
thethe cutting
cutting blade
blade rather
rather difficult.
difficult.
difficult. The
The problem
problem is is addressed
addressed byby defining
defining a specific
a specific ROI
ROI
which
which encloses
encloses only
only the
the cutting
cutting blade.
blade.
which encloses only the cutting blade. TheThe method
method of of extracting
extracting the
the ROI
ROI is is shown
shown in in Figure
Figure
the ROI is shown in Figure 3. 3. 3.

(a)(a) (b)(b) (c)(c)


Figure
Figure
Figure 3. 3. (a)
(a)
3. (a) Original
Original
Original image
image
image of of
of thethe
the cutting
cutting
cutting blade;
blade;
blade; (b)(b)
(b) binary
binary
binary image;
image;
image; (c)(c)
(c) ROI
ROIROI image.
image.
image.

First, thethe
First, S region
S regionof of
the original image is detected and then thethe
pixel margin is added to to
extract
First, the S region of thethe original
original image
image is detected
is detected andand then
then pixel
the pixel margin
margin is added
is added extract
to extract
thethe
R region,
R region, which
which is the required ROI as shown in Figure 3a. In order to detect the S region, thethe
the R region, which is is
thethe required
required ROIROIas as shown
shown in in Figure
Figure 3a.3a.
In In order
order to to detect
detect thethe S region,
S region, the
binary image
binary image is obtained
is obtained at the 128 gray value of the original image as shown in Figure 3b. By using
binary image is obtained at at
thethe
128128 gray
gray value
value of of
thethe original
original image
image as as shown
shown in in Figure
Figure 3b.3b.
ByBy using
using
thethe
binary
binary value
valueof of
128, it is
128, it possible
is possibleto to
quickly
quickly grasp thethe
grasp position
positionof of
the cutting blade. However,
the binary value of 128, it is possible to quickly grasp the position of thethe cutting
cutting blade.
blade. However,
However,
Appl. Sci. 2019, 9, 3338 4 of 14

accurate measurement is not possible at this value. Using the line profile, the edge boundary of the
cutting blade where the value changes from 255 to 0 is detected and set as the starting point of the
cutting blade. Additionally, the boundary is defined as the end point where the gray value changes
from 0 to 255 while having the expected cutting-edge thickness.
Then, the average of the x-coordinates of the detected boundaries are calculated to detect the
region of interest, S. The region of interest, R, as shown in Figure 3c, is extracted from the original
image of Figure 3a by adding the pixel margin to the detected region, S.

3. Curvature-Based Binarization Method

3.1. Image Binarization


The gray image needs to be converted into the binary image for accurate determination of the
thickness and length of the cutting blade. The generally used Otsu method extracts the objects
with similar brightness values using the image histogram by finding a value that maximizes the
between-class variance [11]. This method works well under the assumption of bimodal images with
implicitly uniform illumination; conversely, it is limited by many factors involving the small object size
and the presence of a large amount of noise requiring image restoration [12]. When the designated
image is composed of L gray-level pixels, where the number of pixels whose brightness value is i,
i ∈ {1, . . . , L} is fi , the total number of pixels in the image is the image is N, and the probability pi for
such a pixel can be obtained as follows:
X L
N= fi , (1)
i=1

fi
pi = . (2)
N
On the basis of the pixels’ gray levels, the image can be divided into two classes, C0 and C1 , where
the pixels with gray levels [1, . . . , k] are in the class C0 and the pixels with gray levels [k + 1, . . . , L]
are in class C1 . If the probability distribution of each class is called ω0 and ω1 , respectively, while the
overall average of the original image is µT , the threshold value of Otsu can be obtained by finding k∗ ,
which maximizes the distribution, σ2B (k), between classes using Equations (3) and (4).
The ROIs of the captured image and the binary image by the Otsu method are shown in Figure 4a,b,
respectively. A comparison between the gray levels of the input image and that obtained from the
Otsu method along a smaller line segment at the same position in the ROI are provided in Figure 4c,
which shows that due to the specific limits of binarization standard and diffraction of light, the Otsu
method is unable to detect the exact thickness of the cutting blade:

[µT ω(k) − µ(k)]2


σ2B (k) = , (3)
ω(k)[1 − ω(k)]

σ2B (k∗ ) = maxσ2B (k), 1 ≤ k ≤ L. (4)

It is hard to divide the target accurately by defining the standard of binarization based upon
the brightness value by using the Otsu method, which can be seen in the previous example.
This phenomenon occurs due to the diffraction of light. The problem can, however, be addressed if
the high curvature value position in the gray-level profile of the pixels is defined as the binarization
reference. In this paper, we propose the curvature-based binarization method for accurate detection of
the cutting blade used in the ultrasonic vibration cutting machine. The curvature value is calculated
from the horizontal line profile in the ROI of the gray image. The first order and the second order
derivatives of the gray value profile at the given reference position for the given pixel are expressed
as Equations (5) and (6). Where P(n) is the gray level of the n-th position where P0 (n) is the first
Appl. Sci. 2019, 9, 3338 5 of 14

derivative, which represents the variation of the gray level, and w is the increment along the x axis and
P00 (n) is the second order derivative of P(n):

P(n + t) − P(n − t)
P0 (n) = , (5)
w

P00 (n) = (P0 (n))0 . (6)

The curvature, k(n), can be expressed in terms of P0 (n) and P00 (n) as given in Equation (7):

P00 (n)
k (n) =  3 . (7)
P0 (n)2 +1 2
Appl. Sci. 2019, 9, x FOR PEER REVIEW 5 of 14

(a) (b) (c)


Figure 4. (a)4. ROI
Figure of of
(a) ROI thetheacquired
acquired image; (b)ROI
image; (b) ROIofof
thethe
OtsuOtsu method-based
method-based binary
binary image;image; (c)
(c) comparison
comparison of the gray values of two images along same
of the gray values of two images along same line profile. line profile.

It is hard to divide
The result thecutting
of the target blade
accuratelydetectedby defining
by usingthe thestandard of binarization
curvature-based based upon the
image binarization method
brightness
is given value by using
in Figure the Otsu
5. Figure 5a shows method, the ROIwhich can of
image bethe
seen in the
cutting previous
blade obtainedexample.
in the This
previous
phenomenon
section. The occurs
line due to the
profile diffraction
of the image of of light.5aThe
Figure andproblem can, however,
the curvature be addressed
value obtained if the (7)
using Equation
high and
curvature value
its feature position
points usinginthethe gray-level
extreme valueprofile of the are
of curvature pixels
shownis defined
in Figureas5b,c,
the binarization
respectively. The
reference.
binaryInimage
this paper, we using
obtained proposethisthe curvature-based
method and a comparison binarization method
of its gray forprofile
value accurate detection
with the original
of the cutting
image blade in
are shown used in the
Figure 5d,e,ultrasonic
respectively. vibration cutting machine. The curvature value is
calculatedThe from the horizontal
original image of line profile blade
the cutting in the with
ROI of thethe gray image.
enlarged view of The
itsfirst
tinyorder
segmentandalong
the second
a reference
orderline
derivatives
is shown of the gray
in Figure 6a.value
It canprofile
clearlyatbethe seengiven
that reference
the imageposition
is erodedfor duethetogiven pixel areof the
the influence
backlight illumination. The binary image
expressed as Equations
( )
P nobtained by the Otsu method
the n-th
with the enlarged P ( )
view
' n of its
tiny segment along(5) and (6). Where
a reference line is shown inisFigure the gray 6b.level of clearly
It can beposition
seen thatwhere
the portion of the
is thecutting
first derivative,
blade whichwhichhadrepresents
been eroded theby variation
illuminationof thein graythe level, and wprocess
binarization is the increment along
was removed, which

the x
results in the
axis and
P" n ()
reduced thickness of the cutting blade, ( )
and
P n leads to the inaccurate results. The binary
image obtained byisthe themethod
second proposed
order derivativein this of paper with : the enlarged view of its tiny segment
along a reference line is shown in Figure 6c. P ( nIt+can
t ) −clearly
P ( n − tbe) , seen that the portion of the cutting blade
which had been eroded by illumination P ' ( n ) = at the time of binarization was included, which provides (5) an
w
accurate thickness of the cutting blade. The comparison provided in Figure 6 verifies the accuracy of
the proposed method in comparison P to"(the
n ) Otsu
= ( P method.
'( n )) ' . (6)

The curvature, k ( n ) , can be expressed in terms of P '( n ) and P "( n ) as given in Equation (7):

P "( n)
k ( n) = .
3
(7)
( P ' ( n ) + 1)
2
2

The result of the cutting blade detected by using the curvature-based image binarization method
is given in Figure 5. Figure 5a shows the ROI image of the cutting blade obtained in the previous
section. The line profile of the image of Figure 5a and the curvature value obtained using Equation
(7) and its feature points using the extreme value of curvature are shown in Figure 5b,c, respectively.
The binary image obtained using this method and a comparison of its gray value profile with the
original image are shown in Figure 5d,e, respectively.
The result of the cutting blade detected by using the curvature-based image binarization method
is given in Figure 5. Figure 5a shows the ROI image of the cutting blade obtained in the previous
Appl.
section. The Sci. 2019,
line 9, x FOR
profile ofPEER REVIEWof Figure 5a and the curvature value obtained using Equation 6 of 14
the image
(7) and its feature points using the extreme value of curvature are shown in Figure 5b,c, respectively.
The binary image obtained using this method and a comparison of its gray value profile with the6 of 14
Appl. Sci. 2019, 9, 3338
original image are shown in Figure 5d,e, respectively.

(d) (e)
Figure 5. (a) Acquired image; (b) gray values line profile; (c) curvature values line profile; (d)
Appl. Sci. 2019, 9, x FOR PEER REVIEW 6 of 14
proposed binary image; (e) comparison of the gray values of the acquired and binary images along
(a) (b) (c)
the defined line profile.

The original image of the cutting blade with the enlarged view of its tiny segment along a
reference line is shown in Figure 6a. It can clearly be seen that the image is eroded due to the influence
of the backlight illumination. The binary image obtained by the Otsu method with the enlarged view
of its tiny segment along a reference line is shown in Figure 6b. It can clearly be seen that the portion
of the cutting blade which had been eroded by illumination in the binarization process was removed,
which results in the reduced thickness of the cutting blade, and leads to the inaccurate results. The
binary image obtained by the method proposed in this paper with the enlarged view of its tiny
segment along a reference
(d) line is shown in Figure 6c. It can clearly(e) be seen that the portion of the
cutting blade which had been eroded by illumination at the time of binarization was included, which
FigureFigure
5. (a) 5.Acquired image;
(a) Acquired (b) (b)
image; gray values
gray valuesline
lineprofile; curvaturevalues
profile; (c) curvature valuesline
line profile;
profile; (d)
(d) proposed
provides an accurate thickness of the cutting blade. The comparison provided in Figure 6 verifies the
proposed binary
binary image;image; (e) comparison
(e) comparison ofgray
of the the gray
values values
of theofacquired
the acquired and binary
and binary images images
along along
the defined
accuracy of the proposed method in comparison to the Otsu method.
the defined line profile.
line profile.

The original image of the cutting blade with the enlarged view of its tiny segment along a
reference line is shown in Figure 6a. It can clearly be seen that the image is eroded due to the influence
of the backlight illumination. The binary image obtained by the Otsu method with the enlarged view
of its tiny segment along a reference line is shown in Figure 6b. It can clearly be seen that the portion
of the cutting blade which had been eroded by illumination in the binarization process was removed,
which results in the reduced thickness of the cutting blade, and leads to the inaccurate results. The
binary image obtained by the method proposed in this paper with the enlarged view of its tiny
segment along a reference line is shown in Figure 6c. It can clearly be seen that the portion of the
cutting blade which had (a)been eroded by illumination at (b) the time of binarization was included,(c) which
provides an accurate
Figure
Figure 6. 6. thickness
Images
Images of of
of the thethe
tiny cutting
tiny
segments ofblade.
segmentstheof The comparison
the cutting
cutting blade
blade for provided in Figure
for thickness
thickness 6 verifies
comparison
comparison for (a) the for thethe
(a)
acquired;
accuracy of(b)acquired;
the proposed method
(b) Otsu
Otsu method; andmethod; in comparison
and (c)
(c) proposed to the Otsu
proposed method.
method. method.

3.2.
3.2.Background
BackgroundExtraction
Extraction
InInthethebinary image,
binary image,the accurate extraction
the accurate extractionof the of cutting blade area
the cutting bladeis difficult
area is due to thedue
difficult existence
to the
ofexistence
fine noiseofaroundfine noise around the cutting blade and the mounting portion, which is used toblade.
the cutting blade and the mounting portion, which is used to fix the cutting fix the
Therefore,
cutting blade. for accurate
Therefore,detection of the cutting
for accurate detection blade, of the
themounted
cutting blade, part isthe
alsomounted
treated aspart
the is
background,
also treated
which is first extracted and then removed by using the morphological
as the background, which is first extracted and then removed by using the morphological filter. filter. The morphological filter
The
includes erosion, dilation, opening, and closing computations. The
morphological filter includes erosion, dilation, opening, and closing computations. The opening and opening and closing operations,
γ̃( f ) and φ̃( f ), can be given by using the Equation (8):
( ) ( )
closing operations, γ f and φ f , can be given by using the Equation (8):
(a) γ̃( f )((b)x) = δ(ε( f ))(x) (c)
γ (( ff)()(xx) )== εδ((δε( f())(
φ̃ f ) x)() x ) (8)
Figure 6. Images of the tiny segments of the cutting blade for ,
thickness comparison for (a) the (8)
φ ( f )( x ) = ε ( δ ( f ) )( x )
acquired;
where ε((b) f ) Otsu
and δmethod; and (c) proposed
( f ) represent erosion andmethod. dilation, respectively [13]. In this paper, the background
region is extracted
3.2. Background where
( )
ε f by usingδ the
Extraction
( )
f closing operation.
The cutting bladeand image obtainedrepresent
usingerosionthe proposedand dilation,
binarization respectively
method [13]. In this
is shown paper, 7a.
in Figure the
In background
Thethebackground region
binary image, image is extracted
theextracted by using
accuratebyextraction the
using theof closing
closing operation.
operation
the cutting blade is shown
area isindifficult
Figure 7b.dueThis image is
to the
also of
existence composed
fine noise of around
the mounted portionblade
the cutting of theand cutting blade. Theportion,
the mounting image ofwhichthe cutting
is usedblade obtained
to fix the by
cutting blade. Therefore, for accurate detection of the cutting blade, the mounted part is also treated
as the background, which is first extracted and then removed by using the morphological filter. The
morphological filter includes erosion, dilation, opening, and closing computations. The opening and
closing operations, γ f ( ) ( )
and φ f , can be given by using the Equation (8):
Appl.The cutting
Sci. 2019, blade
9, x FOR image
PEER obtained
REVIEW using the proposed binarization method is shown in Figure 7 of 14
7a. The background image extracted by using the closing operation is shown in Figure 7b. This image
is alsoThe cuttingofblade
composed image obtained
the mounted portion using the proposed
of the cutting binarization
blade. The image of method is shown
the cutting in Figure
blade obtained
7a. The
by removingbackground image
the9,background extracted by using the closing operation is shown in Figure
shown in Figure 7b from the image shown in Figure 7a is given 7b. This image
in7 of 14
Appl. Sci. 2019, 3338
is also composed of the mounted portion of the cutting blade. The image of the
Figure 7c. Various fine noises exist in this image, which can lead to inaccurate measurement. An cutting blade obtained
by removing
enlarged portion theofbackground
the cutting shown in Figure
blade near 7b fromportion,
the mounted the image shown
which in Figure
contains such 7afineisnoise,
given isin
removing
Figure 7c. the
Various
shown in Figure 7d. background
fine noises shown
exist in
in Figure
this 7b from
image, the
which image
can shown
lead to in Figure
inaccurate 7a is given
measurement. in Figure
An 7c.
Variousportion
enlarged fine noises
of theexist in thisblade
cutting image, which
near the can lead toportion,
mounted inaccurate measurement.
which An enlarged
contains such fine noise,portion
is
of the
shown in cutting blade near the mounted portion, which contains such fine noise, is shown in Figure 7d.
Figure 7d.

(a) (b) (c) (d)


Figure 7. Cutting blade detection using the morphological filter. (a) Binary image; (b) background
(a) (b) (c) (d)
region modeling; (c) background removed image; (d) enlarged view of the image with fine noise.
Figure 7. Cutting
Figure blade
7. Cutting detection
blade using
detection the the
using morphological filter.
morphological (a) Binary
filter. image;
(a) Binary (b) (b)
image; background
background
Inregion
order modeling;
region (c) these
tomodeling;
remove background
noises,removed
(c) background the image;
proposed
removed (d) enlarged
method
image; (d) usedview
enlarged theofopening
view the image
of the with finefine
operation.
image with noise.
The cutting
noise.
blade image after removing the noise using the opening operation is shown in Figure 8a. Usually, the
regions Inthe
In of
orderorder
to toand
remove
topremove these
these
bottom noises,
noises,
parts thethe
(near theproposed
proposed
mounted method
method used
portion used
thethe
and opening
opening
the near the operation.
operation. TheThe
tip) of the cutting
cutting
cutting
blade
blade imageimage
afterafter removing
removing thethe noise
noise using
using the the opening
opening operation
operation is shown
is shown in Figure
in Figure 8a. 8a. Usually,
Usually, the the
regions of the top and
N1 near the N2 tip) of the cutting blade
regions
blade areof the
more top
immuneand tobottom
bottom parts
noise.parts (nearthe
The (near
enlargedthe mounted
mounted
view of theportion
portion and
andthe
regions, theand
near the tip) ofenclose
, which the cutting
the
are more
mounted immune
portion and theto noise. Thecutting
tip of the enlargedbladeviewareofshown
the regions, N1 and
in Figure N1 Nrespectively.
8b,c, 2 , which
N2 enclose the mounted
However, in
blade are more immune to noise. The enlarged view of the regions,
portion and the tip of the cutting blade are shown in Figure 8b,c, respectively. However, in thisthe
and , which enclose noise
N2
mounted
this noise portion
removal removal
process,and the tip
process,
some of
data the
some
is cutting
data
lost isblade
in the Nlost are
in shown
2 region, the
which inresults
Figurein8b,c,
region, which respectively.
inaccurateresults in However,
inaccurate
information aboutin the
information
tip of theabout
cuttingtheblade.
tip of the
Some cutting blade.are
measures Some measures
needed N2 are
to restore needed
the lost to
tip restore
data inthe lost
order tip
to data
secure the
inthis noise
order to removal
secure the process,
accurate
accurate cutting blade image. some
cutting data
blade is lost
image. in the region, which results in inaccurate
information about the tip of the cutting blade. Some measures are needed to restore the lost tip data
in order to secure the accurate cutting blade image.

(a) (b) (c)


Figure 8. (a)8.
Fine
(a) noise removed imageimage
of theofcutting
Figure
(a) Fine noise removed
(b)blade;blade;
the cutting (b) enlarged view view
(b) enlarged of theofmounted
(c) portion;
the mounted portion;
(c) enlarged viewview
(c) enlarged of the
oftip
theshowing a lossa of
tip showing data.
loss of data.
Figure 8. (a) Fine noise removed image of the cutting blade; (b) enlarged view of the mounted portion;
3.3.
(c) Restoration
enlarged
3.3. Restoration of Lost
of view
Lost of theData
Data tip showing a loss of data.

In In paper,
this this paper, we proposed a new technique
that that restores the data ofcutting
the cutting blade’s tip which
3.3.was
Restoration
lost during of we
Lost proposed
theData
a new technique restores the data of the blade’s
fine noise filtering by comparing the region, N2 , in two images. For this purpose, a
tip which
N2 Figure 7c and image after data loss
was similar
In this
lost ROI
during paper,is defined
the we in the
fineproposed
noise atwo
newimages
filtering technique
by (image
comparing before
thatthe data
restores
region, theloss
data
, in of theimages.
two cuttingFor
blade’s tip which
this purpose,
Figure
a similar 8aisand
ROI a comparison
defined in the twoofimages
these two ROI’s
(image provides
before the lost
data loss Figuredata7cof tip,image
and whichafter
is than
datarestored).
loss
N2
was lost
This during
process the
of fine
lost noise
tip data filtering by
restoration comparing
is shown the
in region,
Figure 9. , in two
Figure 8a and a comparison of these two ROI’s provides the lost data of tip, which is than restored). images. For this purpose,
a similar
This ROI
The
process islost
ROI
of defined
intip
thedatain the two images
noise-removed
restoration (image
isimage
shown inbefore
with data
the lost
Figure 9.tiploss
dataFigure 7c and
is shown in image
Figureafter datastraight
9a. Two loss
Figure 8aaand
lines, a comparison
1 a2 and of these
b1 b2 , are drawn, two circumscribe
which ROI’s provides thethe lost data
cutting bladeofportion
tip, which is thaninrestored).
as shown Figure 9b in
Thistheprocess
defined of ROI.
lost tipBydata restoration
knowing is shown inofFigure
the coordinates points9.a1 (x1 , y1 ), a2 (x2 , y2 ), the line can be further
extended by using Equation (9). The same treatment is done for the points, b1 and b2 , and extend both
lines, and , are drawn, which circumscribe the cutting blade portion as shown in Figure
a1 ( x1 , y1 ) a2 ( x2 , y2 )
9b in the defined ROI. By knowing the coordinates of points , , the line can be
b1 b2
further extended by using Equation (9). The same treatment is done for the points, and , and
Appl.
extendSci. 2019,
both9,lines
3338 to point c where they meet each other. The coordinates of point c 8 ofbe
can 14

a a b b
determined in terms of the points, 1 , 2 , 1 and 2 , by using Equation (10):
lines to point c where they meet each other. The coordinates of point c can be determined in terms of
the points, a1 , a2 , b1 and b2 , by using Equation (10): y −y
y − y2 = 2 1 ( x − x1 ) , (9)
yx22 −
− xy11
y − y2 = ( x − x1 ) , (9)
x2 − x1
 a2 − b2 a −b !
−−aa2 −−bb2 ,, −a
− a1a2 2− b22 + a1a2  . (10)
 a1 1− b11 1 a1 − b1
a1 − b1
+ a1 2 .
a (10)

The area in the


the triangle
triangle a2b2c contains information about the lost data, which is to be restored
by comparison of thethe pixels
pixels of
of two
twoimages
imagesasasshown
shownininFigure
Figure9c.9c.The region, A
Theregion, A,, in
in Figure
Figure 9c
9c shows
the noise-removed image, which remains unchangedunchanged during
during the lost tip data restoration operation.
operation.
The region, B,Bon the
, on theother hand,
other hand,shows
showsthe
therestored
restoreddata
dataofofthe
thecutting
cuttingblade
bladetip
tipin
in Figure
Figure 9c,
9c, which
was lost
lost during
during noise
noise removal
removal process.
process. The proposed technique results in accurate detection of the
cutting blade without data loss.

(a) (b) (c)


Figure 9.
Figure 9. Lost
Lost tip
tip data
data restoration
restoration process.
process. (a)(a) ROI
ROI assignment
assignment to
to the morphological filtered
the morphological filtered image;
image;
boundary mapping
(b) boundary mapping using
using straight
straight lines;
lines; (c)
(c) restored
restored tip
tipdata
dataimage.
image.

3.4. Cutting Blade


3.4. Cutting Blade Rotation
Rotation Angle
Angle and
and Length
Length Measurement
Measurement
The
The commonly
commonly usedused bounding
bounding box box suffers
suffers from
from the
the disadvantage
disadvantage of of non-consideration
non-consideration ofof the
the
rotation angle of the object, which leads to inaccurate measurement of the length and thickness
rotation angle of the object, which leads to inaccurate measurement of the length and thickness of the of the
cutting
cutting blade.
blade.The
Theproblem
problem is addressed in this
is addressed paper
in this by using
paper the minimum
by using rotated rotated
the minimum rectanglerectangle
method
to measure
method the anglethe
to measure and length
angle andoflength
the cutting
of theblade.
cutting blade.
A
A rotated rectangular bounding box for finding the
rotated rectangular bounding box for finding the least
least circumscribed
circumscribed rectangle
rectangle is
is shown
shown inin
Figure
Figure 10.
10.
Appl. Sci. 2019, 9, x FOR PEER REVIEW 9 of 14

Figure10.
Figure Formationofofrectangle
10.Formation R0' bybyedge
rectangle R edgerotation.
rotation.

The difference, ΔA, between the area of rectangle, R , and the area of rectangle, R' , can be
determined by using the following method described by Freeman et al. [14]:
4 4

ΔA =  S − T
i i
. (11)
i =1 i =1
Appl. Sci. 2019, 9, 3338 9 of 14

The difference, ∆A, between the area of rectangle, R, and the area of rectangle, R0 , can be
determined by using the following method described by Freeman et al. [14]:

4
X 4
X
∆A = Si − Ti . (11)
i=1 i=1

Si and Ti can be given as follows:

1
Si = (qi,2 − qi+1,1 tan θ)2 sin θ cos θ, (12)
2
1
qi,1 2 tan θ.
Ti = (13)
2
By substituting Equations (12) and (13) in Equation (11) and simplifying:

4
 4 4

X
2 1 X
2
X
2

∆A = sin θ qi,2 qi+1,1 + sin θ cos θ qi,1 − qi,2 , (14)
2
i=1 i=1 i=1
 4 4

X X 
∆A = N1 + N2  q2i,1 − q2i,2 , (15)
i=1 i=1

where N1 and N2 are both positive quantities. ∆A is the amount by which the area of R0 is less than
that of R [15–17].
The input image for the detection of the cutting blade area is shown in Figure 11a. A commonly
used rectangular bounding box for the detection of the cutting blade is shown in Figure 11b. The cutting
blade does not lie in the perpendicular plane rather it is rotated by some angle, which makes the
measurements obtained through the commonly used rectangle inaccurate and unreliable due to the
reduced length and increased thickness of the cutting blade. The measurements can be accurate only if
the rectangular bounding box is also rotated by a similar angle to that of the cutting blade. A rotating
rectangular bounding box that accurately measures the length of the cutting blade and the angle of
rotation by means of the least sized bounding box that considers the rotation angle of the cutting blade
Appl. Sci. 2019, 9, x FOR PEER REVIEW 10 of 14
is shown in Figure 11c.

(a) (b) (c)


Figure
Figure 11. Detection
Detection of
of rotated
rotated cutting
cutting blade.
blade. (a)
(a)Acquired
Acquired image;
image; (b)
(b) generalized
generalized bounding
bounding box
box used
used
for
for image detection; (c) minimum rotated rectangle bounding box for image detection.

The thickness
The thickness and
and length
length of
of the
the cutting
cutting blade
blade can
can also be measured by using the minimum
However, if some
rotated rectangle method. However, some fine
fine noise
noise is
is present
present along
along the
the edges
edges of
of the
thecutting
cuttingblade,
blade,
the outermost boundary of the bounding box is unable to exclude such noise, which results in the
the outermost boundary of the bounding box is unable to exclude such noise, which results in
slightly inaccurate
inaccuratethickness
thicknessmeasurement.
measurement. The problem
The problem is resolved by using
is resolved the optimized
by using line fitting
the optimized line
algorithms
fitting for thickness
algorithms measurement.
for thickness measurement.

3.5. Cutting Blade Thickness Measurement


To minimize the impact of this fine noise which is present along the edges of the rotated cutting
blade, the thickness of the cutting blade was measured by dividing the left and right sides with
respect to the center coordinates of the cutting blade to find the optimized straight lines for each of
its edge as shown in Figure 12.
The thickness and length of the cutting blade can also be measured by using the minimum
rotated rectangle method. However, if some fine noise is present along the edges of the cutting blade,
the outermost boundary of the bounding box is unable to exclude such noise, which results in the
slightly inaccurate thickness measurement. The problem is resolved by using the optimized line
fitting algorithms
Appl. for thickness measurement.
Sci. 2019, 9, 3338 10 of 14

3.5. Cutting Blade Thickness Measurement


3.5. Cutting Blade Thickness Measurement
To minimize the impact of this fine noise which is present along the edges of the rotated cutting
To minimize the impact of this fine noise which is present along the edges of the rotated cutting
blade, the thickness of the cutting blade was measured by dividing the left and right sides with
blade, the thickness of the cutting blade was measured by dividing the left and right sides with respect
respect to the center coordinates of the cutting blade to find the optimized straight lines for each of
to the center coordinates of the cutting blade to find the optimized straight lines for each of its edge as
its edge
shownas shown in12.
in Figure Figure 12.

Figure
Figure Least square
12. square
12. Least method-based
method-based line fitting
line fitting imageimage
of theofcutting
the cutting
blade blade for accurate
for accurate thickness
thickness measurement.
measurement.
The line fitting was performed by using the least square method, which is an approximation
The line fitting
technique to solvewas the performed by using
equations. For the least
the n number of square
points with method, which isvariables
xi dependent an approximation
and yi
independent variables with (1 ≤ i ≤ n), a model
technique to solve the equations. For the n number of points with xi dependent variables
function, f ( x, β ) , can be defined, where β is a vector yi
and
with β = (β0 , β1 , . . . , βk ), then the equation for least square method can be given as follows:

independent variables with


(1≤ i ≤ n) , a model
X function, 2f ( x, β ) , can be defined, where β is a
N= ( yi − f (xi , β)) . (16)
β = ( β0 , β1 ,..., βk ) i
vector with , then the equation for least square method can be given as follows:
The distance, d, between a point (x1 , y1 ) on the one straight line and another straight line,

( yi − f (equation:
xi , β ) ) .
2
ax + bx + c = 0, can be calculated using = following
N the (16)
i

ax1 + by1 + c

The distance, d , between a point 1


(dx=, y )√aon
1 2 + b2
, (17)
the one straight line and another straight line,
ax + where
bx + c d=corresponds
0 , can be calculated usingofthe
to the thickness thefollowing equation:
cutting blade.

3.6. Abnormalities’ Inspection in the Cutting Blade


Using the proposed algorithm, the rotation angle, thickness, and length of the cutting edge of the
ultrasonic cutting machine can be measured to identify any abnormality or the absence of the cutting
blade and robot arm movements. An alarm is triggered on the equipment for the inspector to take
action in case of an automatic equipment malfunction, which may prevent damage to the original
material or equipment. If the length or thickness of the measured cutting blade differs from the actual
value, it can determine whether the cutting blade is broken or bent. In the case of a robot arm, it
shall always be in the set position; however, if the turning angle of the cutting blade is out of range,
it is possible to determine that the robot arm is abnormal, or the cutting blade is in poor condition.
A cutting blade that was detected by using the proposed method is shown in Figure 13. This detected
image has severe left and right rotation angles as shown in Figure 13a,b, respectively. Thus, it was
confirmed that the proposed method can detect the cutting blade with large rotational angles in a
range of 0◦ to 90◦ .
range, it is possible to determine that the robot arm is abnormal, or the cutting blade is in poor
condition. A cutting blade that was detected by using the proposed method is shown in Figure 13.
This detected image has severe left and right rotation angles as shown in Figure 13a,b, respectively.
Thus, it was confirmed that the proposed method can detect the cutting blade with large rotational
Appl. Sci. 2019, 9, 3338 11 of 14
angles in a range of 0° to 90°.

(a) (b)
Figure 13. The large rotational images. (a) Left rotated; (b) right rotated.
Figure 13. The large rotational images. (a) Left rotated; (b) right rotated.
4. Experimental Results and Discussion
4. ExperimentalThe
Results and
images of Discussion
the cutting blade of the ultrasonic cutting machine which were used in this paper
were acquired using a 5 MP mono camera made by Basler. The acquired images were gray image with
The images
2448 × of the
2048 cutting
pixels. bladewere
The images of the ultrasonic
processed by usingcutting machine
the proposed methodwhich
at a PC were used
with Intel in this paper
Core
were acquiredi5 CPUusing
and 16a GB
5 MPRAMmono camera made
(Intel Corporation, by Basler.
Santa Clara, CA, USA).The acquired images were gray image
A step-by-step comparison of the cutting-edge
with 2448 × 2048 pixels. The images were processed by using the measurement algorithms
proposed of the commonly
method atused
a PC with Intel
Otsu method (O) and the proposed curvature-based binarization method (P) is given in Figure 14.
Core i5 CPUThe and 16 GB RAM (Intel Corporation, Santa Clara, CA, USA).
original image and the ROI image of the cutting blade are shown in Figure 14a,b, respectively.
A step-by-step
The binarycomparison
images obtained ofbythe
thecutting-edge measurement
existing Otsu method algorithms
and the proposed methodof arethe
showncommonly
in used
Otsu method (O)14c.
Figure andThe themodeling
proposed curvature-based
of mounting portion as thebinarization
background area method (P) isimage
in the binary givenand in Figure 14.
the detected cutting blade images after removing the mounted portion are shown in Figure 14d,e,
The original image and the ROI image of the cutting blade are shown in Figure 14a,b, respectively.
respectively. The images after noise removal by using the morphological filter and the images after tip
The binary restoration
images obtained
(which wasby lostthe existing
during the noiseOtsu method
removal process)and the proposed
as obtained by the twomethod
methods areare shown in
Figure 14c. shown
The modeling of respectively.
in Figure 14f,g, mounting portion as the background area in the binary image and the
detected cutting Theblade
images images
for the detection
after of the length and
removing therotation
mountedangle of the cutting
portion areblade by using
shown in the
Figure 14d,e,
minimum rotated rectangle and the images for the detection of the thickness of the cutting blade
respectively. The images after noise removal by using the morphological filter and the images after
by using the least square method-based line fitting technique for the two methods are shown in
tip restoration (which
Figure was lost during the noise removal process) as obtained by the two methods are
14h,i, respectively.
shown in Figure The performance
14f,g, comparison of the Otsu method and proposed curvature-based binarization
respectively.
method is provided in Table 1.
The images for the detection of the length and rotation angle of the cutting blade by using the
The comparison provided in Table 1 verifies that the results obtained by using the proposed
minimum rotated rectangle
curvature-based and themethod
binarization images are for
morethe detection
accurate of the
and closer thickness
to the of the cutting
actual measurements in blade by
using the least squaretomethod-based
comparison the Otsu method. line fitting
The mean technique
error in thicknessfor the two methods
measurement is 0.16 mm byaretheshown
Otsu in Figure
method while it is within 0.004 mm using the proposed method. Similarly, the mean error in length
14h,i, respectively.
measurement is 0.3 mm by the Otsu method while it is within 0.02 mm using the proposed method.
Appl. Sci. 2019, 9, 3338 12 of 14
Appl. Sci. 2019, 9, x FOR PEER REVIEW 12 of 14

(a) (b) (c)

(d) (e) (f)

(g) (h) (i)

FigureFigure 14. Comparison


14. Comparison of theofcutting
the cutting
bladeblade detection
detection algorithms.
algorithms. (a) Acquired
(a) Acquired image;
image; (b) extracted
(b) extracted
ROI; ROI; (c) binary
(c) binary images;
images; (d) background
(d) background modeling
modeling images;
images; (e) background
(e) background removed
removed images;
images; (f) noise
(f) noise
filtered images; (g) ROI detection images for tip restoration; (h) minimum rotated
filtered images; (g) ROI detection images for tip restoration; (h) minimum rotated rectangular bounding rectangular
bounding
box images; (i) box
line images; (i) line fitting images.
fitting images.

1. Aperformance
TableThe performance comparison
comparison between the Otsu
of the Otsu method
method andand the proposed
proposed curvature-based
curvature-based binarization
binarization method for the cutting
method is provided in Table 1. blade dimensions’ measurement.
The comparison provided in Table 1 verifies that the results
Thickness obtained by using the proposed
0.6 mm
Original
curvature-based binarization method are more accurate and closer to the actual measurements in
Length 25 mm
comparison to the Otsu method. The mean error in thickness measurement is 0.16 mm by the Otsu
methodReal
while it is within 0.004Angle Otsu
mm using theMethod
proposed[mm] Proposal
method. Method
Similarly, the [mm]
mean error in length
Angle Measured
Thickness Length Thickness Length
measurement is 0.3 mm by the Otsu method while it is within 0.02 mm using the proposed method.
0◦ 0◦ 0.4290 24.6400 0.6006 24.9550
Table2.20 ◦
1. A performance
2.186 ◦
comparison between
0.4573 the Otsu method and
24.7071 the proposed
0.6002 curvature-based
24.9889
3.40◦ method for3.367
binarization
◦ 0.4425
the cutting blade 24.7378
dimensions’ 0.6138
measurement. 25.0209
−2.4◦ −2.444◦ 0.4286 24.6843 0.6000 24.9585
Mean Thickness
0.4393 24.6923 0.6036 0.6 mm
24.9808
Original
Standard deviation Length0.00169
0.00018 0.00004 25 mm
0.00094
Otsu Method [mm] Proposal Method [mm]
Real Angle Measured Angle
Thickness Length Thickness Length
0° 0° 0.4290 24.6400 0.6006 24.9550
2.20° 2.186° 0.4573 24.7071 0.6002 24.9889
Appl. Sci. 2019, 9, 3338 13 of 14

5. Conclusions
In this paper, a machine vision-based measurement system was proposed for accurate measurement
of the rotated angle, length, and thickness of the cutting blade of an ultrasonic cutting machine.
A multi-step detection algorithm was proposed to designate a region of interest in the captured image
for the detection of the cutting blade. The binarization of the designated ROI of the image was
performed by proposing a curvature-based adaptive binarization method instead of the commonly
used Otsu method. The mounted portion of the cutting blade was also modeled as the background
area and was removed using morphological computation from the binary image for accurate detection
of the cutting blade. Then, the morphological filter was also used to remove the remaining fine noise
from the background-removed image. The tip data of the cutting blade, which was lost in the noise
removal process, was restored by defining the new ROI around the tip area of the cutting blade in the
images before and after the noise removal. Measurement error introduced due to the malfunctioning
of the robot arm was addressed by using the minimum rotated rectangle method for the measurement
of the rotated angle and length of the cutting blade while the least square line fitting method was used
to measure the thickness of the cutting blade accurately.
The measurement error in the thickness and length of the cutting blade with different rotation
angles of the proposed curvature-based binarization method was found to be considerably lower than
that of the Otsu method.

Author Contributions: T.-G.K., B.-J.Y. and H.D.K. conceived and designed the experiments; T.-G.K. performed
the experiments; T.-G.K., B.-J.Y. and H.D.K. analyzed the data; T.-G.K. and S.F.A. contributed analysis tools; T.-G.K.
and S.F.A. wrote the paper.
Acknowledgments: This research was financially supported by the industrial infrastructure program of Ministry
of Trade Industry and Energy Republic of Korea (Establishment of infra-structure for laser industry support).
This research was supported by Basic Science Research Program through the National Research Foundation of
Korea (NRF) funded by the Ministry of Education (NRF-2018R1D1A1B07040457).
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Seo, J.S.; Lee, Y.J.; Kim, J.W.; Park, D.S. Design Improvement and Performance Evaluation of 20kHz Horn for
Ultrasonic Cutting. J. Korean Soc. Manuf. Process Eng. 2013, 12, 135–140.
2. Ji, S.; Yuan, J.; Wan, Y.; Zhang, X.; Bao, G.; Zhang, L.; Zhang, L. Method of monitoring wearing and breakage
states of cutting tools based on Mahalanobis distance features. J. Mater. Process. Technol. 2002, 129, 114–117.
[CrossRef]
3. Furferi, R.; Governi, L.; Puggelli, L.; Servi, M.; Volpe, Y. Machine Vision System for Counting Small Metal
Parts in Electro-Deposition Industry. Appl. Sci. 2019, 9, 2418. [CrossRef]
4. Wen, S.; Chen, Z.; Li, C. Vision-Based Surface Inspection System for Bearing Rollers Using Convolutional
Neural Networks. Appl. Sci. 2018, 8, 2565. [CrossRef]
5. Jung, Y.; Park, K.-H. O-ring Size Measurement Based on a Small Machine Vision Inspection Equipment.
J. Korea Ind. Inf. Syst. Res. 2014, 19, 41–52.
6. Lee, J.-J.; Lee, K.-H.; Chung, C.-D.; Park, K.-H.; Park, Y.-B.; Lee, B.-G. Pattern Elimination Method Based on
Perspective Transform for Defect Detection of TFT-LCD. J. Korea Multimed. Soc. 2012, 15, 784–793. [CrossRef]
7. He, J.; Shi, L.; Xiao, J.; Cheng, J.; Zhu, Y. Size Detection of Firebricks Based on Machine Vision Technology.
In Proceedings of the 2010 International Conference on Measuring Technology and Mechatronics Automation,
Changsha, China, 13–14 March 2010; pp. 394–397.
8. Xiang, R.; He, W.; Zhang, X.; Wang, D.; Shan, Y. Size measurement based on a two camera machine vision
system for the bayonets of automobile brake pads. J. Int. Meas. Confed. 2018, 122, 106–116. [CrossRef]
9. Choi, H.H.; Yun, B.J. Contrast Enhancement for the Captured Image by Estimating Local Illumination.
Opt. Rev. 2011, 18, 389–393. [CrossRef]
10. Lee, H.Y.; Kim, T.H.; Park, K.H. Target Extraction in Forward-Looking Infrared Images Using Fuzzy
Thresholding via Local Region Analysis. Opt. Rev. 2011, 18, 383–388. [CrossRef]
Appl. Sci. 2019, 9, 3338 14 of 14

11. Otsu, N. A Threshold Selection Method from Gray Level Histograms. IEEE Trans. Syst. Man Cybern.
1979, 9, 62–66. [CrossRef]
12. Boccardi, S.; Carlomagno, G.M.; Simeoli, G.; Russo, P.; Meola, C. Evaluation of impact-affected areas of glass
fibre thermoplastic composites from thermographic images. Meas. Sci. Technol. 2016, 27, 075602. [CrossRef]
13. Kim, T.-H.; Kim, S.-W.; Ryu, C.-H.; Yun, B.-J.; Kim, J.-H.; Choi, B.-J.; Park, K.-H. ECG Signal Compression
using Feature Points based on Curvature. J. Korea Inst. Intell. Syst. 2010, 20, 624–630. [CrossRef]
14. Lamiroy, B.; Gaucher, O.; Fritz, L. Robust Circle Detection. In Proceedings of the 9th International Conference
on Document Analysis and Recognition, Parana, Brazil, 23–26 September 2007; pp. 526–530.
15. Freeman, H.; Shapira, R. Determining the Minimum-Area Encasing Rectangle for an Arbitrary Closed Curve.
Commun. ACM 1974, 18, 409–413. [CrossRef]
16. Fischler, M.A.; Bolles, R.C. Random Sample Consensus: A Paradigm for Model Fitting with Applications to
Image Analysis and Automated Cartography. Commun. ACM 1981, 24, 381–395. [CrossRef]
17. Lei, Z.; Gao, J.; Wang, Z. Size Measurement Technology of Seals. In Proceedings of the 2010 3rd International
Conference on Computer Science and Information Technology, Chengdu, China, 9–11 July 2010; pp. 222–225.

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like