You are on page 1of 20

sensors

Article
A Weld Joint Type Identification Method for Visual
Sensor Based on Image Features and SVM
Jiang Zeng, Guang-Zhong Cao * , Ye-Ping Peng and Su-Dan Huang
Shenzhen Key Laboratory of Electromagnetic Control, College of Mechatronics and Control Engineering,
Shenzhen University, Shenzhen 518060, China; zengjiang525@163.com (J.Z.); yeping.peng@szu.edu.cn (Y.-P.P.);
hsdsudan@gmail.com (S.-D.H.)
* Correspondence: gzcao@szu.edu.cn; Tel.: +86-0755-2653-4865

Received: 18 November 2019; Accepted: 6 January 2020; Published: 14 January 2020 

Abstract: In the field of welding robotics, visual sensors, which are mainly composed of a camera
and a laser, have proven to be promising devices because of their high precision, good stability, and
high safety factor. In real welding environments, there are various kinds of weld joints due to the
diversity of the workpieces. The location algorithms for different weld joint types are different, and
the welding parameters applied in welding are also different. It is very inefficient to manually change
the image processing algorithm and welding parameters according to the weld joint type before each
welding task. Therefore, it will greatly improve the efficiency and automation of the welding system
if a visual sensor can automatically identify the weld joint before welding. However, there are few
studies regarding these problems and the accuracy and applicability of existing methods are not
strong. Therefore, a weld joint identification method for visual sensor based on image features and
support vector machine (SVM) is proposed in this paper. The deformation of laser around a weld joint
is taken as recognition information. Two kinds of features are extracted as feature vectors to enrich
the identification information. Subsequently, based on the extracted feature vectors, the optimal
SVM model for weld joint type identification is established. A comparative study of proposed and
conventional strategies for weld joint identification is carried out via a contrast experiment and a
robustness testing experiment. The experimental results show that the identification accuracy rate
achieves 98.4%. The validity and robustness of the proposed method are verified.

Keywords: weld joint type identification; image feature extraction; visual sensor; support vector
machine (SVM)

1. Introduction
Robotic welding technology is an important indicator on which to measure the technical development
of the welding industry in today’s highly developed environment. Currently, the two most common
operating modes for welding robots, namely, the teaching mode and the off-line programming mode,
do not depend on sensor measurements during welding; the welding trajectories are set in advance
by workers, and the robot moves in accordance with the desired trajectory. These two modes are
suitable for use in a standardized, modular, strictly coordinated welding system [1–4]. However,
in actual welding operations, the welding environment might not be static. Therefore, these two
modes do not offer sufficient flexibility and robustness to handle such a complex and dynamic welding
environment. The teaching mode and off-line programming mode require a lot of time to teach and
preprogram each time the workpiece is replaced; therefore, with the rapid improvement in sensor
technology and integration technology, intelligent welding robots that can overcome the difficulties that
are encountered when operating welding robots in the teaching and off-line programming modes are
emerging. An intelligent welding robot uses external sensors to perceive its environment and detect

Sensors 2020, 20, 471; doi:10.3390/s20020471 www.mdpi.com/journal/sensors


Sensors 2020, 20, 471 2 of 20

the weld position. In the past, arc and contact sensors were two of the most widely applied types of
sensors in robot sensing models [5,6]. Visual sensing is widely used presently in the field of robotic
welding because of its high accuracy and non-contact nature [7–9]. Furthermore, it could obtain abundant
information regarding the welding environment. Currently, most visual sensors are composed of a
camera, an auxiliary laser, a filter, and a partition. In general, the auxiliary laser is a line laser. Research
that is related to visual sensors typically focuses on image preprocessing [10–12], weld joint contour
extraction [13–15], and weld feature point extraction [16–18].
The recognition of the weld joint is the most important part of weld visual sensors. In real
industrial environments, many types of weld joints are encountered, including fillet weld joints,
butt weld joints, and v-type weld joints, due to the diversity of the workpieces. Many scholars
have proposed corresponding single weld joint recognition methods for locating the position of the
weld joint [10–18]. For example, Fan et al. considered feature extraction for butt weld joints [17],
Zou et al. considered feature extraction for the recognition of lap weld joints [18], and Fang considered
feature extraction for fillet weld joints [19]. These methods need to recognize only one kind of weld
joint due to the single weld joint environment. However, in some welding environments, such as
bridge structure welding environments, there might be many kinds of weld joint types. Different
weld joints have different welding currents, welding voltages, welding torch swing methods, and
welding speeds. Therefore, the ability to recognize multiple types of weld joints is necessary in a
real welding environment. In recent years, some researchers have investigated multi-type weld joint
recognition. Li et al. used the spatial composition relationship of the line segment elements and the
junctions of weld joints for weld joint classification [20]. First, the relationship between the line segment
elements and the junctions is defined; then, weld joint recognition is performed in accordance with this
characteristic composition relationship. This method is feasible, but it has difficulty in recognizing
different weld joints with similar compositional characteristics. Qian et al. proposed a two-step method
for multi-type weld joint recognition [21]. First, the position of the laser curve is found and then the
weld type is recognized. This method identifies the weld joint type by considering the correlations
among feature points However, while using this method, a series of experiments must be performed to
choose threshold values to distinguish different weld joints, which is complex and impractical. Li et al.
proposed a method that was based on the Hausdorff distance and template matching to recognize
different weld joint types [22]; its computational cost is 1.17s and it lacks adaptability. Fan et al.
proposed a method for establishing a support vector machine (SVM) model by using the distances from
the weld joint ends to the bottom of the weld to form the feature vector [23]; it has a better recognition
accuracy and less computational cost than other methods, possibly because it uses SVM to build the
recognition model.
SVM is a machine learning algorithm that is widely used for classification. This algorithm is very
suitable for problems that involve small sample sizes and it has a relatively low computational cost.
Additionally, the SVM approach is also suitable for solving nonlinear, high-dimensional problems
because its use of a kernel function. SVM was applied in [23]. However, for two similar weld joints, the
recognition results are poor because the feature vector only has one kind of feature and, thus, does not
have enough recognition information. Therefore, the development of a weld joint recognition system
that has high recognition accuracy and low computational cost and is suitable for multiple types of
weld joints is necessary.
In this paper, a weld joint type identification method for visual sensor based on image features
and SVM is proposed. Two different types of features are extracted from the weld joint image to
improve the recognitions accuracy. SVM is used to build a model for weld joint type recognition to
reduce the computational cost and improve the accuracy of recognition, and five-fold cross-validation
is employed to find the optimal parameters of the SVM model.
Sensors 2020, 20, 471 3 of 20

The main contribution of this paper is three-fold, as follows:


Sensors 2020, 20, x FOR PEER REVIEW 3 of 20

(1) A weld joint type identification method that is based on the deformation information of a laser
(2) A weld joint type identification model that is based on an optimal SVM is constructed to identify
curve in the image is proposed to improve the welding system adaptability and automation degree.
various kinds of weld joints.
(2) A weld joint type identification model that is based on an optimal SVM is constructed to identify
(3) A comprehensive study on the feature extraction, recognition, experimental comparison, and
various kinds of weld joints.
discussion of weld joint images is performed to handle the weld joint recognition task of the arc
(3) A comprehensive study on the feature extraction, recognition, experimental comparison, and
welding systems.
discussion of weld joint images is performed to handle the weld joint recognition task of the arc
The remaining
welding sections of this paper are organized, as follows. Section 2 introduces the visual
systems.
tracking sensor used in this paper and the feature analysis of various welding joint images. In Section
3, theThe remaining
weld sections
joint image of this
feature paper are
extraction organized,
method used inas this
follows.
paperSection 2 introduces
is proposed. the visual
In Section 4, an
tracking sensor used in this paper and the feature analysis of various welding joint
SVM-based weld joint recognition model is built, and the weld joint type recognition method images. In Section 3,
the weld joint image feature extraction method used in this paper is proposed. In Section 4, an
proposed in this paper is introduced. Section 5 discusses the experimental. Finally, the conclusions SVM-based
weld
of thisjoint recognition
paper model is built, and the weld joint type recognition method proposed in this paper
are provided.
is introduced. Section 5 discusses the experimental. Finally, the conclusions of this paper are provided.
2. Visual Sensor and Image Features Analysis of Weld Joints
2. Visual Sensor and Image Features Analysis of Weld Joints
2.1. Visual Sensor
2.1. Visual Sensor
All of the weld joint images that were considered in this paper were obtained while using our
All of the weld joint images that were considered in this paper were obtained while using our
independently developed visual sensor, as shown in Figure 1.
independently developed visual sensor, as shown in Figure 1.

.
Figure 1. (Left) weld torch and visual sensor model; (Right) visual sensor.
Figure 1. (Left) weld torch and visual sensor model; (Right) visual sensor.
The visual sensor is equipped with a complementary metal-oxide-semiconductor (CMOS) camera,
The visualofsensor
the parameters which are is equipped
shown in Table with 1.a Thecomplementary
main functionmetal-oxide-semiconductor
of the laser is to project a laser (CMOS)
curve
camera,
on the parameters
the workpiece of which
for features are shown
extraction and in Table 1. The main
three-dimensional function
(3D) of the laser
reconstruction. Theis partition
to projectisa
laser curve
designed to on the workpiece
isolate for features
splash to reduce the noise extraction and three-dimensional
in the image. The laser wavelength (3D)that
reconstruction.
was used in thisThe
partition
paper is designed
as 650 nm andto theisolate
laser splash
powerto wasreduce
30 mW.the noise in the image.
The adjustable widthTheof laser
the wavelength
fringes ranges thatfrom
was
used
0.5 mm in to
this
2.5paper
mm. The as 650 nm and the
penetration laser is
of filters power
80%; was 30 mW.
the filters The
filter adjustable
splash width
and light. Theoffocus
the fringes
length
ranges
is 12 mm. fromThe 0.5distance
mm to 2.5 mm. The
between camerapenetration
and welding of filters
torchisis80%;
200 the
mm. filters filterwelding
In a real splash and light. The
environment,
focus
the lengthofisthe
posture 12 mm.
visualThe distance
sensor will between
be adjusted camera andtime
in real welding
to keeptorch is 200 mm. In with
it perpendicular a realthe
welding
weld
environment,
joint, so that the the posture
laser curveofisthe visual
vertical insensor
the weld will be adjusted
joint image, that in is,
realthetime to keep
visual sensorit is
perpendicular
oriented the
with the
same wayweld joint,the
relative sopath,
that the
evenlaser
for curve
curved is paths.
vertical in the weld joint image, that is, the visual sensor
is oriented
The visual the same
sensorway relative
is used the path,
to capture theeven for curved paths.
two-dimensional coordinates of weld feature point in the
pixel The visual sensor
coordinate system,istoused
mapto capture
it to the two-dimensional
robot base coordinate system coordinates
through a seriesof weld feature point in
of transformations
the pixel
matrix. coordinate
Figure 2 showssystem, to map
the coordinate it to robot base
transformation model. coordinate system through a series of
transformations matrix. Figure 2 shows the coordinate transformation model.

Table 1. Camera parameters.

Parameter Value
Resolution 640 × 480 (pixel)
Frame rate 40 fps
Sensors 2020, 20, 471 4 of 20

Table 1. Camera parameters.

Parameter Value
Sensors 2020, 20, x FOR PEER REVIEW 4 of 20
Resolution 640 × 480 (pixel)
Frame rate 40 fps
Sensor type Sensor type CMOS CMOS
Pixel size Pixel size 6 µm × 6×µm
6 μm 6 μm

Figure2.2.Coordinate
Figure Coordinate transformation
transformationmodel.
model.

OW XOWWYXWYZWZWisisworld
W W
world coordinate
coordinatesystem, system, OIXO IYI is pixel coordinate system, OCXCYCZC is camera
I XI YI is pixel coordinate system, OC XC YC ZC is
coordinate system, O HXHYHZH is robot welding torch coordinate system, ORXRYRZR is robot base
camera coordinate system, OH XH YH ZH is robot welding torch coordinate system, OR XR YR ZR is robot
coordinate system. OBXBYBZB is workpiece coordinate system, Te is hand-eye matrix relating
base coordinate system. OB XB YB ZB is workpiece coordinate system, Te is hand-eye matrix relating
OCXCYCZC to OHXHYHZH, as obtained by hand-eye calibration [24], T6 is transformation matrix
OC XC YC ZC to OH XH YH ZH , as obtained by hand-eye calibration [24], T6 is transformation matrix
between OHXHYHZH and ORXRYRZR, as obtained by the robot controller, Min is the transformation
between OHbetween
matrix XH YH ZO and
HCX O XR YORIX
CYCZCRand
ZIRY,I,as obtained
obtained via by camerathe robot calibration [25], ΠM
controller, 1 is
is the
inthe transformation
imaging plane,
matrix between
P is OCprojection
one of the XC YC ZC and
pointsOIof
XIthe
YI , laser
obtainedon the via camera
surface calibration
of the welding [25], Π1 is the
work-piece, andimaging
Pi (XI, YI)plane,
P is one of the
is the projection
image points
point that of the laser
corresponds to Pon in the
thesurface
pixel coordinateof the welding system.work-piece, and Pcan
The laser plane I , YI ) is
i (Xbe
the image point that
represented corresponds
by Ax + By + Cz = 1.toItPcan
in the pixel coordinate
be calculated by lasersystem. The laser[26].
plane calibration plane can be represented
Therefore, the 3D
+ By +
by Axcamera Cz = 1. It
coordinate of can
P (Xbe
C, Ycalculated
C, ZC) can be represented
by laser plane bycalibration
Pi, camera intrinsic parametersthe
[26]. Therefore, and3D laser
camera
plane, as follows
coordinate of P (XC , YC , ZC ) can be represented by Pi , camera intrinsic parameters and laser plane,
as follows  kky ((X
X I−X − X)I 0 )
 XXcc == Ak (X −X )+ y I I0
Ak y (yX I I− XI0I 0 ) +BkBk I − ) +xCk

 x (Yx (IY
−Y I0Y
)+I 0 Ck ky x k y




 kx (YI −YI0 )
 c
Y = Ak y (XI −XI0 k YI x−(YYII−Y
x (Bk 0 ) I0 )+Ckx k y
(1)
 Yc =
 )+
 (1)
− + x k y (Y − Y ) + Ck k


 Ak
 Zc = y I ( X X I0 ) kBk
x I I0 x y
 y Ak (X −X )+Bk (Y −Y )+Ck k
I I0 x I
kx k y
I0 x y
 Zc =
where XI , YI is coordinate of point Pi inAk
pixel coordinate system, Xc, Yc, Zc is the coordinate of point P
 y ( X I − X I 0 ) + Bk x (YI − YI 0 ) + Ck x k y
in camera coordinate system, the parameters kx = f ·mx , ky = f ·my represents the focus length in terms of
where
pixels, XI, m
where YIxisand
coordinate
my are of
thepoint
scalePi factors
in pixel that
coordinate
relate system,
pixels toXc,distance
Yc, Zc isand
the coordinate of point
f is the focal length in
P in camera coordinate system, the parameters kx = f·mx, ky = f·my represents the focus length in terms
terms of distance. XI0 and YI0 represent the principal point, which would be ideally in the center of
of pixels, where mx and my are the scale factors that relate pixels to distance and f is the focal length
the image.
in terms of distance. XI0 and YI0 represent the principal point, which would be ideally in the center of
Finally, we can establish the mapping matrix between a point in the image coordinate system and
the image.
the corresponding
Finally, wepoint in the robot
can establish base coordinate
the mapping system
matrix between through
a point theimage
in the coordinate transformation
coordinate system
model, that is:
and the corresponding point in the robot base coordinate system through the coordinate
transformation model, that is:    
 XR   XC
 X  R   X C 

 
 YR 
 Y =  T Te  YYC
 
 Z  R  6  ZC 
  (2)
 R  = T6Te  C

(2)
 Z   Z C 

1 R   1
 
 1   1 
Sensors 2020, 20, 471
x FOR PEER REVIEW 5 of 20
Sensors 2020, 20, x FOR PEER REVIEW 5 of 20

2.2. Image Features Analysis of Weld Joints


2.2. Image Features
2.2. Features Analysis
Analysis of of Weld
Weld Joints
Joints
In a real welding environment, based on the groove shape, the groove size, and the interval
In aa real
In real welding
welding environment,
environment, based based on on the
the groove
groove shape,
shape, the groove
groove size,
size, and
and the the interval
interval
between and relative positioning of the two welding work-pieces,the a weld joint can be classified as
betweenand
between andrelative
relativepositioning
positioningofof the the
twotwo welding
welding work-pieces,
work-pieces, a a weld
weld joint joint
can can
be be classified
classified as oneas
one of four types, i.e., lap weld joint, butt weld joint, fillet weld joint, and v-type weld joint, as shown
one of
of Figure four
four types, types, i.e.,
i.e., lapbe lap
weld weld
joint, joint, butt weld joint, fillet weld joint, and v-type weld joint, as shown
in 3. It should noted thatbutt
Figureweld 3b,cjoint,
both fillet
showweld joint,
a butt andjoint;
weld v-type weld joint,
therefore, we useas shown
the term in
in Figure
Figure 3. 3. should
It It should be be noted
noted that that Figure
Figure 3b,cboth
3b,c bothshowshowa abutt
buttweldweldjoint;
joint;therefore,
therefore,we weuseusethetheterm
term
“splice weld joint” in Figure 3c to differentiate them.
“spliceweld
“splice weld joint”in inFigure
Figure3c 3ctotodifferentiate
differentiatethem. them.
Figure 4joint” shows the weld joint images that were obtained under laser projection while using our
Figure
Figure 4 shows
4 shows the weld
theofweld joint images
joint images that were obtained
obtained under laser laser projection whilewhile using our our
visual sensor. The size the image is 640 ×that 480 were
pixels and the under thickness ofprojection
laser curve in theusing image is
visual sensor.
visual sensor. The size of the image is 640
640 ×× 480
480pixels and
andthe thickness of laser
lasercurve ininthe
theimage is
approximately 5–8 pixels in the weldisjoint image. pixels
The size the
of thickness
v-type joint of
work-piece curve samples image
is 800
approximately
is approximately 5–85–8pixels
pixels in in
thethe weld
weld joint
joint image.
image. TheThe sizesize
of v-type
of v-typejoint work-piece
joint work-piece samples
samples is 800
is
mm × 150 mm and the others are 600 mm × 150 mm. Figure 5 shows the corresponding grey value
mmmm × 150 mmmm andand
×distributions thethe others areare
600 mm × 150 mm. Figure 5 shows the corresponding grey value value
800
intensity 150 others
of Figure 4. 600 mm × 150 mm. Figure 5 shows the corresponding
intensity distributions
intensity distributions ofFigure Figure4.4.
An analysis of theofweld joint images and the grey value intensity distribution diagrams reveals
An
An analysis
analysis of
of the
the weld
weld joint images
joint images and and thethe grey
grey value
value intensity
intensity distribution
distribution diagrams
diagrams reveals
reveals
that the various weld images exhibit the following characteristics:
that the
that the various
various weld weldimages
imagesexhibit
exhibitthe thefollowing
followingcharacteristics:
characteristics:
(1) The distinguishing features of a weld joint image are only related to the laser stripe in the image.
(1) The
(1) Thedistinguishing
distinguishingfeatures features of of aa weld
weld joint
joint image
image areare only
only related
related to the laser stripe in the image.
It is difficult to recognize a weld joint that is based on another part of the image, except for the
ItItisisdifficult
difficult to to recognize
recognize aa weld weld joint
joint that
that is based on another part of the the image,
image, except
except forfor the
the
part near the laser stripe.
part
partnear nearthe thelaser
laserstripe.
stripe.
(2) The laser stripe in the weld joint image has the characteristics of discontinuity, grey value change,
(2) The
(2) laser
laserstripe
Thedeformation. stripein the
inFortheweld
weldjoint image
jointweld
image has
hasthe
thecharacteristics
characteristics of discontinuity, grey value change,
and different joints types, the degrees of ofdiscontinuity,
deformation and the change of
and
and deformation.
deformation. For
For different
different weld
weld joints
joints types,
types, the
the degrees of deformation and the the change
change of of
grey value of the laser stripe in the image also differ.
grey value of the laser stripe in the
grey value of the laser stripe in the image also differ. image also differ.
According to the above analysis, the abrupt differences of laser stripe at weld joints in image can
According
Accordingto to the
the above
above analysis,
analysis, the the abrupt
abrupt differences
differencesof oflaser
laserstripe
stripeat atweld
weldjoints
jointsin in image
image can can
be used as features to describe and, thus, recognize different weld joints.
be
be used
used as as features
featuresto todescribe
describeand, and,thus,
thus,recognize
recognizedifferent
differentweld
weldjoints.
joints.

Figure 3. Weld joint types: (a) lap weld joint, (b) butt weld joint, (c) splice weld joint, (d) fillet weld
Figure 3.3. Weld
Figure Weld joint
joint types:
types: (a)
(a) lap
lap weld
weld joint,
joint, (b)
(b) butt
butt weld
weld joint,
joint, (c)
(c) splice
splice weld
weld joint,
joint, (d)
(d) fillet
fillet weld
weld
joint, and (e) v-type weld joint.
joint,
joint, and
and (e)
(e)v-type
v-typeweld
weldjoint.
joint.

Figure 4.
Figure 4. Weld
Weldjoint
jointimages
imageswith
withlaser stripe:
laser (a)(a)
stripe: laplap
weld joint,
weld (b) (b)
joint, buttbutt
weldweld
joint,joint,
(c) splice weld weld
(c) splice joint,
Figure
(d) fillet 4. Weld
weld joint
joint, andimages
(e) withweld
v-type laser joint.
stripe: (a) lap weld joint, (b) butt weld joint, (c) splice weld
joint, (d) fillet weld joint, and (e) v-type weld joint.
joint, (d) fillet weld joint, and (e) v-type weld joint.
Sensors 2020, 20,
Sensors 2020, 20, 471
x FOR PEER REVIEW 66 of
of 20
20
Sensors 2020, 20, x FOR PEER REVIEW 6 of 20

Figure 5. Grey value intensity of weld joint: (a) lap weld joint, (b) butt weld joint, (c) splice weld joint,
Figure 5. Grey
Grey value
value intensity of weld joint: (a)
(a) lap
lap weld
weld joint,
joint, (b)
(b) butt
butt weld
weld joint,
joint, (c)
(c) splice
splice weld
weld joint,
joint,
(d) fillet weld joint, and (e) v-type weld joint.
and (e) v-type weld joint.
(d) fillet weld joint, and (e) v-type weld joint.
3. Image
3. Features
Image Features Extraction
Features Extraction of
Extraction of Weld
of Weld Joints
Weld Joints
3. Image Joints
Image features determines
Image determines the recognize accuracy and and applicability of aa weld
weld joint recognition
recognition
Image features
features determines the the recognize
recognize accuracy
accuracy and applicability of a
applicability of weld joint
joint recognition
system.
system. In this
In this paper,
this paper, we
paper, we take
we take full
take full advantage
full advantage
advantage ofof the
of the laser
the laser stripe
laser stripe deformation
stripe deformation information
deformation information
information inin the image
the image
system. In in the image
as
as the
the features
features for
for identifying
identifying the
the weld
weld joint
joint type.
type. Figure
Figure 6
6 outlines
outlines the
the extraction
extraction process.
process.
as the features for identifying the weld joint type. Figure 6 outlines the extraction process.

Figure 6. Features extraction process of a weld joint image.


Figure 6. Features extraction process of a weld joint image.
Figure 6. Features extraction process of a weld joint image.
3.1. Weld Joint Image Preprocessing
3.1. Weld Joint Image Preprocessing
3.1. Weld Joint Image
The laser curve Preprocessing
in the image must be extracted to extract the feature vector. Therefore, weld joint
imageThe laser curve
preprocess in thenoise
include imageprocessing,
must be extracted to extract
laser curve the feature
extraction, regionvector. Therefore,
of interest weld joint
(ROI) extraction,
The laser curve in the image must be extracted to extract the feature vector. Therefore, weld joint
image preprocess include
and laser curve fitting. noise processing, laser curve extraction, region of interest (ROI) extraction,
image preprocess include noise processing, laser curve extraction, region of interest (ROI) extraction,
and laser curve fitting.
and laser curve fitting.
3.1.1. Noise Processing and Laser Curve Extraction
3.1.1. Noise Processing and Laser Curve Extraction
As shown in Figure 7, regarding the relative position of the visual sensor, the welding torch and
As shown in Figure 7, regarding the relative position of the visual sensor, the welding torch and
the workpiece, the welding process is divided into four stages:
the workpiece, the welding process is divided into four stages:
Sensors 2020, 20, 471 7 of 20

3.1.1. Noise Processing and Laser Curve Extraction


As shown in Figure 7, regarding the relative position of the visual sensor, the welding torch and
the workpiece, the welding process is divided into four stages:
L1: The robot start moving–The visual sensor arrive the weld starting point
Set the scanning path before the welding operation and use the visual sensor to find the starting
point of the weld based on the imaging difference between the weld joint image and background image.
At this stage, the visual sensor has not captured the weld joint image and the welding torch does not
start welding.
L2: The visual sensor arrive the weld starting point–The welding torch arrive the weld starting point
Determine whether the visual sensor has captured the weld joint image. The visual sensor
will identify the weld joint types when the weld joint image is captured. At this stage, the welding
torch does not arrive the starting point of the weld joint and does not perform welding. Therefore,
the obtained image is without splash and other interference.
L3: The welding torch arrive the weld starting point–The visual sensor arrive the weld ending point.
When the welding torch arrive the weld starting point, the welding torch starts welding. The image
processing method of weld joint location and relevant welding parameters used in this stage are based
on the identification result in L2. In this stage, there is a large amount of splash interference in the
image that was captured by the camera.
L4: The visual sensor arrive the weld ending point–The welding torch arrive the weld ending point
When the visual sensor captures the welding ending point, the welding scanning is completed.
Once the welding torch reaches the welding ending point, welding is completed.
As described above, weld joint identification is performed in L2, and it is the basis of L3 and L4.
In the real weld site, the weld joint type generally does not change during one-time tracking; thus,
the weld joint type needs to be recognized only at the initial stage L2. In this paper, we consider
the recognition of only 40 weld joint images that were captured by the camera in the first second of
the welding process. There is no spatter in the joint images at this time. Therefore, we only need to
consider the reflection and scattering from the weld surface. This simplification can effectively reduce
the difficulty of noise processing and improve the response speed of the system. In this paper, a hybrid
filtering algorithm that consists of median filtering with a 5 × 5 square template and an open operation
with a 3 × 7 rectangular template is used to remove noise from the weld joint images. Figure 8a,b
show the results of image noise processing of Figure 4e. Subsequently, we perform a threshold binary
operation to the noise with a grey value below 220. The laser stripe forms clusters in the image and the
weld joint positions in each frame are different. Therefore, we extract the laser curve in the image to
determine the position of the laser stripe based on the grey centroid method in this paper. This process
can be expressed, as follows:
 Pk g(h,i)×h
 pi (x) = h= j


 Pk
 h= j g(h,i) (1 ≤ i ≤ 480) (3)

 pi ( y) = i

where i is the ith row in the image, pi (x) and pi (y) are the coordinates of the points on the laser curve in
the image, g is the grey value of the current point on the laser stripe, whose value is 0 or 255, h is the
X coordinate of the current point on the laser stripe, and k and j are the X coordinates of the left and
right edge points, respectively, of the laser stripe in each row of pixels. Figure 8c shows the laser curve
extraction results.
Sensors 2020, 20, 471 8 of 20
Sensors 2020, 20, x FOR PEER REVIEW 8 of 20

Figure 7. The relative position of the welding torch, visual sensor and the workpiece.
Figure 7. The relative position of the welding torch, visual sensor and the workpiece.
3.1.2. Extraction of the ROI
3.1.2. Extraction of the ROI
In this paper, the dimensions of the original weld joint images are 640 × 480; thus, processing the
whole Inweld
this paper, the dimensions
joint image would incur of the original
a high weld joint cost.
computational imagesIn are
fact,640
the× weld
480; thus,
joint processing the
feature points
whole weld joint image would incur a high computational cost. In fact, the weld joint
are only located in the laser curve in the image, and the feature vector is related to the deformation of feature points
are laser
the only curve
located atin
thethe laser
weld curve
joint. in the image,
Therefore, we canand theconsider
only feature the
vector is related
region near the tolaser
the deformation
curve in the
of the laser curve at the weld joint. Therefore, we can only consider the region near
image. For this purpose, we define an ROI near the laser curve in the image to reduce the required the laser curve in
the image. For this purpose, we define an ROI near the laser curve in the image to
number of pixel computations. Moreover, defining an ROI can effectively reduce the probability of reduce the required
number
noise of pixel
in the image, computations.
thus reducing Moreover, defining
the difficulty an ROI canthe
and improving effectively
accuracy reduce
of weldthe probability
joint extraction.of
noise in the image, thus reducing the difficulty and improving the accuracy of weld joint extraction.
We define a rectangular ROI based on the location of laser curve in the image, and the
parameters of the ROI can be determined, as follows:
Sensors 2020, 20, 471 9 of 20
Sensors 2020, 20, x FOR PEER REVIEW 9 of 20

We define a rectangular ROI based on the location of laser curve in the image, and the parameters
 X  min( pi ( x))  w1
of the ROI can be determined, as follows:
Y  min( pi ( y ))
  (1  i  M ) (4)
 XW
 = min
max( x))−wmin(
(pip(ix()) 1 pi ( x))  w2
 Y

= min y ))  min( pi ( y ))



H (pip(iy())
max(
(1 ≤ i ≤ M) (4)
 W = max(pi (x)) − min(pi (x)) + w2




where X and Y represent the
H vertex
= maxof (pthe top
i ( y)) left (corner
− min pi ( y)) of the ROI, W is the width of the ROI, H is
the length of the ROI, and w1 and w2 are the reserved values of the rectangular ROI in the X direction.
where X and Y represent the vertex of the top left corner of the ROI, W is the width of the ROI, H is the
In this paper, we set w1 to 15 and w2 to 15 based on a large number of experiments. Figure 8d shows
length of the ROI, and w1 and w2 are the reserved values of the rectangular ROI in the X direction.
the extraction
In this results.
paper, we set w1 to 15 and w2 to 15 based on a large number of experiments. Figure 8d shows
the extraction results.
3.1.3. Laser Curve Fitting
3.1.3. Laser Curve Fitting
During image preprocessing, the laser curve in the image might be interrupted, as shown in
Figure During image preprocessing,
8d. Therefore, the laserofcurve
to ensure the accuracy in the image
the subsequent might
feature be interrupted,
vector as shown
extraction and fitting ofin
Figure
the laser8d. Therefore,
curve to ensure
in the image, the accuracy
in this paper, theoflaser
the subsequent
curve in thefeature
image vector
is fittedextraction
by using and fitting(5)
Equation of
thecalculate
to laser curve
the in the image, in this paper, the laser curve in the image is fitted by using Equation (5) to
slope:
calculate the slope:
m ( y)  n ( y)
S  mii ( y) − ni i ( y)  i  1 (5)
Si i= mi ( x)  ni ( x) (i ≥ 1) (5)
mi ( x ) − n i ( x )
where
where SSii represents
represents thethe slope
slope of
of the
the ith
ith fit
fit line and mii and
and nnii denote
denote the
the endpoints
endpoints of
of the
the interrupted
interrupted
laser
laser curve
curve in in the
the image. The The disconnected
disconnected curves
curves can
can be
be determined
determined by by row scanning. Figure
Figure 8e
8e
shows
shows the
the line
line fitting
fitting results.
results.

3.2. Extraction
3.2. ExtractionofofFeature
FeatureVector
Vector
Section IIII describes
Section describes the
the line
line fitting
fitting results.
results. We
We already
already know
know thatthat the
the weld
weld joint
joint type
type can
can bebe
determined based
determined basedon on the
the abrupt
abrupt differences
differences ofof the
the laser
laser curve
curve atat the
the weld joint in the image. When When
considering the image characteristics, we propose a method using the laser curve
considering the image characteristics, we propose a method using the laser curve distribution in distribution in the
the
image and
image and the
theweld
weldjoint
jointfeature
featurepoint
pointintervals
intervals as
asthe
thebasis
basisof
ofaacharacteristic
characteristicdescription
descriptiontotorecognize
recognize
different weld joints types.
different

(a) (b) (c) (d) (e)

Figure 8.
Figure (a,b)Image
8. (a,b) Imagenoise
noiseprocessing
processingresult
resultof
ofFigure
Figure4e,
4e,(c)
(c)laser
lasercurve,
curve,(d)
(d)region
regionof
ofinterest
interest(ROI),
(ROI),
and (e)
and (e) laser
laser curve
curve fitting.
fitting.

3.2.1. Slope
3.2.1. SlopeDistribution
Distributionof ofLaser
LaserCurve
Curve
At aaweld
At weldjoint,
joint,the
thelaser
lasercurve
curvewill
willexhibit
exhibitdeformation,
deformation, which
which will
will result
result in
in changes
changes inin the
the slope
slope
of laser
of laser curve.
curve. Additionally,
Additionally, the
the slope
slope distributions
distributions of
of the
the laser
laser curve
curve that
that are
are observed
observed in in images
images ofof
different types of weld joints are different. Therefore, the slope distribution of the laser curve
different types of weld joints are different. Therefore, the slope distribution of the laser curve in the in the
image can be used as an effective basis for weld joint type recognition. We define the slope of a point
at the laser curve as:
Sensors 2020, 20, 471 10 of 20
Sensors
Sensors 2020,
2020, 20,
20, xx FOR
FOR PEER
PEER REVIEW
REVIEW 10
10 of
of 20
20

image can be used as an effective basis for weld joint type recognition. We define the slope of a point at
the laser curve as: ppii++11 (( xx)) −
− ppii−−11 (( xx)) + ppii++22 (( xx)) −
+
− ppii−−22 (( xx)) + ppii++33 (( xx)) −
+
− ppii−−33 (( xx)) +
+
22 22 22
pi+1 (x)−p i−1 (x) pi+2 (x)−pi−2 (x) ppi+3((xx))−pi−3 (x) +
2ppii++44 (( xx )) −−+pp ii−−44 (( xx)) +2 ppii++55 (( xx ))+−
− pi −5 ( x)2 (6)
p (x)−p (x)+ p (x)−p (ix−)5 (6)
i+224 2 i−4 + i+5 2 22i−5
KKii i=
K == ( 55 <
< ii <
(5 < i < 475)< 475
475 )
555
where
whereKK ii represents
i represents thetheslope
slopeatat
point pii pon
point thethe
i on laser curve
laser in the
curve image
in the andand
image pi−1
i−1pto
i−1pi+5
to represent
i+5 the
pi+5 represent
adjacent pointspoints
the adjacent of pii. of
Thepi .slopes at theatlaser
The slopes curvecurve
the laser for each
for joint
each type
joint are
typemainly distributed
are mainly in thein
distributed
range fromfrom
the range 0–5. 0–5.
Figure 9a,b 9a,b
Figure show the the
show lap lapweld
weldjoint image
joint imageandandsplice
spliceweld
weldjoint.
joint.For
Forstatistical
statistical
convenience,
convenience,we wedefine
definethe theslope
slopedistribution,
distribution,asasfollows:
follows:
KKii =
=K11 = 1 (0.6 < K ≤ 1.5)
< K1.5
ii ≤ 1.5)
  i (0.6 <(0.6
Ki ≤ )
KK =
= 22 (1.5 <
< K ii ≤ ≤ 2.5)
(1.5 < Ki ≤ 2.5) 2.5)
(1.5 K

i
 
i Ki = 2
K =
=K33 = 3 (2.5 <
<K ii ≤ ≤)3.5)

K (2.5 <(2.5 K 3.5) (7)

ii
K i Ki ≤ 3.5 (7)
K =
= 44 (3.5
(3.5 <
< K
K ≤
≤ 4.5)
) 4.5)

  (3.5 < Ki ≤ 4.5
ii ii
 Ki = 4
K = <

K =K55 = 5 (4.5
(4.5(4.5 <K K i) )

 ii <K

i) i

i

Figure 10a,b show the final laser curves slope distributions of the lap weld joint and splice weld
Figure 10a,b show the final laser curves slope distributions of the lap weld joint and splice weld
joint images. On this basis, the laser curve slope feature can be defined as Equation (8).
joint images. On this basis, the laser curve slope feature can be defined as Equation (8).
V
V11 = num ( K
= num = 11)
Kii =
V1V == num

(Ki ==2 )1)
num ( K
V22 = num Kii = 2



V2 = num(Ki = 2)

num ( K = 33)

V V3 = = num Ki =

(8)
V3 3= num(Ki i = 3)


(8)

V = num ( Kii = 44)
num K =
V4V44== num





 (K = 4)
V = num ( K i = 5 )
 V 55 = num Kii = 5



 V5 = num(Ki = 5)
where num() denotes a counting function used to represent the statistics of the slope distribution of
where num() denotes a counting function used to represent the statistics of the slope distribution of
laser curve.
laser curve.

(a) (b)
Figure9.
Figure
Figure Slopedistribution
9.9.Slope
Slope distributionalong
distribution alongthe
along thelaser
the lasercurve:
laser curve:(a)
curve: (a)(a)lap
laplapweld
weldjoint
weld jointand
joint and(b)
and (b)splice
(b) spliceweld
splice weldjoint.
weld joint.
joint.

(a) (b)
Figure
Figure10.
Figure 10.Final
10. Finalslope
Final slopedistribution
slope distributionalong
distribution alongthe
along thelaser
the lasercurve:
laser curve:(a)
curve: (a)(a)lap
laplapweld
weldjoint
weld jointand
joint and(b)
and (b)splice
(b) spliceweld
splice weldjoint.
weld joint.
joint.
Sensors 2020, 20, x FOR PEER REVIEW 11 of 20

3.2.2. Interval of Weld Joint Feature Points


Sensors 2020, 20, 471 11 of 20
The laser curve slope distribution can be used as an important feature for recognizing the weld
joint type; however, there are some weld joints that are very different in appearance, but have similar
3.2.2. distributions,
slope Interval of WeldsuchJoint Feature
as fillet Points
weld joints and v-type weld joints. Therefore, it is necessary to select
a feature
The laser curve slope distributiontype
as a condition to identify the of weld.
can be used as an important feature for recognizing the weld
joint The
type;slope of thethere
however, laserare curve will
some jump
weld twice
joints in are
that the very
welddifferent
joint image. As shown in
in appearance, butFigure 10, the
have similar
positions of the two jumps are marked as feature point 1 and feature point 2. There is
slope distributions, such as fillet weld joints and v-type weld joints. Therefore, it is necessary to select a position
difference
a feature asbetween feature
a condition point 1 and
to identify the feature
type ofpoint
weld.2 in the X and Y directions due to the differences
in relative position
The slope between
of the laser curvethe two
willwelding
jump twicework-pieces. Wejoint
in the weld define disx asAs
image. theshown
pixel difference in
in Figure 10,
the X direction and dis y as the pixel difference in the Y direction, as shown in
the positions of the two jumps are marked as feature point 1 and feature point 2. There is a positionFigure 11. These
distances
differencecan be calculated
between feature pointas 1 and feature point 2 in the X and Y directions due to the differences in
relative position between the V6 two
= diswelding work-pieces. We define disx as the pixel difference in the X
x = | feature point 1x − feature point 2 x |
direction and disy as the pixel  difference in the Y direction, as shown in Figure 11. These distances can (9)
be calculated as V7 = dis y = | feature point 1y − feature point 2 y |
(
V = dis = f eature point 1x − f eature point 2x
where feature point 1x, feature6 point x1y, feature point 2x , and feature point 2y represent the X and(9)Y
V7 = dis y = f eature point 1 y − f eature point 2 y

coordinates of feature point 1 and feature point 2, respectively, in the pixel coordinate system.
whereAsfeature
shownpoint 1x , feature
in Figure 11, the point
slope 1y ,offeature point
the laser 2x , and
curve feature
changes onlypoint
at the2y represent the
edge of the X and
weld Y
joint.
coordinates of feature point 1 and feature point 2, respectively, in the pixel coordinate
Therefore, in this paper, feature point 1 is defined as the first pixel point at which the slope is greater than system.
As shown in Figure 11, the slope of the laser curve changes only at the edge of the weld joint.
1 and feature point 2 is defined as the last pixel point at which the slope is greater than 1, as follows:
Therefore, in this paper, feature point 1 is defined as the first pixel point at which the slope is greater
than 1 and feature point 2 is defined   feature
as thepoint
last 1pixel
= pi point at which the slope is greater than 1, as follows:
  first

( slope (=pi )p−i slope ( pi +1 ) ) >= 1

if f:eature
abs point1

 



f irst

 
  (10)
 
 feature point 2 = ipi − slope(pi+1 )) >= 1
 
i f : abs ( slope ( p )
 


(10)
 f eature point2 =
 
 
 last p
 i

if : abs ( slope ( pi last ) − slope ( pi +1 ) ) >= 1
 




 
  i f : abs(slope(pi ) − slope(pi+1 )) >= 1

 

Accordingly, we can obtain the distributions


distributions of
of disx
disx and
and disy disy for for various
various weld weld joint
joint images.
images.
Figures 12–15 show the disx and disy of the lap weld joint and splice weld weld joint. joint.
On this basis, the feature vector can be expressed as = [[VV
as VV = 1 ,1
V,2 V
,V23 ,,VV4 ,3V, 5V
, V46,,V
V75],.V6 , V7 ].

Figure 11.
Figure Featurepoints
11. Feature pointsofofweld
weldjoints:
joints:(a)(a) lap
lap weld
weld joint,
joint, (b)(b) butt
butt weld
weld joint,
joint, (c) splice
(c) splice weldweld joint,
joint, (d)
(d) fillet weld joint, and (e) v-type weld
fillet weld joint, and (e) v-type weld joint. joint.
Sensors 2020, 20, x FOR PEER REVIEW 12 of 20

Sensors 2020, 20, x FOR PEER REVIEW 12 of 20


Sensors 2020, 20, 471 12 of 20
Sensors 2020, 20, x FOR PEER REVIEW 12 of 20

Figure 12. Distribution of disx for lap weld joints.

Figure 12. Distribution of disx for lap weld joints.


Figure 12. Distribution of disx for lap weld joints.
Figure 12. Distribution of disx for lap weld joints.

Figure 13. Distribution of disy for lap weld joints.


Sensors 2020, 20, x FOR PEER REVIEW 13 of 21
Figure 13. Distribution of disy for lap weld joints.
Figure 13. Distribution of disy for lap weld joints.
Figure 13. Distribution of disy for lap weld joints.

Figure 14. Distribution of disx for splice weld joints.


Figure14.
Figure 14.Distribution disx xfor
Distributionofofdis forsplice
spliceweld
weldjoints.
joints.

Figure 14. Distribution of disx for splice weld joints.


Figure 14. Distribution of disx for splice weld joints.

Figure 15. Distribution of disy for splice weld joints.


Figure 15. Distribution of disy for splice weld joints.
Figure 15.
4. Weld Joint Type Recognition Distribution
Using SVM of disy for splice weld joints.
4. Weld Joint Type Recognition FigureUsing SVM of disy for splice weld joints.
15. Distribution
The feature vector has
4. Weld Joint Type Recognition Using SVM been extracted in Section III. Now, a weld joint recognition model should
has Figure
been 15.
be constructed. Therefore, in this paper, an Section
The feature vector Distribution
extracted in SVMof dis y forNow,
isIII. spliceaas
adopted weldthejoints.
weld joint recognition
basis model should
of the recognition model.
4. WeldThe Joint Type
feature Recognition
vector has been Using SVM
extracted in Section III. Now, a weld joint recognition model should
be
Anconstructed.
SVM is a maximum Therefore, in thisclassifier
interval paper, anfor SVM is adopted
binary as the basis
classification of the recognition
that determines model.
the decision An
plane
4. Weld
be Joint
constructed. Type Recognition
Therefore, in this Using
paper, SVM
an SVM is adopted as the basis of the recognition model. An
f (x) = w x + b based on the maximum distance between the plane and some edge data, which are also
SVM is
The aT maximum
feature interval
vector has classifier
been for
extracted binary
in classification
Section III. Now, that
a welddetermines
joint the decision
recognition plane
model f(x)
should
SVM
=be
supportis+a bvectors.
The maximum
wconstructed.
Tx
feature on interval
basedTherefore,
the has
vector inclassifier
maximum for an
distance
thisextracted
been paper, binary
inSVM classification
between theNow,
is adopted
Section III. asthat
plane and
athe determines
some
basis
weld of edge
joint the the decision
data,
recognition which
recognition plane
are
model.
model f(x)
also
shouldAn
=support
SVM
be wconstructed.
TxSuppose
+ bavectors.
is based
maximum on there
that the
Therefore,maximum
interval
arein two distance
classifier
types
this paper, for
of between
anbinary
data,
SVM Aisand theB,plane
classification
adopted which and
as that
theare some ofedge
determines
labelled
basis data,
the
thewith which
ydecision
values
recognition are{1,also
plane
of
model. f(x)
−1},
An
support
SVM xis+avectors.
wTSuppose
=respectively. thatonthere
bmaximum
based the
Afterwards, are
thetwo
maximum
interval types of binary
distance
decision
classifier data,
plane
for Aclassification
between andthe
is defined, B,aswhich
plane
follows:
thataredetermines
and labelled
some edge with ydecision
data,
the values
whichof {1, also
are
plane −1},
f(x)
Suppose
respectively.
=support
wTx + bvectors. that there
Afterwards,
based are two types
the decision
on the maximum of data, A
plane between
distance and
is defined, B,
the which
asplane
follows:are
and labelled
some with
edge y
data,values
whichof {1,
are −1},
also
T x + bis>defined,
(
respectively.
support Suppose Afterwards,
vectors.that there are the two
decision
if(wplane
types of data, 0A) andthenB, as follows:
(label
which ) labelled with y values of {1, −1},
: 1are
T x + b < 0) then (label : − 1) (11)
respectively. Afterwards, the
Suppose that there are twotypes if
decision( w plane is defined, as follows:
T of data, A and B, which are labelled with y values of {1, −1},
if ( w x + b  0) then (label: 1)
respectively. Afterwards, the decision  T plane is defined, as follows:
(11)
if ( w x + b  0) then (label: − 1)

One of the common methods of solving for f ( x) = wT x + b is:

 1 2
min 2 w
Sensors 2020, 20, 471 13 of 20

One of the common methods of solving for f (x) = wT x + b is:



1 2
 2 kwk 
 min


(12)
 yi wT x + b ≥ 1, i = 1, . . . . . . , n

Soft intervals can be introduced to prevent over-fitting, which allows the SVM to make mistakes
with some samples and allows some samples to fail to satisfy yi (wT x + b) ≥ 1. However, the number
of samples that fail to satisfy the constraint should be as small as possible. Therefore, the relaxation
variable ξ is introduced, and Equation (12) can be written as:
n

2
1
ξi
 P



 min 2 kwk + C
 i=1
(13)




 ξi ≥ 0 
 yi wT xi + b ≥ 1 − ξi , i = 1, . . . . . . , n

In the case of linearly inseparable data, it is difficult to find a suitable hyperplane for classifying
the two types of data. The common solution to this problem that is adopted in the SVM algorithm is to
find a mapping function for mapping the data into higher dimensions; thus, data that are inseparable
in lower dimensions can become separable in a higher dimensional space, such that an appropriate
hyperplane can be found to classify them. A kernel function k(xi , xj ) is introduced to map two vectors
from the original space into the higher-dimensional space to solve the problem of high-dimensional
data and to calculate the inner product during optimization. The input to this kernel function consists of
two vectors, and its output is the inner product of the two vectors mapped into the higher-dimensional
space. Common kernel functions include linear kernels, polynomial kernels, Gaussian kernels, Laplace
kernels, and sigmoid kernels. In this paper, a Gaussian kernel (radial basis function, RBF) is adopted:
   
κ xi , x j = exp −gkxi − x j k2 (14)

where g is the bandwidth of the Gaussian kernel function. As g goes toward infinity, almost all of
the samples become support vectors, which can lead to overfitting, as all training samples will be
accurately classified. As g goes toward zero, the discriminant function of the SVM becomes a constant
function, and its classification ability for new samples becomes 0; in other words, it will classify all
the samples into the same class, which leads to under-fitting. Therefore, choosing an appropriate
bandwidth has great influence on the performance of the model, and it is necessary to find a balance
between correctly dividing the current data and ensuring suitability for a wider range of data to ensure
that the model has good practical value.
SVMs are mainly used for binary classification; however, there are two main schemes that can be
used to adapt them for multi-class classification. The first is the one-to-many scheme, in which multiple
classifications are performed, each for identifying whether the samples belong to one particular
category. Therefore, for K classes of samples, it is necessary to construct K SVMs. Classification is
achieved by comparing the distances between each of the input samples and the hyperplane in each
SVM. The second scheme is the one-to-one scheme, which is based on SVMs for distinguishing between
two particular classes of samples. Therefore, k(k − 1)/2 SVMs are needed for the K sample classes.
The category to which each sample belongs is determined by voting to make the classification decision.
The samples are input into all SVMs, and the results of the individual decisions are then counted.
The category with the most votes for a given sample is determined to be the category of that sample.
In this paper, a total of 10 SVMs are designed for classifying five types of weld joints while using the
second scheme.
The selection of the parameter set plays an important role in determining the quality of an SVM
model. Additionally, multi-type weld joint recognition involves the selection of parameters for multiple
SVM models. In general, there are two methods that can be applied for choosing parameters:
the category of that sample. In this paper, a total of 10 SVMs are designed for classifying five types
of weld joints while using the second scheme.
The selection of the parameter set plays an important role in determining the quality of an SVM
model. Additionally, multi-type weld joint recognition involves the selection of parameters
Sensors 2020, 20, 471
for
14 of 20
multiple SVM models. In general, there are two methods that can be applied for choosing parameters:
(1) The parameter set for each SVM model (i.e., the model for distinguishing between each pair of
(1) The parameter set for each SVM model (i.e., the model for distinguishing between each pair of
weld types) is independently selected, and each model will have its own parameter settings. For
weld types) is independently selected, and each model will have its own parameter settings.
example, with this approach, there will eventually be 10 sets of parameters since there are 10
For example, with this approach, there will eventually be 10 sets of parameters since there are 10
SVM models for the problem considered in this paper.
SVM models for the problem considered in this paper.
(2) All of the models share one set of parameters, and the parameters that yield the highest overall
(2) All of the models share one set of parameters, and the parameters that yield the highest overall
performance are selected.
performance are selected.
Each method has its advantages. A single parameter set might not be appropriate for all ĸ(ĸ −
Each method
1)/2 models. However, has the
its overall
advantages.
accuracy A issingle parameter
the ultimate set mightand
consideration, notindividual
be appropriate sets of for all
model
k(k − 1)/2 models.
parameters may leadHowever, the overall
to over-fitting accuracy
of the overallismodel.
the ultimate consideration,
Therefore, the second and individual
strategy sets
is adopted
of model parameters may lead to over-fitting of the overall model. Therefore,
in this paper, the same set of parameters is used for all of the models, and the parameters are set the second strategy is
adopted
based oninthe this paper,performance.
overall the same set of parameters is used for all of the models, and the parameters are
set based on the overall
To optimize the parameters,performance.the feature vectors were first normalized to facilitate fast
convergence. Subsequently, the the
To optimize the parameters, feature vectors
parameter set waswere first by
selected normalized
using thetogridfacilitate
search fast convergence.
method that is
Subsequently, the parameter set was selected by using the grid search method
based on five-fold cross-validation. The training samples were divided into five groups. Four groups that is based on five-fold
cross-validation.
were treated as The training samples
the training data each were divided
time, and theintoremaining
five groups. Fourwas
group groupsused were
as treated
the testasdata.
the
training data each time, and the remaining group was used as the test data.
Afterwards, the average five-fold cross-validation accuracy was obtained by averaging the results Afterwards, the average
five-fold
that were cross-validation
obtained in this accuracy
way for waseach
obtained
set ofbyparameters.
averaging the results
Figure 16that
showswerethe obtained
resultsinthat
thiswere
way
for each set of parameters. Figure 16 shows the results that were obtained for
obtained for 110 sets of parameters via the cross-validation approach. These results show that when 110 sets of parameters via
the cross-validation
log2C is equal to −5approach.
and log2gThese results
is equal show
to −15, that
i.e., C when log2C
= 0.03125 andis equal to −5 and
g = 0.00003, the log2g is equal
accuracy that to
is
−15, i.e., C = 0.03125 and g = 0.00003, the accuracy that is determined via cross-validation
determined via cross-validation is the lowest, with a value of 62.3834%. When log2C is equal to 13 is the lowest,
with a value
and log2g is of 62.3834%.
equal When
to 1, i.e., C = log2C is equal
8192 and g = 2,tothe13 accuracy
and log2gdetermined
is equal to via1, i.e., C = 8192 and gis=the
cross-validation 2,
the accuracy
highest, determined
98.7565%. via cross-validation
Therefore, C = 8192 and is g the
= 2 highest,
were used 98.7565%. Therefore, C
as the parameter = 8192 for
settings g=2
andmodel
were used as the parameter settings for model
training to establish the weld joint classification model. training to establish the weld joint classification model.

Figure 16. Cross-validation results for the model parameters.


Figure 16. Cross-validation results for the model parameters.
5. Experimental Results and Analysis
5. Experimental
As shown in Results and
Figure 17, theAnalysis
welding system that was considered in this paper consisted of five parts:
a welding robot,ina Figure
As shown visual sensor,
17, the awelding
robot controller, a computer,
system that and the in
was considered auxiliary welding
this paper equipment.
consisted of five
A visual sensor is installed on the end effector of the robot. For communication,
parts: a welding robot, a visual sensor, a robot controller, a computer, and the auxiliary welding there are cable
connections
equipment. between the auxiliary welding equipment and the robot controller, between the computer
and the robot controller,
A visual and between
sensor is installed theend
on the robot controller
effector and
of the the welding
robot. robot, and thethere
For communication, visual
aresensor
cable
transmits image data to the computer through a USB connection.
connections between the auxiliary welding equipment and the robot controller, between the
The five
computer andweld joint types
the robot that were
controller, and considered
between thefor recognition
robot in this
controller and paper were lap
the welding weldand
robot, joints,
the
butt weld joints, splice weld joints, fillet weld joints, and v-type
visual sensor transmits image data to the computer through a USB connection. weld joints. Table 2 shows the
parameters of each weld type.
Sensors 2020, 20, x FOR PEER REVIEW 15 of 20
Sensors 2020, 20, x FOR PEER REVIEW 15 of 20
The five weld joint types that were considered for recognition in this paper were lap weld joints,
The five
butt weld weld
joints, joint weld
splice types joints,
that were considered
fillet for recognition
weld joints, and v-type in this paper
weld joints. were
Tablelap
2 weld
showsjoints,
the
butt weld
parameters joints,
of 471 splice weld
each weld type. joints, fillet weld joints, and v-type weld joints. Table 2 shows the
Sensors 2020, 20, 15 of 20
parameters of each weld type.

Figure 17. System configuration.


Figure
Figure 17. System configuration.
17. System configuration.
Table 2. Weld joint parameters.
Table 2. Weld joint parameters.
parameters.
Type of Weld Joint Dimensions of Workpieces Plates
Type
Type of
of Weld
Lap Weld
weldJoint Joint
joint Dimensions
600 mm × of
150Workpieces
Dimensions 3 mmPlates Plates
mmof× Workpieces
Lap Lap
weld
Butt weld joint
jointjoint
weld 600 mm ×
600 mm ×600 150 mm ×33mm
mm× ×150
150mm mm× 3 mm
mm
Butt weldweld
Butt
Splice joint joint
weld joint mm×600
600mm
600 ×150 mm
150 × ×150
mm
mm mm× 3 mm
mm
×33mm
Splice weldweld
Splice joint joint 600 mm 600
× mm
150 × 150
mm × 3mm
mm× 3 mm
Fillet weld joint
Fillet weld joint
600 mm ×600150mmmm× ×150
3 mm
mm × 3 mm
Fillet weld
V-type weld joint
joint 600mm
800 mm× ×150
150mm
mm× ×203 mm
mm
V-type weld joint 800 mm × 150 mm × 20 mm
V-type weld joint 800 mm × 150 mm × 20 mm
5.1.
5.1.Experimrntal
ExperimrntalResults
Results
5.1. Experimrntal Results
For
Forthis
thisanalysis,
analysis,600 600weld
weldjoint
jointimages,
images,including
including120 120images
imagesforforeach
eachtype
typeofofjoint,
joint,were
wereselected
selected
as For this analysis, 600 weldofjoint images, including 120 images for each type of joint, were selected
as the training images. A total of 250 weld joint images were selected for testing: images1–50
the training images. A total 250 weld joint images were selected for testing: images 1–50were
were
as the training
images of lap images.
weld A total
joints, of 250
labelled withweld
a joint of
value images
1; were 51–100
images selectedwere
for testing:
the imagesimages
of 1–50weld
butt were
images of lap weld joints, labelled with a value of 1; images 51–100 were the images of butt weld
images
joints, of lap weld joints, labelled with a value of 1; images 51–100 were the images of butt weld
joints,labelled
labelledas as2;2;images
images101–150
101–150werewereimages
imagesofofsplice
spliceweld
weldjoints,
joints,labelled
labelledasas3;3;images
images151–200
151–200
joints,
were labelled
images of as 2;weld
fillet images 101–150
joints, were
labelled as images
4; and, of splice
images weld joints,
201–250 were labelledofasv-type
images 3; images
weld 151–200
joints,
were images of fillet weld joints, labelled as 4; and, images 201–250 were images of v-type weld joints,
were
labelledimages of fillet weld joints, labelled as 4; and, images 201–250 were images
as 5. All of the experiments that were reported in this paper were performed on a computer of v-type weld joints,
labelled as 5. All of the experiments that were reported in this paper were performed on a computer
labelled as 5. All of the experiments that were reported
with an Intel i5-6500 CPU, with a main frequency of 3.2 GHz and 8 GB of RAM.in this paper were performed on a computer
with an Intel i5-6500 CPU, with a main frequency of 3.2 GHz and 8 GB of RAM.
withAccording
an Intel i5-6500
to theCPU,final with a main frequency
experimental results, the of 3.2 GHz andaccuracy
recognition 8 GB of RAM.
achieved 98.4%, and the
According to the final experimental results, the recognition accuracy achieved 98.4%, and the
computational cost of single-image recognition is 148.23 ms. Figure 18 showsachieved
According to the final experimental results, the recognition accuracy 98.4%, and
the recognition the
results
computational cost of single-image recognition is 148.23 ms. Figure 18 shows the recognition results
computational cost of
for various weld joint images. single-image recognition is 148.23 ms. Figure 18 shows the recognition results
for various weld joint images.
for various weld joint images.

Figure 18. Results of weld joint identification using the proposed method.
Figure 18. Results of weld joint identification using the proposed method.
5.2. Comparison ofFigure 18. Results
Weld Joint ImageofFeature
weld joint identification
Extraction Methodsusing the proposed method.

We compared the proposed feature extraction algorithm for weld images with the weld image
feature extraction method presented in reference [23] to verify the effectiveness and superiority of
the feature extraction method proposed in this paper. The cited paper presents a comparison with
previously proposed weld joint feature extraction algorithms and it shows that the selected method
5.2. Comparison of Weld Joint Image Feature Extraction Methods
We compared the proposed feature extraction algorithm for weld images with the weld image
feature extraction method presented in reference [23] to verify the effectiveness and superiority of
Sensors
the 2020, 20,
feature 471
extraction method proposed in this paper. The cited paper presents a comparison 16 of 20
with
previously proposed weld joint feature extraction algorithms and it shows that the selected method
achieves
achievesbetter
betterrecognition
recognitionaccuracy
accuracywith
withaalower
lowercomputational
computationalcost;
cost;thus,
thus,ititcan
canbe
beused
usedasasan
anobject
object
ofofcomparison in this paper. We used both feature extraction algorithms to extract feature
comparison in this paper. We used both feature extraction algorithms to extract feature vectors vectors
with
with which to build SVM models. Table 3 shows the final recognition accuracies and computational
which to build SVM models. Table 3 shows the final recognition accuracies and computational costs of
costs of the
the two two methods.
methods. Figure Figure 19 the
19 shows shows the results
results of weldofimage
weldrecognition.
image recognition.

Figure 19. Results of weld joint identification using the reference [23].
Figure 19. Results of weld joint identification using the reference [23].
Table 3. Recognition results obtained with different feature extraction algorithms.
Table 3. Recognition results obtained with different feature extraction algorithms.
Feature Extraction Algorithm Accuracy Rate Computational Cost
Feature
ThisExtraction
paper Algorithm Accuracy
98.4% Rate Computational
148.23Cost
ms
This
Reference [23]paper 98.4%
89.2% 148.23165.62
ms ms
Reference [23] 89.2% 165.62 ms
A comparison reveals that the recognition accuracy and computational cost of the weld image
featurecomparison
A extraction reveals
algorithmthatthat
the was
recognition
proposed accuracy
in this and computational
paper costthose
are better than of theofweld image
the feature
feature extraction
extraction methodalgorithm
presentedthat was proposed
in reference in this
[23]. The paper
vertical are better
distance fromthan those oftothe
the groove thefeature
surface
extraction method presented in reference [23]. The vertical distance from the groove
of the weld joint is used to construct the feature vector. This feature vector is relatively to the surface of
simple.
the weld joint is used to construct the feature vector. This feature vector is relatively
Consequently, this method is more suitable for weld joint types or weld grooves that exhibit large simple.
Consequently, this method is more suitable for weld joint types or weld grooves that exhibit large
differences. Weld joints that do not show significant groove differences will be difficult to distinguish.
differences. Weld joints that do not show significant groove differences will be difficult to distinguish.
The method of reference [23] misidentifies some lap weld joints, splice weld joints, and fillet weld joints,
The method of reference [23] misidentifies some lap weld joints, splice weld joints, and fillet weld
because the weld grooves of these three types of joints are relatively small, which makes it difficult to
joints, because the weld grooves of these three types of joints are relatively small, which makes it
differentiate among them, as shown in Figure 19. By contrast, the method that was proposed in this
difficult to differentiate among them, as shown in Figure 19. By contrast, the method that was
paper not only utilized the characteristics of the weld joint, but also accounted for laser deformation,
proposed in this paper not only utilized the characteristics of the weld joint, but also accounted for
which made it suitable for a wider scope of applications and enabled it to achieve a higher accuracy rate.
laser deformation, which made it suitable for a wider scope of applications and enabled it to achieve
a5.3.
higher accuracyofrate.
Comparision Classification Methods
Currently, ofthe
5.3. Comparision most commonly
Classification Methodsapplied classification algorithms include logistic regression,
the K-nearest- neighbour algorithm, the decision tree algorithm, Bayesian classification, the SVM
Currently,
algorithm, and the most
neural commonly
networks. Theapplied classification
SVM results algorithms
were compared with include logistic
the results regression,
of other the
classification
K-nearest- neighbour algorithm, the decision tree algorithm, Bayesian classification,
methods to verify the effectiveness and applicability of the SVM method selected in this paper. Logistic the SVM
algorithm, and
regression is neural
mainly used networks.
for binaryThe SVM results
classification, sincewere compared
the number withtypes
of weld the considered
results of here
otheris
classification methods
greater than two, to verify
logistic the effectiveness
regression andclassification
is not a suitable applicabilityalgorithm
of the SVM formethod selected
the problem in this
of interest.
paper. Logistic regression is mainly used for binary classification, since the number
Bayesian classification is based on the premise that the posterior probability can be obtained with of weld types
the
considered here is greater than two, logistic regression is not a suitable classification algorithm
prior probability. Bayesian classification is not suitable for the problem of interest because it is difficult for
the problem
to find of interest.
a suitable Bayesian classification
prior probability for the problem is based on theinpremise
considered thatNeural
this paper. the posterior
networks probability
have high
can be obtained with the prior probability. Bayesian classification is not suitable for
requirements in terms of the number of samples needed and they are not suitable for classification with the problem of
interest because
small sample it is and
sizes, difficult
they to find aasuitable
require prior probability
high performance GPU forfortraining
the problem considered
and testing; they inarethis
not
suitable for welding industrial sites. Therefore, the SVM model that was established in this paper was
compared with the K-nearest-neighbour algorithm and the decision tree algorithm. The classification
accuracy of each algorithm and the computational cost of single image recognition were analysed
based on the feature vectors that were extracted in this paper.
paper. Neural networks have high requirements in terms of the number of samples needed and they
are not suitable for classification with small sample sizes, and they require a high performance GPU
for training and testing; they are not suitable for welding industrial sites. Therefore, the SVM model
that was established in this paper was compared with the K-nearest-neighbour algorithm and the
decision tree algorithm. The classification accuracy of each algorithm and the computational cost
Sensors 2020, 20, 471
of
17 of 20
single image recognition were analysed based on the feature vectors that were extracted in this paper.
The K-nearest-neighbour algorithm recognizes the weld joint type by calculating the Euclidean
The K-nearest-neighbour
distances between the feature vectoralgorithm recognizes
of the weld joint the
toweld joint type and
be recognized by calculating
the featurethe Euclidean
vectors of all
distances between the feature vector of the weld joint to be recognized and the
known joint images. In the decision tree algorithm, a decision tree is recursively generated feature vectors of by
all
known joint
selecting images.asIncriteria
features the decision tree algorithm,
for node a decision
splitting. The tree is ID3
traditional recursively generated
algorithm by selecting
was used in this
features as criteria
experiment. Table 4for
showsnodethesplitting. The traditional
final experimental ID3 algorithm was used in this experiment.
results.
Table 4 shows the final experimental results.
Table 4. Comparison of recognition results.
Table 4. Comparison of recognition results.
Accuracy Computational
Classification Algorithm
Classification Algorithm Rate
Accuracy Rate Cost
Computational Cost
SVM SVM 98.40%
98.40% 148.23 ms
148.23 ms
K-nearest-neighbour
K-nearest-neighbour 97.00%
97.00% 984.92 ms
984.92 ms
Decision tree
Decision tree 90.30%
90.30% 138.35 ms
138.35 ms

The experimental
experimental results show that the joint recognition system that is based on the proposed
SVM model is superior to the K-nearest neighbour algorithm in terms of both recognition accuracy
and computational
computational cost.
cost.ByBycontrast,
contrast,thethetime
timecost
costofofthe
thedecision
decision tree algorithm
tree algorithmis lower than
is lower that
than of
that
the SVM algorithm by 9.88 ms; however, its accuracy
of the SVM algorithm by 9.88 ms; however, its accuracy rate is only rate is only 90.3%. Therefore, based
Therefore, based on
comprehensive considerationofofthe
comprehensive consideration the computational
computational costcost
andand accuracy,
accuracy, we conclude
we conclude that
that the SVMthemodel
SVM
model that is presented
that is presented in thisispaper
in this paper is also superior
also superior to the decision
to the decision tree algorithm.
tree algorithm.

Robustness Testing of the Proposed Weld


5.4. Robustness Weld Joint
Joint Recognition
Recognition Method
Method
We added a new weld joint type to the model to verify the robustness of the proposed weld
recognition algorithm. The The new
new weld
weld joint
joint type
type is “filler layer weld joint”. Figure
Figure 20 shows a laser
curve image of such a filler layer weld joint. During the actual welding process, the welding voltage
and current are lower than those in the bottom layer, because one layer has already been welded, and
the swing range of the welding torch should be smaller. Thus, the weld characteristics meet the needs
of weld joint recognition, as presented in this paper. With this addition of this weld joint type, the
final recognition
recognitionaccuracy
accuracyofofthe
thenew
newweld
weldjoint recognition
joint recognitionsystem
systemis 98.1%. When
is 98.1%. compared
When comparedwithwith
the
previous performance,
the previous the accuracy
performance, of theof
the accuracy proposed systemsystem
the proposed is reduced by only by
is reduced 0.3%, thus
only showing
0.3%, thus
that the proposed
showing joint recognition
that the proposed system exhibits
joint recognition systemgood robustness.
exhibits good robustness.

Laser curve image of a filler layer weld joint.


Figure 20. Laser

It is
It is necessary
necessary to
to test
test the
the recognition
recognition results
results for
for different
different weld
weld sizes
sizes to
to verify
verify the
the generalizability
generalizability
of the proposed recognition algorithm. Therefore, along with the additional weld joint mentioned
of the proposed recognition algorithm. Therefore, along with the additional weld joint mentioned
above, our system was applied to weld joint image of different sizes, as shown in Table 5. Ultimately,
above, our system was applied to weld joint image of different sizes, as shown in Table 5. Ultimately,
the recognition accuracy of the new model is 98.4% and the computational cost is 148.23 ms. In fact,
the changes in weld size have no effect on the recognition accuracy.
Sensors 2020, 20, x FOR PEER REVIEW 18 of 20

the recognition accuracy of the new model is 98.4% and the computational cost is 148.23 ms. In
Sensors 2020, 20, 471
fact,
18 of 20
the changes in weld size have no effect on the recognition accuracy.

Table5.5. New
Table New weld
weld joint
joint parameters.
parameters.

Type
Type of of Weld
Weld Joint
Joint Dimensions of Work-Pieces
Dimensions PlatesPlates
of Work-Pieces
LapLap weld
weld joint
joint 600 mm 600
× 150 mm × 5 mm
mm × 150 mm × 5 mm
Butt
Butt weld
weld jointjoint 600 mm 600
× 150
mm mm × 5mm
× 150 mm× 5 mm
Splice
Spliceweld
weldjoint
joint 600 mm 600 mm
× 150 × 150
mm mm× 5 mm
× 5mm
Fillet weld joint mm × 150 mm × 5 mm
Fillet weld joint 600 mm 600
× 150 mm × 5 mm
V-type weld joint 800 mm × 150 mm × 25 mm
V-type weld joint 800 mm × 150 mm × 25 mm

In
In this
this study,
study,wewerecognize
recognizethe
thewelding
weldingjoint
joint type
type before
before welding
welding begins.
begins. However,
However, there
there might
might
be
be multiple welding robots working together in an actual welding workshop. In this case, the
multiple welding robots working together in an actual welding workshop. In this case, the other
other
welding
weldingrobots
robotsaffects thethe
affects welding jointjoint
welding image, and the
image, andarc light
the arcand splash
light and generated by other by
splash generated welding
other
robots appear in the image. We input the image with splash and arc light into our
welding robots appear in the image. We input the image with splash and arc light into our weldingwelding system to
test the effectiveness
system of the proposed
to test the effectiveness of thealgorithm
proposedinalgorithm
the case ofinexternal
the casenoise interference.
of external The results
noise interference.
show that the
The results proposed
show algorithm
that the proposedcanalgorithm
effectivelycan
process an image
effectively with an
process splash andwith
image arc light,
splashasand
shown
arc
in Figure 21.
light, as shown in Figure 21.

Figure 21. (a,e)


Figure 21. (a,e)the
theoriginal
originalimage
imagewith
withsplash and
splash arcarc
and light, (b,f)
light, thethe
(b,f) image afterafter
image median filter,filter,
median (c,g) (c,g)
The
image after after
The image openopen
operation, and (d,h)
operation, the process
and (d,h) result.
the process result.

6. Conclusions
6. Conclusions
In this paper, we proposed an algorithm to address the low adaptability and automation of
In this paper, we proposed an algorithm to address the low adaptability and automation of
traditional weld joint feature extraction algorithms that are based on visual tracking sensors for
traditional weld joint feature extraction algorithms that are based on visual tracking sensors for
determining welding parameter configurations in multi weld joint type environments. Based on
determining welding parameter configurations in multi weld joint type environments. Based on
images that were captured by a visual tracking sensor, a weld joint type recognition algorithm that
images that were captured by a visual tracking sensor, a weld joint type recognition algorithm that
considers the slope distribution at points along the laser curve and the distance between feature
considers the slope distribution at points along the laser curve and the distance between feature
points of a weld joint to construct the feature vector for SVM classification is proposed. The following
points of a weld joint to construct the feature vector for SVM classification is proposed. The following
conclusions can be drawn from the experimental results:
conclusions can be drawn from the experimental results:
(1)
(1) The proposed
The proposed weld
weld joint
joint recognition
recognition system
system can
can accurately
accurately identify
identify various
various weld
weld joint
joint types.
types.
(2)
(2) An image feature extraction method is proposed to extract two kinds of feature information,
An image feature extraction method is proposed to extract two kinds of feature information,
which can
which can increase
increase the
the recognition
recognition information
information and
and improve
improve the
the recognition
recognition accuracy.
accuracy.
(3) The weld joint image feature extraction algorithm that was proposed in this paper offers better
recognition accuracy and a lower computational cost than algorithms from other papers.
Sensors 2020, 20, 471 19 of 20

(4) The weld joint recognition system that was proposed in this paper exhibits good robustness.
After the addition of a new weld joint type, this method can still effectively recognize different
types of joints with higher recognition accuracy.
(5) In future work, we will attempt to achieve online recognition to allow for weld joint types to
be recognized during the actual welding process, which will require a stronger ability to deal
with noise.

Author Contributions: J.Z. and G.-Z.C. conceived and designed the research, J.Z. analyzed the data and wrote
the manuscript, Y.-P.P. and S.-D.H. revised the paper. All authors have read and agreed to the published version of
the manuscript.
Funding: This research was funded by the National Natural Science Foundation of China under Grant NSFC
U1813212 and 51677120, in part by the Shenzhen government fund under Grant JCYJ20180305124348603.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Chen, S.B.; Lv, N. Research evolution on intelligentized technologies for arc welding process. J. Manuf.
Process. 2014, 16, 109–122. [CrossRef]
2. Wang, Z. An imaging and measurement system for robust reconstruction of weld pool during arc welding.
IEEE Trans. Ind. Electron. 2015, 62, 5109–5118. [CrossRef]
3. You, D.; Gao, X.; Katayama, S. WPD-PCA-based laser welding process monitoring and defects diagnosis by
using FNN and SVM. IEEE Trans. Ind. Electron. 2015, 62, 628–636. [CrossRef]
4. Maiolino, P.; Woolley, R.; Branson, D.; Benardos, P.; Popov, A.; Ratchev, S. Flexible robot sealant dispensing
cell using RGB-D sensor and off-line programming. Robot. Comput. Integr. Manuf. 2017, 48, 188–195.
[CrossRef]
5. Zhang, Z.F.; Chen, S.B. Real-time seam penetration identification in arc welding based on fusion of sound,
voltage and spectrum signals. J. Intell. Manuf. 2017, 28, 207–218. [CrossRef]
6. Xu, Y.L.; Zhong, J.Y.; Ding, M.Y.; Chen, S.B. The acquisition and processing of real-time information for height
tracking of robotic GTAW process by arc sensor. Int. J. Adv. Manuf. Technol. 2013, 65, 1031–1043. [CrossRef]
7. Lü, X.Q.; Gu, D.X.; Wang, Y.D.; Qu, Y.; Qin, C.; Huang, F.Z. Feature extraction of welding seam image based
on laser vision. IEEE Sens. J. 2018, 18, 4715–4724. [CrossRef]
8. Li, X.H.; Li, X.D.; Khyam, M.O.; Ge, S.S. Robust welding seam tracking and recognition. IEEE Sens. J. 2017,
17, 5609–5617. [CrossRef]
9. Rodríguez-Martín, M.; Rodríguez-Gonzálvez, P.; González-Aguilera, D.; Fernández-Hernández, A.J.
Feasibility study of a structured light system applied to welding inspection based on articulated coordinate
measure machine data. IEEE Sens. J. 2017, 17, 4217–4224. [CrossRef]
10. Fang, Z.J.; Xu, D.; Tan, M. Vision-based initial weld point positioning using the geometric relationship
between two seams. Int. J. Adv. Manuf. Technol. 2013, 66, 1535–1543. [CrossRef]
11. Kong, M.; Shi, F.H.; Chen, S.B.; Lin, T. Recognition of the initial position of weld based on the corner detection
for welding robot in global environment. In Robotic Welding; Springer: Berlin, Germany, 2007; pp. 249–255.
12. Zhu, Z.Y.; Lin, T.; Piao, Y.J.; Chen, S.B. Recognition of the initial position of weld based on the image pattern
match technology for welding robot. Int. J. Adv. Manuf. Technol. 2005, 26, 784–788. [CrossRef]
13. Zhou, L.; Lin, T.; Chen, S.B. Autonomous acquisition of seam coordinates for arc welding robot based on
visual servoing. J. Intell. Robot. Syst. 2006, 47, 239–255. [CrossRef]
14. Dinham, M.; Fang, G. Autonomous weld joint identification and localisation using eye-in-hand stereo vision
for robotic arc welding. Robot. Comput. Integr. Manuf. 2013, 29, 288–301. [CrossRef]
15. Sung, K.; Lee, H.; Choi, Y.S.; Rhee, S. Development of a multi-line laser vision sensor for joint tracking in
welding. Weld. J. 2009, 88, 79–85.
16. Lee, J.P.; Wu, Q.Q.; Park, M.H.; Park, C.K.; Kim, I.S. A study on optimal algorithms to find joint tracking in
GMA welding. Int. J. Eng. Sci. Innov. Technol. 2014, 3, 370–380.
17. Fang, J.F.; Jing, F.S.; Yang, L. A precise seam tracking method for narrow butt seams based on structured
light vision sensor. Opt. Laser Technol. 2019, 109, 616–626.
Sensors 2020, 20, 471 20 of 20

18. Zou, Y.B.; Chen, T. Laser vision seam tracking system based on image processing and continuous convolution
operator tracker. Opt. Lasers Eng. 2018, 105, 141–149. [CrossRef]
19. Fang, Z.; Xu, D. Image-based visual seam tracking system for fillet joint. In Proceedings of the 2009 IEEE
International Conference on Robotics and Biomimetics (ROBIO), Guilin, China, 13–19 December 2009;
pp. 1230–1235.
20. Li, X.D.; Li, X.H.; Ge, S.Z. Automatic welding seam tracking and identification. IEEE Trans. Ind. Electron.
2017, 64, 7261–7271. [CrossRef]
21. Qian, B.F.; Liu, N.S.; Liu, M.Y.; Lin, H.L. Automatic recognition to the type of weld seam by visual sensor
with structured light. Nanchang Univ. Eng. Technol. 2007, 29, 368–370.
22. Li, Y.; Xu, D.; Tan, M. Welding joints recognition based on Hausdorff distance. Chin. High Technol. Lett. 2006,
16, 1129–1133.
23. Fan, J.F.; Jing, F.S.; Fang, Z.J. Automatic recognition system of welding seam type based on SVM method.
Int. J. Adv. Manuf. Technol. 2017, 92, 989–999. [CrossRef]
24. Zeng, J.; Cao, G.Z.; Li, W.B.; Chen, B.C. An algorithm of hand-eye calibration for arc welding robot.
In Proceedings of the 2019 16th International Conference on Ubiquitous Robots (UR), Jeju, Korea, 24–27 June
2019; pp. 1–6.
25. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22,
1330–1334. [CrossRef]
26. Li, W.B.; Cao, G.Z.; Sun, J.D.; Liang, Y.; Huang, S.D. A calibration algorithm of the structured light vision for
the arc welding robot. In Proceedings of the 2017 14th International Conference on Ubiquitous Robots and
Ambient Intelligence (URAI), Jeju, Korea, 28 June–1 July 2017; pp. 481–483.

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like