You are on page 1of 17

J Intell Robot Syst (2006) 47: 239–255

DOI 10.1007/s10846-006-9078-9

Autonomous Acquisition of Seam Coordinates for Arc


Welding Robot Based on Visual Servoing

L. Zhou & T. Lin & S. B. Chen

Received: 17 May 2006 / Accepted: 29 August 2006 /


Published online: 25 October 2006
# Springer Science + Business Media B.V. 2006

Abstract Autonomous acquisition of seam coordinates is a key technology for developing


advanced welding robot. This paper describes a position-based visual servo system for
robotic seam tracking, which is able to autonomously acquire the seam coordinates of the
planar butt joint in the robot base frame and plan the optimal camera angle before welding.
A six-axis industrial robot is used in this system, which has an interface for communicating
with the master computer. The developed visual sensor device is briefly presented that
allows the charge-coupled device (CCD) cameras to rotate about the torch. A set of robust
image processing algorithms are proposed so that no special requirements of light source are
needed in this system. The feedback errors of this servo system are defined according to the
characteristics of the seam image, and the robust tracking controller is developed. Both
the image processing program and tracking control program run on the master computer. The
experimental results on straight line seam and curve seam show that the accuracy of the seam
coordinates acquired with this method is more adequate for high quality welding process.

Key words image processing . robot vision . seam tracking . visual servo . welding robot

1 Introduction

At present, more and more industrial robots have been used in the automation of arc
welding process to place the welding torch at the correct orientation above the seam to be
welded. However, most of them work in the mode of “teach and playback.” They must be
taught the welding path in advance as well as welding parameters and work sequence, and
they are not adaptive to the fixturing inaccuracies of workpiece, the dimensional variations,
in-process thermal distortions and the variations in the joint geometry. Therefore, there are
many issues needed to be handled to develop an autonomous welding robot. It is obvious

L. Zhou (*) : T. Lin : S. B. Chen


Institute of Welding Engineering, Shanghai Jiao Tong University, No.1954 Hua Shan Road,
Shanghai 200030, People’s Republic of China
e-mail: zhoulv@sjtu.edu.cn
240 J Intell Robot Syst (2006) 47: 239–255

that autonomous welding path planning (i.e., acquisition of seam coordinates of planar butt
joint) is important issues in this field.
The visual sensor has become a useful robotic sensor because it has non-contact sensing
capability and can provide abundant information. However, in the actual welding, there are
a lot of optical disturbances such as arc glares, welding spatters and smoke which make it
difficult to plan the welding path during welding. Moreover, if the seam path is planned
during welding, the in-process errors will influence the accuracy of the weld bead. So
planning the welding path before welding is reasonable. For the same workpiece, the
welding path needs to be planned only one time, which can mainly substitute for the
manual teach operation.
There have been a number of studies on correcting welding path and on controlling
welding parameters by means of visual sensing. In some studies, the camera was used to
directly view the weld pool and its vicinity to obtain control information such as the size or
position of the pool, the width of the gap etc [1, 2]. In other studies, the camera was used to
view the laser stripe projected by a laser diode to detect the seam position, gap size, the
offset etc [3–5].
However, the researches on planning welding path before welding are relatively few.
Kuo and Wu [6] demonstrated a look-then-move approach. In their approach, the seam
coordinates of butt-plate seam were extracted in image frame by processing the full seam
image, and then these coordinates were transformed into the coordinate system of the
numerical control (NC) machine. This approach works in an open-loop fashion, so its
accuracy depends directly on the accuracy of the calibration and the manipulator. Moreover,
in their research, extra illumination condition was needed and the processed seam images
did not include the complicated scene. Besides, Chen et al. [7] adopt the image-processing
and stereo-vision computing methods to acquire the 3D coordinates of the welding seam.
However, the accuracy of the coordinates acquired by their method was too low to be used
as welding path.
In this paper, in order to autonomously acquire the seam coordinates of planar butt joint
and to plan the angle of camera, the welding seam is tracked using the visual servoing
approach before welding. A tutorial introduction to visual servo control of robotic
manipulator is given in [8]. This approach can increase the overall accuracy of the system
by using a visual-feedback control loop. Furthermore, the camera only views the local
region of the seam rather than the full seam with the eye-in-hand configuration, so the scene
has no influence on the tracking results.
The visual servoing for object tracking has been studied in various applications, for example
in “edge trimming of fabric embroideries by laser” addressed by Amin-Nejad et al. [9]. In their
study, a laser stripe was used to detect embroidery’s edge and fabric embroideries were cut
automatically. To get a high sampling rate, they used a parallel DSP processor to run the
image processing. Moreover, in their visual servo system, the position of embroidery’s edge
and the position of the laser spot projected by the cut tool are both detected.
In our research, we directly viewed the welding seam using a camera and this view
fashion is also convenient for the view of weld pool in the future. Accordingly, the image-
processing algorithms, the definitions of feedback errors and tracking controller were
developed based on the characteristics of the seam image. Since high tracking speed is not
needed in this application, both image-processing program and the tracking control program
run on one PC with the multithread programming technology. Since it is difficult to detect
the torch position in the seam image, two independent fitted cubic polynomials are used to
provide the torch position.
J Intell Robot Syst (2006) 47: 239–255 241

Figure 1 Diagram of the seam


tracking visual servo system.

2 System Description

The constructed visual servo system, as shown in Figure 1, mainly consists of three parts:
the RH6 robot system, the sensor device and the master computer. The RH6 robot system is
a six-axis industrial robot, developed by Shenyang Institute of Automation, Chinese
Academy of Sciences (shown in Figure 2). The master computer is a Pentium III PC, which
runs the image-processing and tracking control program and acts as the man–machine
interface to the system.

2.1 Visual Sensor Device

Figure 3 shows the prototype of the visual sensor device. It has two hollow shafts and the
outer shaft can rotate about the inter shaft. The welding torch used in gas tungsten arc
Figure 2 RH6 robot system at a
glance.
242 J Intell Robot Syst (2006) 47: 239–255

Figure 3 The visual sensor device.

welding (GTAW) process passes through the inter shaft. Two CCD cameras are mounted on
the opposite sides of the outer shaft. Only one them is used to capture the seam image in
this seam tracking system. The other camera is used to capture weld pool image in order to
study welding quality control, which is not presented in this paper. The whole visual sensor
is fixed on the end of the arm of the robot.
The cameras can rotate about the torch in order to make cameras on the optimal shooting
location with a servo-actuator. Additionally, the visual sensor device can automatically load
or unload light filters using an electromagnetism air valve and two air cylinders to be
suitable for two different shooting cases: welding and before welding. The servo-actuator is
under the control of the robot controller as an external axis, so we call this servo-actuator
the 7th axis. The electromagnetism air valve is controlled by the robot controller using a
robot I/O port.

2.2 Interface of Robot Controller

The master computer communicates with the robot controller using controller area network
(CAN) bus. The RH6 robot has its controller with the interface for accepting incremental
pose and incremental the 7th axis angle from the master computer and for sending the
current pose of the robot body and the current 7th axis angle to the master computer. In
addition, the robot controller also has the interface for accepting the I/O commands from
the master computer. The control cycle of the robot controller is 16 ms, and the shortest
period of communication between the master computer and the robot controller is 16 ms.
The current pose is described in terms of a transform matrix from the tool fame to the
base frame of the robot, and the incremental pose includes three position increments in
J Intell Robot Syst (2006) 47: 239–255 243

terms of x, y and z coordinates in the base frame and three orientation increments in terms
of roll, pitch, yaw angles. Since the welding seam is planar, only x, y increments are used
and the others are zero when tracking the seam.

2.3 Seam Tracking Scheme

The local seam image captured by the camera is transferred to the memory of the master
computer with the frame grabber. The master computer processes the image and extracts the
position error and the angle error of the local seam circularly. With the information of the
position and the direction of the local seam, the master computer sends motion increments
to the robot controller, then the robot moves the end-effector along the seam path and the
7th axis rotates the camera to seam’s tangent direction. At the same time, the master
computer reads the current pose of the robot and the 7th axis angle from robot controller,
corrects these data with the current position error and angle error, and then records them at
constant time intervals to acquire the seam coordinates in the base frame and the required
the 7th axis angle.
After a pass of seam tracking finished, the master computer send an I/O command to
robot controller to load the light filters. Then under the control of the master computer, the
robot moves the torch to the initial point of the seam and then begins to weld along the
welding path described by the acquired seam coordinates.
The error signals in the visual-feedback loop of this system are defined in the Cartesian
coordinates. And the control structure of the seam tracking system is hierarchical: the seam
tracking controller, which runs on the master computer, and the underlying robot controller.
The seam tracking controller is simplified by using such control structure. According to the
taxonomy of visual servo systems [8], the constructed seam tracking system is categorized
into the position-based and dynamic look-and-move structure.
The master computer sends position increments to the robot controller due to the
interface of the robot controller. Using position control rather than speed control for seam
tracking will cause jerky motions if the update rate of the position increments is not high
enough. At the stage of seam tracking, the update rate of the position increments is 25 Hz,
limited by the sampling rate of the CCD camera and the calculation capacity of the master
computer. So the jerky motions are imperceptible and have little influence on the acquired
seam coordinates.

3 Calibrations

One CCD camera in the sensor device is used to capture the local image of the seam when
tracking the seam. The orientation of the torch is perpendicular to the plane of workpiece
which is fixed on a horizontal worktable. The vertical distance between the tip of the
tungsten electrode and the workpiece is about 4 mm which is appropriate to GTAW. Before
tracking the seam, this orientation of the torch can be easily restored by using the pose
interpolation method.

3.1 Calibration of the End-Effector Position

The point, where the electrode axis intersects the plane of the workpiece in the image
frame, is called the torch point and is denoted by p. The position of p is described by the
coordinates in the image frame with its origin at the top left corner of the image. In this
244 J Intell Robot Syst (2006) 47: 239–255

Figure 4 The calibration data and the fitted curves.

frame, the horizontal axis is called u axis and the vertical axis is called v axis according to
the convention.
The position of p alters when the camera rotates about the torch. The reasons for this are
that the torch is not absolutely perpendicular to the plane of workpiece and there are some
manufacturing errors and mounting error.
It is difficult to detect the position of point p because p doesn’t really exist in the seam
image. However, there is a nonlinear mapping between the 7th axis angles θ and the
positions p, so that p can be described by θ. Then the seam tracking system only needs to
detect the seam position in the image, and the system that only observes the target is
referred to as endpoint open-loop (EOL) system. Compared with the endpoint closed-loop
(ECL) system, which observe both the target and the end-effector, the EOL system often
requires the solution of a less demanding vision problem.
Two independent cubic polynomials are used to describe the mapping between the
position of p and θ.
pu ðθÞ ¼ cu0 þ cu1 θ þ cu2 θ2 þ cu3 θ3
ð1Þ
pv ðθÞ ¼ cv0 þ cv1 θ þ cv2 θ2 þ cv3 θ3

where pu(θ) and pv(θ) represent the u-axis coordinate of p and the v-axis coordinate of p are
the functions of θ, respectively. The parameters cui and cvi (i = 0, 1, 2, 3) can be obtained
with polynomial fit tool in Origin, developed by OriginLab, USA. The calibration data and
the fitted curves are shown in Figure 4. The position data of p is updated using equation 1
for the feature extraction of every seam image.

3.2 Calibration of the Angle in the Image Frame

The angle α with the vertex p and a line parallel to u-axis is unequal to the corresponding angle
α′ in the base frame, since the camera is not perpendicular to the workpiece. The relationship
between them can also be described by using a cubic polynomial (shown in Figure 5).

a 0 ¼ c0 þ c 1 a þ c2 a 2 þ c 3 a 3 ð2Þ
J Intell Robot Syst (2006) 47: 239–255 245

Figure 5 The calibration of the


angle in the image frame.

4 Image Processing

The main purpose of image processing is to acquire the center line of the welding seam
with high accuracy and reliability. The basic procedures of the proposed image-processing
algorithms are filtering, thresholding and thinning (shown in Figure 6).

4.1 Median Filtering

A median filter is adopted to remove the noise because it is effective for removing salt and
pepper and impulse noise while retaining image details. A 3×3 sliding windows is used and

Figure 6 Images of the processing procedures: (a) the whole image. The area with white boundaries in this
image is the processing region; (b) median filtering; (c) thresholding; (d) thinning.
246 J Intell Robot Syst (2006) 47: 239–255

the method of computing a median value of the nine pixels in the window centered on the
current pixel is as follow:
1. Sort the nine pixels into ascending order by gray level.
2. Select the value of the fifth pixel and the new value of the current pixel.
The median filtering result of the processing region of Figure 6a is shown in Figure 6b.

4.2 Thresholding

Thresholding is a common method to convert a gray scale image into a binary image so that
objects of interest are separated from the background. According to the features of this
seam-tracking task, the gray level values of pixels making up of the welding seam in each
image are smaller than a certain value and the number of these seam pixels can be thought
of as a constant. So the threshold of each image can be chosen by a specified value of this
number. The special value is set based on the width of the welding seam and the width of
the image-processing region. The algorithm is given as follow:
1. Count the number of pixels Ni for per gray level, where subscript i is the gray level
value and i = 0, 1, 2,..., 255.
2. Calculate the threshold T by:
!
X T
min Ni  Nset ð3Þ
T
i¼0

where Nset is the specified value.


3. Segment the image by:

FT ½i; j ¼ 0 if F ½i; j  T ð4Þ
225 otherwise

where FT [i, j] is thresholded gray image which is obtained using a threshold T for
original gray image F [i, j].
The thresholding result of the processing region in Figure 6b is shown in Figure 6c.

4.3 Thinning

Thinning is an image-processing operation in which binary-valued image regions are


reduced to lines that approximate their center lines. The purpose of thinning is to obtain the
position information of the center line of the seam. There are numerous algorithms of
thinning and a comprehensive survey of them is available in [10]. The method of [11] is
used, which is introduced as follow:
The black pixel p[2][2] examined for deletion and the pixels in a 5×5 window are
labeled (see Figure 7). We use p[i][ j] to denote both the pixel and its value 0 or 1 when p[i]
[ j] is white or black, respectively. The number of black pixels in the 8-neihbors of p[2][2]
is denoted by N(p[2][2]). A sequence of the 8-neihbors of p[2][2] in counterclockwise order
is p[1][2] p[1][1] p[2][1] p[3][1] p[3][2] p[3][3] p[3][3] p[1][3] p[1][2], and the number of
transitions from a white point to a black point is denoted by T(p[2][2]) when the sequence is
traversed. Similarly, T(p[1][2]) denotes the number of transitions when sequence p[0][2] p[0]
J Intell Robot Syst (2006) 47: 239–255 247

Figure 7 Pixels in a 5×5


window.

[1] p[1][1] p[2][1] p[2][2] p[2][3] p[1][3] p[0][3] p[0][2] is traversed; T(p[2][1]) denotes the
number of transition when the sequence p[1][1] p[1][0] p[2][0] p[3][0] p[3][1] p[3][2] p[2]
[2] p[1][2] p[1][1] is traversed.
Then there are four conditions to decide whether p[2][2] will be deleted. If p[2][2]
satisfies all the following conditions, it is deleted; otherwise, it is remained.
Condition 1: 2  N ðp½2½2Þ  6;
Condition 2: T ðp½2½2Þ ¼ 1;
Condition 3: p½1½2  p½2½1  p½2½3 ¼ 0 and T ðp½1½2Þ 6¼ 1
Condition 4: p½1½2  p½2½1  p½3½2 ¼ 0 and T ðp½2½1Þ 6¼ 1
The thinning process is performed iteratively: On each iteration, every black image pixel
is inspected with the above conditions. When no pixel can be deleted on an iteration, the
image is thinned. The thinned image of Figure 6b is shown in Figure 6d.

4.4 Removing Small Lines

The thresholding method as mentioned above is helpful to keep reliability of the seam
image processing because it can avoid excess and deficiency of the seam pixels. However,
when the illumination in the workshop is badly uneven, the shadows of the sensor device
and the torch on the workpiece causes some shadow pixels, this case is even worse when
the workpiece is made of high light reflectance materials like some aluminum alloy.
In the thinned seam image, the line representing the welding seam should be the longest
one. To make seam detecting more robust, it is effective to remove all lines except the
longest one after thinning process. Removing small lines also gives a relative larger
tolerance to the value of Nset in thresholding algorithm and reduces the influences of seam
width variation.
A line corresponds to a connected component in the digital images. To determine the
size of the lines, component labeling is necessary which finds all connected components in
248 J Intell Robot Syst (2006) 47: 239–255

Figure 8 The image processing results when the illumination is bad. (a) the whole image (b) image after
thresholding (c) image after thinning (d) image after removing small lines.

an image and assigns a unique label to all points in the component. The sequential
component labeling algorithm using 8-connectivity is adopted and similar algorithm using
4-connectivity is given in [12]. The size of every line is calculated as part of this algorithm.
After component labeling, the label of the maximal component with the greatest size in
the labeled image is found; then all components whose labels are not equal to this label are
removed by changing the corresponding pixels in the original thinned image to 255. With
these operations, the purpose of removing all lines except the longest one is realized.
Figure 8 shows the robustness of the whole proposed image-processing algorithms when
the illuminating condition is bad. For high light reflectance workpiece like aluminum alloy
5056, the only requirement is that the plates close to the seam should be polished, which is
a normal practice for welding process.

5 Feature Extraction

5.1 The Extraction of Feedback Errors

As shown in Figure 9, the distance between the torch point and the center line of the local
seam in the v-axis direction is defined as position error d, and d is positive if the torch point
is above the seam in the image frame, otherwise d is negative. The angle of the center line
of the local seam with the u-axis in the image frame is defined as angle error α, and α is
positive if the seam goes upward from left to right, or α is negative. The position error and
the angle error of the local seam are the feedback errors of this visual servo system.
The curved local seam can be approximated by a straight line. So the whole curved seam
will be well approximated by many individual straight lines when the robot moves the torch
along the seam. In the coordinate system that the origin locates at the bottom left corner of
the image, the positive vertical axis points up and the positive horizontal axis points right,
the data points of welding seam are extracted in the thinned image. Then least-squares
straight-line fit is done with these data points, and d and α are calculated in the image frame
using the resulting line and the torch point coordinates.
Assuming the line equation after fitting is
v 0 ¼ au 0 þ b ð5Þ
J Intell Robot Syst (2006) 47: 239–255 249

Figure 9 Sketch map of feed-


back errors definition and target
pose calculation.

Then the position error can be calculated as follow:

d ¼ s0 ððh  pv Þ  ðapu þ bÞÞ ð6Þ

where pu and pv represent the u-axis coordinate and the v-axis coordinate of the current
torch point p, respectively, and are calculated using the equation 1, h is the height of image
and s0 is the a scale factor representing the real size of a pixel in the v-axis direction of the
image frame.
The angle error is directly calculated based the slope of the line:

a ¼ tan1 ðaÞ ð7Þ

5.2 Target Pose Calculation

In this block the position of the target point, i.e., the desired welding point, (xr, yr) and the
target angle of the 7th axis θr are calculated in the base frame. These data are recorded to
acquire the welding path and the camera angle.
The camera rotation angle β b (shown in the Figure 9) is calculated by

b ¼ s1 ðq c  qo Þ ð8Þ

where θc is the current angle of the 7th axis, θo is the reference angle of the 7th axis where
the axial line of camera and the y-axis of the base frame are in the same plane, and s1 is the
reciprocal value of the drive ratio in the sensor.
According to the geometrical relationship, shown in Figure 9, (xr , yr) and θr are
calculated by

xr ¼ xc  d  cos b
ð9Þ
yr ¼ yc þ d  sin b
250 J Intell Robot Syst (2006) 47: 239–255

θr ¼ θc þ α 0 =s1 ð10Þ

In the above equations, the current torch point coordinates, (xc, yc), and θc are obtained
by reading the pose of the robot and α′ is the corrected value of α with equation 2.

6 Tracking Controller

The aim of the tracking controller is to move the torch along the seam with constant
velocity. To control robot tracking the seam (shown in Figure 10), the robot is controlled to
!
move the torch along the tangent direction of the local seam with the speed V f when the
position error is 0. And the position error is compensated by moving the torch toward the
!
seam in the v-axis direction with an additional speed V b when the position error is a
! !
nonzero value. The speed V f and V b are called feedforward speed and feedback speed,
respectively. Visual feedforward control in addition to a feedback controller gives the best
tracking controller [13].
The x-axis speed component vx of the robot is calculated by adding the x-axis component
of the feedforward velocity vectors and the x-axis component of feedback velocity vectors.
The y-axis speed component vy of the robot is calculated in the same way. According to the
geometry relationship as shown in Figure 10, vx and vy are calculated by

vx ¼ vf  sin ða0 þ bÞ  vb  cos ðbÞ


ð11Þ
vy ¼ vf  cos ða0 þ bÞ þ vb  sin ðbÞ

!
where α′+β is the angle of the vector V f with its y-axis component vyf, and β is the angle
!
of the vector V b with its x-axis component vxb.
To keep overall speed of the torch constant, the magnitude of it should be always equal
to the magnitude of the feedforward speed, even in the case that the position error is a
nonzero value. The corrected x-axis speed component v 0x and the corrected x-axis speed
component v 0y are calculated by changing the magnitude of the overall speed vector to vf
while preserving the direction of this vector.
By assuming that the angle of overall speed vector with its y-axis component is gγ, then
v 0x and v 0y are calculated by
  
g ¼ tan1 vx vy
v 0x ¼ vc  sin ðg Þ ð12Þ
v 0y ¼ vc  cos ðg Þ

The robot accepts incremental position commands. The magnitude of the increment is
called step size. There is a relationship between the step size l and the speed v as follows:

l ¼ s2 v ð13Þ
where s2 is the time interval that the computer sends motion increments to the robot
controller. Therefore the x-axis step size lx and y-axis step size ly , which is sent to the robot
controller, are calculated by
lx ¼ s2 v 0x
ð14Þ
ly ¼ s2 v 0y
J Intell Robot Syst (2006) 47: 239–255 251

Figure 10 Sketch map of track-


ing control.

Combining equations 11, 12 and 14, lx and ly can be calculated by vf, vb, α and β
b. As the
overall speed of the torch, the magnitude of feedforward speed, vf, is set in the program.
The magnitude of the feedback speed, vb, depends on the tracking controller. In a
proportional controller with high proportional gain, kp1, the controller would rapidly bring
the error to zero but at the expense of reducing the smoothness of tracking and driving the
controller to instability. By adding some stiffness to the feedback controller, a derivative
gain kd1, sharp deviations from the tracking path are reduced and controller’s stability is
increased. Therefore a PD controller given in equation 15 is used here.
 
vb ¼  kp1 d ðk Þ þ kd1 ðd ðk Þ  d ðk  1ÞÞ ð15Þ

For controlling the camera rotation, a simple PD feedback controller, equation 16, was
used to complement the angle error.
lw ¼ kp2 aðk Þ þ kd2 ðaðk Þ  aðk  1ÞÞ ð16Þ
where lw is the angle step size, kp2 is the proportional gain and kd2 is the derivative gain.

Figure 11 Seam coordinates of the straight line seam.


252 J Intell Robot Syst (2006) 47: 239–255

Figure 12 The 7th axis angle of


the straight line seam.

Figure 13 Position error in


tracking the straight line seam.

Figure 14 Angle error in track-


ing of the straight line seam.
J Intell Robot Syst (2006) 47: 239–255 253

7 Experimental Results

Two types of welding seam, straight line, and curve line were chosen for seam tracking. The
material of the workpiece is aluminum alloy 5056, and the seam width is about 0.5 mm. Before
tracking, the torch is moved to the initial point, which is the seam point with the minimal y-axis
coordinate value. The seam tracking was performed at the speed of about 20 mm/s.
For straight line seam, the acquired seam coordinates is shown in Figure 11 and the
acquired the 7th axis angle is shown in Figure 12. The position error and the angle error in
tracking this seam are shown in Figures 13 and 14, respectively. After the initial stage, the
position error is less than ±0.2 mm except for individual point and the angle error is less
than ±4°. The same results obtained regardless of the change in the angle of the seam.
The size of the curve line seam is shown in Figure 15. Figures 16, 17, 18 and 19 show
the corresponding results. After the initial stage, most position error is less than ±0.3 mm
(Figure 18) and the angle error is less than ±5° (Figure 19).
To further increase the accuracy of the acquired seam coordinates and the 7th axis angle,
the seam coordinates and the 7th axis angles are corrected by the current position error and
angle error, respectively. With this step, the good accuracy is guaranteed for both straight line
seam and curve line seam. As can be seen in Figures 11 and 16, the overall accuracy of the
acquired seam coordinates is better than ±0.5 mm, which is more adequate for high quality

Figure 15 Size of the curve


line seam.

Figure 16 Seam coordinates of the curve seam.


254 J Intell Robot Syst (2006) 47: 239–255

Figure 17 The 7th axis angle of


the curve seam.

Figure 18 Position error in


tracking the curve seam.

Figure 19 Angle error in track-


ing of the curve seam.
J Intell Robot Syst (2006) 47: 239–255 255

welding process. At the same time, the accuracy of the acquired 7th axis angle is better than
±5° (Figures 12 and 17), which is sufficient for the image-processing program to easily
detect the seam and will facilitate the feature extraction of weld pool image in the future.

8 Conclusions

The developed visual servo system demonstrated a successful technology for automatic
robotic seam tracking of planar butt joint seam to acquire the seam coordinates and to plan the
optimal camera angle. In summary, the main features of the system include the followings.
1. The reliable detection of seam can be achieved by the proposed image-processing
algorithms and no special light source is required in this system.
2. The tracking controller was built based on the defined feedback errors and a good
accuracy in the seam tracking can be obtained at the speed of about 20 mm/s.
3. For both straight line seam and curve line seam, the accuracy of the acquired seam
coordinates is better than ±5 mm, which is more adequate for high quality welding
process. At the same time, the accuracy of the acquired 7th axis angle is better than
±5°, which is sufficient for the image-processing program to easily detect the seam and
provides the convenience for welding quality control in the future.

Acknowledgments This work is partially supported by the National Natural Science Foundation of China
under Grant No. 50575144 and Shanghai Science and Technology Committee No. 021111116.
The authors wish to thank the anonymous reviewers for their valuable comments on the earlier draft of
this paper.

References

1. Bae, K.Y.: An optical sensing system for seam tracking and weld pool control in gas metal arc welding
of steel pipe. J. Mater. Process. Technol. 120, 458–465 (2002)
2. Yamane, S., Kaneko, Y., Kitahara, N., Ohshima, K., Yamamoto, M.: Neural network and fuzzy control of
weld pool with welding robot. Ind. Appl. Soc. Annu. Meet.: Conf. Rec. 1993 IEEE 3, 2175–2180 (1993)
3. Kim, J.S.: A robust method for vision-based seam tracking in robotic arc welding. IEEE International
Symposium on Intelligent Control – Proceeding, pp. 363–368 (1995)
4. Smith, J.S., Lucas, J.: Vision-based seam tracker for butt-plate TIG welding. J. Phys. E: Sci. Instrum. 22(9),
739–744 (1989)
5. Luo, H.: Robotic welding, intelligence and automation, laser visual sensing and process control in
robotic arc welding of titanium alloys. LNCIS 299, 110–122 (2004)
6. Kuo, H.C., Wu, L.J.: An image tracking system for welded seams using fuzzy logic. J. Mater. Process.
Technol. 120(1–3), 169–185 (2002)
7. Chen, S.B., Chen, X.Z., Qiu, T., Li, J.Q.: Acquisition of weld seam dimensional position information for
arc welding robot based on vision computing. J. Intell. Robot. Syst. 43, 77–97 (2005)
8. Hutchinson, S.: Tutorial on visual servo control. IEEE Trans. Robot. Autom. 12(5), 651–670 (1996)
9. Amin-Nejad, S., Smith, J.S., Lucas, J.: A visual servoing system for edge trimming of fabric
embroideries by laser. Mechatronics 13(6), 533–551 (2003)
10. Lam, L.: Thinning methodologies – A comprehensive survey. IEEE Trans. Pattern Anal. Mach. Intell.
14(9), 869–885 (1992)
11. Yang, S.Y.: VC++ Programming on Image Processing. Tsinghua University Press, Beijing (2003)
12. Jain, R.: Machine Vision. McGraw-Hill, New York (1995)
13. Corke, P.I.: Controller design for high performance visual servoing. Proc IFAC 12th World Congress
Sydney, pp. 9-395–9-398. (1993)

You might also like