You are on page 1of 6

Welding robot positioning method based on machine vision and laser

ranging
Bangwang Hou1, Yiding Zhao2
1. School of Control Engineering and Northeastern University, Qinhuangdao, 066000
E-mail: bangwanghou@163.com

Abstract: In order to solve the problem of the accurate positioning of the welding robot in the automatic bar production,
this paper proposes a positioning method of the welding robot based on machine vision and laser ranging. The traditional
positioning method of welding robot mainly adopts the Hand-eye calibration mode, however, the essence of Hand-eye
calibration is to obtain the conversion relationship from camera coordinate system to robot end actuator coordinate
system through complex matrix operation. The calculation process of this method is too lengthy, and it is difficult to
achieve the required accuracy of welding. The precise positioning of welding robot mainly includes plane coordinate
positioning and depth positioning. First of all, this paper is based on the principle that the distance between the camera
and the object to be photographed and the size of the camera image are proportional to the size of the object, the pixel
coordinates obtained from the image processing are transformed into the plane coordinates of the final target point to be
welded by the method of machine vision. At the same time, in view of the distortion caused by the camera in the imaging
process, this paper uses the Zhang calibration method to correct the distorted image. Finally, the resistance detection
circuit proposed is used to determine the final welding position. The experimental results show that the positioning
method of the welding robot has the advantages of easy realization and high positioning accuracy, which can fully meet
the welding requirements and improve the efficiency of automatic welding of the label.
Key Words: positioning method; Welding robot; Laser ranging sensor; machine vision.

1. INTRODUCTION
2. Calibration Based On The Teaching Process
For the convenience of users and traceability of bar quality,
it is often necessary to weld the label printed on the end 2.1 Laser Sensor
face of the bar with the specific information of the bundle
The laser ranging sensor is a high-precision distance
of bars in the bar production. Because the welding
measuring instrument. Its working principle[5] is that the
environment is complex and the efficiency of manual
laser emits a laser be-am to the surface of the object, and a
welding is relatively low, it is necessary that the welding
part of the laser light is reflected, the reflected laser light is
robot has t-he ability to sense the welding environment and
imaged on the photosensitive sur-face through the imaging
adjust the welding position autonomously[1].
lens built in the sensor. The optical axis of the laser, the
So far, for the positioning of the welding robot is still more imaging optical axis, and the line where the photodetector
fixed track welding or hand-eye calibration. However, is located form a triangle. According to the change of the
because the specifications, diameter and assembly position imaging position on the photosensitive surface, the
of each bundle of bars are different, the high temperature distance of the laser ranging sensor from the object to be
environment during the welding process also causes measured can be obtained, due to its unique ranging
thermal stress deformation of the welded work piece. principle[6]. This type of sensor has strong real-time and
Therefore, the positioning method of the fixed trajectory high precision, so the laser ranging sensor is widely used in
cannot meet the quality and efficiency of the welding. The robot motion control.The basic principle is shown in the
positioning method of hand-eye calibration is so following figure 1:
complicated. The essence of hand-eye[3] calibration is to
connect the "hand" of the robot end effector with the "eye" /DVHU
of the visual measurement system through complex matrix
operation, to obtain the camera coordinate system to the
end of the robot. The conversion relationship of the £
actuator coordinate system. The calculation process of this  G
method is too long and it is difficult to achieve the precision 2EMHFW
required for welding. To this end, this paper proposes a V T
welding robot positioning method based on machine vision ,PDJH
and laser ranging. The method is not only simple to
implement but also ensures the accuracy of positioning. It [ I
has achieved good positioning effect in actual welding test.

978-1-7281-5855-6/20/$31.00 2020
c IEEE 391

Authorized licensed use limited to: Southeast University. Downloaded on November 02,2022 at 05:09:49 UTC from IEEE Xplore. Restrictions apply.
Figure 1 Principle of Laser ranging According to the principle that the distance between the
The laser Laser emits a laser beam at a certain angle ȕ, and camera and the object to be photographed is constant and
the object with a distance d along the laser direction reflects the size of the camera image and the object is not changed,
the laser beam. The laser receiving is CMOS, and the laser only laser measurement is required. The distance sensor
reflected by the object is photographed by Imager through ensures that the distance between the camera and the object
"small hole imaging". The focal length is f, the vertical to be welded is constant when the image is acquired. The
distance of the object from the plane is q, the distance scaling method based on the fixed distance can be
between the laser and the focus is s, and the point position calculated by the triangle method[7]. The principle is
of the image on the Imager after the laser reflection of the shown in Figure 3:
object is X from there. 3
s
d= f (1)
X sin β
2.2 Resistance Detection Circuit Design
In the method, in order to feedback whether the end of the / $ 0 %
welding robot reaches the welding position, a special
detecting circuit is designed to feed back the position state
of the welding robot. The design circuit is shown in Figure
2:
& '
One side of the relay is connected to the I/O port of the
welding robot. In order to prevent the line from being Figure 3 Principle of scaling scale
disconnected, the welding robot cannot return the position. The P point in the figure is the intersection of the
The contact of the normally closed relay is used. The other longitudinal axis of the camera lens and the optical axis,
side of the relay is connected to a 24V DC power supply The K is the distance between the imaging surface of the
and is connected to an ultra-high voltage diode and cement camera and the optical center, the K value of each camera is
resistor. Since a large current is generated during the constant, and L is the actual distance between the optical
welding process of the electric welding, an ultra-high center of the camera and the object to be welded. The two
voltage-resistant diode is used to prevent damage to the points C and D are any two points on the end face of the bar
circuit by a large current, and the cement resistor is selected to be welded, and the two points A and B are the points
as the voltage dividing resistor, that is, the power supply is where the two points C and D are projected in the image.
well protected and the operation is solved. The problem of According to the principle[8] of small hole imaging and
resistance heating during the process. When the welding similar triangle:
robot does not reach the position to be welded, the left
M K
circuit is turned on, and the I/O terminal is at a high level. F= = (2)
When the welding robot carries the welding nail to the N L
object to be weled, the right circuit is turned on, and the It can be seen from the above formula that when the values
relay pulls the left circuit. Disconnected, the robot specifies of K and L remain unchanged, the scaling scale F of the
that the I/O port is at a low level, and the welding robot physical object and the pixel size remains unchanged.
acquires the signal and stops the motion. In the actual calibration process, first calibrate a two-dime
nsional coordinate system O − XY on the end face of the st
eel to be welded. And specify its XY direction and origin
O . To reduce the error of solving the scaling scale F , n
points can be selected on the end face of the bar to be wel
24V DC
Welding
Robot
ded, respectively defined as . The end face of the steel bar
is shown in Figure 4:

Rebar

Figure 2˖detection resistance circuit

2.3 Image Scaling Scale


With the rapid development of computer image processing
capabilities and technology and the increasing cost
performance of digital image processing equipment, the
robot's autonomy has also been greatly improved, thus
providing a great possibility for the welding robot to have
strong maneuverability and flexibility.
This article only needs to adopt a monocular camera and a 
welding robot to form a simple "eye in hand" mode. Figure 4 Reinforcement end view

392 2020 Chinese Control And Decision Conference (CCDC 2020)


Authorized licensed use limited to: Southeast University. Downloaded on November 02,2022 at 05:09:49 UTC from IEEE Xplore. Restrictions apply.
2.4 Teaching Calibration  variance between classes is denoted as g . Assuming that
The welding positioning system realized by the positioning the background of the image is dark, and the size of the
method only needs to perform calibration once before the image is M × N , the number of pixels in which the gray
actual positioning, and achieves the purpose of “one value of the pixel is smaller than the threshold T is recorded
calibration, multiple realizations”. as N 0 ,and the number of pixels whose pixel gray is greater
The method hardware is mainly based on industrial than the threshold T is recorded as N1 [11]:
cameras, welding robots, laser ranging sensors, resistance ω0 = N 0 / M * N (4)
detecting circuits, and industrial computers. The industrial
ω1 = N1 / M * N (5)
camera and the welding robot form the "eye in hand" mode
to achieve image acquisition, using the laser ranging sensor N 0 + N1 = M * N (6)
as the depth information detecting device, using the ω0 + ω1 = 1 (7)
detecting resistor circuit system to feedback the position of μ = ω0 * μ0 + ω1 * μ1 (8)
the welding robot, and using the industrial computer as the
upper position of the image processing. The machine, the g = ω0 ( μ0 − μ ) 2 + ω1 ( μ1 − μ ) 2 (9)
industrial camera, the laser ranging sensor and the welder's Substituting equation (8) into equation (7) yields an
welding gun are installed at the end of the welding robot, equivalent formula:
and the axis of each device is parallel to the axis of the g = ω0ω1 ( μ0 − μ1 )2 (10)
welding robot flange. The calibration content of the The traversal method is used to obtain a threshold T for
method is the image scaling scale, the world coordinates maximizing the variance between classes, the pixel whose
and the pixel coordinates of the calibration origin. The image is lower than the threshold T is set to 0, and the other
teaching calibration algorithm is as follows: values are set to Pmax , thus obtaining a binary image.
[1] Use the teach pendant to move the welding robot to The centroid coordinates of n points can be expressed as:
the center of the bar to be welded and appear in the center
PN = ( X N , YN ) N = 1, 2,3" n (11)
of the image. The laser distance measuring sensor records
the distance L between the camera and the calibration bar. Calculate the pixel distance L between n mass points, L can
Calibrate a two-dimensional coordinate according to the be expressed as:
calibration mode in 2.3 and record the actual distance d mn = dis tan ce( Pm , Pn )
(12)
between two of the n points on the end face of the bar, m=1,2,3"ˈn = 1, 2,3"
recorded as d mn : [5] The comprehensive statistical method of removing
d mn = dis tan ce( Pm , Pn ) error is used to solve the scaling ratio, and the
(3) transformation model of the world coordinate system scale
m=1,2,3"ˈn = 1, 2,3"
and image scale is obtained:
[2] Calibration of the world coordinate system of the N =1, M =1
N ( N − 1) LMN
welding robot O ′ − X ′Y ′Z ′ . The X ′Y ′ axis direction of the F= ¦ (13)
world coordinate system is parallel to the XY axis of the N =1, M =1 2* DMN
two-dimensional coordinate O − XY defined by the end 2.5 Image Distortion Correction
face of the bar, and the direction of the Z axis is
perpendicular to the direction of the bar to be welded. In the method, in order to improve the accuracy of welding,
[3] Using the teach pendant to move the welding robot to a certain correction process is performed on image
the position where the laser ranging sensor transmits a high distortion caused by camera distortion[12]. The process of
level, take a picture of the welded bar, and record the camera imaging is actually the process of projecting the
coordinate value of the coordinate system O − XY origin points of the world coordinate system into the image
in the world coordinate system ( x0′ , y0′ , z0′ ) . coordinate system and then into the pixel coordinate
system. Since lens precision and process introduce
[4] The bar image captured by the camera is corrected by
distortion (so-called distortion, it means that the line in the
Zhang Zhengyou calibration method, and O − XY the
world coordinate system is transformed into a pixel
coordinate value ( X , Y ) of the coordinate system origin in coordinate system is no longer a straight line[13]), resulting
the corrected image is recorded. The image is binary in distortion. Therefore, it is necessary to correct the
processed according to the Otsu[10] algorithm, and then distortion of the camera. There are three types of camera
The contour of P1 − Pn is drawn and the centroid is distortion: radial distortion, tangential distortion, and thin
extracted. The basic principle of the Otsu algorithm is: for prism distortion[14]. Radial distortion is caused by the fact
the image I ( X , Y ) , the foreground (target) and the that the imaging plane is not strictly parallel to the plane of
background segmentation threshold are recorded as T , the the lens, and it has two adverse effects, one is the
proportion of the pixels belonging to the foreground to the pincushion effect, and the other is the barrel effect, which
appears as an image part away from the center position in
entire image is recorded as ω0 , and the average gray level
the image. There will be some degree of distortion;
μ0 background pixel points are occupied. The ratio of the tangential distortion is caused by the lens not strictly
image is ω1 , and the average grayscale is μ1 . The total parallel to the imaging plane; thin prism distortion is
average gray level of the image is recorded as μ , and the caused by defects in the lens manufacturing process. The

2020 Chinese Control And Decision Conference (CCDC 2020) 393


Authorized licensed use limited to: Southeast University. Downloaded on November 02,2022 at 05:09:49 UTC from IEEE Xplore. Restrictions apply.
radial and tangential distortion of the camera is shown in multaneously introduce radial distortion and tangential dist
Figure 5: ortion, the camera calibration is completed[17]. The calibr
ation results are shown in the following table:
Attributes expression degree of freedom
ªα x γ u0 º
Internal refere
K = «« 0 α y v0 »» 5
nce
«¬ 0 0 1 »¼
Figure 5 no distortion; barrel distortion; pincushion distortion k1 , k2 , k3
Radial 3
The nonlinear distortion model of the camera is as follows:
Tangential p1 , p2 2
x = k1 x( x 2 + y 2 ) + ( p1 (3 x 2 + y 2 ) + 2 p2 xy ) +
(14) Table 1 Linear camera model parameters
s1 ( x 2 + y 2 ) + x The experimental results of the internal parameters and
y = k2 y ( x 2 + y 2 ) + ( p2 (3x 2 + y 2 ) + 2 p1 xy ) + distortion of the camera are as follows:
(15) Camera internal reference Calibration result
s2 ( x 2 + y 2 ) + y
αx 1347.110635037403
In the above formula, ( x , y ) is the coordinate of the projec γ 0.000000000000000
tion point calculated in the image coordinate system accor αy 1349.546118761063
ding to the pinhole imaging model; ( x, y ) is the true coord u0 631.3726105578838
inate of the projection point P in the image coordinate sy- v0 543.024959532527
stem; The first term on the right is radial distortion, the sec
ond term is tangential distortion, and the third term is thin Table 2 Camera internal reference
Distortion parameter Calibration result
prism distortion. The coefficients of each k1 , k2 , p1 , p2 , s1
k1 -0.1254066694630284
, s2 are called nonlinear distortion parameters[15]. In gener
k2 0.7985124029127819
al, describing the nonlinear distortion of the camera only n
eeds to consider the first radial distortion in the above equ k3 -2.018501820080522886
ation. If only radial distortion is considered, the above equ p1 0.0004373161610404791
ation is expressed as:
p2 0.002318151050675862
x = x(1 + k1r 2 ) (16)
Table 3 Camera distortion parameters
y = y (1 + k2 r 2 ) (17) Firstly, the internal reference of the camera is analyzed. In
In the formula, r 2 = x 2 + y 2 This formula indicates that the the experiment, the focal length of the lens used is 6mm, the
radial distortion in the x direction and the y direction is photosensitive plane is 1/2.3 inch, the resolution used is
1280×1024, and the calculated pixel size is 4.8 μ m *4.8
proportional to the square of the radius, that is, the farther
away from the center of the image, the greater the μ m .The error of the effective focal length is -7.76% and
distortion, so in order to improve the positioning accuracy, -7.92%, and the error of the main point is 1.41% and
the object to be welded is photographed as much as -6.05%, respectively, which can meet the calibration
possible in the middle of the image. requirements.Before and after correction in Figure 6 :
Zhang Zhengyou's camera calibration method is used for
distortion correction. Zhang's calibration method is used to
obtain the inner corner point of the checkerboard
calibration plate as the calibration point[16]. There is a
point C on the checkerboard, the homogeneous coordinate
value of the point in the camera coordinate system is a) Before correction b)After correction
Q = ( X c , Yc , Z c ,1) ,c is the projection point of C on the Figure 6 Before and after correction
physical coordinate system of the image, and the 3. Welding Robot Positioning Method
homogeneous coordinates of the projection point in the
image pixel coordinate system The value is q = (u, v,1) . The welding robot positioning method based on machine v
Then the linear expression of the point transformed from ision and laser ranging is based on the calibration result to
the world coordinate system to the image pixel coordinate convert the image processing result into the world coordin
system is: ates recognizable by the welding robot in a simpler and fas
ter way, and complete the final welding positioning. The al
q = kA [ R T ] Q (18) gorithm flow proposed is:
In the above formula, k is a constant other than 0. The mat [1] After obtaining the welding instruction, the welding
rices R and T represent the rotation matrix and the transl robot moves until the object to be welded is just stopped
ation matrix of the camera respectively. The rotation matri when the laser ranging sensor detects the distance L, and
x and the translation matrix are external parameters of the starts to acquire the image I ( X , Y ) .
camera, and the matrix A is the camera internal parameter
. In order to improve the accuracy and applicability, and si

394 2020 Chinese Control And Decision Conference (CCDC 2020)


Authorized licensed use limited to: Southeast University. Downloaded on November 02,2022 at 05:09:49 UTC from IEEE Xplore. Restrictions apply.
[2] The host computer processes the acquired image to 450 0.4 2.6 0.5 1.6 0.6 3.0
obtain a pixel coordinate value (Ox , Oy ) based on the best 500 0.5 2.2 0.7 1.9 0.9 2.9
550 0.8 1.8 0.7 2.0 1.1 2.7
solder joint in pixel coordinates.
600 1.6 1.2 1.6 2.1 2.3 2.4
[3] The pixel coordinates (Ox , Oy ) of the position to be
Table 4 Experimental error analysis table (unit: mm)
welded obtained by processing the image of the object to be According to the analysis of the above table, it can be
welded are processed as follows, and the horizontal and known that the positioning accuracy of this method is
vertical coordinate values (Ox′ , Oy′ , O′ z ) of the world higher than the hand-eye calibration, and the hand-eye
coordinates of the position to be welded in the world calibration will have an error in the z direction, and the
coordinate system are obtained: method and the detection circuit can achieve zero error in
­ (Ox − x0 ) the z direction. Thus greatly improving the positioning
°° Ox′ = F
+ x0′ accuracy.
® (19)
°O ′y = (Oy − y0 ) + y ′ 5. Conclusion
°̄ F
0
In order to solve the lack of universality of traditional
Presetting Oz′ to a constant value (safety refers to the fixed-track welding and the complexity of hand-eye
position where the bar to be welded will never be calibration, this paper proposes a welding robot positioning
assembled), that is, a preliminary world coordinate method based on machine vision and laser ranging to
improve the flexibility and universality of welding
(Ox′ , Oy′ , Oz′ ) is obtained. positioning. Experiments show that The result of guiding
[4] Send the obtained preliminary world coordinates the positioning of the welding robot through the system can
(Ox′ , Oy′ , O ′ z ) based on the welding robot world coordinate fully meet the welding requirements in accuracy. The
system to the welding robot, and move the end of the system for guiding the welding robot positioning described
welding robot to the coordinates, at which time the in this paper has certain reference significance for the
difference between the welding nail and the actual welding motion control of the welding robot.
point is only Z ′ value; Then, the robot is set to the REFERENCES
detection mode along the Z ′ axis direction until the
welding nail touches the object to be welded, and the touch [1] Xuemei Liu,Chengrong Qiu,Qingfei Zeng,Aiping Li.
indicates that the welding robot has reached the welding Kinematics Analysis and Trajectory Planning of collab
position and can be welded. orative welding robot with multiple manipulators[J]. P
In summary, a welding robot positioning based on rocedia CIRP,2019,81.
machine vision and laser ranging is completed. [2] Xueqin Lü,Chao Chen,Peisong Wang,Lingzheng Men
g. Status evaluation of mobile welding robot driven by
4. System Test Analysis fuel cell hybrid power system based on cloud model[J
In order to verify that the method in this paper has certain ]. Energy Conversion and Management,2019,198.
general value and high accuracy, a six-axis welding robot [3] Sara Sharifzadeh,Istvan Biro,Peter Kinnell. Robust ha
was used to compare the method in this paper with the nd-eye calibration of 2D laser sensors using a single-pl
hand-eye calibration method. The laser ranging sensor ane calibration artefact[J]. Robotics and Computer-Int
selects the Panasonic HG-C1400 sensor. The detection egrated Manufacturing,2020,61.
distance of the sensor is 200-600mm and the error is below [4] Liu Jinbo,Wu, Jinshui, Li And Xin. Robust and Accur
1 mm. Select the bar steel section as the target to be welded, ate Hand-Eye Calibration Method Based on Schur Mat
and use the opencv3.4.1 to perform the basic ric Decomposition.[J]. Sensors (Basel, Switzerland),2
morphological processing on the welded steel bar image to 019,19(20).
obtain the pixel coordinates of the final welding point. The [5] Sara Sharifzadeh, Istvan Biro,Peter Kinnell. Robust ha
detection distance of the laser sensor is set to 200mm, nd-eye calibration of 2D laser sensors using a single-pl
250mm, 300mm, 350mm, respectively. 400mm, 450mm, ane calibration artefact[J]. Robotics and Computer-Int
500mm, 550mm, 600mm According to the above egrated Manufacturing,2020,61.
parameter settings, the welding robot is positioned and [6] E. Villagrossi, C. Cenati, N. Pedrocchi. Flexible robot
welded. In the experiment, the welding robot moves at 50% -based cast iron deburring cell for small batch product
of the maximum speed. The experimental results are shown ion using single-point laser sensor[J]. The Internationa
in the following table: l Journal of Advanced Manufacturing Technology, 20
Detec Error in the X Error in the Y
17, Vol.92 (1-4), pp.1425-1438.
Total error [7] Brian G. Rodricks , Kartik Venkataraman. First princi
-tion direction direction
distan Hand Hand Hand ples' imaging performance evaluation of CCD- and C
Lase Lase Lase MOS-based digital camera systems[J]. IS&T/SPIE Ele
-ce -eye -eye -eye
200 2.3 3.1 2.4 2.9 3.3 4.2 ctronic Imaging.2005.122.
250 2.3 2.9 2.3 1.8 3.2 3.4 [8] Kojima M,Wegener A,Hockwin O. Imaging characteri
300 1.4 3.0 1.5 2.3 2.1 3.7 stics of three cameras using the Scheimpflug principle.
350 0.7 2.5 0.7 2.2 1.0 3.3 [J]. Ophthalmic research,1990,22 Suppl 1.
400 0.4 2.4 0.4 1.8 0.5 3.0

2020 Chinese Control And Decision Conference (CCDC 2020) 395


Authorized licensed use limited to: Southeast University. Downloaded on November 02,2022 at 05:09:49 UTC from IEEE Xplore. Restrictions apply.
[9] Cognex Corporation; Patent Issued for System And Me
thod For Performing Vision System Planar Hand-Eye
Calibration From Straight Line Features (USPTO 10,3
80,764)[J]. Computers, Networks & Communications,
2019.
[10] Hui Du,Xiaobo Chen,Juntong Xi. An improved backg
round segmentation algorithm for fringe projection pr
ofilometry based on Otsu method[J]. Optics Communi
cations,2019,453.
[11] Suheir M. ElBayoumi Harb,Nor Ashidi Mat Isa,Samy
A. Salamah. Improved image magnification algorithm
based on Otsu thresholding[J]. Computers and Electri
cal Engineering,2015,46.
[12] Jun Jiang, Liangcai Zeng. An accurate and flexible t
echnique for camera calibration[J]. Computing, 2019,
Vol.101 (12), pp.1971-1988.
[13] Zhang Yuan,Qiu Zhicheng,Zhang Xianmin. Calibrati
on method for hand-eye system with rotation and trans
lation couplings.[J]. Applied optics,2019,58(20).
[14] Barbara Biasuzzi,Kevin Pressard. Design and charact
erization of a single photoelectron calibration system f
or the NectarCAM camera of the medium-sized telesc
opes of the Cherenkov Telescope Array[J]. Nuclear In
st. and Methods in Physics Research, A,2020,950.
[15] Yue Ji,Jun Wu. Calibration method of light-field cam
era for photogrammetry application[J]. Measurement,
2019,148.
[16] Chen Yichao,Huang Fu-Yu,Shi Feng-Ming,Liu Bing-
Qi,Yu Hao. Plane chessboard-based calibration metho
d for a LWIR ultra-wide-angle camera.[J]. Applied opt
ics,2019,58(4).
[17] Sels Seppe,Ribbens Bart,Vanlanduit Steve,Penne Ru
di. Camera Calibration Using Gray Code.[J]. Sensors (
Basel, Switzerland),2019,19(2).
[18] Cheng Luo,Yansong Zhang. Effect of printing orienta
tion on anisotropic properties in resistance spot welde
d 316L stainless steels via selective laser melting[J].
Materials Letters,2019,254.
[19] Wenjian Zheng,Yanming He,Jianguo Yang,Zenglian
g Gao. Influence of crystallographic orientation of epit
axial solidification on the initial instability during the
solidification of welding pool[J]. Journal of Manufact
uring Processes,2019,38.
[20] Yoshiki Mikami,Keisuke Sogabe,Satoru Nishikawa,T
akahiro Sakai,Masahito Mochizuki. Anisotropic Mech
anical Behavior of Ni-base Alloy Weld Metal Conside
ring Crystal Orientation Distribution[J]. Procedia Engi
neering,2011,10.

396 2020 Chinese Control And Decision Conference (CCDC 2020)


Authorized licensed use limited to: Southeast University. Downloaded on November 02,2022 at 05:09:49 UTC from IEEE Xplore. Restrictions apply.

You might also like