You are on page 1of 22

The International Journal of Advanced Manufacturing Technology

https://doi.org/10.1007/s00170-021-07416-5

ORIGINAL ARTICLE

A simple method for dimensional measurement of ring-shaped


objects using image processing technique
Anh-Tuan Dang 1 & Quang-Cherng Hsu 1 & Tat-Tai Truong 2

Received: 5 December 2020 / Accepted: 3 June 2021


# The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2021

Abstract
Automated dimensional inspection is commonly expensive because of the requirement of high-precision measurement devices.
To perform a precision measurement, the technician must be highly skilled and fully understands the operation of the equipment.
This study proposes a method for reconstructing the two-dimensional profiles of ring-shaped objects using image processing. At
first, an industrial camera captures partial images of the object. After that, through several image processing procedures such as
binarization, line detection, and contour recognition, the profiles in the images were detected and grouped. Then, a calibration
model was introduced to calibrate and combine the contours from partial images. This process results in a point cloud consisting
of every point from the outer and inner contours of the object, which can be directly used for the automatic measurement. To
verify the proposed method, the data were compared with those acquired from the ATOS measurement system, revealing a
favourable correlation.

Keywords Image processing . Camera calibration . Ring-shaped object . 2D contour reconstruction

1 Introduction A number of image processing methods have been devel-


oped for the inspection of different product types. Lu et al. [3]
With the development of modern high-precision manufactur- developed a device that measures the deformation in liquid
ing, applying new technologies in inspection and automatic crystal display smartphone screens through the use of coaxial
measurement has become a trend attracting considerable illumination and a semi-reflecting mirror. Chauhan and
scholarly attention [1, 2]. Inspection systems integrated ma- Surgenor [4] incorporated machine vision into an assembly
chine vision and image processing are increasingly employed machine for fault detection and classification. Molleda et al.
to effectively reduce human error and inspection time. With [5] presented a profile measurement system in which laser
advances in high-resolution camera and machine vision tech- range finders were used to reconstruct the two-dimensional
nologies, the quality of the inspection operations has been (2D) sections of rails in rolling mills. Another research of
improved, enabling the manufacturers to construct low-cost Molleda [6] also proposed methods to substantially improve
systems while maintaining the adequate accuracy. the accuracy of the measurement system by including an au-
tomatic warning system for the detection of below-threshold
inspection accuracy. Millara et al. [7] later suggested a new
calibration method that can be applied for rail rolling systems
* Quang-Cherng Hsu in industrial settings with measurement errors below 0.12%.
hsuqc@nkust.edu.tw
The applications of machine vision in non-contact mea-
Tat-Tai Truong
surement systems can reduce the operation time and enhance
cadcamtai@gmail.com the accuracy. Derganc et al. [8] proposed a machine vision
system for the inspection of bearings in electromechanical
1
Department of Mechanical Engineering, National Kaohsiung metres. By adopting the Hough transform and linear regres-
University of Science and Technology, 415 Chien Kung Road, sion techniques, the system can inspect the eccentricity and
Sanmin District, Kaohsiung 80778, Taiwan, Republic of China
the length of the needle extending out of the bearing. Xiang
2
Department of Mechanical Engineering, Hung Yen University of et al. [9] measured the dimensions of the bayonet of automo-
Technology and Education, Dan Tien, Khoai Chau, Hung
Yen 17817, Vietnam
bile brake pads by using a two-camera machine vision system.
Int J Adv Manuf Technol

Using the regions of interest in the captured images and a [19] constructed an online measurement model for determin-
unique calibration algorithm, the edges of the component are ing the shaft diameters on a lathe using a laser linear light
detected; subsequently, the bayonet size is calculated. source. The maximum average measurement error was
Gadelmawla [10] presented a computer vision system for the 0.019 mm when the shaft diameter was measured at a rota-
measurement of pinions and spur gears parameters. Through tional speed of 1250 rpm.
image processing algorithms such as image binarization, edge The above literature reviews demonstrate an extensive ef-
detection, edge thinning, and labelling, the outer diameter, fort in the construction of non-contact measurement systems
root diameter, and the number of teeth in the gears can be through machine vision and image processing. These studies
calculated. Moru and Borro [11] developed a measurement focused mainly on determining the dimensions (e.g. diameter,
system which uses telecentric lens to determine the parameters length, and thickness) or calculating other dimensions based
of spur gears with diameters under 70 mm. By assessing the on the determined parameters (e.g. using the pitch diameter
global uncertainty involved in the measurement process, the and the number of teeth to calculate the remaining character-
system effectively reduced the measurement error of the gear istics of gear or examining the features of thread on the basis
diameter to ± 0.02 mm. In the study by Ye and Hsu [12], a of crest and root information). A few methods for measuring
machine vision device was introduced for the inspection of large objects have been proposed; however, the error rates
defects in the assembly of unidirectional bearings in automo- remain quite high. The main reason is that the whole picture
biles. Both online and offline inspection methods were devel- of the object is captured in a single, low-resolution image.
oped to count the number of needles in the bearings as well as Measuring large objects, especially large ring-shaped objects,
to detect errors in part assembly. Hsu et al. [13] proposed a is still a challenge for low-cost systems.
method for classifying a random mixture of metal objects, In the present study, a profile reconstruction system for
such as bolts, nuts, and washers, in a container. The applica- ring-shaped objects was developed and experimental mea-
tion of machine vision under various lighting environments surement was conducted. Comparison with the measurement
and a self-developed calibration method enables the system to results from ATOS scanner revealed that despite its low cost
automatically control the gripper moving parts to the desired and simplicity, the system could accurately measure the com-
positions. plex profile of a 192-mm diameter ring-shaped object with an
In general, the accuracy of an inspection system is affected error under 0.063 mm. The three main contributions of this
by the properties of the camera. For example, the higher res- study are as follows: First, the paper introduces a method for
olution of the cameras, the greater the pixel density of the reconstructing the 2D profiles of large-size ring-shaped ob-
image, therefore improves measurement accuracy. Peng jects using a simple single-camera system. Second, by captur-
et al. [14] constructed a computer vision algorithm for ing the inspection object with multiple partial images, the
inspecting the diameter and thickness of metal O-rings, a com- proposed system enhances the accuracy of the measurement
mon type of seals. Using a camera with a resolution of 4 μm results by increasing the resolution of the image up to 3.6
pixel, and the given tolerance of ± 0.1 mm for the 3-mm-thick times. Third, the paper proposes an efficient and accurate cal-
O-rings, the system yielded a measurement accuracy of ap- ibration model to combine contours from discrete images of
proximately 100%. Rejc et al. [15] developed an automated the object into the same world coordinate system (WCS) with
visual inspection method to measure dimensions of a mechan- small errors.
ical assembly known as the protectors. Using a camera with a
4 μm pixel resolution, the system can measure the edges of the
protector with the range of error within ± 0.02 mm. Wang
et al. [16] presented a 16-camera system for measuring the 2 The inspection object and system setup
radius on three-dimensional (3D) multi-bend tubes. With the
pixel resolution of the camera approximately 4.65 μm, the 2.1 The inspection object
system can achieve an accuracy within ± 0.5 mm.
Gadelmawla [17] introduced a vision system for the automatic The inspection object in this study was the outer ring of
measurement of various types of screw threads with a camera a unidirectional bearing. The maximum and minimum
and backlighting table. The results show that by using a diameters of the bearing are 165 and 192 mm, respec-
charge-coupled device (CCD) camera with a 5 μm pixel res- tively. The outer and inner profiles are broached into
olution, the difference between the standard and measured complex shapes consisting of 16 teeth different in width
values was ± 5.4 μm, which provides a good accuracy for and angle orientations (Fig. 1). The geometrical toler-
the measurement. Ngo et al. [18] presented a method using a ances for the linear dimensions and angular dimensions
two-camera system to measure the diameters of holes drilled between the small teeth are ± 0.25 mm and ± 0.5°, and
in various planes. Using a camera with 0.19 mm pixel resolu- the tolerance of the maximum diameter ranges from +
tion, the system can obtain an accuracy of ± 0.5 mm. Tan et al. 0.4 to + 0.8 mm.
Int J Adv Manuf Technol

°
° ±0.5
13.8

°
0.5
.5° ±
41

1 3.8 °
Ø16
5 +0.5

±0.5°

+0.6
Ø186+0.2
+0.8
Ø192+0.4
.5
+0
41.5°

8
Ø 16
±0.5°

°
.5
° ±0
.5
41
10.8±0.25

55.4°±0.5°
Fig. 1 The shape of the product with basic dimensions (unit: mm)

2.2 System setup light with a dark background, high-intensity light sources
should be employed to take advantage of the reflection of
Because of its complex profiles, dimensional measuring the the metal surface for contour recognition. However, the reflec-
ring-shaped object through the use of conventional equipment tion would cause blur at the border of the surface and influ-
like callipers or micrometres has some limitations. And the ences the overall accuracy (Fig. 2b). In case the light back-
use of modern equipment such as coordinate measuring ma- ground is used, the intensity of the front light can be adjusted
chines or profile projectors requires highly skilled operators. to a moderate level to prevent blurring. At that time, the dark
A more favourable option involves the construction of a vir- image of the object would be easy to distinguish from the light
tual model of the object and conducts the measurement pro- background (Fig. 2c). However, owing to the entocentric
cess using a medium software. Because the object has a uni- lenses’ properties, which regard to the angle of view, images
form height, instead of building a complex 3D model, con- of the inner profile B are hard to be observed, while profile A
structing a 2D profile of the ring-shaped object is an approach (located in the different plane with the outer profile) becomes
that is not only simpler but also reduces the inspection time. more visible. This causes incorrect information to the profile
To estimate the difficulty in measuring the ring-shaped recognition process and affects the measurement results. The
object, an image acquisition method was examined. With the same situation occurs when the system uses backlighting. In
camera set perpendicular to the top surface of the object, im- addition, for this case, the image of the object is completely
ages of the entire object were captured as showed in Fig. 2. It black (only the outer profile and profile A are visible).
is clear that using this system, the layout shows several short- To overcome the above-mentioned problems, the system
comings. First, the image resolution cannot ensure sufficient could employ high-resolution cameras (to have a larger pixel
tolerance. Because for a given image size 2560 × 1920 pixel size) and telecentric lenses (to suppress the image distortion).
of the camera, the pixel resolution of the 192-mm-diameter However, these changes would increase the system cost and
object would be 0.1 mm, meaning that a difference from 3 to 5 only suitable for measuring not very large objects because the
pixels during the image processing will exceed the tolerance limit diameter of the telecentric lens will not be smaller than
of the measurement (0.5 mm, see Fig. 1). Second, the appear- the diameter of the inspection sample.
ance of the inner surface in captured images causes an error for This paper presents a new method to reconstruct the 2D
the reconstruction of inner contours. If the system uses a front profiles of large-size ring-shaped objects. Fig. 3 demonstrates
Int J Adv Manuf Technol

(a) Camera (b)


Ring-light Outer profile

Inner
profile

Angle of view Low resolution


.

Captured profiles (c)

Real profiles
Profile A
Top surface
Inner surface
Profile B
Inner surface
Ring-shaped object
Fig. 2 Capturing the object using a single-camera system. a Schematic of the system; captured image in the case of b dark background under high-
intensity light and c light background under moderate-intensity light

the structure of the proposed system, which consists of an reducing the appearance of the inner surfaces in the images;
industrial camera, a backlighting source, and a rotary table. thus preventing the inner profiles capture problem. The cap-
Instead of capturing the entire object in a single image (as in tured region of the new method is smaller than that in Fig. 2,
Fig. 2), the camera is positioned on one side to capture a small meaning that the partial images have a higher resolution,
region of it (Fig. 3b). The object is set on the rotary table, hence the measurement accuracy is enhanced. For accurate
which rotates 30° after one image is taken. This shooting edge detection and reducing the influence of blurring,
method allows to capture all of the ring-shaped contours while backlighting source is chosen. A uniform backlight light-

Camera

Lens Camera
Field of view Lens
Calibration plate Calibration plate
Ring-shape object Captured area
Object
Rotary table

Light source
Light source Rotary table
(a) (b)
Fig. 3 Illustration of the proposed system: a the schematic diagram of the system, b the 3D model of the system
Int J Adv Manuf Technol

emitting diode (LED) panel is placed below the camera’s field performed to remove the regions adjacent to the separation
of view, facilitating image segmentation and avoiding the re- lines. Second, calibration points in the masked image are de-
flection of the top surface. tected to calculate the coefficients of the calibration model.
A calibration plate is used to transform the image coordi- Third, the inner and outer contours in partial images are ex-
nates into the world coordinates as well as combine contours tracted and calibrated to the same 2D plane.
from separate images. This plate was made from an acrylic The image procedure is described in detail in the following
sheet with a thickness and a diameter of 0.1 and 250 mm, section.
respectively, is fixed on top of the object during the measure- In stage 1.1, the partial images are converted to binary
ment. The patterns on the plate, i.e. the calibration points and images to reduce unwanted noise and improve the visibility
separation lines, were manufactured by JuanShan Inc. (Taipei, of desired information in the image. The two common
Taiwan) using an Orbotech LP7008 Photoplotter. Regarding methods Otsu and simple thresholding are used for this stage.
the calibration features, the machine uses laser etching tech- As shown in Fig. 7b, the separation lines become visible
niques to achieve the maximum accuracy of 0.1 μm (corre- through the simple thresholding. However, this method makes
sponding to a resolution of 25,400 dpi). Fig. 4a presents the the noises in the image became observable, as well as the
top view of the calibration plate, which consists of 12 separa- image of inner surfaces. The Otsu’s thresholding, on the other
tion lines and 276 points distributed across three concentric hand, can completely remove noises and the inner surface
circles. Fig. 4b shows the placement of the object on the cal- images, but the separation lines are also obscured because of
ibration plate. With the image size of 2560 × 1920 pixels and this automatic method (Fig. 7c). Therefore, simple
the captured field of view is 72 × 54 mm, the resolution of the thresholding is selected for this stage to enhance the visibility
captured images is approximately 0.028 mm/pixel. of the separation lines, and Otsu thresholding is applied after
The programming language Visual Studio C# was used for the separation lines are detected to reduce the effect of noise in
image processing based on the Emgu CV library. The camera the images.
was connected to the computer by using the uEye Cockpit in In stage 1.2, the separation lines, which define the remov-
the IDS Software Suite, and the program outputs were the able regions in partial images, are identified (Fig. 8). First, the
point cloud data of the contours. separation lines are detected using the Hough Line Transform
The initial settings for the measurement process were se- function, through which the coordinates of the endpoints of
lected to optimize image quality. First, the focal length of the these separation lines (P1, P2, P3, and P4 in Fig. 8a) are
lens was adjusted to provide evenly sharp images. Second, the returned. Subsequently, the coordinates of C, i.e. the intersec-
lens aperture and LED lighting must be finely tuned to mini- tion points of these lines, are calculated and the intersection
mize blurring at the object edges. Finally, images were taken points between these lines and the image borders (M, N, P, and
while ensuring two separations constraining a region appeared Q) are determined (see Fig. 8b).
in the field of view. Fig. 5 depicts the system setup and a Because point C is on line P1P2, the slopes of the two lines
sequence of 12 images corresponding to partial regions of CP1 and CP2 are equal and can be determined using the for-
the object. mula:
C x −P1x P2x −P1x
tanφ1 ¼ ¼ ð1Þ
2.3 Image processing procedure C y −P1y P2y −P1y

The proposed image processing system involves a three-part Multiply both sides of Eq. (1) by the product of (Cy − P1y)
procedure described in Fig. 6. First, image masking is and (P2y − P1y):

Calibration point R105


11 x 2.6°
2° . .

12 1 2 R100
Separator 3
line 11
R80
°
10 4 7 x 4.3
9 5 R75
8 7 6

°
6.5
5x
30°
R65
Position to Image captured by the system
place object

(a) (b)
Fig. 4 Dimensions and layout of calibration plate. a Calibration plate with detailed dimensions (unit: mm), b placement of the object on the calibration
plate
Int J Adv Manuf Technol

Image 1 Image 2 Image 3 Image 4

Image 5 Image 6 Image 7 Image 8

Image 9 Image 10 Image 11 Image 12

(a) (b)

(c)
Fig. 5 The system structure and the captured images. a System setup; b a sequence of 12 capture images, and c user interface (UI) of the system

The coordinates of point C are the solution to the equation:


P2y C x −P2y P1x −P1y C x þ P1x P1y
   −1  
Cx P2y −P1y P1x −P2x P1x P2y −P1y P2x
¼ P2x C y −C y P1y −P1y P2x þ P1x P1y ð2Þ ¼ ð8Þ
Cy P4y −P3y P3x −P4x P3x P4y −P3y P4x
Move the products that contain Cx and Cy to one side of the
The coordinates of the points M, N, P, and Q are deter-
equation:
mined by applying the following equations:
 
C x P2y −P1y þ C y ðP1x −P2x Þ ¼ P1x P2y −P1y P2x ð3Þ
C x −Px C x −Qx P4x −P3x
¼ ¼ ð9Þ
Apply the same procedure to point C on line P3P4: C y −Py C y −Qy P4y −P3y
C x −M x C x −N x P2x −P1x
C x −P3x P4x −P3x ¼ ¼ ð10Þ
tanφ2 ¼ ¼ ð4Þ C y −M y C y −N y P2y −P1y
C y −P3y P4y −P3y
 
C x P4y −P3y þ C y ðP3x −P4x Þ ¼ P3x P4y −P3y P4x ð5Þ With the points on the border of the image, Ny = Qy = 1920
and My = Py = 0, their x-coordinates can be determined using
Combine Eqs. (3) and (5) to form a system: the following equations:
  
C x P2y −P1y  þ C y ðP1x −P2x Þ ¼ P1x P2y −P1y P2x   P2x −P1x
ð6Þ M x ¼ C x − C y −M y : ð11Þ
C x P4y −P3y þ C y ðP3x −P4x Þ ¼ P3x P4y −P3y P4x P2y −P1y
  P2x −P1x
Representing system (6) in matrix form: N x ¼ C x − C y −N y : ð12Þ
     P2y −P1y
P2y −P1y P1x −P2x C x P1x P2y −P1y P2x
¼ ð7Þ
P4y −P3y P3x −P4x C y P3x P4y −P3y P4x
Int J Adv Manuf Technol

START
  P4x −P3x
Px ¼ C x − C y −Py : ð13Þ
P4y −P3y
i=1
  P −P
4x 3x
Qx ¼ C x − C y −Qy : ð14Þ
Select image P4y −P3y
number i
Because separation lines in partial images can be automat-
Image ically detected using the Hough Line Transform function, a
masking Stage 1.1
Binarization difference of a few degrees over-rotation of the table does not
negatively affect the results of this stage. Even if the table is
manually rotated at an angle within 25° and 35°, as long as te
Stage 1.2
Line detection initial setting is followed, the images of the separation lines
remain detectable.
In stage 1.3, the side regions constrained by the two lines
Stage 1.3
Masking MN and PQ are masked. Because of the properties of the lens,
the features near the image border tend to be radially distorted,
Calibration and the unstable number of calibration points appearance in
Stage 2.1 i = i+1 these regions also make it difficult to proceed with the cali-
Blob detection
bration stage. Thus, to enhance the calibration accuracy, these
regions must be removed. The small areas near the separation
Stage 2.2 line, namely the matching regions (Fig. 9a), are retained to
Calibration calculating
allow for the inspection of overlapping during the combina-
tion of contours from adjacent images. The masked image
Contours
extraction Stage 3.1 obtained in this stage encompasses 23 calibration points and
Contour extraction a part of the object, as shown in Fig. 9b.
In stage 2.1, to prepare for the calibration, the points in the
Stage 3.2 masked image are detected and labelled. Since these points are
Contour calibration oriented in concentric arcs with different angular distributions,
assigning them with the exact world coordinates must be done
False with a specific scheme. First, by using the FindContours func-
i = 12 tion from Emgu CV, the image centres of the blobs (i.e. 23
True calibration points and a part of the object) are detected.
Subsequently, these blobs are grouped based on the distance
END
between their centre and point C. By arranging the centres in
Fig. 6 Flowchart of the image processing procedure ascending order according to the x-coordinates, these

Image noise

Inner
surface

Real profile

(a) (b) (c)


Fig. 7 Binarization of partial image. a Original image, b image binarized through simple thresholding, and c image binarized through Otsu thresholding
Int J Adv Manuf Technol

Fig. 8 Two steps of line detection O(0,0) O(0,0)


stage (unit: pixel). a Detection of x x M P
the endpoints of the separation
lines; b calculation of the
y y
coordinates of the intersection
points

P1
P3

Image P2 P4 Image coordinate N Q


Point
Point coordinate x y
x y M 529.84 0
P1 807 1437 N 2294.75 0
P2 899 1914 P 900.16 1920
P3 1807 1503 Q 1671.68 1920
C
P4 1672 1919 C 1187.76 3411.17

(a) (b)

calibration points are numerically labelled (Fig. 10). The ge- Stage 2.2 is an essential step in the contour reconstruction
ometries of the blobs after the sorting scheme are listed in procedure with regard to obtaining the world coordinates from
Table 1. the image coordinates.
With Di is the distance between the blobs’ centres (denoted The literature on image calibration focuses on estimating
by xi and yi) and point C (which of coordinated had been the relationships between actual coordinates and image coor-
calculated in stage 1.2), as presented in the following equa- dinates. For example, the direct linear transformation method
tion: presented by Abdel-Aziz and Karara [20] approximates un-
qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi known parameters, and the calibration model developed by
Di ¼ ðxi −xC Þ2 þ ðyi −yC Þ2 ð15Þ Deng et al. [21] combines the differential evolution and parti-
cle swarm optimization algorithms in the calibration of camera
The column range of D in Table 1 is the range of parameters. Because an entocentric lens is used for the system,
the distance Di that contains points in the same circle. the captured images are imperfect and exhibit some aberra-
The data in Table 1 indicate that the stage was com- tions. Numerous methods had been suggested to approximate
pleted through this simple sorting scheme. Given the these distortions and enhance the precise calibration process,
extensive gap between the area of calibration points for example, the calibration technique based on images of a
and the ring-shaped block (2700 vs. 440,000 pixels), calibration plate captured from various positions [22], or the
differentiating them becomes easy. Furthermore, given self-calibration method using a single image combined with
that the difference between concentric arcs varies from geometric constraints, including vanishing points and ellipses
300 to 500 pixels and the distance between points of [23].
the same arc ranges between 100 and 200 pixels, The present study employs the quadratic function proposed
mislabelling will not be occurred. by Hsu et al. [13] with the use of 12 coefficients as follows:

Fig. 9 Masking stage. a Covering


the unnecessary regions of the Matching regions
image through masking; b the
image after the masking Matching contour
procedure

MASKED MASKED
REGION 1 REGION 2

(a) (b)
Int J Adv Manuf Technol

Fig. 10 Detecting and labelling


the calibration points. a Detecting
centre of the blobs, b numerically
13 14 15 16 17 18 19 20 21
labelling the calibration points 22 23

6 7 8 9 10 11
12

1 2 3 4 5

Order to sort number


(a) (b)


X ¼ a1 x2 þ b1 xy þ c1 y2 þ e1 x þ f 1 y þ g1 are the transformation coefficients of the equations. To
ð16Þ obtain these transformation coefficients for the partial
Y ¼ a2 x2 þ b2 xy þ c2 y2 þ e2 x þ f 2 y þ g2
image, regression analysis involved the mean squared
where (x, y) are the coordinates of the points in the error using the coordinates of the 23 calibration points
image, (X, Y) are the corresponding coordinates in the detected in stage 2.1 was performed. The results are
WCS, and a1, b1, c1, e1, f1, g1, a2, b2, c2, e2, f2, and g2 presented as following matrix equations:

Table 1 Grouping and ordering


the calibration points Blob number Blob centre coordinates Distance to point C (pixel) Area (pixel)

x (pixel) y (pixel) Di Range of D

1 897 1554 1879.85 1700~2000 2676.5


2 1099 1531 1881.84 2648.5
3 1302 1532 1882.18 2615.0
4 1504 1558 1880.18 2662.5
5 1703 1606 1876.83 2707.5
6 853 1281 2156.42 2000~2300 2629.5
7 1008 1261 2157.61 2580.0
8 1164 1253 2158.27 2562.0
9 1319 1257 2158.04 2558.5
10 1474 1273 2157.14 2572.5
11 1628 1301 2155.65 2612.0
12 1780 1341 2153.52 2651.0
Object’s image 1325 901 2513.65 2300~2700 440,873.5
13 713 439 3010.16 2700~3200 2715.0
14 845 421 3009.97 2702.5
15 977 409 3009.89 2679.0
16 1109 402 3009.77 2678.0
17 1241 402 3009.79 2671.0
18 1373 407 3009.81 2638.0
19 1505 418 3009.88 2659.0
20 1637 435 3009.99 2662.5
21 1768 457 3010.36 2694.5
22 1899 486 3010.46 2717.5
23 2028 520 3010.62 2736.0
Int J Adv Manuf Technol

2 38 9 3 Results and discussion


x4i x3i yi x2i y2i x3i x2i yi x2i > > a1 >
>
6 x3 y >
> >
>
6 i i x2i y2i xi y3i x2i yi xi y2i xi yi 7
7 >
> b 1>
>
2 7< =
23 6 x2 y2
The applicability of the proposed system was evaluated by
xi y3i y4i xi y2i y3i yi 7 c1
∑6 6 3
i i checking the accuracy of the resulting point cloud. First, pre-
i¼1 6 xi x2i yi xi y2i x2i xi yi xi 7 7>> e >
> 1> > liminary assessments of the accuracy of the calibration model
4 x2 y xi y2i y3i xi yi y2i yi 5 > >
> f >>
i i : 1> ; were conducted for 276 points on the calibration plate.
x2i xi yi y2i xi yi 1 g1
Subsequently, a repeatability test was performed to analyse
2 3
x2i X i the data precision. Finally, to determine the efficiency of the
6 xi y X i 7 system as a whole, the dimensions measured from the resulted
6 i 7
23 6 2
yi X i 7 point cloud were compared with those obtained from an
¼ ∑6 6
7
7 ð17Þ
i¼1 6 xi X i 7 ATOS I scanner model.
4 y Xi 5
i
Xi
2 4 38 9 3.1 Calibration assessment
xi x3i yi x2i y2i x3i x2i yi x2i > > a2 >
>
>
> >
6 x3 y x2 y2 xi y3
6 i i x2i yi xi y2i xi yi 7
7 >
> b2 >
>
>
23 6 x2 y2
i i i < = Calibration errors are estimated by comparing the differences
6 xi y3i y4i xi y2i y3i y2i 77 c2
∑6 3 i i between the calibration points’ world and calibrated coordi-
i¼1 6 xi x2i yi xi y2i x2i xi yi xi 7 7> >
> e2 >
> > nates as follows:
4 x2 y xi y2 y3i xi yi y2i yi 5 > >
> f2>>
i i i : > ; qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
xi2
xi yi y2i xi yi 1 g2  2  2
2 2 3 Li ¼ X i −X *i þ Y i −Y *i ð19Þ
xi Y i
6 xi y Y i 7 where (Xi, Yi) are the world coordinates of point i, and
6 2i 7
23 6
yi Y i 7 (X *i ,Y *i ) are the calibrated coordinates of the point obtained
¼ ∑6 6
7
7 ð18Þ
i¼1 6 xi Y i 7 in stage 3.2.
4 y Yi 5
i Using Eq. (19) for 276 points, the calculation returns devi-
Yi ation distribution of the calibration model in the 2D world
coordinate system (Fig. 13a) and a histogram presenting the
where xi and yi are the image coordinates of the 23 calibra- distribution of the difference (Fig. 13b).
tion points, and Xi and Yi are their corresponding centres in the Another analysis of the calibration deviation is the deter-
WCS. mination of the standard deviation as follows:
Stage 3.1 locates the contour pixels of the object and sep- sffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
arates them into the corresponding inner or outer profile. As 1 m
σ¼ ∑ ðLi −μÞ2 ð20Þ
demonstrated in stage 2.1, the FindContours function not only m−1 i¼1
detects the centre E of the ring-shaped blob (which accounts
for the largest area in Table 1) but also determines the coordi- where m is the number of calibration points (276) and μ is
nates of pixels in the blob contours. By calculating the dis- the average deviation.
tance between each pixel and point C and comparing it to the m
distance between points C and E, these pixels are grouped and ∑ Li
i¼1
differentiated by colour as shown in Fig. 11b. μ¼ ð22Þ
In stage 3.2, the pixel contours obtained from stage 3.1 are m
calibrated into a 2D coordinate system. Through the applica- As shown in Fig. 13, the deviations ranged from 0 to 10
tion of the calibration equation in Eq. (16) with the 12 coeffi- μm, with the maximum error Lmax of 9.476 μm. Using the
cients obtained from stage 2.2, the coordinates of these points calculated Li from Eq. (19), the calibration errors, including
are projected and stored as point cloud data in a 2D plane. As the standard and average deviations (σ = 2.032 and μ = 4.269
shown in Fig. 11c, the image coordinates of the calibration μm, respectively), were obtained.
point centres are also projected into this plane. According to the empirical rule of the normal distribution
By using the same procedure for the remaining partial im- static, there is 68% of the calibration error will range from
ages, the inner and outer contours of the ring-shaped object are 2.236 to 6.311 μm (1 standard deviation), 95% of the error
calibrated into a single 2D point cloud. As displayed in Fig. will range from 0.204 to 8.333 μm (2 standard deviations),
12, the resulted point cloud is constructed from 12 partial and 99.7% of the error will range from 0 to 10.364 μm (3
images, with each image contributes a different set of calibra- standard deviations). Hence, the proposed calibration model
tion coefficients. It means that 12 sets of calibration coeffi- can yield good results since the error does not exceed 10 μm,
cients are used. in which 95% of the errors do not exceed 8 μm.
Int J Adv Manuf Technol

Table 2 Dimensions obtained


from the repeatability test Dimension Result from the inspection Range Average
Re M
Measurement number Measurement number Measurement number
1 (d1) 2 (d2) 3 (d3)

A1 (°) 55.399 55.426 55.420 0.027 55.415


A2 (°) 13.841 13.807 13.845 0.038 13.831
A3 (°) 13.844 13.866 13.826 0.040 13.845
A4 (°) 41.484 41.520 41.508 0.036 41.504
A5 (°) 41.513 41.530 41.523 0.017 41.522
A6 (°) 13.829 13.823 13.821 0.008 13.824
A7 (°) 13.879 13.862 13.868 0.017 13.870
A8 (°) 13.829 13.838 13.835 0.009 13.834
A9 (°) 13.842 13.824 13.832 0.018 13.833
A10 (°) 13.870 13.862 13.836 0.034 13.856
A11 (°) 13.875 13.840 13.852 0.035 13.856
A12 (°) 41.558 41.544 41.524 0.034 41.542
R1 (mm) 96.323 96.299 96.312 0.024 96.311
R2 (mm) 93.310 93.320 93.331 0.021 93.320
R3 (mm) 84.106 84.112 84.117 0.011 84.112
T1 (mm) 10.840 10.830 10.817 0.023 10.829
T2 (mm) 10.787 10.757 10.745 0.042 10.763
T3 (mm) 10.813 10.820 10.818 0.007 10.817
T4 (mm) 10.875 10.868 10.909 0.041 10.884
T5 (mm) 10.750 10.766 10.797 0.047 10.771
T6 (mm) 10.863 10.832 10.835 0.031 10.843
T7 (mm) 10.802 10.810 10.843 0.041 10.818
T8 (mm) 10.833 10.834 10.812 0.022 10.826
T9 (mm) 10.775 10.768 10.750 0.025 10.764
T10 (mm) 10.864 10.857 10.862 0.007 10.861
T11 (mm) 10.856 10.852 10.857 0.005 10.855
T12 (mm) 10.826 10.823 10.829 0.006 10.826
T13 (mm) 10.796 10.783 10.793 0.013 10.791

Blob center Outer contour

Inner contour

Calibration point

C x

(a) (b) (c)


Fig. 11 Calibrating contour pixels of the ring-shaped object. a Using the distance point C and point E as the basis for grouping; b separating the contour
pixels into groups, and c calibrating the pixels into the world coordinate system
Int J Adv Manuf Technol

Fig. 12 Extracting the object


profiles and combining them into
a point cloud
Image 1
Image 12 Image 2

Image 11 Image 3

12 1 2
11 3

10 4

9 5
Image 10 8 Image 4
7 6

Image 9 Image 5

Image 8 Image 6

Image 7

The calibration algorithm is also involved in connecting the magnified image shows that despite the complexity of object
contours from the partial images. When two adjacent point profile, the point clouds from overlapping regions well match
clouds are merged, their matching regions overlap. Fig. 14 with each other, and the deviation may not be observed with
illustrates the final point cloud of the ring-shaped object with the naked eye. Fig. 14b presents the straightening of the points
the points from adjacent images differentiated by colour. The in four overlapping regions. The difference between adjacent

120
Diference 63
L( m) 60
Frequency (Number of points)

9 ~10
60
8~9 46
7~8 39
40
y(mm)

6~7
0 33
5~6 29
28
4~5
3~4 20
-60 14
2~3
9 9
1~2 6

0~1
-120 0
-120 -60 0 60 120 0 1 2 3 4 5 6 7 8 9 10
x(mm) Diference L ( m)

(a) (b)
Fig. 13 Deviation distribution of calibration points, as calculated using the proposed method. a Position of calibration points with the corresponding
deviation values; b frequency of calibration error
Int J Adv Manuf Technol

84.25
Region 4

Radius (mm)
Region 4 84.20 Region 4 Region 5
84.15 0.011 0.033
84.10
Region 3 Region 2 84.05
Reg. 5 Region 5 84.00
0.0 1.0 2.0 Length (mm) 3.0

Region 1 96.35
Region 6

Radius (mm)
Region 6 Region 7
96.30
96.25
96.20
4 3
5
Region 1 96.15 0.044
2 96.10
Region 6 6 1 0.0 1.0 2.0 Length (mm) 3.0

Region 7 7 12 Region 12 84.25


Region 4 Region 5

R adius (mm)
8 11 84.20
9 10 84.15 0.010 0.033
Region 7 84.10
Reg. 12 84.05
84.00
0.0 1.0 2.0 Length (mm) 3.0
Region 8
Region 8 Region 11 Reg. 11 82.65

Radius (mm)
82.60 Region 1 Region 12
0.046
82.55
82.50
Region 9 Region 10 82.45
82.40
0.0 1.0 2.0 Length (mm) 3.0

(a) (b)
Fig. 14 Distribution of the point cloud in overlapping regions. a Point cloud of the ring-shaped object and the magnified overlapping regions; b
straightening the points from the four overlapped arcs

point clouds was smaller than 0.011 mm, and the thickness of the left, and the edges of the tooth are left tilted. Magnification
the point cloud in the overlapping regions varied between images show that the edges from the first case (A and B) are
0.033 and 0.046 mm. smooth, whereas those from the second (C and D) are rough.
For a more detailed observation, the point clouds calibrated
3.2 The repeatability test results from the two cases were set side by side for comparison. Fig.
16a and 16b present the results, with the outer contours of the
Because the calibration plate is manually fixed on the top of images extracted separately. The tooth profile in Fig. 16c was
the object, the positions of the plate and the ring-shaped object constructed by rotating the point cloud of case 2. Comparison
in different measurement procedures are not alike. Therefore, of the two point clouds of the tooth revealed that the contours
partial images of the same object from different measurements of the same object in different measurements were not
are not identical. Fig. 15 depicts a tooth captured from two completely identical. As shown in Fig. 16c, the rough edges
different measurements. In the first case (Fig. 15a), the tooth is from case 2 give a larger deviation with the thickness profile
positioned at the centre of the image, with the edges orient up to 0.041 mm. The magnified regions indicate differences in
vertically. In the second case (Fig. 15b), the object rotates to the two cases, which can be explained by the camera distortion

Fig. 15 Analysis images of the Edge A Edge B Edge C Edge D


same tooth with different
orientation angle, a case 1: the
tooth is positioned at the centre of
the image, b case 2: the tooth is
placed near the side of the image

D
A B C

(a) (b)
Int J Adv Manuf Technol

Case 1 Case 2

0.023

Contour extraction Contour extraction

Case 1
Case 2

(a) (b) (c)


Fig. 16 Comparison of the calibration deviation at various tooth position: a, b point cloud data of the outer profiles extracted from Fig. 15; c comparing
the point clouds in cases 1 and 2 (unit: mm)

and edges blurring problems. The subpixel may also affect the processor that had 32 GB of RAM. The average processing
point cloud at the edge as well as the thickness of the calibrat- time for each step is presented in Table 3.
ed contours. To improve the quality of the system, using high-resolution
To evaluate the precision of the proposed system, a repeat- cameras is recommended to create higher-quality images,
ability test was conducted. The experiment was performed thereby reducing the deviation of the point cloud. Increasing
three times on the same object. The table was manually rotated the number of captured images and decreasing the field of
to magnify the effects of the binarization algorithm, the light, view can also be achieved by redesigning the calibration plate
and the lens distortion. The ring-shaped object and calibration (with more partial regions) and using lenses with a smaller
plate were repositioned after each measurement to assess the angle of view (reduce the affection of distortion); however,
effectiveness of the calibration algorithm. To measure the de- these strategies also increase the system processing time.
viation between the extracted experimental point clouds,
Geomagic Design X software was used. Fig. 17 shows 28
dimensions of the ring-shaped object, including the angles 3.3 Compared with ATOS scan results
between the teeth (12 angular dimensions, A1 to A12), the
widths of the small teeth (13 alignment dimensions, T1 to Because of the influence of the machining process, the dimen-
T13), and the radial dimensions (R1, R2, and R3). Table 2 sions of the product always have deviations. Therefore, it is
presents the results of the measurement, including the point necessary to use another measuring system to evaluate the
cloud dimensions, the measurement ranges Re, and the means working of the proposed system.
M. In this study, the efficiency and accuracy of the proposed
As shown in Table 2, the dimensions range of the point system in the measurement of the ring-shaped object was
cloud Re changed slightly. The maximum range of the angular compared with those of an ATOS scanner for the following
dimensions, 0.040°, did not depend on the magnitude of the reasons: First, as a popular used commercial vision system,
angle to be measured. For example, although angle A12 was ATOS scanners collect measurement data and reproduce the
three times larger than A1, they had the same range of 0.036°. 3D models of the object with high accuracy and reliability.
Angles A3 and A6 had the same magnitude (≈13.8°), but their Second, from the 3D point cloud obtained from an ATOS
ranges were 0.008° (the smallest value) and 0.04° (the largest scanner, 2D point cloud profiles of the object can be extracted,
value), respectively. Regarding linear dimensions, the greatest facilitating comparison with those obtained from the proposed
range was 0.047 mm (for the tooth width T5 = 10.8 mm) but system.
was only 0.02 mm for the radius R1 (96.3 mm; which is 9 An ATOS I scanner model was used as a reference for the
times larger). These deviations are partially attributable to the evaluation of system accuracy. The scanner yielded a 3D point
minor errors the calibration algorithm generated during the cloud of the object that contained 438,450 points. These data
point cloud construction. Other explanations are the subpixel were imported into Gom Inspect software and three sections
effect and the blurred edges caused the instability in the parallel to the top surface of the object, with offset distances of
measurement. 2, 7, and 12 mm, respectively, are created (Fig. 18b). The
The running time for a single process was measured using measurement process was the same as that used for the 2D
an HP Z6 Workstation with an Intel Xeon Silver 4108 points of the original system; specifically, each section was
Int J Adv Manuf Technol

Fig. 17 Dimensions for A2 A3


measurement in the repeatability
test A1 A4

T1 T2 T
3
T4

3
A12

T1

R3
R1

T12 R2
A11

T11 T

T5
A10

10

T6

A5
T7
T8
T9

A6
A9
A7

A8

measured three times, and the final dimension was the average between the point clouds and the standard profile are generat-
of these values. ed as presented in Fig. 19c.
The point clouds from the two systems were merged with From Fig. 19c, parameters of the point cloud, including the
the standard profile and six regions from the merged images measurement precision, the error range, and the density of the
were selected for initial analysis of the similarities. Analyses point cloud, were analysed. Compared with the standard pro-
of these regions are magnified and displayed in Fig. 19. file, contours from the ATOS scanner were smaller about
As shown in the magnified images of the selected regions −0.10 and −0.05 mm, greater than that the differences of the
(Fig. 19b), the point cloud produced by the proposed system point cloud created by the proposed system (in the range be-
was smaller than that constructed by the ATOS scanner. This tween −0.15 and −0.10 mm). Furthermore, the ATOS point
is because of the blurring in the object profiles makes them cloud was smoother, with the thickness of the contour range
appear smaller in captured images. The differences between from 0.003 to 0.013 mm, smaller than that of the point cloud
these point clouds can be further examined by straightening constructed by the proposed system, with the maximum thick-
the six magnified regions. With the distance between points ness of which up to 0.047 mm. The point spacing in the sec-
and the standard profile taken as y-coordinates, the differences tion of the ATOS scanner varied between 0.008 and 0.5 mm,

Table 3 Running time for a


contour extraction process Operation Running time (second)

Image capturing 20.5


Calibration points detection and calibration coefficients calculation 8.6
Contour extraction 4.4
Calibration and store data to point clouds 29.8
Total time 63.3
Int J Adv Manuf Technol

Fig. 18 System comparison with


the ATOS I scanner. a Scanning Plane 1
the ring-shaped object; b
exporting 2D sections from the
3D scanned surface

Plane 2

Plane 3

(a) (b)

REGION 1 Start REGION 3 REGION 5


point End point
Region 2 Region 3

Region 1 End point

Start point
Region 4 End point Start point

REGION 2 Start REGION 4 End REGION 6


point
point
Ov
erla
Overlapping Start r e g ppi n
.

ion g
region point
End
point Start
End point
point

Region 6 Region 5 ATOS point cloud New system point cloud Standard profile
(section 2)

(a) (b)
Arc length (mm) Arc length (mm) Arc length (mm)
0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10
0.00 0.00 0.00
0.006 0.013 0.003
-0.05 -0.05 -0.05
Error (mm)
Error (mm)

Error (mm)

-0.10 -0.10 -0.10


0.037
-0.15 -0.15 0.032 -0.15 0.047
Proposed system ATOS's result ATOS's result
ATOS's result Proposed system Proposed system
-0.20 -0.20 -0.20
Region 1 Region 2 Region 3
Arc length (mm) Arc length (mm) Arc length (mm)
0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10
0.00 0.00 0.00
0.012 0.006
Error (mm)

Error (mm)

-0.05 -0.05 0.006 -0.05


Error (mm)

0.036 0.039
-0.10 -0.10 -0.10

-0.15 -0.15 0.042 -0.15


ATOS's result Proposed system ATOS's result
Proposed system ATOS's result Proposed system
-0.20 -0.20 -0.20
Region 4 Region 5 Region 6
(c)
Fig. 19 The selection of six regions from the merged point cloud of the two systems: a six regions were selected for inspection; b magnified images of
the selected regions, and c error-to-length graphs comparing the distance between the point cloud from the two systems with the standard profile
Int J Adv Manuf Technol

while the point spacing in the model of the proposed system mm), whereas the deviation of R3 yielded a positive range
was constant and equals to the pixel size (0.028 mm). (from +0.056 to +0.067 mm). These deviations can be ex-
To evaluate the system accuracy, dimensions from the plained as follows: First, because the image processing of
scanned model are measured and compared with those from the proposed system under thresholding binarization is influ-
the proposed system. By averaging the measurement results enced by the lighting, pixels in the blurred regions tend to be
from the three sections, 28 dimensions (mentioned in Fig. 17) diminished, which results in the shrinking in the outer con-
of the scanned model are calculated. These dimensions are tours (which contains the dimensions of Ti, R1, and R2) and
used as references to compare with the results of the three- the expansion in the inner contours (dimension R3) of the
time measurements from Table 2. The deviations of the mea- captured images (Fig. 20). The subpixel effect also impacts
surement results obtained from the two models, calculated the point clouds’ thickness in the three-time measurement (see
using the formula (Δi = di − D), are presented in Table 4. ‘Section 3.2’) and increases the difference between measure-
The result from Table 4 shows that compared with the point ments from the two system’s point clouds.
clouds of the ATOS scanner, the deviations of those generated Another explanation for these deviations, as mentioned
from the proposed systems range within ± 0.042° for the an- above, is the dimensions obtained from the three sections of
gular dimensions and range from −0.081 to −0.015 mm for the the ATOS scanned model were measured separately then av-
tooth width. Regarding the radius measurements, deviations eraged, whereas the dimensions measured from the proposed
of R1 and R2 had a negative range (from −0.029 to −0.004 system were obtained only from the top surface. If the side

Table 4 Comparison of results


between the ATOS scanner and Dimension Proposed system ATOS Deviation Δi = di − D
the proposed system scanner
Measurement Measurement Measurement (D) Δ1 Δ2 Δ3
no. 1 (d1) no. 2 (d2) no. 3 (d3)

A1 (°) 55.399 55.426 55.420 55.384 0.015 0.042 0.036


A2 (°) 13.841 13.827 13.845 13.852 −0.011 −0.025 −0.007
A3 (°) 13.844 13.866 13.826 13.861 −0.017 0.005 −0.035
A4 (°) 41.484 41.520 41.508 41.502 −0.018 0.018 0.006
A5 (°) 41.513 41.530 41.523 41.510 0.003 0.020 0.013
A6 (°) 13.829 13.823 13.821 13.832 −0.003 −0.009 −0.011
A7 (°) 13.879 13.862 13.868 13.858 0.021 0.004 0.010
A8 (°) 13.829 13.838 13.835 13.834 −0.005 0.004 0.001
A9 (°) 13.842 13.824 13.832 13.815 0.027 0.009 0.017
A10 (°) 13.870 13.862 13.836 13.841 0.029 0.021 −0.005
A11 (°) 13.875 13.840 13.852 13.882 −0.007 −0.042 −0.030
A12 (°) 41.558 41.544 41.524 41.532 0.026 0.012 −0.008
R1 (mm) 96.323 96.299 96.312 96.327 −0.004 −0.028 −0.015
R2 (mm) 93.310 93.320 93.331 93.339 −0.029 −0.019 −0.008
R3 (mm) 84.106 84.112 84.117 84.050 0.056 0.062 0.067
T1 (mm) 10.840 10.830 10.817 10.874 −0.034 −0.044 −0.057
T2 (mm) 10.787 10.757 10.745 10.802 −0.015 −0.045 −0.057
T3 (mm) 10.813 10.820 10.818 10.862 −0.049 −0.042 −0.044
T4 (mm) 10.875 10.868 10.890 10.943 −0.068 −0.075 −0.053
T5 (mm) 10.750 10.766 10.797 10.817 −0.067 −0.051 −0.020
T6 (mm) 10.833 10.832 10.835 10.913 −0.080 −0.081 −0.078
T7 (mm) 10.812 10.810 10.843 10.877 −0.065 −0.067 −0.034
T8 (mm) 10.833 10.834 10.812 10.884 −0.051 −0.050 −0.072
T9 (mm) 10.775 10.768 10.750 10.832 −0.057 −0.064 −0.082
T10 (mm) 10.864 10.857 10.862 10.893 −0.029 −0.036 −0.031
T11 (mm) 10.856 10.852 10.857 10.901 −0.045 −0.049 −0.044
T12 (mm) 10.826 10.823 10.829 10.861 −0.035 −0.038 −0.032
T13 (mm) 10.796 10.783 10.793 10.858 −0.062 −0.075 −0.065
Int J Adv Manuf Technol

model. Two other ring-shaped objects were added to the as-


sessment. Each dimension was measured three times and the
averaged values of the measurement were obtained. The re-
sults of the comparison are summarized in Table 5, including
nominal values D0, dimensions obtained from the proposed
system di, dimensions of the top section DT, and the deviation
between measurements of the two point clouds (Δi = di − DT).
Results in Table 5 showed that compared with the di-
mensions of the ATOS scanned model, the tooth width
deviation range from −0.063 to −0.003 mm, smaller than
the range from −0.082 to −0.015 mm in Table 4. For the
Fig. 20 The region where pixels were lost due to binarization dimensions of outer circles R1 and R2, their maximum
thresholding (red colour) deviations in Table 5 are 0.024 mm and 0.022 mm, respec-
tively, smaller than that in Table 4 (0.029 mm and 0.028
surface of the object has defects or the waviness tolerances mm, respectively). The deviation range of dimensions R3 is
were too high, the final calculation will be affected, resulting 2.4-fold reduced, from 0.067 (Table 4) to 0.028 mm
in high errors in the measurement comparison. (Table 5). These differences between dimensions from
To survey the surface properties of the object, three sec- the two tables proving the statement that the proposed sys-
tions of the scanned model were placed in the same plane for a tem only constructs profiles of the top surface of large-size
dimensional comparison (Fig. 21). The magnified regions re- ring-shaped objects.
veal that in some positions, the differences between the three Compare with the standard measurements from
sections even up to 0.133 mm certify the assumption about the ATOS, the deviations of the proposed system range
high waviness of the sample’s side surfaces. So that, when from −0.063 to −0.003 mm for the small tooth width
using measurement results of the sections for the comparison and from −0.024 to 0.00 mm for the 168-mm-diameter
in Table 4, these differences will affect the averaged values circles. With the pixel size of the system’s camera is
and lead to the incorrect assessment of the proposed system 0.028 mm, the pixel was diminished are approximately
accuracy. 2.25 pixels due to the blurring and subpixel issue.
To re-evaluate the proposed system, the new comparison Compare with the requirement tolerances from Fig. 1
only uses dimensions from the top section of the scanned (± 0.25 mm for linear dimension and ± 0.5° for angular

Fig. 21 Comparison of the three


sections obtained from the ATOS
scan (unit: mm)
Int J Adv Manuf Technol

Table 5 Dimensional comparisons between the point cloud from the two systems

Sample no. Dimension Nominal dimension Proposed system measurement number ATOS scanner Deviation Δi = di − DT
D0 DT
#1 (d1) #2 (d2) #3 (d3) Δ1 Δ2 Δ3

#1 A1 (°) 55.385 55.399 55.426 55.420 55.389 0.010 0.037 0.031


A2 (°) 13.103 13.841 13.827 13.845 13.854 −0.013 −0.027 −0.009
A3 (°) 13.103 13.844 13.866 13.826 13.858 −0.014 0.008 −0.032
A4 (°) 41.538 41.484 41.520 41.508 41.500 −0.016 0.020 0.008
A5 (°) 41.538 41.513 41.530 41.523 41.514 −0.001 0.016 0.009
A6 (°) 13.103 13.829 13.823 13.821 13.834 −0.005 −0.011 −0.013
A7 (°) 13.103 13.879 13.862 13.868 13.852 0.027 0.010 0.016
A8 (°) 13.103 13.829 13.838 13.835 13.844 −0.015 −0.006 −0.009
A9 (°) 13.103 13.842 13.824 13.832 13.829 0.013 −0.005 0.003
A10 (°) 13.103 13.870 13.862 13.836 13.857 0.013 0.005 −0.021
A11 (°) 13.103 13.875 13.840 13.852 13.854 0.021 −0.014 −0.002
A12 (°) 41.538 41.558 41.544 41.524 41.522 0.036 0.022 0.002
R1 (mm) 96.000 96.323 96.299 96.312 96.323 0.000 −0.024 −0.011
R2 (mm) 93.000 93.310 93.320 93.313 93.322 −0.012 −0.002 −0.009
R3 (mm) 84.000 84.106 84.112 84.117 84.092 0.014 0.020 0.025
T1 (mm) 10.800 10.840 10.830 10.817 10.844 −0.004 −0.014 −0.027
T2 (mm) 10.800 10.787 10.757 10.745 10.802 −0.015 −0.045 −0.057
T3 (mm) 10.800 10.813 10.820 10.818 10.856 −0.043 −0.036 −0.038
T4 (mm) 10.800 10.875 10.868 10.890 10.896 −0.021 −0.028 −0.006
T5 (mm) 10.800 10.750 10.766 10.797 10.806 −0.056 −0.040 −0.009
T6 (mm) 10.800 10.833 10.832 10.835 10.854 −0.021 −0.022 −0.019
T7 (mm) 10.800 10.802 10.810 10.843 10.862 −0.060 −0.052 −0.019
T8 (mm) 10.800 10.833 10.823 10.812 10.856 −0.023 −0.033 −0.044
T9 (mm) 10.800 10.775 10.768 10.750 10.811 −0.036 −0.043 −0.061
T10 (mm) 10.800 10.864 10.857 10.862 10.885 −0.021 −0.028 −0.023
T11 (mm) 10.800 10.856 10.852 10.857 10.865 −0.009 −0.013 −0.008
T12 (mm) 10.800 10.826 10.823 10.829 10.848 −0.022 −0.025 −0.019
T13 (mm) 10.800 10.796 10.783 10.793 10.824 −0.028 −0.041 −0.031
#2 A1 (°) 55.385 55.338 55.369 55.342 55.354 −0.016 0.015 −0.012
A2 (°) 13.103 13.825 13.817 13.841 13.852 −0.027 −0.035 −0.011
A3 (°) 13.103 13.878 13.843 13.865 13.861 0.017 −0.018 0.004
A4 (°) 41.538 41.574 41.602 41.608 41.592 −0.018 0.010 0.016
A5 (°) 41.538 41.534 41.506 41.516 41.510 0.024 −0.004 0.006
A6 (°) 13.103 13.809 13.821 13.841 13.832 −0.023 −0.011 0.009
A7 (°) 13.103 13.847 13.839 13.885 13.858 −0.011 −0.019 0.027
A8 (°) 13.103 13.849 13.827 13.855 13.834 0.015 −0.007 0.021
A9 (°) 13.103 13.781 13.794 13.813 13.810 −0.029 −0.016 0.003
A10 (°) 13.103 13.768 13.791 13.806 13.781 −0.013 0.010 0.025
A11 (°) 13.103 13.840 13.824 13.852 13.862 −0.022 −0.038 −0.010
A12 (°) 41.538 41.524 41.548 41.531 41.532 −0.008 0.016 −0.001
R1 (mm) 96.000 96.338 96.329 96.326 96.340 −0.002 −0.011 −0.014
R2 (mm) 93.000 93.320 93.324 93.319 93.332 −0.012 −0.008 −0.013
R3 (mm) 84.000 84.122 84.124 84.122 84.097 0.025 0.027 0.025
T1 (mm) 10.800 10.858 10.831 10.840 10.864 −0.006 −0.033 −0.024
T2 (mm) 10.800 10.757 10.787 10.770 10.797 −0.040 −0.010 −0.027
T3 (mm) 10.800 10.800 10.819 10.828 10.844 −0.044 −0.025 −0.016
T4 (mm) 10.800 10.818 10.783 10.824 10.832 −0.014 −0.049 −0.008
Int J Adv Manuf Technol

Table 5 (continued)

Sample no. Dimension Nominal dimension Proposed system measurement number ATOS scanner Deviation Δi = di − DT
D0 DT
#1 (d1) #2 (d2) #3 (d3) Δ1 Δ2 Δ3

T5 (mm) 10.800 10.747 10.750 10.721 10.782 −0.035 −0.032 −0.061


T6 (mm) 10.800 10.863 10.835 10.845 10.897 −0.034 −0.062 −0.052
T7 (mm) 10.800 10.873 10.857 10.883 10.886 −0.013 −0.029 −0.003
T8 (mm) 10.800 10.812 10.832 10.800 10.855 −0.043 −0.023 −0.055
T9 (mm) 10.800 10.809 10.775 10.802 10.838 −0.029 −0.063 −0.036
T10 (mm) 10.800 10.802 10.822 10.831 10.856 −0.054 −0.034 −0.025
T11 (mm) 10.800 10.876 10.848 10.872 10.911 −0.035 −0.063 −0.039
T12 (mm) 10.800 10.878 10.853 10.837 10.891 −0.013 −0.038 −0.054
T13 (mm) 10.800 10.795 10.832 10.796 10.852 −0.057 −0.020 −0.056
#3 A1 (°) 55.385 55.326 55.366 55.347 55.340 −0.014 0.026 0.007
A2 (°) 13.103 13.848 13.852 13.866 13.833 0.015 0.019 0.033
A3 (°) 13.103 13.843 13.846 13.861 13.852 −0.009 −0.006 0.009
A4 (°) 41.538 41.508 41.481 41.502 41.490 0.018 −0.009 0.012
A5 (°) 41.538 41.513 41.523 41.548 41.536 −0.023 −0.013 0.012
A6 (°) 13.103 13.863 13.828 13.849 13.853 0.010 −0.025 −0.004
A7 (°) 13.103 13.862 13.885 13.893 13.892 −0.030 −0.007 0.001
A8 (°) 13.103 13.852 13.835 13.868 13.867 −0.015 −0.032 0.001
A9 (°) 13.103 13.832 13.842 13.821 13.850 −0.018 −0.008 −0.029
A10 (°) 13.103 13.863 13.870 13.895 13.871 −0.008 −0.001 0.024
A11 (°) 13.103 13.871 13.882 13.864 13.891 −0.020 −0.009 −0.027
A12 (°) 41.538 41.578 41.591 41.617 41.599 −0.021 −0.008 0.018
R1 (mm) 96.000 96.329 96.320 96.324 96.341 −0.012 −0.021 −0.017
R2 (mm) 93.000 93.402 93.399 93.401 93.421 −0.019 −0.022 −0.020
R3 (mm) 84.000 84.324 84.326 84.319 84.298 0.026 0.028 0.021
T1 (mm) 10.800 10.787 10.817 10.778 10.822 −0.035 −0.005 −0.044
T2 (mm) 10.800 10.735 10.703 10.725 10.742 −0.007 −0.039 −0.017
T3 (mm) 10.800 10.776 10.805 10.801 10.820 −0.044 −0.015 −0.019
T4 (mm) 10.800 10.769 10.789 10.757 10.818 −0.049 −0.029 −0.061
T5 (mm) 10.800 10.677 10.703 10.663 10.721 −0.044 −0.018 −0.058
T6 (mm) 10.800 10.715 10.742 10.752 10.762 −0.047 −0.020 −0.010
T7 (mm) 10.800 10.812 10.788 10.781 10.843 −0.031 −0.055 −0.062
T8 (mm) 10.800 10.766 10.782 10.751 10.790 −0.024 −0.008 −0.039
T9 (mm) 10.800 10.721 10.758 10.740 10.783 −0.062 −0.025 −0.043
T10 (mm) 10.800 10.769 10.797 10.800 10.817 −0.048 −0.020 −0.017
T11 (mm) 10.800 10.767 10.749 10.789 10.802 −0.035 −0.053 −0.013
T12 (mm) 10.800 10.833 10.804 10.813 10.852 −0.019 −0.048 −0.039
T13 (mm) 10.800 10.778 10.765 10.790 10.812 −0.034 −0.047 −0.022

dimension), the proposed system can be used in the It can be seen that, despite having higher deviation mea-
industrial environment for checking the dimension after surement, the proposed system is still a cost-effective and
broaching. If the system employs some modifications easy-to-operate measurement system. The installation of the
such as redesigning the calibration plate, using whole system accounts for around US$2,000 cost while the
telecentric lens, and collimate light sources, the accura- price of an ATOS I scanner is US$100,000, not to mention
cy of the system can be improved and reduce the devi- operating on ATOS scanners requires skilled technicians.
ation measurement. Furthermore, the warmed up time before each measurement
Int J Adv Manuf Technol

using the ATOS scanner takes approximately 10 to 20 min, References


when the proposed system enables head-on measurement, and
the whole processing time takes about 65 seconds. 1. Blais F (2004) Review of 20 years of range sensor development. J
Electronic Imaging 13(1):231. https://doi.org/10.1117/1.1631921
2. Malamas EN, Petrakis EGM, Zervakis M, Petit L, Legat J-D (2003)
A survey on industrial vision systems, applications and tools. Image
Vis Comput 21(2):171–188. https://doi.org/10.1016/S0262-
4 Conclusions 8856(02)00152-X
3. Lu Q, Fu Q, Luo L, Yuan Q, Hua W, Yunguang Y (2018)
Measurement method of LCD surface deformation for smartphone
In this paper, a new method for reconstructing the 2D based on optical vision sensing system. Optik 172:1079–1088.
profile of large-size ring-shaped objects using image pro- https://doi.org/10.1016/j.ijleo.2018.07.122
cessing was presented. By capture partial images of the 4. Chauhan V, Surgenor B (2017) Fault detection and classification in
object, the system can increase the resolution of the cap- automated assembly machines using machine vision. Int J Adv
Manuf Technol 90(9):2491–2512. https://doi.org/10.1007/s00170-
tured images up to 3.6 times, improving the profile ex-
016-9581-5
traction accuracy. With the use of a calibration plate 5. Molleda J, Usamentiaga R, Millara AF, García DF, Manso P,
containing reference points and separation lines, profiles Suárez CM, García I (2016) A profile measurement system for rail
from partial images can be projected into the same plane quality assessment during manufacturing. IEEE Trans Ind Appl
with small errors. Analysis of the resulted point clouds 52(3):2684–2692. https://doi.org/10.1109/TIA.2016.2524459
6. Millara AF, Molleda J, Usamentiaga R, García DF (2019) Profile
obtained from the system revealed that with a pixel size
measurement of rails in a rolling mill: implementing and evaluating
of 0.028 mm and a calibration error under 0.01 mm, the autonomic computing capabilities. IEEE Trans Ind Appl 55(5):
new system can reconstruct the complex profiles of 5466–5475. https://doi.org/10.1109/TIA.2019.2919487
large-size ring-shaped objects with high accuracy, as in- 7. Millara AF, Molleda J, Usamentiaga R, García DF (2021)
dicated by the thickness of the point cloud profile (0.046 Calibrating a profile measurement system for dimensional inspec-
tion in rail rolling mills. Mach Vis Appl 32(1):1–16. https://doi.org/
mm), the deviation range of the angular dimensions is ± 10.1007/s00138-020-01147-5
0.04°, and of the linear dimension is 0.062 mm for the 8. Derganc J, Likar B, Pernus F (2003) A machine vision system for
tooth width. measuring the eccentricity of bearings. Comput Ind 50:103–111.
The precision and efficiency of the proposed system were https://doi.org/10.1016/S0166-3615(02)00141-0
9. Xiang R, He W, Zhang X, Wang D, Shan Y (2018) Size measure-
evaluated through comparison with results from an ATOS I
ment based on a two-camera machine vision system for the bayo-
scanner model. The differences in dimensions between the nets of automobile brake pads. Measurement 122:106–116. https://
two systems revealed that lighting and image resolution had doi.org/10.1016/j.measurement.2018.03.017
a considerable effect on the measurement. Considering the 10. Gadelmawla ES (2011) Computer vision algorithms for measure-
low fabrication cost, simple structure, and ease of operation ment and inspection of spur gears. Measurement 44(9):1669–1678.
https://doi.org/10.1016/j.measurement.2011.06.023
within the acceptable range of deviation, the proposed system
11. Moru DK, Borro D (2020) A machine vision algorithm for quality
is industrially applicable concerning the 2D profiles recon- control inspection of gears. Int J Adv Manuf Technol 106(1):105–
struction of various types of ring-shaped objects. The process 123. https://doi.org/10.1007/s00170-019-04426-2
can be completely automated, facilitating the rapid construc- 12. Ye J-H, Hsu Q-C (2018) Automatic optical apparatus for inspecting
tion of 2D point clouds and preventing errors caused by man- bearing assembly defects. Sensors and Materials 30(11):16. https://
doi.org/10.18494/SAM.2018.2113
ual measurement. 13. Hsu QC, Ngo NV, Ni RH (2019) Development of a faster classifi-
cation system for metal parts using machine vision under different
Acknowledgements The authors would like to acknowledge the financial lighting environments. Int J Adv Manuf Technol 100(9):3219–
support from the Ministry of Science and Technology, Taiwan. The au- 3235. https://doi.org/10.1007/s00170-018-2888-7
thors would like to thank Linesoon Industrial Co., Ltd. (Taiwan) for 14. Peng G, Zhang Z, Li W (2016) Computer vision algorithm for
providing the measurement specimens and for valuable discussion. measurement and inspection of O-rings. Measurement 94:828–
836. https://doi.org/10.1016/j.measurement.2016.09.012
Author contribution Anh-Tuan Dang: formal analysis, investigation, 15. Rejc J, Kovačič F, Trpin A, Turk I, Štrus M, Rejc D, Obid P, Munih
software, writing-original draft, visualization. Quang-Cherng Hsu: data M (2011) The mechanical assembly dimensional measurements
curation, funding acquisition, methodology, project administration, re- with the automated visual inspection system. Expert Syst Appl
sources. Tat-Tai Truong: review, visualization and editing. 38(8):10665–10675. https://doi.org/10.1016/j.eswa.2011.02.133
16. Wang X, Liu J, Liu S, Jin P, Wu T, Wang Z (2018) Accurate radius
Funding This research was partially funded by the Ministry of Science measurement of multi-bend tubes based on stereo vision.
and Technology, Taiwan, under grant number MOST 106-2622-E-151- Measurement 117:326–338. https://doi.org/10.1016/j.
017-CC3. measurement.2017.12.009
17. Gadelmawla ES (2017) Computer vision algorithms for measure-
ment and inspection of external screw threads. Measurement 100:
Declarations 36–49. https://doi.org/10.1016/j.measurement.2016.12.034
18. Ngo N-V, Hsu Q-C, Hsiao W-L, Yang C-J (2017) Development a
Conflict of interest The authors declare no competing interests. simple three-dimensional machine-vision measurement system for
Int J Adv Manuf Technol

i n - p r o c e s s m e c h a n i c a l p a r t s . A d v M e c h En g 9 ( 1 0 ) : optimization algorithm. Neurocomputing 174:456–465. https://doi.


1687814017717183. https://doi.org/10.1177/1687814017717183 org/10.1016/j.neucom.2015.03.119
19. Tan Q, Kou Y, Miao J, Liu S, Chai B (2021) A model of diameter 22. Zhang Z (2000) A flexible new technique for camera calibration.
measurement based on the machine vision. Symmetry 13(2):187. IEEE Trans Pattern Anal Mach Intell 22(11):1330–1334. https://
https://doi.org/10.3390/sym13020187 doi.org/10.1109/34.888718
20. Abdel-aziz YI, Karara H (1971) Direct linear transformation from 23. Liu D, Liu X, Wang M (2016) Camera self-calibration with lens
comparator coordinates into object space coordinates in close-range distortion from a single image. Photogramm Eng Remote Sens
photogrammetry. Photogramm Eng Remote Sens 81:103–107. 82(5):325–334. https://doi.org/10.1016/S0099-1112(16)82013-3
https://doi.org/10.14358/PERS.81.2.103
21. Deng L, Lu G, Shao Y, Fei M, Hu H (2016) A novel camera Publisher’s note Springer Nature remains neutral with regard to jurisdic-
calibration technique based on differential evolution particle swarm tional claims in published maps and institutional affiliations.

You might also like