You are on page 1of 11

Vision-based displacement

measurement sensor using modified


Taylor approximation approach

Bingyou Liu
Dashan Zhang
Jie Guo
Chang’an Zhu

Bingyou Liu, Dashan Zhang, Jie Guo, Chang’an Zhu, “Vision-based displacement measurement sensor
using modified Taylor approximation approach,” Opt. Eng. 55(11), 114103 (2016),
doi: 10.1117/1.OE.55.11.114103.

Downloaded From: https://www.spiedigitallibrary.org/journals/Optical-Engineering on 16 Mar 2021


Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Optical Engineering 55(11), 114103 (November 2016)

Vision-based displacement measurement sensor using


modified Taylor approximation approach
Bingyou Liu,a,b Dashan Zhang,a Jie Guo,a,* and Chang’an Zhua
a
University of Science and Technology of China, Department of Precision Machinery and Precision Instrumentation, Huangshan Road No. 443,
Shushan District, Hefei, AnHui 230027, China
b
Anhui Polytechnic University, Key Lab of Electric and Control of Anhui Province, Beijingzhong Road No. 8, Jiujiang District, Wuhu, AnHui 241000,
China

Abstract. The development of image sensors and optics lenses has contributed to the rapidly increasing use of
vision-based methods as noncontact measurement methods in many areas. A high-speed camera system is
developed to realize the displacement measurement in real time. Conventional visual measurement algorithms
are commonly subjected to various shortcomings, such as complex processes, multiparameter adjustments, or
integer-pixel accuracy. Inspired from the combination of block-matching algorithm and simplified optical flow,
a motion estimation algorithm that uses modified Taylor approximation is proposed and applied to the vision
sensor system. Simplifying integer-pixel searching with a rounding-iterative operation enables the modified algo-
rithm to rapidly accomplish one displacement extraction within 1 ms and yield satisfactory subpixel accuracy.
The performance of the vision sensor is evaluated through a simulation test and two experiments on a grating
ruler motion platform and a steering wheel system of a forklift. Experimental results show that the developed
vision sensor can extract accurate displacement signals and accomplish the vibration measurement of engineer-
ing structures. © The Authors. Published by SPIE under a Creative Commons Attribution 3.0 Unported License. Distribution or reproduction of this
work in whole or in part requires full attribution of the original publication, including its DOI. [DOI: 10.1117/1.OE.55.11.114103]

Keywords: noncontact measurement; machine vision; high-speed vision sensor; subpixel accuracy; vibration analysis.
Paper 161320 received Aug. 24, 2016; accepted for publication Oct. 25, 2016; published online Nov. 14, 2016.

1 Introduction displacement measurement based on two-dimensional (2-D)


Noncontact measure techniques, such as speckle photogra- digital image correlation (DIC). Kim et al.26 proposed a
phy,1 hologram interferometry,2 and laser Doppler vibrome- vision-based monitoring system that uses DIC to evaluate
try,3 have been developed for years and well applied in the cable tensile force of a cable-stayed bridge. The same
various fields. Compared to traditional measurement devices, method was also applied in experimental mechanics for non-
such as accelerometer or linear displacement gauge, devices contact, full-field deformation measurement.27,28 Park et al.29
with noncontact approaches have a more flexible installation realized displacement measurement for high-rise building
and provide intuitionistic exhibitions of the actual movements structures using the partitioning approach. Wahbeh et al.30
of the target without affecting its behavior. In some environ- realized the measurement of displacements and rotations of
ments where traditional sensors do not have clear access or the Vincent Thomas Bridge in California by using a highly
cannot work effectively, e.g., remote measurement targets accurate camera in conjunction with a laser tracking reference.
or fields with high temperature or strong magnetic, noncontact Fukuda et al.31 proposed a camera-based sensor system in
measurement devices obviously have great advantages over which a robust object search algorithm was used to measure
conventional ones. However, most noncontact equipment the dynamic displacements of large-scale structures. Feng
requires high cost and strict construction structures, thus limit- et al.32 developed a vision-based sensor that employed an
ing the wide use of such systems in practical applications. up-sampled cross-correlation (UCC) algorithm for noncontact
Technological developments in image sensors and optics structural displacement measurement, which can accurately
lens have contributed to the rapidly increasing use of measure the displacements of bridges.33
vision-based measurement methods as noncontact measure- The traditional camera system for displacement measure-
ment methods in numerous research and industrial areas, such ment is composed of commercial digital cameras and video-
as vibration analysis,4,5 condition monitoring,6–11 human processing devices (normally computers). However, ordinary
motion,12,13 and underwater measurement.14 With a relatively digital cameras often have low video-sampling rate, which
lower cost and better flexibility in structure, optical devices limits the application range of their vibration frequency.34
and cameras offer effective alternatives to noncontact equip- To overcome this restriction, a high-speed vision system
ment. Benefiting from the wide availability of affordable with 1000 frames per second (fps) or even higher has been
high-quality digital image sensors and high-performance developed and applied in practice.35 In the present paper, a
computers, cheap high-resolution cameras have been growing high-speed camera sensor system composed of a zoom opti-
used in many areas. Recently, vision-based techniques were cal lens, a high-speed camera body with a CCD receiver, and
successfully used to measure various structures and get satis- a notebook computer. A USB 3.0 interface is used to ensure
factory results.15–24 Quan et al.25 achieved three-dimensional stable data transfer between the camera body and the com-
puter. On the notebook computer, the captured video can be
processed by software to realize the tracking of an object and
*Address all correspondence to: Jie Guo, E-mail: guojiegj@mail.ustc.edu.cn to extract motion information in real time.

Optical Engineering 114103-1 November 2016 • Vol. 55(11)

Downloaded From: https://www.spiedigitallibrary.org/journals/Optical-Engineering on 16 Mar 2021


Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Liu et al.: Vision-based displacement measurement sensor using modified Taylor. . .

Similar to other measurement equipment, a vision-based


measurement system is mainly concerned with measurement
of speed and accuracy, both of which significantly depend on
the performance of the image-processing algorithm. Owing
to their high sampling rate, high-speed sensors have strict
requirements for motion-tracking algorithms on computing
speed to satisfy the demand of real-time signal processing.
Conventional motion extraction algorithms based on tem-
plate matching registration techniques [i.e., sum of absolute
difference (SAD) or normalized cross-correlation (NCC)]
are mostly complex and have a heavy computation load. Fig. 1 High-speed vision sensor system: (a) experimental setups and
Moreover, template matching techniques can only achieve (b) video camera and zoom optical lens.
integer-pixel resolution because the minimal unit in a video
image is 1 pixel. Such accuracy is far from satisfactory in computer (Intel Core processor 2.9 GHz, 2.75 GB RAM)
numerous practical applications, particularly for those where and a video camera with telescopic lens is developed for
the vibrations of small structures are required. Various meth- displacement measurement, as shown in Fig. 1(a). The
ods have been proposed to refine measurement accuracy, telescopic lens has a large zooming capability [Fig. 1(b)]
including interpolation techniques and subpixel registra- that can reach the measurement requirement at different
tion,36–38 most of which improved accuracy indeed but distances. The camera head uses a CCD sensor as the image
exhibited low computational efficiency to some degree. receiver, which can capture 8-bit gray-scale images at a
Chan et al.39 proposed a subpixel motion estimation maximum of 1000 fps when the image resolution is set as
method that uses a combination of classical block matching 300 pixels × 300 pixels. A USB 3.0 interface is used to
and simplified optical flow. The method is tremendously ensure stable data transfer between the camera and the com-
faster than any existing block-matching algorithm because puter. With its high sampling rate and computing efficiency,
no interpolation is needed. In the first step, a block-matching the image-processing software on the notebook computer
algorithm, such as three-step search (TSS) or cross-diamond can use the refined Taylor algorithm to track specific fast-
search, is used to determine integer-pixel displacement. The moving objects and extract motion information in real time.
result is then refined through local approximation using a A target panel preinstalled on the target is very helpful to
simplified optical flow to subpixel level. In this paper, we ensure extraction accuracy during measurement. If the target
simplified Chan’s algorithm by replacing block matching panel is unavailable because of the limitation of the measure-
with rounding-iterative Taylor approximation. With no sub- ment environment, the distinct surface patterns of the struc-
pixel interpolation needed during frame cutting in each ture, such as textures or edges, can be used as tracking
iteration, this modified algorithm runs much faster than templates. Then the camera system is ready to capture
conventional iterative DIC optical flow. Given that the images from a remote location, and the displacement time
improvement brings no additional parameter that requires history of the structure can be obtained by applying the dis-
specification, the modified algorithm naturally executes placement tracking algorithm to the digital video images.
with a high degree of automation. After several times of
optimization, the computation time of one extraction in
the modified algorithm is reduced to less than 1 ms. The 3 Motion Extraction Algorithm
modified algorithm is utilized in the high-speed camera sys-
tem for its high efficiency and satisfactory subpixel accuracy. 3.1 Subpixel Motion Estimation Without Interpolation
A simulation and two experiments under laboratory and real- Figure 2 shows the subpixel motion estimation algorithm
istic conditions are carried out for performance verification. combined with block-matching algorithm and simplified
The positive results demonstrated the accuracy and effi- optical flow. Two consecutive frames, fðx; yÞ and gðx; yÞ,
ciency of the camera sensor system in measuring dynamic with real displacement ðΔx; ΔyÞ are given. The real displace-
displacement. ment can be divided into an integer part ðΔx̄; ΔȳÞ and a sub-
The rest of the paper is organized as follows. Section 2 pixel part ðδx; δyÞ as
introduces the components and capability parameters of
the high-speed vision sensor system. Section 3 presents Δx ¼ Δx̄ þ δx;
EQ-TARGET;temp:intralink-;e001;326;243 Δy ¼ Δȳ þ δy: (1)
the theory of motion estimation algorithm without interpo-
lation and the modified Taylor algorithm. Section 4 evaluates A block-matching algorithm is first applied to estimate
the performance of the modified algorithm with a simulation integer-pixel displacements Δx̄ and Δȳ. When the integer
test. Section 5 presents two experiments for performance part is determined, the image block fðx; yÞ is shifted by
verification. Section 6 discusses the results and outlook. Δx̄ pixel in the x-direction and Δȳ pixel in the y-direction.
For the subpixel part, the Taylor series approximation is
used to refine the search. The shifted image fðx þ Δx̄; y þ
2 High-Speed Vision Sensor System ΔȳÞ differs from the accurate location only by jδxj < 1 and
Traditional camera systems for displacement measurement jδyj < 1, which can be computed by using one-step Taylor
are commonly composed of commercial digital cameras approximation.
and personal computers. However, commercial digital cam- Total displacement can be determined by combining the
eras usually have low frame rates (i.e., 100 fps), which limit integer part and the subpixel part. Analytical error
their application in vibration frequencies over 50 Hz. In this analysis39,40 is deduced in one dimension and can be gener-
paper, a high-speed sensor system composed of a notebook alized straightforwardly to a 2-D situation. The results imply

Optical Engineering 114103-2 November 2016 • Vol. 55(11)

Downloaded From: https://www.spiedigitallibrary.org/journals/Optical-Engineering on 16 Mar 2021


Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Liu et al.: Vision-based displacement measurement sensor using modified Taylor. . .

secondary illumination (shadows or intersurface reflection),


the brightness constancy assumption works well ⇀
in practice.40
Given the fact that the displacement vector p ¼ ðΔx;ΔyÞT
is usually a small value (normally several pixels), Eq. (2) can
be approximated using first-order Taylor expansion with the
higher-order terms ignored as follows:

EQ-TARGET;temp:intralink-;e003;326;686

gðx; yÞ ¼ fðx þ Δx; y þ ΔyÞ ≈ fðx; yÞ þ Δx fðx; yÞ


∂x

þ Δy fðx; yÞ: (3)
∂y

With two unknowns, Δx and Δy, in one equation, the linear


least squares (LS) estimator minimizes the square errors:

⇀ X ∂ ∂
2
EðpÞ ¼ gðx;yÞ−fðx;yÞ−Δx fðx;yÞ−Δy fðx;yÞ :
∂x ∂y
EQ-TARGET;temp:intralink-;e004;326;596

x;y

(4)
Fig. 2 Flowchart of motion estimation without interpolation using a ⇀
combination of block-matching algorithm and simplified optical flow. As a linear LS problem, the minimum of Eðp ⇀
Þ can be found
by setting its derivatives with respect to p are zero:
⇀ ⇀
that this two-step method can extract more accurate motion ∂EðpÞ ∂EðpÞ
vectors than other block-matching algorithms. With no ¼ 0; ¼ 0: (5)
∂Δx ∂Δy
EQ-TARGET;temp:intralink-;e005;326;512

requirement for any interpolation and motion-compensated


frames, the algorithm is much faster than the conventional Equation (5) can be written in matrix form
method.

∇Ip ¼ ΔI;
EQ-TARGET;temp:intralink-;e006;326;454 (6)
3.2 Taylor Approximation with Rounding-Iterative
Operation in which ΔI is the difference matrix and ∇I denotes the
In this part, an analytic model is built to illustrate the gradient matrix
proposed modified algorithm used in the sensor system. 2 3
fðx1 ; y1 Þ − gðx1 ; y1 Þ
EQ-TARGET;temp:intralink-;e007;326;397

Figure 3 illustrates the displacement extraction procedure 6 7


from consecutive frames k and k þ 1. A random subimage 6 fðx2 ; y2 Þ − gðx2 ; y2 Þ 7
6 7
fðx; yÞ in frame k is selected as the matching⇀template. With ΔI ¼ 6 .. 7;
6 7
all its pixels moved by displacement vector p ¼ ðΔx; ΔyÞT, 4 . 5
the template image will become a new subimage gðx; yÞ in fðxn ; yn Þ − gðxn ; yn Þ
the next frame at the same position. 2 3
With the assumption of brightness constancy or intensity f x ðx1 ; y1 Þ f y ðx1 ; y1 Þ
conservation, the relationship between the template images 6 7
6 f x ðx2 ; y2 Þ f y ðx2 ; y2 Þ 7
fðx; yÞ and gðx; yÞ at the same position in frame k þ 1 can be 6 7
∇I ¼ 6 .. .. 7; (7)
written as 6 7
4 . . 5
gðx; yÞ ¼ fðx þ Δx; y þ ΔyÞ:
EQ-TARGET;temp:intralink-;e002;63;276 (2) f x ðxn ; yn Þ f y ðxn ; yn Þ

Note that the surface radiance remaining fixed from one where n refers to the number of pixels in selected templates.
frame to the next rarely holds exactly. As the scene might be In the sense of general LS, if ∇I T · ΔI is invertible (full
constrained with no specularities, object rotations, and rank), then the displacement vector can be expressed with an
LS estimate as

EQ-TARGET;temp:intralink-;e008;326;196 p ¼ ½∇I T · ΔI−1 · ∇I T · ΔI: (8)

With Eq. (8), displacement vectors between adjacent frames


can be obtained precisely on the condition of a minute inter-
frame displacement because the validation requirement of
Taylor approximation is that jΔxj < 1 and jΔyj < 1. How-
ever, the interframe displacement between frames may be
larger than the expected

value in practical application, in
which case the vector p might be correct in direction but
inaccurate in magnitude. Therefore, a rounding-iterative
Fig. 3 Illustration of overall-shift template image. process is introduced to solve the problem and guarantee

Optical Engineering 114103-3 November 2016 • Vol. 55(11)

Downloaded From: https://www.spiedigitallibrary.org/journals/Optical-Engineering on 16 Mar 2021


Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Liu et al.: Vision-based displacement measurement sensor using modified Taylor. . .

stable satisfactory subpixel results and executes with high


efficiency in the following simulation and experiments.
Thus, the algorithm can be used in high-speed camera sys-
tems to measure displacement in real time. The contrast sim-
ulation and experiments for validation are presented in the
following sections.

4 Simulation Test
The performance of the proposed modified Taylor algorithm
is first evaluated through a simulation test. The simulation
gives a simple case with only one vignetting black circle
(a diameter of 160 pixels) on white ground, as shown in
Fig. 5. The black circle is programmed to rotate with the
following ellipse equation:

xðtÞ ¼ 10 cosð2πftÞ;
EQ-TARGET;temp:intralink-;e009;326;597 yðtÞ ¼ 6 sinð2πftÞ: (9)

The maximum displacements in the x- and y-directions


are 10 and 6 pixels, respectively. The rotation frequency is
Fig. 4 Displacement extraction using Taylor approximation with the set to 1 Hz, and the sampling frequency is 50 Hz. Four algo-
reformative iteration operation. rithms, namely, classical NCC, UCC, TSS with optical flow,
and the proposed modified Taylor, are applied to extract
⇀ the motion displacement of the moving circle. All code pro-
accuracy. For each calculation step of pj ¼ ðΔxj ; Δyj ÞT , gramming and computing works are accomplished with
the calculated Δxj and Δyj are set by rounding to the MATLAB R2015a.
nearest

integers until the termination condition is satisfied During the testing, an 80 × 80 pixels region (within the
(pj < 0.5). red box) is selected as the tracking template for the stable
Figure 4 shows the architecture of the proposed displace- tracking error.41 UCC algorithm is an advanced subpixel
ment extraction method with the reformative iteration proc- image registration technique that allows resolution adjusting
ess involved. The procedure for the proposed modified by changing the up-sampling factor.34 The up-sampling fac-
method can be summarized as follows: tors for UCC algorithm are set as 1, 10, and 100 for subpixel
Step 1: Cut fðx; yÞ and gðx; yÞ from consecutive frames at levels of one integer pixel, 0.1 pixel, and 0.01 pixel, respec-
the same position; tively. Meanwhile, the UCC algorithm cannot give accurate
Step 2: Compute the partial derivatives f x and f y of fðx; yÞ displacement results until the template size is large enough.
by the central difference; The contrast test results on the same template condition are
Step 3: Compute the difference matrix ΔI and gradient summarized in Table 1 with an asterisk.
matrix ∇I according to Eq. (7); ⇀ Motion extraction results of two integer-level methods,
Step 4: Compute the displacement vector pj ⇀ ¼ ðΔxj ; Δyj ÞT namely, NCC and UCC (usfac ¼ 1), are shown in Figs. 6(a)
according to Eq. (8) and make sure pj is less than and 6(b). These two algorithms only scan the template’s
⇀ best-matching region per pixel, and such a deficiency cer-
0.5. If pj is less than 0.5, the algorithm proceeds
to Step 5; if not, update fðx; yÞ to the new tainly leads to a step-type curve shape and reduces extraction
f½x þ roundðΔxj Þ; y þ roundðΔyj Þ and return to accuracy. Results of subpixel level motion extraction are
Step 2 (Symbol round denotes rounding to the near- shown in Figs. 6(c)–6(f). The figures show that the motion
est integer); curves with these four subpixel level algorithms are obviously
Step 5: Accumulate Δxj and Δyj to obtain the refined dis- smoother than the curves with NCC and UCC (usfac ¼ 1).
⇀ Quantitive contrast results regarding tracking error and
placement vector p ¼ ðΔx; ΔyÞT.
computation time are given in Table 1. The table indicates
With this rounding-iterative modification, the integer- that with the improvement of subpixel resolution level from
level motion estimation is also accomplished with optical 1 to 0.01 pixel, the absolute average horizontal error of the
flow instead of block matching. This modification of the UCC algorithm reduces from 0.2309 to 0.0548 pixel, and
original method is so simple because it does not introduce the absolute average vertical error reduces from 0.2378
any unnecessary computation and pixel interpolation. The to 0.0481 pixel. Meanwhile, the time consumed increases
rounding-off operation eliminates the subpixel interpolation
computation in each frame cutting, which makes the algo-
rithm much faster than conventional iterative DIC optical
flow. The algorithm naturally executes with a high degree
of automation because the improvement brings no additional
parameter that requires specification. Although the round-
ing-off operation significantly decreases the time consumed
for subpixel interpolation, the modified algorithm may, to
some extent, sacrifice accuracy for its relatively loose termi- Fig. 5 Simulation black circle on white ground and the selected
nation condition. Fortunately, the proposed method performs tracking template.

Optical Engineering 114103-4 November 2016 • Vol. 55(11)

Downloaded From: https://www.spiedigitallibrary.org/journals/Optical-Engineering on 16 Mar 2021


Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Liu et al.: Vision-based displacement measurement sensor using modified Taylor. . .

Table 1 Errors and time consumption comparisons in the simulation test.

Absolute average error


Max error (pixel) (pixel)

Algorithm x y x y T total (s) T avg (ms)

Classical NCC 0.6279 0.6231 0.2340 0.2421 4.46 17.77

UCC (usfac ¼ 1) 0.5710 0.3742 0.2309 0.2378 2.45 9.89

UCC (usfac ¼ 10) 0.1897 0.1520 0.0580 0.0491 3.51 14.03

UCC (usfac ¼ 100) 0.1497 0.1220 0.0548 0.0481 4.20 16.78

TSS + optical flow 0.3383 0.3383 0.1151 0.1126 1.07 4.08

Modified Taylor 0.3327 0.2553 0.0817 0.0681 0.15 0.46

Modified Taylora 0.1590 0.1480 0.0531 0.0463 0.32 0.97


a
With the same template size as the UCC algorithm.

x y x y
10 10 6
6

4 4
Amplitude (mm)
Amplitude (mm)

Amplitude (mm)
Amplitude (mm)

5 5
2 2
Actual input Actual input Actual input Actual input
0 0 0 0
Classical NCC Classical NCC UCC (usfac=1) UCC (usfac=1)

-2 -2
-5 -5
-4 -4

-6 -10 -6
-10
0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2
Time (s) Time (s) Time (s) Time (s)
(a) (b)
x y x y
10 10 6
6

4 4
Amplitude (mm)
Amplitude (mm)

Amplitude (mm)
Amplitude (mm)

5 5
2 2
Actual input Actual input Actual input Actual input
0 0 0 0
UCC (usfac=10) UCC (usfac=10) UCC (usfac=100) UCC (usfac=100)

-2 -2
-5 -5
-4 -4

-6 -10 -6
-10
0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2
Time (s) Time (s) Time (s) Time (s)
(c) (d)
x y x y
10 10 6
6

4 4
Amplitude (mm)
Amplitude (mm)

Amplitude (mm)
Amplitude (mm)

5
2 2
Actual input Actual input Actual input Actual input
0 0 0
TSS+optical flow TSS+optical flow Modified Taylor Modified Taylor

-2 -2
-5 -5
-4 -4

-6 -10 -6
-10
0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2 0 0.5 1 1.5 2
Time (s) Time (s) Time (s) Time (s)
(e) (f)

Fig. 6 Comparisons of displacement extraction results between the actual input and different algorithms.
(a) Classical NCC algorithm, (b) UCC algorithm (usfac ¼ 1), (c) UCC algorithm (usfac ¼ 10), (d) UCC
algorithm (usfac ¼ 100), (e) TSS and optical flow, and (f) modified Taylor algorithm.

Optical Engineering 114103-5 November 2016 • Vol. 55(11)

Downloaded From: https://www.spiedigitallibrary.org/journals/Optical-Engineering on 16 Mar 2021


Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Liu et al.: Vision-based displacement measurement sensor using modified Taylor. . .

from 14.03 to 16.78 ms∕frame. When subjected to the displacement sensor was installed on the moving table plat-
requirement of template size, the UCC algorithm does not form, with its reading moving synchronously with the junc-
have the capability of giving dual attention to both accuracy tion plate in the horizontal direction. With this structure, the
and high efficiency. The error analysis of the combination of displacement of the target can be recorded simultaneously by
TSS and optical flow reveals that the combination method the high-speed sensor system and the grating ruler for com-
has a clear advantage in time consumed (4.08 ms∕frame) parison. The sampling frequency of the grating ruler sensor
with acceptable average errors (0.1151 pixel in the horizontal used in the experiment is 20 Hz, and the grating pitch is
and 0.1126 pixel in the vertical). The proposed modified 0.02 mm with a resolution of 1 μm.
Taylor method is also observed to have the highest compu- In the experiment, the vision-based high-speed camera
tation efficiency among all the listed methods. The average system was experimentally evaluated against the grating
elapsed time of handling one frame costs only 0.46 ms with a ruler. As seen in Fig. 7(a), a circle target with a diameter
relatively better error performance than the combined TSS of 20 mm was installed on the junction plate in advance.
and optical flow. Furthermore, with a large size template, The target can be programmed to move with arbitrary ampli-
the modified Taylor even gives a similar error performance tudes and frequencies in the horizontal direction. The video
as UCC (usfac ¼ 100) with the elapsed time per frame of just camera was placed at a stationary position 3 m away from
1∕17 of the latter. the platform. The camera captured the moving target at a
Owing to the satisfactory performances in time efficiency resolution of 160 × 160 pixels with 200 fps during the
and accuracy during displacement extraction in the simula- shooting process. To measure the displacement in real-
tion test, the modified Taylor algorithm was integrated into time dimensions, the actual size of the preinstalled target
the real-time vision sensor system mentioned in Sec. 2. The panel in the video images was calculated. The results
software contains several modules. The high-speed camera
showed that 20 mm in real life corresponds to 104.7 pixels
module can control the parameters of the digital camera,
in the captured images, which means the pixel resolution
such as contrast, brightness, and exposure time. The calibra-
would be 0.191 mm∕pixel. A 50 × 50 pixels region on
tion part has the ability to compute the actual displacement of
the target, as shown in Fig. 7(b), was chosen as the template
one pixel based on a target with an already known size. With
matching image.
the image-capturing part, the streaming image data can be
acquired in real time and sent to the template tracking mod- The guide screw was driven by a 10 s manual arbitrary
ule, where the modified Taylor algorithm is operating. The input. As shown in Fig. 8, the horizontal displacement
entire sensor system is implemented based on the Qt and time history measured by the vision-based system was com-
OpenCV libraries and is capable of realizing the displace- pared with that measured by the grating ruler sensor. The
ment measurement of actual structures. grating ruler data (green dashed line) matched well with
the vision-based sensor data (dashed blue line). The inte-
ger-level tracking result is marked with a red solid line.
5 Experimental Verification The step-type result indicates that the integer-level algo-
5.1 Case 1: Experiment on a Grating Ruler Motion rithms can only acquire the integer-pixel motion of the target,
Platform which leads to a large measurement error.
Similar to the simulation test, the captured video was
To evaluate the performance of the developed vision-based analyzed by the different motion extraction algorithms pre-
sensor system, an experimental verification was carried out viously mentioned. Quantitive experimental results regard-
on a laboratory platform with a conventional grating ruler, as ing the tracking error and computation time are given in
shown in Fig. 7. Using the Moiré fringe technology of gra- Table 2. To further evaluate the error performance, the nor-
ting and photoelectric conversion, the incremental grating malized root mean squared error (NRMSE) is introduced as
displacement sensors widely act as a high-accuracy displace-
ment measurement tool with numerous advantages, such as
stability, reliability, and high accuracy. The experimental Local zoom
installations are shown in Fig. 7(a). The grating ruler 2

1.5

1
Displacement (mm)

0.5
Grating ruler
Integer level
Modified Taylor
0

-0.5

-1

-1.5
0 1 2 3 4 5 6 7 8 9 10
Time (s)
Fig. 7 Experiment setup in grating ruler platform experiment.
(a) Experimental device and (b) the cross target and selected Fig. 8 Results of displacement measurement on grating ruler moving
template. platform.

Optical Engineering 114103-6 November 2016 • Vol. 55(11)

Downloaded From: https://www.spiedigitallibrary.org/journals/Optical-Engineering on 16 Mar 2021


Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Liu et al.: Vision-based displacement measurement sensor using modified Taylor. . .

Table 2 Errors and time consumption comparisons in grating ruler assembled using an interference fit, and the mounting
motion platform experiment. plate is connected with the commutator pump using a lock-
ing device. The mounting plate and front panel are both
Algorithm Erroravg (mm) NRMSE (%) T avg (ms) welded to the install plate. Owing to the design defects, res-
onance exists on the steering wheel system when the engine
Classical NCC 0.051 — 5.93 is working at idle speed (22.8 to 28.3 Hz). This resonance
UCC (usfac ¼ 1) 0.058 — 3.78
may lead to physical complaints and industrial accidents
if drivers operate the forklift for a long drive.
UCC (usfac ¼ 10) 0.028 1.08 13.09 A finite element model (FEM) of the steering wheel sys-
tem was built with Pro/Engineer, as shown in Fig. 9(b). The
UCC (usfac ¼ 100) 0.022 0.87 15.35
locking device between commutator pump and mounting
TSS + optical flow 0.025 1.01 2.54 plate was simplified into a bolted connection in the FEM
modeling. Figure 9 also illustrates the grid results using
Modified Taylor 0.020 0.75 0.22 a hexahedral mesh. Modal test results proved that a first-
Modified Taylora 0.019 0.73 0.45
order natural frequency occurs at 22.3262 Hz. From Fig. 10,
the vibration mode of this frequency is shown as a horizontal
a
With the same template size as the UCC algorithm. steering wheel bending. The FEM analysis confirmed the
resonance speculation because the natural frequency is appa-
qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
Pn rently close to the resonance frequency range.
i¼1 ðai − bi Þ
1 2 The vision-based experimental setup on the forklift’s
n
NRMSE ¼ × 100%; (10) steering wheel system is shown in Fig. 11. The high-
bmax − bmin
EQ-TARGET;temp:intralink-;e010;63;543

speed camera sensor was installed on a special support to


avoid additional interference. Measurement targets with a
where n denotes the frame number and a and b refer to the
displacement data measured by the vision-based sensor and 10 mm × 10 mm size were marked on the upper surface
grating ruler, respectively. The result indicates that the vision of the steering wheel. The distance between the camera
sensor system with modified Taylor algorithm has the lowest and the steering wheel was about 60 cm. The actual size
NRMSE of 0.75% and the fastest average computing time of one pixel was 0.0375 mm∕pixel, which was calculated
per frame of 0.22 ms. The absolute average measuring
error of modified Taylor was 0.020 mm. With the pixel
resolution of 0.191 mm∕pixel, the proposed sensor system
achieved 1∕9 the pixel accuracy of the experimental
measurement.

5.2 Case 2: Vibration Measurement of Steering


Wheel System
To validate the effectiveness of the proposed sensor system
in a practical environment, a vibration measure experiment
on a forklift’s steering wheel system was conducted. The
steering wheel system consists of several components,
including front panel, install panel, mounting plate, commu-
tator pump, steering wheel, and tubular column. As shown Fig. 10 Horizontal steering wheel bending mode at 22.3262 Hz using
in Fig. 9(a), the steering wheel and tubular column are FEM modeling.

Fig. 9 Model of the steering wheel system. (a) 3-D model of the steering wheel system and (b) hexa-
hedral FEM mesh result.

Optical Engineering 114103-7 November 2016 • Vol. 55(11)

Downloaded From: https://www.spiedigitallibrary.org/journals/Optical-Engineering on 16 Mar 2021


Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Liu et al.: Vision-based displacement measurement sensor using modified Taylor. . .

obtained accurately from the proposed vision-based dis-


placement sensor.

6 Conclusions
This study developed a vision-based high-speed sensor sys-
tem for dynamic displacement measurement. The sensor sys-
tem is composed of a high-speed camera head with a zoom
optical lens and a notebook computer. To meet the require-
ment of real-time measurement, a motion extraction algo-
rithm with high efficiency is used. With the combination
of block-matching algorithm and simplified optical flow,
motion vectors between frames can be extracted accurately.
The method is proven to be much faster than conventional
algorithms because there is no interpolation or motion com-
Fig. 11 Vibration measurement experiment on a forklift’s steering pensation. However, this combination method still has room
wheel system. for improvement.
In our proposed algorithm, the integer-pixel searching
is replaced with a rounding-iterative operation on Taylor
by using the known physical size of the targets. With the approximation. This simple modification does not bring
efficient modified Taylor algorithm, the vibration can be any unnecessary computation or pixel interpolation to the
analyzed and displayed in real time. original method. By benefiting from no additional parameter
The horizontal and vertical vibration displacements with requiring specification, the modified algorithm can execute
their corresponding Fourier spectra after the modified Taylor with high automation and even faster with better accuracy.
algorithm were applied to the vibration video, as shown in Based on the assumption of brightness constancy or intensity
Fig. 12. The results show that the center of the steering wheel conservation, the proposed algorithm obtains the displace-
vibrates with an amplitude under 0.5 mm after excitation. ment vector between frames in the sense of LSs and achieves
Two obvious spectral peaks can be observed at 22.27 and fast automatic computation by iteratively updating the tem-
44.24 Hz in the Fourier spectrum results; these peaks can plate’s position. Without the image feature extraction proc-
be considered as the first-order master frequency and its dou- ess, the algorithm simplifies the selection of thresholds and is
ble frequency of the steering wheel system. During the completed through a simple matrix operation. The properties
motion extraction process, the elapsed time for each frame of high efficiency, high precision, and good robustness of the
was less than 0.4 ms, and more than 87% extractions proposed algorithm itself contribute to the applications of
were completed within 0.1 ms. The results are very close the high-speed camera sensor system.
to the natural frequency obtained with FEM analysis with A simulation on the tracking rotation motion of a black
an acceptable error. Therefore, the same frequency can be circle as well as two experiments on a grating ruler motion

Horizontal Horizontal
0.4

0.1 X: 22.27
Amplitude (mm)

0.2 Y: 0.1086
0.08
Amplitude

0 0.06

0.04
-0.2 X: 44.24
0.02
Y: 0.004531
-0.4 0
0 1 2 3 0 20 40 60 80
Time (s) Frequency (Hz)
Vertical Vertical
0.4
0.08 X: 22.27
Amplitude (mm)

0.2 Y: 0.08586
0.06
Amplitude

0
0.04
X: 44.24
-0.2 0.02 Y: 0.01425

-0.4 0
0 1 2 3 0 20 40 60 80
Time (s) Frequency (Hz)

Fig. 12 Vibration displacements and the frequency spectra of the steering wheel system.

Optical Engineering 114103-8 November 2016 • Vol. 55(11)

Downloaded From: https://www.spiedigitallibrary.org/journals/Optical-Engineering on 16 Mar 2021


Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Liu et al.: Vision-based displacement measurement sensor using modified Taylor. . .

platform and vibration analysis of steering wheel system are 12. F. Cheli et al., “Vision-based measuring system for rider’s pose estima-
tion during motorcycle riding,” Mech. Syst. Signal Process. 38, 399–
conducted to verify the effectiveness of the modified algo- 410 (2013).
rithm and developed sensor system. The results of displace- 13. Y. Shao, Y. Guo, and C. Gao, “Human action recognition using motion
energy template,” Opt. Eng. 54(6), 063107 (2015).
ment extraction using the modified algorithm are compared 14. F. C. Trigo et al., “Identification of a scaled-model riser dynamics
with the actual values and the results of three other existing through a combined computer vision and adaptive Kalman filter
extraction algorithms. From the simulation test, a satisfactory approach,” Mech. Syst. Signal Process. 43, 124–140 (2014).
15. J. Guo, “Dynamic displacement measurement of large scale structures
agreement is observed between the real motion curve and the based on the Lucas–Kanade template tracking algorithm,” Mech. Syst.
curve obtained through the modified algorithm. In the grating Signal Process. 66, 425–436 (2015).
ruler motion platform experiment, the motion of the grating 16. Y. Song et al., “Virtual visual sensors and their application in structural
health monitoring,” Struct. Health Monit. An Int. J. 13, 251–264 (2014).
ruler platform is accurately measured using the developed sen- 17. U. Yang et al., “Illumination-invariant color space and its application to
sor system. In a realistic environment, the performance of the skin-color detection,” Opt. Eng. 49(10), 107004 (2010).
vision sensor is further confirmed by the vibration analysis of 18. J. J. Lee, H. N. Ho, and J. H. Lee, “A vision-based dynamic rotational
angle measurement system for large civil structures,” Sensors 12, 7326–
the forklift’s steering wheel system. Of all the simulation and 7336 (2012).
experiments, the modified algorithm shows its outperform- 19. H. S. Park, “A new position measurement system using a motion-cap-
ture camera for wind tunnel tests,” Sensors 13, 12329–12344 (2013).
ance on computing efficiency. The average elapsed time of 20. J. Sadek et al., “Development of a vision based deflection measurement
handling one frame can be reduced to less than 1 ms with system and its accuracy assessment,” Measurement 46, 1237–1249
an impressive measurement error. (2013).
21. J. Morlier and G. Michon, “Virtual vibration measurement using KLT
Although the brightness constancy assumption works motion tracking algorithm,” J. Dyn. Syst. Meas. Control 132, 011003
well in practice, the large vibration on illumination intensity (2010).
may still influence the measurement results and lead to large 22. B. Ko and S. Kwak, “Survey of computer vision-based natural disaster
warning systems machine vision and applications,” Opt. Eng. 51(7),
errors. Different from the improvement through multi- 070901 (2012).
frame,39 the modified algorithm acquires the image basis 23. H. Wang et al., “Vision-based vehicle detection and tracking algorithm
design,” Opt. Eng. 48(2), 127201 (2009).
by handling only one frame. This characteristic makes the 24. A. Jaume-i-Capo et al., “Automatic human body modeling for vision-
method concise and highly effective, but the differential based motion capture system using B-spline parameterization of the
operation may amplify the image noise and cause an unde- silhouette,” Opt. Eng. 51(2), 020501 (2012).
25. C. Quan et al., “Determination of three-dimensional displacement using
sired error. The developed sensor system can only meet the two-dimensional digital image correlation,” Appl. Opt. 47, 583–593
real-time measurement under a frequency sampling below (2008).
500 Hz, which is limited by the camera module we can access. 26. S. W. Kim et al., “Vision based monitoring system for evaluating cable
tensile forces on a cable-stayed bridge,” Struct. Health Monit. An Int. J.
Future work will be focused on improving the algorithm’s 12, 440–456 (2013).
robustness under large illumination changes and developing 27. E. S. Bell, J. T. Peddle, and A. Goudreau, “Bridge condition assessment
using digital image correlation and structural modeling,” in 6th Int.
a sensor system for high frequency sampling over 500 Hz. Conf. on Bridge Maintenance, Safety and Management, Stresa, Italy,
pp. 330–337 (2012).
28. W. Tong, “Formulation of Lucas–Kanade digital image correlation algo-
Acknowledgments rithms for noncontact deformation measurements: a review,” Strain 49,
313–334 (2013).
This work was supported by the Key Project of Natural 29. J. W. Park et al., “Vision based displacement measurement method for
Science by Education Department of Anhui Province high-rise building structures using partitioning approach,” NDT&E Int.
(No. KJ2015A316) and the Outstanding Young Talents at 43, 642–647 (2010).
30. A. M. Wahbeh et al., “A vision-based approach for the direct measure-
Home Visit the School Training Project (No. gxfxZD2016101). ment of displacements in vibrating systems,” Smart Mater. Struct. 12,
785–794 (2003).
31. Y. Fukuda et al., “Vision-based displacement sensor for monitoring
References dynamic response using robust object search algorithm,” IEEE
Sensors J. 13, 4725–4732 (2013).
1. Y. Zhang et al., “Application of the Fourier transform in electronic 32. D. Feng et al., “A vision based sensor for noncontact structural displace-
speckle photography,” Exp. Mech. 42, 409–416 (2002). ment measurement,” Sensors 15, 16557–16575 (2015).
2. J. L. Valin et al., “Methodology for analysis of displacement using 33. M. Guizar-Sicairos, S. T. Thurman, and J. R. Fienup, “Efficient subpixel
digital holography,” Opt. Laser Technol. 43, 99–111 (2005). image registration algorithms,” Opt. Lett. 33, 156–158 (2008).
3. H. H. Nassif et al., “Comparison of laser Doppler vibrometer with con- 34. J. J. Lee and M. Shinozuka, “Real-time displacement measurement of a
tact sensors for monitoring bridge deflection and vibration,” NDT&E flexible bridge using digital image processing techniques,” Exp. Mech.
Int. 38, 213–218 (2005). 46, 105–114 (2006).
4. Y. Ji and C. Chang, “Nontarget stereo vision technique for spatiotem- 35. D. You, X. Gao, and S. Katayama, “Monitoring of high power laser
poral response measurement of line-like structure,” J. Eng. Mech. 134, welding using high-speed photographing and image processing,”
466–474 (2008). Mech. Syst. Signal Process. 49, 39–52 (2014).
5. D. L. B. R. Jurjo et al., “Experimental methodology for the dynamic 36. P. Bing et al., “Performance of sub-pixel registration algorithms in dig-
analysis of slender structures based on digital image processing tech- ital image correlation,” Meas. Sci. Technol. 17, 1615–1621 (2006).
niques,” Mech. Syst. Signal Process. 20, 1112–1133 (2006). 37. Z. Zhang and R. Wang, “Robust image super resolution method to
6. T. Wu et al., “Full-life dynamic identification of wear state based on on- handle localized motion outliers,” Opt. Eng. 48(7), 077005 (2009).
line wear debris image features,” Mech. Syst. Signal Process. 42, 404– 38. L. Li et al., “Subpixel flood inundation mapping from multispectral
414 (2014). remotely sensed images based on discrete particle swarm optimization,”
7. Y. V. Filatov et al., “Noncontact measurement of angular position and J. Photogramm. Remote Sens. 101, 10–21 (2015).
angular movement by means of laser goniometer,” Opt. Eng. 54(5), 39. S. Chan et al., “Subpixel motion estimation without interpolation,” in
054103 (2015). 2010 IEEE Int. Conf. on Acoustics, Speech and Signal Processing,
8. S. W. Park et al., “3D displacement measurement model for health mon- pp. 722–725 (2010).
itoring of structures using a motion capture system,” Measurement 59, 40. .D. Fleet and Y. Weiss, Handbook of Mathematical Models in Computer
352–362 (2015). Vision, pp. 239–256, Springer, New York (2006).
9. Y. Arai, “Development of in-plane and out-of-plane deformations simul- 41. X. Lei et al., “Vibration extraction based on fast NCC algorithm and
taneous measurement method for the analysis of buckling,” Opt. Eng. high-speed camera,” Appl. Opt. 54, 8198–8206 (2015).
54(2), 024102 (2015).
10. P. J. Figueroa, N. J. Leite, and R. M. L. Barros, “Tracking soccer players
aiming their kinematical motion analysis,” Comput. Vis. Image Bingyou Liu received his BS and MS degrees in detection technol-
Understanding 101, 122–135 (2006). ogy and automation devices from Anhui Polytechnic University in
11. R. Aharoni et al., “Real-time stand-off spatial detection and identifica- 2003 and 2008, respectively. He is an associate professor at
tion of gases and vapor using external-cavity quantum cascade laser Anhui Polytechnic University. He has authored 15 journal papers.
open-path spectrometer,” Opt. Eng. 54(6), 067103 (2015).

Optical Engineering 114103-9 November 2016 • Vol. 55(11)

Downloaded From: https://www.spiedigitallibrary.org/journals/Optical-Engineering on 16 Mar 2021


Terms of Use: https://www.spiedigitallibrary.org/terms-of-use
Liu et al.: Vision-based displacement measurement sensor using modified Taylor. . .

His current research interests include optoelectronic systems, intelli- with the Department of Precision Machinery and Precision Instrumen-
gent control, and weak signal detection. tation, University of Science and Technology of China. His current
research interests include machine vision, pattern recognition,
Dashan Zhang received his BS degree in mechanical engineering machine learning, and fault diagnosis.
from Guizhou University in 2012. Currently, he is a PhD candidate
in the Department of Precision Machinery and Precision Instrumen- Chang’an Zhu received his BS degree from HeFei University of
tation at the University of Science and Technology of China. His Technology in 1982, his MS degree from Xidian University in 1985,
research interests include image processing and optical measurement. and his PhD from National University of Defense Technology in
1989. He is a professor at the University of Science and Technology
Jie Guo received his BS and PhD degrees in mechanical engineering of China. His current research interests include intelligent control, fault
from the University of Science and Technology of China, Hefei, China, diagnosis technology, and advanced manufacturing technology.
in 2010 and 2015, respectively. Currently, he is a postdoctoral fellow

Optical Engineering 114103-10 November 2016 • Vol. 55(11)

Downloaded From: https://www.spiedigitallibrary.org/journals/Optical-Engineering on 16 Mar 2021


Terms of Use: https://www.spiedigitallibrary.org/terms-of-use

You might also like