You are on page 1of 9

Flexible Videogrammetric Technique for Three-Dimensional

Structural Vibration Measurement


C. C. Chang1 and Y. F. Ji2
Downloaded from ascelibrary.org by Universidad Politecnica De Valencia on 05/25/15. Copyright ASCE. For personal use only; all rights reserved.

Abstract: A videogrammetric technique is proposed for measuring three-dimensional structural vibration response in the laboratory. The
technique is based on the principles of close-range digital photogrammetry and computer vision. Two commercial-grade digital video
cameras are used for image acquisition. To calibrate these cameras and to overcome potential lens distortion problems, an innovative
two-step calibration process including individual and stereo calibration is proposed. These calibrations are efficiently done using a planar
pattern arbitrarily shown at a few different orientations. This special characteristic makes it possible to perform an on-site calibration that
provides flexibility in terms of using different camera settings to suit various application conditions. To validate the proposed technique,
three tests, including sinusoidal motion of a point, wind tunnel test of a cross-section bridge model, and a three-story building model under
earthquake excitation, are performed. Results indicate that the proposed videogrammetric technique can provide fairly accurate displace-
ment measurement for all three tests. The proposed technique is shown to be a good complement to the traditional sensors for measuring
two- or three-dimensional vibration response in the low-frequency range.
DOI: 10.1061/共ASCE兲0733-9399共2007兲133:6共656兲
CE Database subject headings: Photogrammetry; Vibration; Dynamic response; Measurement; Imaging techniques.

Introduction sensors that provide displacement measurements at selected


points on structures. When displacements at a large number of
Measuring displacements of structures under dynamic excitations locations are desired, measuring them using these sensors can be
has been a difficult and challenging task. One common approach prohibitively expensive if not impossible 共Fu and Moosa 2002兲.
is to perform a double integration on acceleration data that can be Photogrammetry is a measurement technique that uses photo-
easily measured using accelerometers. This integration process, graphs to establish the geometrical relationship between a
however, is not readily automated as it requires selection of filters three-dimensional 共3D兲 object and its two-dimensional 共2D兲
and baseline correction and use of judgment when anomalies exist photographic images. When sequences of images are used to cap-
in the records 共Hudson 1979兲. Traditional displacement sensors, ture a spatial coordinate time history of the object, the technique
such as linear variable differential transformers 共LVDTs兲, are ac- can also be referred to as a videogrammetric technique. The pho-
curate for measuring one-dimensional 共1D兲 displacements. These togrammetric technique can be classified as aerial or terrestrial
sensors, however, require a stationary platform as the measure- depending on the types of photographs used in the analysis
ment reference. In many cases, this requirement is not desirable 共Fraser 1998; Mikhail et al. 2001兲. Just as the name implies, the
especially when access to the structure is prohibited or hazardous. aerial photogrammetric technique uses photographs acquired
Recently, the global position system 共GPS兲, which obtains the from aircrafts, satellites, or hot air balloons. On the other hand,
position of a receiver using a satellite ranging method, has the terrestrial photogrammetric technique, or the so-called close-
emerged as a potential alternative displacement measurement range photogrammetric technique, uses photographs from ground-
technique especially for large-scale structures 共Çelebi and Sanli based cameras.
2002兲. Current GPS technology is capable of providing 10– 20 Hz There have been a few civil engineering applications using
sampling rates with an accuracy of ±1 cm horizontally and ±2 cm close-range photogrammetric techniques. Bales 共1985兲 reported
vertically. Note that the sensors mentioned above are all discrete on the use of a close-range photogrammetric technique for vari-
ous bridge applications including estimation of crack sizes in a
1 reinforced concrete bridge deck, deflection measurement of a rail
Associate Professor, Dept. of Civil Engineering, Hong Kong Univ. of
Science and Technology, Clear Water Bay, Kowloon, Hong Kong. bridge caused by thermal effects, and effect of dead load on the
E-mail: cechang@ust.hk girders of a three-span steel bridge under construction. It was
2 concluded that accuracy in the order of 3.2 mm can be achieved
Ph.D. Candidate, Dept. of Civil Engineering, Hong Kong Univ. of
Science and Technology, Clear Water Bay, Kowloon, Hong Kong. for measuring bridge deflections using a metric camera. Li and
E-mail: toneyji@ust.hk Yuan 共1988兲 developed a 3D photogrammetric vision system con-
Note. Associate Editor: Henri P. Gavin. Discussion open until sisting of TV cameras and 3D control points for measuring bridge
November 1, 2007. Separate discussions must be submitted for individual deformation. A direct linear transform 共DLT兲 method 共Abdel-Aziz
papers. To extend the closing date by one month, a written request must
and Karara 1971兲 was employed for camera calibration. It was
be filed with the ASCE Managing Editor. The manuscript for this paper
was submitted for review and possible publication on July 26, 2005; reported that the measurement accuracy could reach 0.32 mm,
approved on September 13, 2006. This paper is part of the Journal of which was 1/2,000 of the field of view. Aw and Koo 共1993兲 used
Engineering Mechanics, Vol. 133, No. 6, June 1, 2007. ©ASCE, ISSN three standard frame-transfer charge coupled device 共CCD兲 cam-
0733-9399/2007/6-656–664/$25.00. eras to determine 3D coordinates of premarked targets. They were

656 / JOURNAL OF ENGINEERING MECHANICS © ASCE / JUNE 2007

J. Eng. Mech. 2007.133:656-664.


able to achieve an average accuracy of 2.24 mm by the bundle for application. With recent advances in electronic and optical
adjustment method. A two-video camera system was used by technologies, commercial digital video cameras with high pixel
Whiteman et al. 共2002兲 to measure vertical deflections of concrete resolution have become popular. These commercial digital
beams during destructive testing. The cameras were calibrated cameras are low cost but are semimetric or even nonmetric and
using 16 images of a 3D retroreflective target field. Results might suffer from a few problems such as lens distortion. To
showed that an accuracy of 0.25 mm could be achieved. Nied- use these commercial cameras for photogrammetric applications,
eröst and Maas 共1997兲 reported an application of a digital video it is necessary to perform an accurate calibration to obtain their
camera on the measurement of deformation accuracy during the intrinsic and extrinsic parameters preferably on-site prior to the
dehydration process of concrete parts over several months. The application.
camera was calibrated for each epoch individually by photogram- In this study, a videogrammetric technique is proposed for
metric self-calibration techniques. It was shown that a precision measuring 3D structural vibration response in the laboratory.
of 3 ␮m could be achieved over an object space with a largest
Downloaded from ascelibrary.org by Universidad Politecnica De Valencia on 05/25/15. Copyright ASCE. For personal use only; all rights reserved.

The technique is based on the principles of close-range digital


dimension of 80 cm. The multiepoch deformation monitoring of a photogrammetry and computer vision. Two commercial-grade
series of hot steel beams by digital close-range photogrammetry nonmetric digital video cameras are used for image acquisition.
was conducted byFraser and Riedel 共2000兲. Three CCD cameras To calibrate these nonmetric cameras and overcome potential lens
were used to measure both stable reference points and targets. distortion problems, an innovative two-step calibration process
Results from several beams showed that a dimensional tolerance including individual calibration and stereo calibration is pro-
of about 1 mm could be achieved. Fu and Moosa 共2002兲 used posed. These calibrations are efficiently done using a planar pat-
monochrome images acquired from a CCD camera to measure tern arbitrarily shown at a few different orientations. This special
static deformation profile of a simply supported beam. A subpixel characteristic makes it possible to perform an on-site calibration,
edge detection technique was used together with curve fitting to which provides flexibility in terms of using different camera set-
locate damage in the beam. Jáuregui et al. 共2003兲 used a semi- tings to suit for various application conditions. To validate the
metric digital camera to measure vertical static deflection of proposed technique, three laboratory tests, including sinusoidal
bridges. A set of control points with precise 3D coordinates was motion of a point, wind tunnel test of a model bridge cross sec-
used to establish the geometrical relationship between targets and tion, and a three-story building model under earthquake excita-
their images. tion, are performed.
For dynamic displacement measurement, Olaszek 共1999兲
developed a method that incorporated the photogrammetric prin-
ciple with the computer vision technique to investigate the Videogrammetric Measurement System
dynamic characteristics of bridges. A sophisticated CCD camera
system equipped with a telephotolens was used to acquire sequen- The proposed videogrammetric system consists of two commer-
tial images that were digitalized at 512⫻ 512 8-bit pixels. The cial video cameras, a planar pattern for camera calibration, and a
accuracy of the method depended on factors such as system cali- few targets to be attached on test objects. Steps to realize this
bration, pattern form, and distortion of images caused by the op- system include a two-step camera calibration, target point track-
tical and electronic systems. Results of two field tests revealed ing, and 3D point reconstruction. Details of these tasks are briefly
that the method was reliable with an accuracy of 0.1– 1 mm for discussed in the following.
camera-pattern distance ranges between 10 and 100 m and vibra-
tion frequency under 5 Hz. Patsias and Staszewski 共2002兲 used a
videogrammetric technique to measure mode shapes of a cantile- Two-Step Camera Calibration
ver beam. A wavelet edge detection technique was adopted to
detect the presence of damage in the beam. A high-speed profes- Individual Camera Calibration with Lens Distortion
sional camera system equipped with a maximum sampling rate of Camera calibration is a process to determine a set of camera
600 frames per second 共fps兲 was used to capture the beam vibra- parameters that relate a 3D object to its 2D images. Camera
tion. The system, however, was limited to 2D planar vibration parameters can be classified into two groups: intrinsic and extrin-
measurement since only one camera was used. Yoshida et al. sic parameters. Intrinsic parameters define the geometric and the
共2003兲 used a videogrammetric technique to capture 3D dynamic optical characteristics of the camera while extrinsic parameters
behavior of a membrane. Their measurement system consisted of describe the rotational and the translational information of an
three progressive CCD cameras with 1.3 million pixels and image coordinate system with respect to a predefined global co-
30 fps. The cameras were calibrated using a calibration plate ordinate system. A general projective camera such as the one
anchored on a linear guide system with accurate positioning ca- equipped with CCD sensors can be described using a pinhole
pability. Chung et al. 共2004兲 used digital image techniques for model 共see Fig. 1兲 as 共Hartley and Zisserman 2003兲
identifying nonlinear characteristics in systems. Applications
showed that digital image processing could identify the Coulomb
friction coefficient of a pendulum as well as the nonlinear behav- ␭m = PM 共1兲
ior of a base-isolated model structure.
In the earlier development of photogrammetric techniques,
researchers used high-precision industrial grade metric cameras where m = 共u , v , 1兲T⫽homogeneous vector of a 2D point
for image acquisition. These metric cameras are designed for in the image coordinate uv system defined by pixels;
accurate measurement and need to satisfy stringent optical and M = 共X , Y , Z , 1兲T⫽homogeneous vector of a 3D real point in
geometrical requirements. These metric cameras, however, are the global coordinate XYZ system; and ␭⫽scale factor. For a finite
quite expensive and require elaborate factory calibration. After projective CCD camera, the 3 ⫻ 4 homogeneous camera projec-
calibration, the principle distance of these metric cameras cannot tion matrix P contains the intrinsic parameters of the camera as
be changed, which limits the distance range of these cameras well as relates the translational and the rotational relations

JOURNAL OF ENGINEERING MECHANICS © ASCE / JUNE 2007 / 657

J. Eng. Mech. 2007.133:656-664.


Downloaded from ascelibrary.org by Universidad Politecnica De Valencia on 05/25/15. Copyright ASCE. For personal use only; all rights reserved.

Fig. 1. Pinhole camera model

between the global coordinate XYZ system and the camera coor-
dinate xyz system. This projection matrix can be expressed as
P = KR关I t兴 共2兲
The camera calibration matrix K is of the form

冤 冥
␣x s u0
K = 0 ␣ y v0 共3兲
0 0 1
where ␣x and ␣y⫽focal lengths of the camera in terms of pixel
dimensions in the u and v-directions, respectively; s⫽skew pa-
rameter; and u0 and v0⫽coordinates of the principal point in Fig. 2. 共a兲 Plane pattern used in camera calibration; 共b兲 plane-based
terms of pixel dimensions. In Eq. 共2兲, I = 3 ⫻ 3 unity matrix; camera calibration
R = 3 ⫻ 3 rotation matrix; and t = 3 ⫻ 1 translation vector repre-
senting the orientation and the translation between the camera
coordinate system and the global coordinate system, respectively.
ing the global coordinate system is fixed on and rotated with the
In total, there are five intrinsic parameters 共␣x, ␣y, s, u0, and v0兲
pattern, these l corner points would have the same coordinates on
and six extrinsic parameters 共three Euler rotation angles and
the n images, Mj = 共X j , Y j , 0 , 1兲T, j = 1 , 2 , . . . , l. Also, since the
three translation parameters兲. To determine these 11 parameters, it
focal length of the camera is fixed during the image acquisition,
is necessary to provide at least six pairs of correspondence be-
the intrinsic matrix K is a constant matrix. The projected image
tween nonplanar 3D points and their 2D images 共Hartley and
coordinates of the jth corner point on the ith image from the
Zisserman 2003兲.
pinhole model, mij = 共uij , vij , 1兲T, can be obtained from
For metric cameras, off-site calibration methods using special
equipment such as multicollimator or geniometer are popular but ␭mij = KRi关I ti兴Mj 共4兲
time-consuming and expensive 共Mikhail et al. 2001兲. For modern
close-range photogrammetry using nonmetric cameras, on-site The camera parameters can then be obtained through the follow-
calibration becomes attractive since some comprehensive and ing optimization function:
easy-to-use analytic techniques have been developed 共Gruen and n l


Huang 2001兲. These techniques can be roughly classified into two
categories: photogrammetric calibration and self-calibration. For Minimize 兺
i=1 j=1
储m̂ij − mij储2 共5兲
photogrammetric calibration, a calibration object with precisely
known geometry in the 3D space is required. For self-calibration, where m̂ij⫽actual observed image coordinates for the jth control
bundle triangulation is performed using a network of highly point on the ith image. The objective of optimization is to deter-
convergent overlapping images. In this study, a plane-based cam- mine a total of 5 + 6n parameters 共including five intrinsic param-
era calibration method 共Tsai 1987兲 that requires only 2D control eters and six extrinsic parameters for each image兲 that minimize
information is adopted. This technique is more flexible than the the sum of distances between the projected image points mij and
photogrammetric calibration and more robust when compared to the actual observed image points m̂ij. This optimization problem
self-calibration 共Zhang 1998兲. The technique uses a planar cali- can be initialized at the algebraic solution suggested by Zhang
bration pattern such as the one shown in Fig. 2共a兲. The pattern 共1998兲 for fast convergence.
consists of 30 mm⫻ 30 mm black and white squares printed from Comparing to metric cameras, commercial cameras might suf-
a laser printer with a resolution of 1,200 dots per in. fer from a few problems that could affect their accuracy when
Assume that n images of the pattern shown at different orien- used for photogrammetric applications. One notable problem is
tations are acquired under a fixed focal length 关Fig. 2共b兲兴. The associated with lens distortion. Denote the distorted image coor-
rotation matrix and the translation vectors for these n images are dinates of a point as 共ũ , ṽ , 1兲. The corresponding ideal 共or
expressed as Ri and ti, i = 1 , 2 , . . . , n, respectively. On each of distortion-free兲 image coordinates 共u , v , 1兲 can be expressed as
these images, l corner points are selected for calibration. Assum- 共Sturm and Maybank 1999兲

658 / JOURNAL OF ENGINEERING MECHANICS © ASCE / JUNE 2007

J. Eng. Mech. 2007.133:656-664.


Downloaded from ascelibrary.org by Universidad Politecnica De Valencia on 05/25/15. Copyright ASCE. For personal use only; all rights reserved.

Fig. 3. Epipolar condition and imperfect projection for the two Fig. 4. Target point extraction
cameras

Fig. 3共b兲. This problem arises from errors in the estimated camera
u = ũ + ū共k1r2 + k2r4兲 + k3共r2 + 2ū兲 + 2k4uv parameters. It is possible to minimize this geometric error shown
in Fig. 3共c兲 by performing another optimization for all the camera
v = ṽ + v̄共k1r2 + k2r4兲 + k4共r2 + 2v̄兲 + 2k3uv 共6a兲 parameters 共Hartley and Zisserman 2003兲
n l 2
ū = ũ − u0
Minimize 兺 兺 兺 储m̂ijs − m̃ijs储2
i=1 j=1 s=1
共8兲

v̄ = ṽ − v0 共6b兲
In this optimization, the geometric relationship of the two cam-
eras is established through a relative orientation matrix and a
r2 = ū2 + v̄2 共6c兲 relative translation vector to reduce the number of parameters for
where k1 and k2⫽coefficients of radial lens distortion; and k3 and optimization.
k4⫽coefficients of decentering lens distortion. To account for the
lens distortion problem, the optimization function of Eq. 共5兲 Target Point Tracking
should be extended to include these distortion coefficients
Measuring displacement at a selected location on an object is
n l
realized through the 3D tracking of a target point attached to this
Minimize 兺 兺 储m̂ij − m̃ij储
i=1 j=1
2
共7兲 location. This target needs to be visible on every frame of the
acquired image sequences. Fig. 4共a兲 shows a target that consists
where m̃ij⫽projection of point Mj on the ith image according to of four 30 mm⫻ 30 mm black and white squares attached on a
Eq. 共4兲, followed by distortion according to Eq. 共6兲. metal block that is then fixed on a shake table. The intersection
point of the four squares is used as a target point whose trajectory
Stereo Calibration will be tracked from the two sequences of images acquired from
After the two cameras are calibrated individually, all the param- the two cameras. To automatically track this target point in the
eters of the two cameras should be simultaneously optimized image sequences, a tracking algorithm consisting of the following
based on the epipolar geometry principle 共Hartley and Zisserman three steps is used: 共1兲 detecting candidate points from the image
2003兲, which is referred to herein as the stereo calibration. In the sequence using an image morphology technique 共Gonzalez et al.
following, Superscripts 1 and 2 are used to indicate those param- 2004兲; 共2兲 identifying most-likely points by a correspondence
eters associated with the first and the second cameras, respec- technique; and 共3兲 locating the target point using the Harris
tively. As illustrated in Fig. 3共a兲, the cameras are indicated by corner detection technique 共Harris and Stephens 1988兲. Fig. 4共c兲
their optical centers C1 and C2. According to the epipolar geom- shows the binary image skeleton of the zoom-in target in Fig. 4共b兲
etry principle, the two optical centers of the cameras, the point M using the image morphology technique. The intersections of
and its image projection points m1 and m2 should lie on the the skeletons are identified as candidate points. On the first
so-called epipolar plane. This condition unfortunately cannot be image of the sequence, a point that is visibly close to the target
guaranteed after individual camera calibration due to imperfection point is manually selected as a reference point. Most-likely points
of projection. As a result, the two projective rays defined by on the subsequent images are then identified using a corres-
the image points and the corresponding camera optical centers pondence technique. This technique calculates the following
cannot converge and meet at point M in the 3D space as shown in correlation coefficient kab between the r ⫻ c pixel rectangular

JOURNAL OF ENGINEERING MECHANICS © ASCE / JUNE 2007 / 659

J. Eng. Mech. 2007.133:656-664.


Fig. 5. Target point tracking by a correlation-based method
Downloaded from ascelibrary.org by Universidad Politecnica De Valencia on 05/25/15. Copyright ASCE. For personal use only; all rights reserved.

mask 共termed as Mask a in Fig. 5兲 centered at the reference point


on the reference image and the same size of rectangular masks
共termed as Mask b in Fig. 5兲 centered at the candidate points on Fig. 6. Experiment setup for sinusoidal motion
the subsequent images
r c
three unknowns. The 3D coordinates are the least-squares solu-
兺 兺 关f a共i, j兲 − f̄ a兴关f b共i, j兲 − f̄ b兴 tion to these homogeneous equations. This estimate is based on

冑兺 兺
i=1 j=1
kab = r c r c
共9兲 linear triangulation, which does not take into account the geo-
metric error associated with the epipolar constraint. To minimize
i=1 j=1
关f a共i, j兲 − f̄ a兴 ⫻
2
兺 兺 关f b共i, j兲 − f̄ b兴
i=1 j=1
2
the geometric error, the following nonlinear optimization should
be performed:
where f a共i , j兲 and f b共i , j兲⫽gray level for the pixel 共i , j兲 in Mask a
冉 冊 冉 冊
2 2 2
p11M p12M
and Mask b, respectively; and f̄ a and f̄ b denote the mean gray
level of Mask a and Mask b, respectively. Among the candidate
Minimize 兺
i=1
储mim̂i储2 = u1 −
p13M
+ v1 −
p13M

冉 冊 冉 冊
points on a subsequent image, the most-likely point is the one that 2
gives the largest correlation coefficient. After all most-likely p21M p22M
+ u2 − + v2 − 共12兲
points are identified from the image sequence, the Harris corner p23M p23M
detection technique 共Harris and Stephens 1988兲 is applied on the
The corrected 3D coordinates are the ones that minimize this
image sequences using these most-likely points as the initial con-
objective function. In the above optimization, the 3D coordinates
ditions. The Harris corner detection technique can locate corner
obtained from the linear triangulation can be used as the initial
points in an image with subpixel accuracy as shown in Fig. 4共d兲.
value.

Reconstruction of 3D Point
After all the camera parameters and the target points on the two Experimental Studies
image sequences are determined, the next task is to reconstruct
the 3D coordinates of the target points from the two image se- To evaluate the accuracy of the proposed videogrammetric tech-
quences captured by the two cameras. This coordinate reconstruc- nique, three dynamic experiments were performed. Two digital
tion is done using a nonlinear triangulation method 共Hartley and video cameras were used for image sequence acquisition. The
Zisserman 2003兲. For the 3D point M = 共X , Y , Z , 1兲T, its undis- cameras came with a 1 / 3 in. 1.18 million-pixel progressive CCD
torted image coordinates from Cameras 1 and 2 can be estimated and could record high-definition images with a pixel resolution
from the observed distorted image coordinates using Eq. 共6兲 of 1,280⫻ 720 at 29.97 fps. They were equipped with 10⫻ opti-
and are expressed as 共u1 , v1 , 1兲T and 共u2 , v2 , 1兲T, respectively. The cal and 200⫻ digital zoom with a lens of F1.8 and focal length
reprojection from the two cameras can be expressed as the fol- ranging between 5.2 and 52 mm. The two cameras were cali-
lowing equations: brated prior to each experiment using the planar pattern as shown
in Fig. 2共a兲. A laser pointer was used to synchronize the two
␭1关u1, v1,1兴T = P1M cameras during the experiments. The two sequences of images
recorded by the two cameras were manually synchronized at the
␭2关u2, v2,1兴T = P2M 共10兲 images on which the laser dot was first observed.
Eq. 共10兲 can be rewritten as the following equations:
Experiment 1: Measurement of Harmonic Motion
AM = 0 共11a兲
In this experiment, one target was fixed on a shake table that

冤 冥
could be programmed to move in one or two directions with an
p11 − u1p13
arbitrary amplitude and frequency. The two cameras were placed
p12 − v2p13 at about 1.8 m away from the target and stood at about 1.5 m
A= 共11b兲
p21 − u2p23 above the ground 共see Fig. 6兲. The angle between the two cameras
was about 30°. For individual and stereo calibration, the calibra-
p22 − v2p23
tion pattern was placed in front of the target. Seven pairs of im-
pji
where denotes the ith row vector of the jth camera’s projective ages were recorded by the two cameras under a fixed focal length.
matrix. This is a set of four homogeneous equations with only Each image pair corresponded to a different orientation of the

660 / JOURNAL OF ENGINEERING MECHANICS © ASCE / JUNE 2007

J. Eng. Mech. 2007.133:656-664.


Downloaded from ascelibrary.org by Universidad Politecnica De Valencia on 05/25/15. Copyright ASCE. For personal use only; all rights reserved.

Fig. 8. Displacement time histories and power spectral densities for


sinusoidal motion

reprojected corner points calculated from the obtained camera


parameters for the three focal lengths. It is seen that the means of
the reprojection errors are all smaller than 1 pixel, which indicates
that subpixel accuracy is obtained after the calibration. It is noted
that both the mean and the standard deviation of the reprojection
error increase as the focal length increases. This is because the
pattern becomes larger on the image and covers more pixels as
Fig. 7. Images acquired by Camera 1 using three different focal the focal length increases as seen from Fig. 7. Table 1 also shows
lengths the average errors in the global coordinates between the actual
corner points and the reconstructed corner points obtained from
Eq. 共12兲 for the three focal lengths. It is seen that the error de-
pattern. Note that as long as the corner points on the pattern could
creases as the focal length increases due to the decrease of the
be clearly recorded on the images, precise locations and rotation
physical size of a pixel. These results suggest that the focal length
angles of the pattern were not important to the calibration. Three
should be set as large as possible to increase the measurement
different focal lengths, 5.2, 15.6, and 20.8 mm, were used to il-
accuracy.
lustrate the accuracy of videogrammetric measurement. Note that
After the calibration, the shake table was programmed to move
different focal lengths would result in different physical dimen-
along the X- and Y-directions, respectively, in sinusoidal motion
sions for the field of view as illustrated in Fig. 7. As a result, the
of 20-mm amplitude. Five frequencies were tested, 0.5, 1, 2, 3,
physical size corresponding to a pixel obtained under one focal
and 5 Hz under the three different focal lengths. Fig. 8 shows the
length would be different from that of a pixel obtained under a
results obtained with focal length of 20.8 mm under frequencies
different focal length.
of 1, 2, and 5 Hz, respectively. It is seen that the proposed video-
The two cameras were first individually calibrated before
grammetric measurement technique is able to track the sinusoidal
being subjected to the stereo calibration as outlined previously.
motion of the target quite closely. Also shown in Fig. 8 are the
Table 1 shows the average pixel errors obtained from Eq. 共8兲
power spectral densities 共PSDs兲 of these three measured results.
between the actual observed corner points on the images and the
The PSDs of the displacement time history output directly from
the shake table are also plotted in Fig. 8 for comparison. Results
Table 1. Reprojection and Reconstruction Errors after Camera show that the PSDs obtained from the videogrammetric technique
match quite well with those obtained from the shake table output.
Reprojection error 共pixel兲 Reconstruction error 共mm兲 It is seen that the vibration frequencies can be accurately deter-
Focal length Standard Standard mined from the peaks of the PSDs. Table 2 further shows the
共mm兲 Mean deviation Mean deviation means and the standard deviations of videogrammetric measure-
5.2 0.09 0.10 0.45 0.25
ment errors for the 1D sinusoidal motion. While the means of
measurement errors show the accuracy of the proposed technique,
15.6 0.12 0.16 0.24 0.12
the standard deviations provide indication on the minimally re-
20.8 0.14 0.19 0.15 0.07
solved displacement. It is seen that the means of measurement
5.2 0.09 0.10 0.45 0.25
errors are rather constant and range between 0.12 and 0.29 mm

JOURNAL OF ENGINEERING MECHANICS © ASCE / JUNE 2007 / 661

J. Eng. Mech. 2007.133:656-664.


Table 2. Means and Standard Deviations of Videogrammetric Measurement Errors for One-Dimensional 20-mm Sinusoidal Motion
Frequency 共Hz兲
0.5 1.0 2.0 3.0 5.0
Direction Focal length
of motion 共mm兲 Mean SD Mean SD Mean SD Mean SD Mean SD
X 5.2 −0.29 0.48 −0.26 0.56 0.25 0.62 −0.32 0.90 0.31 1.44
15.6 −0.21 0.40 0.24 0.52 −0.26 0.63 0.30 0.84 0.32 1.30
20.8 −0.14 0.38 −0.18 0.42 0.24 0.59 −0.26 0.71 0.30 1.26
Y 5.2 −0.28 0.38 −0.29 0.50 0.18 0.61 −0.21 0.94 0.29 1.47
Downloaded from ascelibrary.org by Universidad Politecnica De Valencia on 05/25/15. Copyright ASCE. For personal use only; all rights reserved.

15.6 −0.18 0.33 −0.25 0.46 −0.21 0.58 −0.26 0.61 0.27 1.24
20.8 −0.12 0.31 −0.19 0.36 −0.22 0.51 0.18 0.62 0.26 1.29
Note: SD⫽standard deviation.

for a vibration frequency of 0.5 Hz and between 0.26 and time for the two-step calibration of the two cameras using seven
0.31 mm for a vibration frequency of 5.0 Hz. The standard de- pairs of planar pattern images was about 4 min. The time for
viations of measurement errors, however, increase from about tracking a target point was about 0.6 s per image frame per cam-
0.4 mm for 0.5 Hz to about 1.4 mm for 5.0 Hz. The increase era. The time required for coordinate reconstruction of a 3D point
of standard deviations is possibly due to the following two rea- was in the order of 1 ms. These computational time requirements
sons: 共1兲 increasing vibration frequency implies a higher target indicate that the proposed videogrammetric technique still cannot
speed that results in more blurry images and larger measurement be used for real-time measurement.
variation and 共2兲 increasing vibration frequency leads to more
pronounced frame correspondence error from the image synchro-
Experiment 2: Measurement of Bridge Section Model
nization. These standard deviations provide an indication on the
in Wind Tunnel
resolution of this videogrammetric measurement technique for the
frequency range of 0.5– 5.0 Hz. Fig. 10共a兲 shows a wind tunnel setup for a bridge sectional model
Next, the shake table was programmed to move in a 2D cir- test. The sectional model consisted of two wooden box girders
cular motion with amplitude of 100 mm and frequency of 1 Hz. connected by cross bars. The model was rigidly connected to
Fig. 9 shows the results of videogrammetric measurement as circular rotatable shafts. The shafts were then attached to rigid
compared to the shake table output. It is seen that the videogram- rectangular bars that were supported by four linear springs to
metric technique is able to track this circular motion quite closely.
The largest discrepancy is less than 1 mm on the X – Y plane and
less than 0.5 mm along the vertical direction.
The results obtained above were calculated using a Pentium-4
2.6 GHz personal computer with 1.5 GB RAM. The processing

Fig. 10. 共a兲 Wind tunnel test for a bridge section model; 共b兲 layout of
Fig. 9. Videogrammetric tracking of a 2D circular motion targets and LVDTs

662 / JOURNAL OF ENGINEERING MECHANICS © ASCE / JUNE 2007

J. Eng. Mech. 2007.133:656-664.


Downloaded from ascelibrary.org by Universidad Politecnica De Valencia on 05/25/15. Copyright ASCE. For personal use only; all rights reserved.

Fig. 13. Displacement responses of the three-story model at Targets


A and B

ment and the LVDT measurement at the two target locations. The
close match between the two measured results indicates that the
videogrammetric technique is able to track the two targets quite
accurately.

Fig. 11. Videogrammetric measured responses at Targets A and B Experiment 3: Measurement of a Three-Story Model
under Earthquake Excitation
provide necessary stiffness for the section model. Two laser In this test, the videogrammetric technique was used to measure
LVDTs were positioned at locations as shown in Fig. 10共b兲 to displacement time histories at a few selected points on a three-
measure the displacement time histories during the test. Two story structural model under earthquake excitation. The model
targets were placed at locations as shown to validate the appli- was made out of aluminum members with a uniform story height
cability of the videogrammetric technique for such a test. The of 0.38 m and a total height of 1.21 m. Four targets, A, B, C, and
cameras were placed at about 1.2 m from the targets and the angle D were attached to the base and the three stories as shown in
between the two cameras was about 30°. The focal lengths of Fig. 12. Four laser LVDTs were set up to measure the displace-
the two cameras were both set at 15.6 mm. The model was pulled ment time histories for the shake table and the three stories. The
by a string and released to vibrate under a mild and steady setup of the two cameras was the same as that shown in Fig. 6. To
wind condition. As the supporting rectangular bars were rigid, the excite the model, the NS component of the 1940 El Centro earth-
readings from the laser LVDTs could be linearly scaled to the quake ground displacement record was input to the shake table.
deformations at the two target locations for comparison. Fig. 11 Fig. 13 shows the displacement responses for these four targets,
shows the comparison between the videogrammetric measure- respectively. It is again seen that the displacement time histories
measured by the videogrammetric technique match closely with
that measured by the LVDTs.

Concluding Remarks

This paper presents a noncontact dual-camera videogrammetric


technique for 3D structural vibration response measurement. The
technique is based on the principles of close-range digital photo-
grammetry and computer vision and consists of the following
steps: a two-step camera calibration, target point tracking, and 3D
target point reconstruction. The novelty of this technique is the
implementation of a two-step camera calibration process that in-
cludes individual and stereo calibration. These calibrations are
done efficiently using a planar pattern arbitrarily shown at a few
different orientations. The calibrations can be done on site prior
to the measurement, which provides great flexibility in terms
Fig. 12. Setup for a three-story structural model under earthquake of using different camera settings to suit various application
excitation conditions.

JOURNAL OF ENGINEERING MECHANICS © ASCE / JUNE 2007 / 663

J. Eng. Mech. 2007.133:656-664.


Three experiments are conducted to verify the accuracy of this Aw, Y. B., and Koo, T. K. 共1993兲. “Phototriangulation using CCD camera
technique. Results show that subpixel reprojection errors can be in close-range environment.” J. Surv. Eng., 119共2兲, 52–58.
obtained after the camera calibration. When used to track 1D Bales, F. B. 共1985兲. “Close-range photogrammetry for bridge measure-
sinusoidal motion, the means of measurement errors are shown to ment.” Transportation Research Record. 950, Transportation Research
range between 0.1 and 0.3 mm for a vibration frequency under Board, Washington, D.C., 39–44.
Çelebi, M., and Sanli, A. 共2002兲. “GPS in pioneering dynamic monitoring
5 Hz. The standard deviations of measurement errors are about
of long-period structures.” Earthquake Spectra, 18共1兲, 47–61.
0.4 mm for 0.5 Hz frequency and increase to about 1.4 mm for
Chung, H., Liang, J., Kushiyama, S., and Shinozuka, M. 共2004兲. “Digital
5 Hz frequency. The increase of standard deviations is possibly image processing for nonlinear system identification.” Int. J. Non-
due to increasing image blurriness and more pronounced frame Linear Mech., 39共5兲, 691–707.
correspondence error associated with synchronization for higher Fraser, C. S. 共1998兲. “Some thoughts on the emergence of digital close
vibration frequencies. For all three experiments conducted, the range photogrammetry.” Photogramm. Rec., 16共91兲, 37–50.
Fraser, C. S., and Riedel, B. 共2000兲. “Monitoring the thermal deformation
Downloaded from ascelibrary.org by Universidad Politecnica De Valencia on 05/25/15. Copyright ASCE. For personal use only; all rights reserved.

tracking of targets matches quite well with those measured from


the LVDT. In this study, two commercial digital video cameras of steel beams via vision metrology.” ISPRS J. Photogramm. Remote
with a pixel resolution of 1,280⫻ 720 at 29.97 fps are used in this Sens., 55共4兲, 268–276.
study. It is expected that more accurate results can be obtained if Fu, G., and Moosa, A. G. 共2002兲. “An optical approach to structural
cameras with a higher pixel resolution are used. Regarding the displacement measurement and its application.” J. Eng. Mech.,
computational effort required for the technique, it is found that 128共5兲, 511–520.
tracking a target point would take about 0.6 s per image frame per Gonzalez, R. C., Woods, R. E., and Eddins, S. L. 共2004兲. Digital image
camera using a Pentium-4 2.6 GHz personal computer. This time processing using Matlab, Pearson Education, Upper Saddle River,
N.J.
requirement indicates that this technique still cannot be used for
Gruen, A., and Huang, T. S. 共2001兲. Calibration and orientation of cam-
real-time measurement. As such, this proposed videogrammetric eras in computer vision, Springer, Berlin.
technique is better regarded as a complement to traditional sen- Harris, C., and Stephens, M. 共1988兲. “A combined corner and edge
sors especially for measuring two- or three-dimensional vibration detector.” Proc., Fourth Alvey Vision Conf., Manchester, U.K.,
response in the low-frequency range. 147–151.
Hartley, R., and Zisserman, A. 共2003兲. Multiple view geometry in com-
puter vision, 2nd Ed., Cambridge University Press, Cambridge, U.K.
Acknowledgments Hudson, D. E. 共1979兲. “Reading and interpreting strong motion accelero-
grams.” EERI engineering monographs on earthquake criteria, struc-
This study is supported by the Hong Kong Research Grants tural design, and strong motion records, Vol. 1, Earthquake
Council Competitive Earmarked Research Grant HKUST6294/ Engineering Research Institute, Berkeley, Calif.
03E. Jáuregui, D. V., White, K. R., Woodward, C. B., and Leitch, K. R. 共2003兲.
“Noncontact photogrammetric measurement of vertical bridge deflec-
tion.” J. Bridge Eng., 8共3兲, 212–222.
Notation Li, J. C., and Yuan, B. Z. 共1988兲. “Using vision technique for bridge
deformation detection.” Proc., Int. Conf. on Acoustic, Speech and Sig-
The following symbols are used in this paper: nal Processing, New York, 912–915.
Mikhail, E. M., Bethel, J. S., and McGlone, J. C. 共2001兲. Introduction to
f ⫽ pixel gray level;
modern photogrammetry, Wiley, New York.
I ⫽ identity matrix; Niederöst, M., and Maas, H. G. 共1997兲. “Automatic deformation mea-
K ⫽ camera intrinsic matrix; surement with a digital still video camera.” Optical 3D measurement
k ⫽ camera distortion coefficient; techniques IV, Zurich, Wichmann, Karlsruhe, Germany, 266–271.
M ⫽ homogeneous vector of 3D real point; Olaszek, P. 共1999兲. “Investigation of the dynamic characteristic of bridge
m ⫽ homogeneous vector of image point; structures using a computer vision method.” Measurement, 25共3兲,
P ⫽ camera projection matrix; 227–236.
R ⫽ rotation matrix; Patsias, S., and Staszewski, W. J. 共2002兲. “Damage detection using opti-
s ⫽ skewness factor of camera intrinsic matrix; cal measurements and wavelets.” Struct. Health Monit., 1共1兲, 7–22.
t ⫽ translation vector; Sturm, P. F., and Maybank, S. J. 共1999兲. “On plane-based camera calibra-
u ⫽ pixel coordinate in the u-direction; tion: A general algorithm, singularities, applications.” Proc., IEEE
v ⫽ pixel coordinate in the v-direction; Conf. Computer Vision and Pattern Recognition, Fort Collins, Colo.,
X ⫽ coordinate of 3D real point in the X-direction; 432–437.
Y ⫽ coordinate of 3D real point in the Y-direction; Tsai, R. Y. 共1987兲. “A versatile camera calibration technique for high-
Z ⫽ coordinate of 3D real point in the Z-direction; accuracy 3D machine vision metrology using off-the-shelf TV cam-
eras and lenses.” IEEE J. Rob. Autom., 3共4兲, 323–344.
␣ ⫽ scale factor of camera intrinsic matrix; and
Whiteman, T., Lichti, D. D., and Chandler, I. 共2002兲. “Measurement of
␭ ⫽ scale factor of pinhole model. deflections in concrete beams by close-range digital photogramme-
try.” Proc., Joint Int. Symp. on Geospatial Theory, Processing, and
Applications 共CD-ROM兲, Ottawa, Canada.
References Yoshida, J., Abe, M., Kumano, S., and Fujino, Y. 共2003兲. “Construction
of a measurement system for the dynamic behaviors of membrane by
Abdel-Aziz, Y. I., and Karara, H. M. 共1971兲. “Direct linear transformation using image processing.” Proc., Structural Membrane Int. Conf. on
from comparator coordinates into object space coordinates in close- Textile Composites and Inflatable Structures, Barcelona, Spain.
range photogrammetry.” Proc., ASP Symp. on Close-Range Photo- Zhang, Z. 共1998兲. “A flexible new technique for camera calibration.”
grammetry, Univ. of Illinois at Urbana–Champaign, Urbana, Ill., Technical Rep. MSR-TR-98-71, Microsoft Research, Microsoft Corpo-
1–18. ration, Redmond, Wash.

664 / JOURNAL OF ENGINEERING MECHANICS © ASCE / JUNE 2007

J. Eng. Mech. 2007.133:656-664.

You might also like