Professional Documents
Culture Documents
www.elsevier.com/locate/soildyn
Abstract
Common problems encountered in automatic digitization of strong motion accelerograms, recorded on film, are presented and discussed.
These include synchronization of the time scale for the three components of motion, non-uniform film speed, trace following in case of
scratches or trace crossings, distortions from high contrast preprocessing of the scanned image, and trace “rotation” resulting from rotated
position of the scanned film record. Procedures for correcting or eliminating these problems are suggested. The image processing hardware
has developed so much during the past 20 years, that at present it exceeds the technical requirements for processing strong motion
accelerograms. The problems described in this paper result from lack of training of the operators and lack of quality control in the process,
which still seems to be esoteric and highly specialized. This situation may have been caused by the low demand by the engineering profession
for high quality and large volume of strong motion data. q 1999 Elsevier Science Ltd. All rights reserved.
Keywords: Automatic digitization of accelerograms; Accelerogram data processing; Strong motion data; Accelerogram image processing
Fig. 3. The 1994 Northridge earthquake accelerogram at Sylmar Converter Station-East (free-field): (a) first 2 s; and (b) beginning of the vertical trace.
522 M.D. Trifunac et al. / Soil Dynamics and Earthquake Engineering 18 (1999) 519–530
due to the fact that the traces are weak immediately after
trigger (while the light bulb warms up; during ,0.1 s), and
the position of the first digitized point changes with different
choice of threshold level. This is illustrated in Figs. 3 and 4.
Part (a) of Fig. 3 shows the first 2 s of a film record and part
(b) shows an enlargement of the beginning of the V-trace.
Fig. 4 (an illustration of an image in LeTV) shows the
position of the first digitized point and the bitmap (raw
digitized data) for different choices of threshold levels of
gray (between 200 and 240). It is seen that for lower thresh-
old level, the trace width increases and the trace beginning
“moves” to the left, while the opposite is true when the
threshold level is increased. This situation is further compli-
cated by the fact that the trace “darkness” on the film, trans-
lated to trace “thickness” after scanning, is in general
different for each trace on the same film, and depends not
only on time after trigger but also on the amplitude of
recorded motions (larger amplitude results in “lighter”, i.e.
Fig. 4. Illustration of a bitmap image of a trace beginning for different “thinner” trace for any given constant threshold level). The
threshold levels (200, 215 and 235), and the corresponding outcomes of
automatic digitization.
examples in Figs. 3 and 4 are from a record of the 1994
Northridge earthquake; gray threshold levels of 180–240
were common in digitization of the Northridge records.
example in inverse analyses of the earthquake source Because of the above, choosing one common threshold
mechanism (“random” time delays in the digitized data level for the entire accelerogram (which was the default
affect the numerical stability of the inversion), applications procedure in the old versions of program LeTrace) will
that require linear combination of the recorded components result in variations of the origin time by at least several
of motion, such as computation of the radial and transfer pixels. Other parameters specified for the trace following,
components of motion (affect the accuracy of peak ampli- e.g. minimum trace width (usually selected as 2–4 pixels for
tudes in the rotated directions; [12]), in analyses of building digitization with 600 dpi) will also contribute errors to the
response from wave propagation viewpoint, or in correction selection of the first point, and the overall uncertainty of the
of accelerograms for cross-axis sensitivity and transducer origin of the time axes can exceed 0.01–0.02 s (2–5 pixels
misalignment (these corrections are meaningless unless for 600 dpi digitization resolution). One way to reduce these
the three components are synchronized; [6–8,13]). errors is intervention by the operator who can choose
The difficulty in selecting the first point to be digitized is manually the first point, but this is time consuming and is
Fig. 5. The vertical trace of the 1994 Northridge earthquake accelerogram at Sylmar Converter Station-Valve Group 1–6 (basement): (a) the beginning; (b) the
beginning enlarged by macro-photography, showing gradual build up of optical density; (c) the beginning enlarged by a repeated Xerox process and the “old”
digitized version of the record. The dashed line shows the “old” digitization.
M.D. Trifunac et al. / Soil Dynamics and Earthquake Engineering 18 (1999) 519–530 523
Fig. 6. Beginning of the accelerogram of the 20 March, 1994, Northridge aftershock at Sylmar Converter Station, Valve Group 7: (a) ground floor; and (b) free-
field site. Two digitized versions, the “old” and the “new”, are shown (by a dashed and a solid line), and the time delay between the two.
subjective. To increase the efficiency of this step and to operator intervention. An example of a circumstance that
eliminate operators subjectivity, since 1994 we have used a can complicate this task is when the gap between the end of
new special purpose algorithm which determines automati- the previous record and the onset of the current record is too
cally the “best” starting point for each acceleration trace. short or the traces overlap, so that they appear as continuous
This algorithm was tested on hundreds of accelerograms, on the scanned image. When such traces are displaced (i.e.
and was found to be successful (error less than one pixel) the trace starts with a large amplitude), this problem is
in about 95% of cases). Difficult cases would still require eliminated. An illustration of such “overlap” is shown in
Fig. 5. The dashed line in part (c) is an example of an
inaccurately digitized trace (0.05 s was omitted in the begin-
ning).
Fig. 6 shows an example of raw data images and inaccu-
rately (arbitrarily) digitized trace beginnings (the dashed
line, referred to as “old” digitization) and trace beginning
digitized with the help of the new features of program
LeFilm, LeTV and LeTrace. In this example the magnitude
of the error for the “old” digitization is difficult to interpret,
because of the lack of apparent reasoning that guided the
operator. For example, it is not clear why the “old” digitiza-
tion of the L trace in Fig. 6(b) starts early with a “ramp” of
, 0.05 s, in a manner not related to the raw data. Parts (a)
and (b) show, respectively, motions recorded at the ground
floor of a structure and in the free-field (approximately 25 m
away towards southwest), both recorded by the same multi-
channel recorder (CR-1), on the same film and with
common trigger time. Such records are invaluable for soil-
structure interaction studies, and for analyses of differential
motions between the two closely spaced points. Errors as
those seen in Fig. 6 (in the origin times of different compo-
nents) make the “old” digitization useless for detailed
studies, and misleading for an unsuspecting user. “Conclu-
sions” that could result from such data would be, e.g. that
the high frequency strong motion amplitudes are not corre-
lated at separation distance of 25 m, that the two sites have
different soil properties (i.e. different wave velocities), that
Fig. 7. Same as Fig. 6 but for the accelerograms at Los Angeles Dam, West
Abutment. The difference between the two digitized versions in choosing the wave train approached the two stations from a different
the beginning is significant. In the “old” version, the beginning seems to be direction (i.e. different phase velocities) and so on. Another
chosen arbitrarily. example is shown in Fig. 7 (Los Angeles dam, right abutment
524 M.D. Trifunac et al. / Soil Dynamics and Earthquake Engineering 18 (1999) 519–530
Fig. 13. (a) Segment of the 20 March aftershock record at Sylmar Converter Station-East, free-field, scanned from film. (b) An enlarged portion of the vertical
trace after repeated Xeroxing, showing associated distortions. Also the “old” and “new” versions of digitized data are shown. The differences may be due to
high contrast preprocessing of the scanned image in producing the “old” version.
vertical traces in Fig. 6(a) and (b) in which the film image two-pulses-per-second signal (2PPS), recorded along the
was enlarged by successive Xeroxing). Fig. 13(b) shows top and bottom edges of the film or paper. About 20 years
comparison of two independent digitizations of this segment ago, one of these relays was converted to work with a local
which differ significantly near the peaks. It appears that the clock which produces a binary code of the Julian day, hour,
“old” digitized version has distortions near peaks of the minute and second every 10 s. At first, it was believed that
same type as the repeatedly Xeroxed image shown below. absolute trigger time was not necessary for recorded strong
The “new” version, digitized by the LeAuto software ground motion [14], but since it was first introduced in the
(600 dpi resolution and 256 levels of gray), shows no such early 1970s [15], it opened many new possibilities for
distortions. We do not know the exact reason for the distor- advanced wave propagation studies in strong motion seis-
tion in the “old” version of this record, and we have never mology.
encountered such distortions in our digitization of strong The 2PPS signal accuracy is believed to be ,1%, but it is
motion records. We speculate that this record was digitized rarely calibrated. Since the early 1970s, we have assumed
after excessive high-contrast enhancement of the bitmap that the time coordinate of analog records is scaled more
image by software, or by a lithographic photo process accurately using 2PPS pulses (generated electronically)
prior to scanning. These types of distortions should have rather than by the nominal speed of the film (driven
been detected in the quality control phase of the job. mechanically), and have used it to correct for minor varia-
tions in the film speed [1–3]. Exceptions are records for
3.3. Non-uniform film speed which the 2PPS and absolute clock relays malfunctioned
simultaneously (e.g. [16]). Figs. 7, 9, 12(a) and 13 show
The nominal film speed for typical strong motion accel- differences in the digitized accelerations when the time is
erographs is 1 cm/s (SMA-1, CR-1). For the M02 accelero- scaled using the digitized 2PPS signal (“new”) and when we
graph (New Zealand) which records on a 35 mm film, the assume uniform film speed was used (“old”). The difference
actual film speed is 1.5 cm/s, and this is equivalent to 3 cm/s is manifested by time dependent delays between the two
for a 70 mm film (SMA-1). Increasing the film speed records.
improves the resolution and accuracy of digitization of Occasionally the film speed may experience abrupt
high frequency accelerations not only in time, but also in changes and stalls. This is caused by friction in the film
amplitude (see the discussion on the low-pass filtering of driving mechanism, friction in the film cassette or by faulty
high frequency and large amplitude accelerations in Section motors, and in general cannot be corrected uniquely. Dura-
3.2). The old AR-240 accelerograph (which recorded the tion of short stalls can be estimated by measuring the short-
Pacoima Dam accelerogram during the 1971 San Fernando, ening of distance between consecutive pulses of the 2PPS
California, earthquake) had a recording speed of 2 cm/s signal. Approximate corrections of the digitized data then
(equivalent to 0.5 cm/s on a 70 mm film; [14]). can be performed by inserting “gaps” into the scanned
Most instruments have one or two relays which produce a bitmap image, and recreating manually the missing portion
528 M.D. Trifunac et al. / Soil Dynamics and Earthquake Engineering 18 (1999) 519–530
Fig. 14. Illustration of a record imperfectly aligned with the scanner. There is an angle a between the coordinate system of the scanner (XSCAN –YSCAN) and the
accelerogram coordinate system (xREC –yREC). The two fiducial marks can be used to evaluate a and correct for it.
of traces [17]. A description of processing an accelerogram this angle is by marking and digitizing at least one pair of
with many stalls can be found in [16]. fiducial points onto one of the fixed traces, evaluating a
from the position of these marks, and rotating back the
3.4. Rotation of the digitized traces digitized signals for the same angle. The accuracy of this
correction is sufficient for typical acceleration records. It
Because the record is digitized directly from the scanned may be limited by the noise in the scanned image of the
image, it is important that the film or paper original is well baseline only for very short records (less than 5–10 s long).
aligned with the scanner. Perfect alignment would mean that While marking and digitizing fiducial points is esential for
the time axis of the record is parallel to one of the scanner long records that require scanning multiple pages (the fedu-
axes. In reality, however, this is difficult to achieve, and the cial marks are used to match the independently digitized
two are off by some small but finite angle, a . The problem pages), it is often neglected by some operators for one
of imperfect alignment is one of the oldest problems recog- page records, and in those cases correction for the angle a
nized in digitization and processing of strong motion accel- is not performed. The associated distortions for a real record
erograms [10]. By careful placement of the original onto the are illustrated in Fig. 15. It shows a segment of the trans-
scanner or the digitizing table, angle a can be kept small, verse acceleration trace of the record in Fig. 14. The “old”
i.e. of the order of 18 (Fig. 14). One simple way to correct for digitized trace appears to be rotated clockwise relative to the
“new” trace by a , 0:98. This is most obvious near the
positive and negative peaks. Assuming that the scanners
used for the “old” and “new” digitized versions were both
accurate, and knowing that the “new” version was corrected
for a rotation a , the difference between the “old” and “new”
versions could be explained as an error in the “old” version
due to imperfect alignment of the original record with the
scanner (a , 18). If this record was photographically
enlarged before scanning, the observed rotation could also
be explained as distortion caused by imperfections of the
lens, or by non-parallel planes of the negative and of the
projected image.
version was digitized by the authors of this paper using the (high frequency errors) due to imperfections on the film,
LeAuto software package). The problems discussed include such as scratches and dust (these imperfections are “not
errors in identifying the origin of common time coordinates seen” or are “smoothed out” by larger pixels, for lower
for the three recorded components of motion, smoothing of scanning resolutions e.g. 300 dpi).
the high frequency peaks as well as of amplitude variations We conclude that the governing factor for high quality
between peaks (both in segments of the record with moder- digitized accelerograms is rigorous quality control and
ate and large signal amplitudes), spurious peaks and noise in experienced and conscientious operator. Intelligent software
the digitized signal due to imperfections of the film image or as well as the increased hardware capabilities significantly
specific circumstances (e.g. intersection of traces), possible speed up the process and reduce the labor cost. However, no
distortions due to high contrast preprocessing of the scanned software will take care of all of the difficulties. Continuous
image, scaling of the time coordinate and rotation of the and systematic quality control by the operator, at all the
digitized signal. The errors due to all of these problems phases of the process, will always remain the determining
can be either eliminated or significantly reduced, automati- factor for the quality of the end product.
cally by intelligent algorithms (performing accurate identi-
fication of trigger times for each trace individually and
Acknowledgements
correction for common time scale fluctuations) or manually
by operator intervention.
The authors thank Ron Tognazzini and Craig Davis of the
The most important factor determining the quality of the
Los Angeles Department of Water and Power for making
processed data is the experience of the operator and rigorous
available the original films and the commercially processed
quality control of the outcome of the automatic trace follow-
data of the 1994 Northridge earthquake records, used to
ing. An experienced operator can avoid most of these
illustrate the problems discussed in this paper.
problems by carefully using the software. Finally, critical
comments and suggestions by experienced operators are
invaluable for further improvement of the automatic digiti- References
zation software. The efficiency and reliability of this soft-
ware can be improved only by close interaction between the [1] Trifunac MD, Lee VW. Automatic digitization and processing of
programmer and an experienced operator. strong motion accelerograms: Parts I and II. Department of Civil
The digitization of accelerograms recorded on film can be Engineering Report No. 79-15, University of Southern California,
Los Angeles, California, 1979.
viewed as an estimation problem using noisy measurements, [2] Lee VW, Trifunac MD. Automatic digitization and processing of
and the result is nonunique. Too high threshold level results accelerograms using PC. Department of Civil Engineering Report
in exclusion of recorded information in the estimation No. 90-03, University of Southern California, Los Angeles, Califor-
process, and too low threshold level results in including nia, 1990.
too much noise in the estimation process. We showed that [3] Trifunac MD, Lee, VW. Routine computer processing of strong
motion accelerograms. Report EERL 73-03, California Institute of
systematic biases in the estimate due to this “noise” (e.g. Technology, Pasadena, California, 1973.
spurious peaks and additional pulses, and smoothing of the [4] Novikova EI, Trifunac MD. Instrument correction for the coupled
sharp peaks) can be eliminated or at least abated by careful transducer–galvanometer system. Department of Civil Engineering
choice of the threshold level. Scanning the image with a 256 Report No. 91-02, University of Southern California, Los Angeles,
level gray-scale is highly recommended, while digitization California, 1991.
[5] Novikova EI, Trifunac MD. Digital instrument response correction
from a binary black-and-white bitmap image is discouraged. for the force balance accelerometer. Earthquake Spectra
A major improvement of the software for automatic trace 1992;8(3):429–442.
following would be a self learning algorithm with adaptive [6] Todorovska, MI, Novikova EI, Trifunac MD, Ivanović SS. Correction
threshold level. Such an algorithm has been incorporated in for misalignment and cross axis sensitivity of strong earthquake
the editing phase of our procedures (LeTV). Within this motion recorded by SMA-1 accelerographs. Department of Civil
Engineering Report No. 95-06, University of Southern California,
phase, the algorithm is used to digitize a trace segment in Los Angeles, California, 1995.
a window defined manually by the operator. Much work [7] Todorovska MI. Cross-axis sensitivity of accelerographs with pendu-
remains to be done to include this algorithm in the fully lum like transducers—mathematical model and the inverse problem.
automatic trace following procedure (LeTrace). Earthquake Engineering and Structural Dynamics 1998;27(10):1031–
There is a common misperception that the accuracy of 1051.
[8] Todorovska MI, Novikova EI, Trifunac MD, Ivanović SS. Advanced
estimating the high frequencies in a record will increase sensitivity calibration of the Los Angeles strong motion array. Earth-
significantly by scanning the film record at a higher resolu- quake Engineering and Structural Dynamics 1998;27(10):1053–
tion. This is true only up to a limit. For example, the 1068.
“smoothing” of the amplitudes of the sharp peaks can be [9] Lindvall–Richter–Benuska Associates. Processed LADWP power
eliminated only by better focussing of the light beam (possi- system strong motion records from the Northridge, California, Earth-
quake of 17 January, 1994. Report LRB 007-027, prepared for the Los
ble only up to a certain degree) or by increasing the film Angeles Department of Water and Power, Los Angeles, California,
speed. Our experience shows that, for higher scanning reso- 1995.
lution (600 dpi), the digitized signal detects more noise [10] Trifunac MD. Zero baseline correction of strong motion accelerograms.
530 M.D. Trifunac et al. / Soil Dynamics and Earthquake Engineering 18 (1999) 519–530
Bulletin of the Seismological Society of America 1971;61:1201– editor. Earthquake engineering, Englewood Cliffs, NJ: Prentice Hall,
1211. 1970.
[11] Trifunac MD. A note on correction of strong motion accelerograms [15] Dielman RJ, Hanks TC, Trifunac MD. An array of strong motion
for instrument response. Bulletin of the Seismological Society of accelerographs in Bear Valley, California. Bulletin of the Seismolo-
America 1972;62:401–409. gical Society of America 1975;65:1–12.
[12] Todorovska MI, Trifunac MD. Amplitudes, polarity and time of peaks [16] Trifunac MD, Todorovska MI, Lee VW. The Rinaldi strong motion
of strong ground motion during the Northridge, California Earth- accelerogram of the Northridge, California, Earthquake of 17 January,
quake. Soil Dynamics and Earthquake Engineering 1997;16(4): 1994. Earthquake Spectra 1998;14(1):225–239.
235–58. [17] Lee VW, Trifunac MD. Current developments in data processing of
[13] Wong HL, Trifunac MD. Effects of cross-axis sensitivity and misa- strong motion accelerograms, Department of Civil Engineering
lignment on response of mechanical optical accelerographs. Bulletin Report No. 84-01, University of Southern California, Los Angeles,
of the Seismological Society of America 1977;67:929–956. California, 1984.
[14] Hudson DE. Ground motion measurements Chapter 6. In: Wiegel RL,