You are on page 1of 3

Shier

source, depositional environment, compaction, and other lies or to introduce artifacts. These pernicious tendencies
regional patterns. can be reduced if certain guidelines are followed.
Even if some adjustments are minor, it is usually easier to In many wells, different depth intervals were logged
normalize all gamma ray curves except those of the type independently at different times. The various logging runs
well(s), than it is to adjust only those wells that need a large frequently involved different instruments, with different
amount of adjustment. calibration histories, different logging crews, and different
service companies. This lack of standardization dictates that
Spontaneous potential (SP) curves each logging run should usually be normalized separately.
Shier (1997) showed that there is a tendency for the size
SP curves are normalized by conversion to a Volume of
of the normalization corrections to be smaller with modern
Shale curve. This involves editing “by hand” to remove
instruments. This is somewhat balanced by a marked
“mechanical shifts,” followed by definition of a shale base-
increase in the introduction of new tools with different
line (a maximum) and definition of a clean sand line (a min-
response characteristics. When first introduced, new tools
imum). Care should be taken to allow for hydrocarbon sup-
may experience initial reliability problems. Normalization
pression of the SP as much as possible, based on the gamma
is needed for both old and new datasets.
ray, resistivity, and other curves that reflect the volume of
Hole rugosity affects log response. Histograms,
shale in the reservoir.
crossplots, and other data displays that do not allow the user
The raw SP curve (Figure 12) is not helpful. It shows the
to simultaneously assess the associated caliper curve may
effects of both run changes and changes in formation water.
be misleading and lead to inaccurate normalizations. An
In Figure 12, a shale baseline was defined for the SP curve, alternative method is to define a bad-hole curve, which may
using the caliper and resistivity curves (not shown) as addi- be based on curves in addition to the caliper. The bad-hole
tional references. The straightened SP curve has the shale
baseline set to zero. A clean sand baseline was defined for
the straightened SP curve, again using the caliper and resis-
tivity curves for reference. This was then used to create the
Volume of Shale curve.
Cook Inlet of Alaska is a place where SP normalization is
vital; volcanic clasts frequently make the sandstones more
radioactive than shales, rendering the gamma ray curve
nearly useless for distinguishing sand from shale.

Resistivity curves
Resistivity curves are usually handled with standard
borehole-correction procedures. They are not normalized at
all unless there is some compelling reason to do so. If nor-
malization is necessary, the zero is generally accepted, and
only values of Wmax are needed. Unfortunately, a suffi-
ciently consistent Wmax is almost impossible to define in
most datasets.
Some of the early resistivity curves have negative values
because of incorrect registry of the galvanometer to the
film. The solution here is to shift the entire resistivity curve
for that run so that the minimum value is compatible with
nearby wells. However, this will produce its own inaccura-
cies if the galvanometer adjustment was not the same for the
entire log run but was arbitrarily adjusted at some unknown
depth.

GUIDELINES
The purpose of curve normalization is to remove system-
atic inaccuracies, not to remove genuine lithologic anoma- FIG. 12 Creation of a Volume of Shale curve from an SP curve.

278 PETROPHYSICS May-June 2004


Well Log Normalization: Methods and Guidelines

curve can be used as a filter prior to selection of a type well “lesser of two evils” strategy. Consider the case where thin
or other normalization tasks. shaly, possibly arkosic sands are developed in a section that
is 90% shale and there are no rocks in the study wells that
Reducing the noise but not the signal have a gamma ray of less than 65 API units. There will be
Avoid very small adjustments. If it cannot be clearly no problem using a gamma ray histogram to establish the
demonstrated that an adjustment will improve the accuracy maximum gamma ray normalizer from the pervasive shale
of the data, it should not be made. Set a minimum level for facies. The problem lies in defining the minimum normal-
adjustments to be made for each of the curve types based on ization rock type. Three less-than-perfect approaches are
the consistency of the lithologies that are available. An suggested below.
exception is made for the gamma ray curve, the values for Wmin can be picked from the low-side shoulder of the his-
which are not related to an absolute standard but are simply togram as shown in Figure 4b. It can be argued that as long
relative to the other wells in the study. as the Wmax and Wmin values that are used form an envelope
Look for map patterns. As an initial test, map the adjust- around the data to be analyzed, there should be no concern
ments that appear to be needed. If the polarity and magni- about the effect of the normalization parameters on nonex-
tude of the adjustments are more or less random, the adjust- istent values less than 60 API units. This approach is based
ments are likely to be legitimate. If there are nonrandom on the assumptions that each gamma ray curve has the same
areas, check possible causes. Are they real geologic anoma- thin-bed resolution characteristics and that the lowest vol-
lies? Are they the result of the consistent use of a particular ume of shale in typical sandstones in each well is the same.
logging contractor in a small area? Are they clumps of older If sandstones are actually better developed in one part of the
vintage logs and younger vintage logs from successive area than in another, use of this procedure by itself will
exploitation episodes? erase the very geologic changes that are most important!
Check for lithologic trends. Detect regional gradients by This source of error will be reduced if the trial normaliza-
making trend surface maps of the unadjusted log data. tion method is used and regional trends are considered.
Consider other information before finalizing normaliza- Zero can be assumed correct. The analyst adopts zero as
tion work. Don’t pursue the normalization work in isolation Rmin and Wmin for each well. As indicated in Figures 10 and
from what is known about the geology and production tests. 11, the amount of noise in the uncorrected gamma ray val-
Methods and results need to be discussed with the project ues decreases markedly at lower gamma ray values.
team. Any study that covers a large area and does not A hybrid approach uses the zero assumption initially.
include the efforts of a geologist/stratigrapher usually con- Then histograms or crossplots are used to check the results
tains unacceptable artifacts. of the preliminary normalization for an anomalous low-side
Do not overwork the data. There is little to be gained by data pattern. For example, check the log headers for potas-
attempting to wring the last tiny bit of accuracy from the sium chloride muds, look at the local stratigraphy more
data by slightly revising the normalization parameters time carefully, and adjust Wmin in anomalous wells. Experience
after time. suggests that this hybrid approach is the most efficient way
Some curves cannot be repaired. Most curves can be to proceed.
converted to reliable data. However, if normalization can- Another way to handle nonideal lithologies is to start
not give the curve the correct response characteristics, the with the most inherently reliable curves and work toward
curve should be discarded. the least reliable. For example, in a thick clastic sequence
Improve the data if you can. If it is clear that a particular that is only partly consolidated, it may be possible to nor-
adjustment will reduce the noise level, make the adjustment malize the gamma ray quite accurately but the neutron
even if it is also clear that an undesirable amount of noise response varies greatly. The procedure in this case would be
will remain. Consider the case of a gamma ray curve to first normalize the gamma ray and then crossplot the nor-
penetrating mechanically competent 15 API unit limestones malized gamma ray versus the unnormalized neutron curve.
and caving 120 API unit shales. There is no caliper curve. Using trial and error, a Wmax is found that produces the
Normalizing the data as if they had been borehole corrected desired crossplot pattern for each well. Similarly, the com-
will tend to correct for the hole enlargement in the shales in pensated sonic curve (which rarely needs adjustment)
that well. might be crossplotted against the density curve in a carbon-
ate play.
Cases without optimum lithologies
Software considerations
Study areas with nonideal lithologies are frequently
encountered. In these cases, it may be necessary to pursue a Effective normalization software has different impera-

May-June 2004 PETROPHYSICS 279


Shier

tives than software designed specifically for log analysis. It ACKNOWLEDGEMENTS


must be possible to view normalized and unnormalized
I thank W. David Kennedy for his extensive work with
curve data as histograms, crossplots, and depth plots on the
this paper and for his helpful suggestions.
screen simultaneously. The effect of any changes in the
normalization parameters should be displayed almost instan-
taneously and easily compared with the data patterns of REFERENCES
nearby wells, without requiring paper copies. All normaliza- Doveton, J. H. and Bornemann, E., 1981, Log normalization by
tion information must be integrated with formation tops and trend surface analysis: The Log Analyst, vol. 22, no. 4, p. 3–9.
available for mapping. The system must be capable of han- Hilchie, D. W., 1979, Old (pre-1958) electrical log interpretation,
dling different normalizers for different log runs seamlessly. published by the Author in Boulder, Colorado, USA., 164 p.
All of the work discussed here was performed using a Lang, W. J., 1980, SPWLA Ad hoc calibration committee report:
software system that can store normalization parameters The Log Analyst, vol. 21, no. 2, p. 14–19.
and apply them to log data as required. The parameters Neinast, G. S. and Knox, C. C., 1973, Normalization of well log
themselves are available for statistical analysis or mapping. data, Paper I in 14th Annual Logging Symposium
Transactions: Society of Professional Well Log Analysts.
It has a complete capability for comparing crossplot, histo- Patchett, J. G. and Coalson, E. B., 1979, The determination of
gram, and depth-plot data displays between wells, and with porosity in sandstone and shaly sandstone, Part 1—Quality
regional norms. Data displays can be confined to any stra- control: The Log Analyst, vol. 20, no. 6, p. 3–12.
tum that is delimited by formation tops or other strati- Reimer, J. D., 1985, Density-neutron cross plot discrepancies
graphic markers. All operations except looking at the data through the Swan Hills Member, Swan Hills Unit #1, Alberta,
and making decisions, including printing of standardized in Transactions of the 10th Formation Evaluation Symposium:
screen displays, can be batch-processed as needed. Canadian Well Log Society and in Vol. 3 of the CWLS Reprint
Volumes, Canadian Case Histories.
Schlumberger, 1969: Log Interpretation Charts, Schlumberger
CONCLUSIONS Ltd., 76 p.
Shier, D. E., 1997, A comparison log response between logging
Approximately 20% of neutron, density, and uncompen-
companies and different vintages of tools: The Log Analyst,
sated sonic curves require normalization before a dataset is vol. 38, no. 3, p. 47–61.
consistent enough to produce reliable results in batch-pro- Shier, D. E. and Sponable, D. M., 1997, Correcting drift in GNT
cessed log analysis. Virtually all SP, gamma ray, and neutron log data from West Texas: The Log Analyst, vol. 38, no.
GNT-type neutron curves require normalization. Reasons 3, p. 16–19.
for the need for normalization include miscalibrated instru-
ments, instruments of different design, varying borehole
ABOUT THE AUTHOR
effect, and other factors. Curve normalization is accom-
plished using a standard linear equation. The most critical Dr. Daniel E. Shier is a consulting geologist and
part of the normalization process is determination of rea- petrophy sicist. He was employ ed by Shell Oil as a
sonable curve values at various points in a study area, tak- micropaleontologist and stratigrapher from 1969 to 1974. During
ing into account any stratigraphic or compaction trends. his tenure for Scientific Software from 1974 to 1981, he worked
with sets of digital well logs that encompassed entire exploration
Techniques that combine data from multiple wells into one
plays and developed the normalization methods needed to obtain
histogram or crossplot tend to blur any real stratigraphic accurate mapping parameters for regional studies. He was associ-
and/or fluid differences across the area. Also, these meth- ated with Energy Data Services from 1981 to 2001. There he
ods may not highlight incorrect data if they are present in developed and taught a workshop on log normalization. He
the majority of the wells in the study. received his BA degree in geology from Rice University and his
Normalization is best regarded as a process that reduces PhD degree in geology from Florida State University. Dr. Shier is a
systematic noise within the dataset. The noise contains member of SPWLA, AAPG, Rocky Mtn. Assoc. of Geologists and
some random elements, including the statistical nature of Denver Well Log Society.
the nuclear tools and small changes in the rocks. A major
challenge in normalization work is the recognition of which
data inconsistencies are systematic (and correctable) and
which are random (and better left alone). Not all curves can
be handled in the same way. Each curve has its own opti-
mum set of normalization techniques.

280 PETROPHYSICS May-June 2004

You might also like