You are on page 1of 3

Technical

Article

10 THINGS TO KNOW ABOUT


WELL LOGS
In today’s fast-everything society, I am often asked for a short set
of rules for checking the quality of well logs. Since you asked….
By Martin Storey, senior consultant, Well Data Quality Assurance Pty Ltd

H ow does one distill several


decades of learning on rigs
and in tape libraries, data
Derived (often referred to as evaluated,
processed or interpreted) data is produced
from original data, either within the organ-
Examples of sources for the different
log data categories are shown in the table
below, which is not exhaustive. The inter-
rooms and offices into a few snappy points? ization or by an external party, and could rupted lines indicate that the distinctions
As with many questions, the answer conceivably be recreated. Derived data al- are not absolute.
depends on the circumstances. The first most always has an interpretative character, As subtle as these distinctions may seem
rules must therefore have to do with framing so it could have more than one version. to some, they largely determine the activ-
the situation. In real life, the distinction between ities required to control the quality and
original and derived data is not always manage the non-quality of the log data. It

1 Distinguish between historical data


and incoming data.
clear-cut, particularly in the past decade
when much original data has already been
is therefore important to keep data sets of
different categories distinct and readily
With historical (also known as legacy, old subjected to various forms of processing identifiable.
or pre-existing) data, what is available is prior to its initial delivery. The original query has now morphed
generally all there is, and its quality con- Log data interpreters tend to use into four distinct questions, and the first two
trol consists of verifying that it is what is mainly original data, while other users rules do not actually serve to accept or reject
expected for the particular data set. of log data tend to use derived data. data sets. We are getting to that, but we are
With incoming (also referred to as new Original data is often voluminous and not yet done with fundamental principles.
or future) data, the quality control should raw, while derived data should have been
include verifying that all the technical
specifications of the contract for the prod-
conditioned and validated, or generated
from data that has been conditioned and 3 Data must be managed for the short
and the long term concurrently.
uct are met and, if they are not, engaging validated, and is therefore presumed The point of this rule is that managing only
with the supplier until they are. safer. However, different applications for the immediate requirements and initial
In practice, however, contractual speci- may require the logs to be conditioned workflows is insufficient and ultimately
fications are rarely detailed or exhaustive, differently. damaging.
and the incoming data quality control is
shared between the operations and petro-
physics departments, without either side LOG CATEGORIES HISTORICAL INCOMING
following a precise script. • Own records or databases • Data acquisition company for own or
• Data vendor joint-venture-project ongoing operations

2 Distinguish between original data and ORIGINAL


• Provider of public domain data • Data processing company
derived data. • Merger or acquisition • Asset swap
Original (also known as raw, acquisition or • Own records or databases
received) data is the record from an oper- • Processed data vendor • Data acquisition company for own or
ation done once and reflects the circum- DERIVED • P rovider of public domain interpretative joint-venture-project ongoing operations
stances at that one particular time, never data • Data processing company
• Merger or acquisition
to be repeated exactly.

Foundations | Fall 2014 | 19


Technical Article

Log data is generally “hot” for a short


period after it was acquired or generated, 5 Original data must be obtained and
stored in the original format, and that
format is usually DLIS or LIS, not LAS.
exploitation or reprocessing. Because of a
lack of a unique digital log data exchange
but the initial demands on the data are often standard, many data exporter software
quite light: for most operational decisions or DLIS frequently contains contextual informa- programs have produced corrupted files,
rush evaluations, high-quality data is not re- tion found nowhere else, while LAS generally which reside incognito in our archives.
quired and the decisions can be made on the contains very little contextual information.
basis of preliminary data. The data becomes
hot again at various and largely unpredictable
DLIS and LAS are among the most com-
mon digital formats for log data. DLIS and 6 For the exploitation of any log data,
context is king and must be preserved.
times later on. Eventually, the demands on its predecessor, LIS, are binary formats, Data without context can only be used
the data may become more and more specific, while LAS files are written in ASCII and by making arbitrary assumptions that in-
requiring higher and broader data quality. can be read using any text reader. LAS was crease the organization’s risk unnecessar-
People are often surprised to hear that first introduced in the late 1980s by the ily. According to a Society of Professional
old data can be as good as modern data, Canadian Well Logging Society and was Well Log Analysts article, Current
but it is, and the main impediment to the use originally intended for exchanging basic Status of Well Logging Data Deliverables
of old data is not its lack of intrinsic value, digital log data on floppy disks between and A Vision Forward, the first context-
but its availability and incompleteness. The personal computers in a quick and easy-to- ual feature of a log is its print header,
lifespan of data is longer than our own, and use format. It succeeded, and 25 years later including the reference metadata, the
as temporary custodians, we must preserve it remains a convenient format for moving remarks section, the description of the
its value for future exploitation. log data around. logging tool string configuration and
Let us now get practical. The specifications of LAS have been hopefully more. There is a lot of other
enhanced, and LAS has the potential to contextual information relating to logs,

4 Original logs normally come in two


forms—print and tape—and both
are required.
contain almost any log data. In practice,
however, LAS files are never complete and
including the operations sequence, the
detailed drilling history, mud compos-
generally contain only the main curves. ition and tidal conditions, and any of
» T he print (or film or image) is frequently DLIS files are much more likely to contain these may have a critical impact on data
delivered as a digital raster or vector the ancillary curves, acquisition param- exploitation.
image file, such as a TIFF or PDF. eters and contextual information required This is not an anecdotal point. Good
» The tape (or CD or digital data) is generally for the exploitation of the data. DLIS files interpreters should verify that their con-
a digital file in Digital Log Interchange can be large, with numerous curves that clusions are consistent with other relevant
Standard (DLIS) or Log ASCII Standard can disorient the casual user; however, information. The more thoroughly that
(LAS) format, but historically other these complete files are required in the consistency can be verified, the more
formats have been used, such as the Log record to ensure that the data can be evalu- confidence there will be in the recommen-
Information Standard (LIS) and TIF. ated and interpreted in the future. dations used to support the organization’s
Both are needed because each may If you must also receive data deliveries decisions.
contain information that is essential for the in LAS files, the consistency of the DLIS In many countries, the combination of
valid exploitation of the data and that is and LAS data sets needs to be checked the original log prints, log tapes and the
found nowhere else. The image below shows systematically. The LAS set should be operator’s end of well report (sometimes
a barely legible section of a log print that a subset of the DLIS set, and the curve called a final well report or a completion
contains critical information about the data names, data sampling, depths and data report), complete with attachments and
and the depth reference. This information is values should be exactly the same. As an appendices, can normally provide all the
not on the tape and ignoring it could result in empirical rule, any conversion from one information required for the present and
overlooking a prospective interval or per- format to another results in the loss of future exploitation of the original log data.
forating the wrong formation in a well. data required by the technical experts for

7 The completeness of the data sets


should be assessed and captured
using a data inventory.
IMAGE: AUSTRALIAN PUBLIC RECORD

Data sets vary in form and content due to


their companies of origin, their locations,
the time when the data was generated, the
vintage of the tools and software used, the
operator’s specific requirements, etc. There
is therefore no standard delivery.
Example of essential information Some data sets contain several different
found exclusively on a log print. prints, log quality control prints, image

20 | Journal of the Professional Petroleum Data Management Association


Technical Article

prints, processing results, time-based


data, raw and corrected data, special tool 9 Perform basic checks on the original
data prints.
by the data management group include the
tape’s legibility using the organization’s
documentation and several successive Once the metadata has been verified, the normal readers and the presence of at least
deliverables. A good way to stay organized data itself should be checked. From the one file per log section.
is for the main people involved—opera- data management perspective, the main Some logs are acquired and put on
tions geologist, petrophysicist, reservoir items to consider are: the tape but not presented in print. For
engineer—to compile an inventory of the » Prints should have a header and a tail so instance, logs recorded while running in
expected data beforehand. This is not that it is clear that no section of the log print the hole and a gamma ray log recorded in
always done, but it helps everyone clarify has been torn out or is otherwise missing. casing up to the ground level.
the content and timing of deliverables and » In between the header and the tail, in Again, metadata is a frequent problem
plan their work accordingly. approximate top-to-bottom order, most with tapes: is the tape correctly labelled and
Later, the inventory should be a central prints should include: is each file on the tape clearly identified?
feature of the data management system, so • a full log header including remarks; Both the logging date and the tape creation
that a prospective user can establish what • tool sketch with the sensor measure date must be specified on the label. When
a data set contains and whether its various points; several versions of a tape exist, the latest one
components are available. • a well sketch; normally supersedes all previous ones, which
• one or several log sections, preferably should be destroyed to minimize future con-

8 Validate the metadata first.


The validation of the prints’ metadata
clearly labelled (ex. main log; repeat
section; correlation pass; main log,
fusion—even if all are labelled “final.”
The checking of derived data is even
is essentially the confirmation that the 1:200 scale); less generic, although similar principles
header data is correct, clear and coherent. • a repeat section in the case of forma- apply: the data must be fully identified and
Bad metadata is a major contributor tion evaluation logs; complete, and it must contain all neces-
to data loss, according to Guy Holmes in • parameter listings for each log sec- sary contextual information or references,
Search and Rescue: The Data Lost and Found tion; and and preferably be stored in a future-proof
Initiative. Not finding a data item when it • a tool calibration section. format. Actual checks and processes will
is needed can result in severe opportunity » Modern logs should also contain: depend on the organization’s systems.
loss and additional costs. Although it may • a job chronology; Well log data quality assurance is not
not appear in the record, more than one • a depth information box; straightforward because no one person in
well has been drilled to re-acquire data that • a survey listing; and the chain of custody is at the same time ful-
could no longer be found. It is often said that • quality-control plots. ly informed, competent and enabled to do
data acquisition companies are very good » Some logs are presented on logarithmic it all. From the perspective of data manage-
at doing the hard stuff and not so good at scales. If the grid scales are not consistent ment and with the aim to “provide the right
doing the easy stuff. with the data scales serious interpretation data at the right time to the right people,”
In fairness, metadata is often provided errors can occur, so it is wise to check it is essential to work with all stakeholders
to data acquisition companies by the oper- these, too. to maximize the quality of the incoming
ating company. Getting the reference data, as well as to minimize quality erosion
information right is basic quality assur-
ance and everyone’s responsibility. Well 10 Basic checks on the original
data tape.
during its life cycle.
In essence, most of the rules apply to
coordinates and elevations are notoriously Many things can go wrong with log data, other categories of well data, for instance,
unreliable on logs, but more disconcert- and there is no set list of rules to check or a mud log data or even core analysis data. The
ingly, so are wellbore names, which are certain way to assure quality. likelihood that data will be exploited for its
often the main handle of a data set. Data In spite of several competent people original purpose, as well as for initially un-
sets are surprisingly often registered having been involved in the production of foreseen purposes, is directly related to its
against the wrong wellbore—Laurel-1 the log data, such as the logging engineer, quality and availability: the better these are,
instead of Laurel-1ST1—or against the a wellsite witness, line managers and a the more easily the data can be exploited
wrong well name—Hardy 1 instead of petrophysicist, errors often slip in. and the more value can be obtained from it,
Hardy-1—rendering the data difficult to Moreover, each set delivered is liable to even decades after it was acquired.
locate. Validating the metadata may be have new errors that were not in the previ-
feasible with programmatic rules if this ous delivery—perhaps a wholly or partially Martin Storey started in the oil and gas in-
does not interfere with the data user’s missing curve, missing vector data or a dustry as a logging engineer, then became a
day-to-day work, and it may be best driven sampling interval set slightly incorrectly at wellsite petroleum engineer and a petrophys-
by the data management team. The PPDM 0.152 metres instead of 0.1524 metres. icist. He is now an independent practitioner,
Association’s What Is A Well is a valuable Every tape delivered must therefore be consultant and trainer in petrophysics and
reference for this. verified. Aspects that can easily be checked data management based in Australia.

Foundations | Fall 2014 | 21

You might also like