Professional Documents
Culture Documents
In most cases, the map of the reservoir’s smaller than expected, indicating the possible
structural top is defined on the basis of a presence of a fault. In the past, when 3D seismic
geophysical interpretation of 2D or 3D data. In this surveys were much less readily available than today,
case, the most frequent, the geophysicist interprets this technique allowed only the largest faults to be
significant horizons in a seismic block as a identified and located with a good degree of
function of times. This generates a data set (x, y, t), accuracy.
forming the basis for the subsequent gridding Well data. The presence of faults in a well can
phase; in other words, the generation of a surface generally be ascertained through the analysis of the
representing the ‘time map’ of the horizon under stratigraphic sequence. Missing geological sequences
consideration. This time map is then converted into indicate the presence of normal faults, whereas
a depth map, using the relevant laws governing the repeated sequences indicate the presence of reverse
velocities of seismic waves, which are calculated faults.
according to the characteristics of the formations Geophysical tests. Geophysical data represents
overlying the reservoir. There are various the main source of information on the presence of
techniques for performing this conversion, some of faults since, unlike the two previous techniques, it
which are highly sophisticated. The choice of the also investigates parts of the reservoir which are
most suitable depends on geological complexity, distant from the wells. The presence of faults may
and on the human, technological and financial be indicated by discontinuities in the seismic
resources available. In any case, the resulting map signal. This applies to both data from surface
is calibrated against well data. seismic surveys and data recorded in well seismics
In some cases, the map of the structural top (VSP, crosswell seismics). Furthermore, this data
may be generated solely on the basis of available can be interpreted both traditionally, by mapping a
well data, with the help of data from the geological reflecting geological horizon, and by using seismic
surface survey, if the reservoir lies in an area with attributes (dip, azimuth, amplitude, etc.).
outcrops of geological formations. This may Dynamic well test data. The interpretation of
happen where no seismic survey has been carried dynamic well tests (see Chapter 4.4) may show the
out, or when there are enough wells available to presence of faults in cases where the faults have an
provide adequate coverage of the structure. In these impact on fluid flow, and thus on pressure patterns
instances, the improvement in quality of the over time.
structural top map resulting from a seismic An adequate integration of these types of
interpretation is not sufficient to justify the extra information generally allows the set of faults running
work involved, which is due above all to the through the reservoir to be reconstructed with sufficient
problem of calibrating a large number of wells. accuracy. However, in carrying out the integration, we
The interpretation of the set of faults running should take into account a series of factors which may
through a reservoir has considerable impact on its be crucial for the quality of the end result.
production characteristics, and in particular on the The first factor concerns the degree of detail
most appropriate plan for its development. Given aimed for in the interpretation. In most cases this
an equal volume of hydrocarbons in place, the depends more on the tools available than on the
number of wells required is higher for reservoirs actual aims of the study. Geophysicists tend to
characterized by faults which isolate independent include in their interpretation all those
or partially independent blocks from the point of discontinuities which can be identified from the
view of fluid content. In the case of deep sea seismic survey, regardless of whether these have an
reservoirs (for example in the Gulf of Mexico, impact on fluid flow. As a result, the reservoir
West Africa, etc.), the number of wells is often engineer often has to simplify the map during the
crucial in the evaluation of development plans. dynamic simulation phase, keeping only those
Consequently, an accurate assessment of faults and faults which turn out to have a significant impact
their characteristics may be a decisive factor. on the results of the simulation model. For
The interpretation of the set of faults within a example, faults shorter than the average size of the
reservoir is generally based on four types of data that model cells can obviously be disregarded. For this
are subsequently integrated. reason, the degree of detail in a geophysical
Inconsistencies in correlation. The presence of interpretation should match the overall
faults may sometimes be apparent from well data, requirements of the study, and be discussed and
indicated by inconsistencies in the correlation agreed with the other members of the study team.
scheme. Typically, for example, the depth of a Another factor is linked to the hydraulic
horizon in a well may turn out to be greater or transmissibility of the faults. In a reservoir study, we
The difficulties encountered at this stage of the correlations, in terms of real depth or with respect to a
reservoir study are mainly linked to the definition of reference level, through which we can generally
the depositional environment of the reservoir. In some identify the lines corresponding to significant
cases, when the sedimentary sequences present a geological variations. Fig. 2 depicts a classic example of
significant lateral extension, the correlations between a geological cross-section between two wells, showing
wells may be relatively simple. This is true, for the logs used for the correlation itself.
example, for shelf areas, with both terrigenous and As already mentioned, there is often a high risk of
carbonate sedimentation, dominated by tidal generating spurious correlations, and the reservoir
phenomena. An extreme example of correlativity is geologist must carefully choose the suitable
represented by the distal facies of some deep sea methodologies to minimize possible errors. To this
turbidite sediments, as in various Adriatic fields, end, one of the best techniques is sequence
where we can correlate with certainty individual stratigraphy. Sequence stratigraphy is a relatively new
events just a few centimetres thick, even between wells approach, whose official appearance can be dated to
several kilometres apart. However, such examples are 1977 (Vail et al., 1977). This is a chronostratigraphic
exceptions to the rule. In most cases, the lateral system, based on the hypothesis that the deposition of
extension of the sedimentary bodies is much lower, sedimentary bodies is governed by the combined
and in many cases, unfortunately, is less than the effects of changes in sea-level (eustatic phenomena),
average distance between wells. This is true of most sedimentation, subsidence and tectonics.
continental and transitional geological formations, On this basis, we can identify sequences of different
such as alluvial, fluvial and deltaic sediments, where hierarchical order within a geological unit, separated by
reconstructing the internal geometry of the reservoir sequence boundaries which represent unconformities or
may turn out to be extremely complicated, maximum flooding surfaces. These surfaces are the
representing an often insurmountable challenge for the most important reference levels (or markers) that a
reservoir geologist. In these cases, as we will see reservoir geologist may find in well profiles.
below, integration of the various disciplines The correct identification of these units allows us
participating in the reservoir study may be crucial for to generate an extremely detailed chronostratigraphic
improving the accuracy of the end result. framework. This is especially well-suited to reservoir
studies, since chronostratigraphic units and fluid flow
Correlation techniques are usually closely linked. This link does not
The basic data used for well-to-well correlations are necessarily exist if we consider traditional
the logs recorded in open hole or cased hole, and cores. lithostratigraphic units (for example by correlating the
These data are used to create stratigraphic sections and tops of arenaceous units).
Fig. 2. Example
of a correlation
between wells
(courtesy
of L. Cosentino).
Where it is not possible to apply sequence the responsibility of the reservoir geologist to examine
stratigraphy, or where this does not provide the desired all the existing opportunities, and exploit these to the
results, we may resort to correlations based on the utmost.
hydraulic properties of the sedimentary bodies. This
approach aims to define flow units (or hydraulic Construction of a stratigraphic model
units), which do not necessarily coincide with the The stratigraphic horizons defined at the wells
geological units, but which can be considered during the correlation phase are subsequently linked to
homogeneous from a dynamic point of view. One of one another by constructing surfaces which together
the classic methodologies for the definition of flow form what we might call the stratigraphic model of the
units is described in Amaefule et al. (1993). reservoir. This model consists of a series of thickness
maps of the individual geological horizons located
Validation of the stratigraphic scheme between the upper and lower boundary surfaces of the
Once we have defined the reference correlation reservoir. These maps are usually created using
scheme, it is good practice to check its accuracy using appropriate computer mapping programmes. For
other types of techniques and data which may provide stratigraphic modelling, too, the three-dimensional
useful information for this purpose. approach is the most commonly adopted by reservoir
Biostratigraphy and palynology. Available rock geologists today. In this case, after constructing the
samples (cores or cuttings) are frequently analysed external framework of the reservoir according to the
with the aim of studying micropalaeontological and/or procedure described in the previous paragraph, we
palynological associations (spores and pollens). This proceed to define the internal geometry; in other
data may in some cases help to confirm the words, to create that set of surfaces between the top
stratigraphic scheme. However, it is important to and the bottom of the reservoir which represent the
check that chronostratigraphy and biostratigraphy are boundaries between the geological sequences selected
consistent, and, in the case of drilling debris (cuttings), for correlation. Generally, as already stressed, these
to take into account the limited vertical resolution of surfaces form boundaries between flow units which
the data. are independent of one another.
Pressure data. Available static pressure data, and The specific procedure allowing us to construct
particularly those collected in wells with WFT this stratigraphic scheme obviously depends on the
(Wireline Formation Tester) instruments, provide applications used. Generally speaking, it is possible to
extremely significant information on the continuity model all existing sedimentary geometries (conformity
and connectivity of the various sedimentary bodies. In and erosion surfaces, pinch-out, onlap, toplap,
the absence of structural discontinuities (e.g. faults), downlap, etc.) and to achieve an accurate reproduction
the pressures measured in different wells in identical of the depositional scheme under consideration.
geological sequences should be similar. If this is not Fig. 3 shows an example of a three-dimensional
the case, there may be correlation problems. stratigraphic model, where we can see the different
Production data. Within a geological unit we depositional geometries of the various sedimentary
should be able to observe a thermodynamic units. Note especially the onlap type geometry of the
equilibrium, corresponding to specific lower unit in the area of structural high.
characteristics of the fluids produced at the surface
(gas-oil ratio and oil density). The presence of
anomalies in these characteristics may be due to
correlation problems. Obviously, in these cases we
should first rule out problems with the well (e.g.
defective cementing).
Drilling data. The Rate Of Penetration (ROP) may
provide useful information on the stratigraphic
sequence crossed. Different geological units often
present varying resistance to the advancement of the
bit. In these cases, data supplied by the drilling activity
may be used to check the consistency of available
correlations.
It is obvious that this list of techniques is not, and
cannot be, exhaustive; every reservoir study possesses
distinctive data and information which can be used Fig. 3. Example of a 3D stratigraphic model
during the various stages of the study. It is therefore (courtesy of L. Cosentino).
log’ curve (log of the gamma rays emitted by the facies can be linked not only to the most obvious
rock as a function of depth). lithological characteristics, but also to precise
Usually, the classification into facies is obtained petrophysical values such as porosity, permeability,
through a more complex process. This involves capillary behaviour, compressibility, cementation
selecting the most suitable log curves, identifying a factor, saturation exponent, etc.
number of reference wells (i.e. those wells which have During the final stage, the defined classification
been cored, and have high-quality logs), and applying on reference wells is extended to all the other wells in
statistical algorithms such as cluster analysis, or more the reservoir through a process of statistical
complex processes based on neural networks. In this aggregation. This stage allows us to obtain
way, a lithological column is generated for each lithostratigraphic columns in terms of facies for all the
reference well, where each depth interval is associated wells in the reservoir.
with a specific facies (log facies). This process is
iterative, and aims to identify the optimal number of Three-dimensional distribution of facies
facies needed to describe the reservoir rock in the right The three-dimensional distribution of facies is
degree of detail. usually obtained by applying stochastic algorithms,
Next, these log facies are compared with using the three-dimensional stratigraphic model as a
available core data and characterized from base (see above).
a lithological and petrophysical point of view. These algorithms, which will be discussed in
Basically, each log facies is associated with typical greater detail below, allow us to generate extremely
lithological descriptions and values (mean and/or realistic geological models, related to all available
statistical distributions) for petrophysical data; geophysics, log and core data, and sometimes
parameters. The degree of detail and the accuracy of even dynamic data. Fig. 4 shows an example of this
this characterization stage obviously depend on the type of model, demonstrating the degree of detail
number and quality of logs used. In the case of old which can be obtained in what are now routine
wells, with a limited availability of logs (e.g. geological studies.
electrical logs of spontaneous potential and/or These models use a vast number of basic cells,
resistivity), the classification process is perfunctory often in the order of tens of millions, thus allowing an
and the characterization stage is limited to a basic extremely detailed representation of the real geological
lithological description, for example sands/silt/clays, structure of the reservoir. In a later stage, after a
with limited vertical resolution. By contrast, where process of simplification and reduction of the number
logs of more recent generations are available (e.g. of cells (upscaling), these geological models (in terms
density/neutron, PEF, sonic and NMR), the facies of the petrophysical characteristics of the reservoir
emerging from the classification process can be rock) are input into the dynamic model to simulate the
characterized more completely. For example, each production behaviour of the reservoir.
generally provide fairly accurate values, and can also complex stage, which frequently involves a high
be applied under conditions of temperature and degree of uncertainty in the final construction of
pressure corresponding to original reservoir the integrated reservoir model.
conditions. The water saturation of a rock, like its porosity,
The problems associated with this type of may be measured on cores, or on the basis of logs. In
measurement, where they exist, are linked to the the laboratory, meaningful measurements of water
representativeness of the rock sample. A typical saturation may be obtained using Dean-Stark type
example is the measurement of secondary porosity, extraction data on preserved samples, at least in cases
which, being linked to genetic factors whose spatial where mud filtrate invasion is limited, and where the
intensity is extremely irregular, may not be at all expansion of the gaseous phase does not lead to a
representative of average reservoir conditions. As significant change in the sample’s initial saturation
such, it may be difficult to determine the porosity of conditions. We can often obtain data of considerable
fractured rocks, or rocks affected by intense accuracy by using suitable coring techniques and non-
dissolution and/or cementation phenomena. Another invasive oil-based drilling muds, at least in areas of the
example of poor representativeness is provided by reservoir which are distant from the transition zone,
rocks of conglomerate type, in which the distribution also known as the capillary fringe. An example of a
of the porous system is highly irregular, at least at the systematic study of this type, carried out on the
core scale. Prudhoe Bay field in Alaska, is described in McCoy et
The methods most frequently used to determine al. (1997).
porosity are those based on the interpretation of well The water saturation of a rock may also be
logs. The quantitative interpretation of porosity is of determined using capillary pressure measurements,
particular significance in reservoir studies when the based on the fact that capillary forces are responsible
determination of the porous volume of the reservoir for the relative distribution of water and hydrocarbons
turns out to be highly complex. This is true, for within the porous space.
example, of old fields, with little data of low quality For the purposes of reservoir studies, water
and resolution; of carbonate reservoirs, characterized saturation is mainly measured on the basis of well
by prevalently secondary porosity; and of fractured logs recorded in uncased boreholes, and particularly
reservoirs, where well instruments may at times turn electrical resistivity/induction logs, generally using
out to be completely inadequate for a quantitative the famous Archie equation, first published back in
calculation of porosity. 1942 (Archie, 1942). In cased boreholes, on the
In all of these cases it is essential to integrate the other hand, water saturation can be measured using
normal petrophysical interpretation, based on log and data obtained with pulsed neutron-type instruments.
core data, with all those techniques, static and These also have the advantage of being recordable
dynamic, which may provide information, even of an through the production tubing, while the well is
indirect nature, on the porous volume of the reservoir. producing. These instruments are often used in
This integration process may make a fundamental systematic monitoring of the evolution of saturation
contribution to the evaluation of the porous volume of conditions in reservoirs, and therefore represent an
the reservoir, and the understanding of its spatial extremely interesting source of information for
distribution. reservoir studies. For example, the ability to
monitor the advancement of oil-water or gas-water
Water saturation contacts in the various zones of the reservoir as a
The porous system of a reservoir rock is filled function of time, not only allows us to optimize the
with fluids, typically water and hydrocarbons. The management of the field, but also provides
relative distribution of these fluid phases within information which is essential in calibrating the
the porous space depends on a series of factors results of the reservoir model.
linked to the chemical and physical properties of
the rock and the fluids themselves, as well as the Permeability
interaction between rock and fluid (the rock Permeability (see Chapter 4.1) is without doubt the
wettability). Determining the saturation conditions most important petrophysical reservoir parameter.
of the reservoir rock represents one of the most Permeability determines both the productivity of the
important stages in a reservoir study, since it wells and the reservoir’s ability to feed drainage areas,
influences not only the calculation of the amount and thus the reservoir’s capacity to sustain economic
of hydrocarbons in place, but also the rates in the long term. On the other hand, this is also
determination of fluid mechanics, and thus the the most difficult parameter to determine.
productivity of the wells. This is generally a Permeability is a property which can be measured
of permeability in the well. Permeability is calculated By integrating the data derived from these different
using equations based on the proton relaxation time, techniques we can often generate reliable permeability
and the results obtained may be fairly accurate, models, which reflect both the static and dynamic
especially where some of the input parameters can be aspects of this property. This allows us to improve and
calibrated on measurements carried out on core shorten the validation phase (history matching) of the
samples in the laboratory. dynamic simulation model, thereby optimizing the
quality of the reservoir study and the time required to
Petrophysical correlations. perform it.
Permeability is often obtained using a correlation
with porosity by means of core measurements Determination of net pay
(Nelson, 1994). However, this method tends to The net pay of a reservoir represents that portion of
generate permeability profiles which are unnaturally rock which effectively contributes to production. This
regular; there are various types of statistical data value is calculated using appropriate cut-off values
processing which allow us to preserve at least in part applied to petrophysical parameters. Although the
the heterogeneity of the original permeability simplicity of the term might lead one to think
distribution. These include, for example, regressions otherwise, cut-off is one of the most controversial
for individual lithological facies, and multiple linear concepts within the community of geologists and
regressions (Wendt et al., 1986). reservoir engineers, since there is no clear shared
methodology for its definition. This is also evident
Empirical equations from the lack of literature on the subject, despite the
Various empirical equations exist in the relevant fact that the determination of net pay is practically
literature for the estimate of permeability on the basis unavoidable in any reservoir study (Worthington and
of known petrophysical parameters. In some specific Cosentino, 2003).
cases, these equations may provide fairly acceptable One of the key points in determining the cut-off to
results, but it is always important to check them using be applied to petrophysical curves is an understanding
available core data. of its dynamic nature. This is because the cut-off is
linked to conditions that imply the productive capacity
Neural networks of hydrocarbons under given pressures, and with a
This is a recent methodology, which allows us to given development plan. Typically, a porosity cut-off is
generate permeability profiles using logs or other selected on the basis of permeability versus porosity
petrophysical profiles. The most interesting aspect of graphs drawn up using data obtained from core
this methodology (Mohaghegh and Ameri, 1996), is analysis, thus fixing a limit value for permeability
that the obtained estimates correctly represent the often equivalent to a conventional value of 1 mD
original degree of heterogeneity of the data (microdarcy).
measured, and the results do not suffer, as statistical In selecting the cut-off, we must consider at least
methods do, from the smoothing effect. Particular the following two factors. Firstly, the cut-off must be
attention should be paid during the preliminary chosen on the basis of fluid mobility rather than
‘training’ process of the neural networks; this permeability alone. Consequently, in the same
requires adequate calibration data, without which the geological formation, the value of the cut-off varies as
results obtained may be misleading. Table 1 a function of the fluid present. This is why many of the
illustrates the characteristics of these various world’s gas fields produce from reservoirs with
methods. extremely low permeability, just a few mD, whereas
the cut-offs normally applied to heavy oil reservoirs contribute to this choice: a good knowledge of the
are in the order of tens of mD. Typical cut-off values reservoir’s lithologies and fluids, the prevalent
for mobility lie in the range of 0.5-1 mD/cp. production mechanism, the analysis of data which may
Secondly, the choice of cut-off must be a function provide direct or indirect information for the purpose
of production mechanisms. In reservoirs which (production tests and DST, data obtained with
produce by simple fluid expansion (depletion drive), measurements performed using WFT and NMR-type
the value of the cut-off depends on the prevalent well test tools, etc.). The integration of all these types
pressure level. It is obvious that rocks with low of information allows an appropriate choice of the
permeability, subjected to high pressure differential values to be adopted.
(the difference between reservoir pressure and the Once the permeability cut-off value for
pressure imposed in the production tubing), can commercial production has been defined, other
contribute to production. As a result, in a reservoir of associated petrophysical cut-off values may be
this type, the real cut-off changes over time, as obtained fairly simply on the basis of diagrams
pressure differences increase. The cut-off’s (crossplots) of reservoir properties. This methodology
dependency on time emphasizes another aspect of this is illustrated schematically in Fig. 6.
complex problem. By contrast, in reservoirs Where a lithological classification is available (see
dominated by convective phenomena (e.g. reservoirs above), this procedure should be carried out
subjected to secondary recovery processes using water independently for each facies. This usually results in
injection), where pressure does not change greater accuracy, and consequently a more effective
significantly during production, the cut-off depends distinction between producing and non-producing
more on the efficiency of the displacement process, rocks. In some cases, the lithological classification may
and is thus more generally linked to concepts of also lead to the definition of facies which are reservoir
Residual Oil Saturation (ROS). and non-reservoir, thus making the determination of
It should be stressed, however, that even taking into net pay even easier, especially when working on
consideration the aspects described above, the complex three-dimensional geological models.
selection of an appropriate cut-off value is difficult, Finally, it is always advisable to perform sensitivity
and often elusive. This explains why such a choice is analyses on the method employed by using different
highly subjective and difficult to justify. It is no working hypotheses, and thus different cut-off values,
coincidence that one of the most controversial aspects and noting the variations in the final values for the
of ‘reservoir unitization’ processes (the pooled volume of hydrocarbons in place. This phase often
production of a reservoir which extends over two or allows us to refine our initial hypotheses, and to
more production leases, by agreement or imposed by optimize our final choice.
law), is precisely the choice of cut-off and the
determination of net pay. Distribution of petrophysical parameters
The main problem in determining the cut-off lies in The petrophysical well interpretation forms the
the choice of the reference value for permeability, basis for the subsequent stage of the study, consisting
which represents the boundary between productive and in the lateral (2D) or spatial (3D) distribution of
non-productive rocks. Various factors should reservoir properties. In both cases, the most complex
log K Φ log SW
SWC
KC Φ C
Φ C Φ VshC Vsh Φ C Φ
Fig. 6. Procedure for defining a consistent set of petrophysical cut-offs. K, permeability; F, porosity;
SW, water saturation; Vsh, shale volume; c, cut-off.
problem is the lack of information on those parts of which prevent this type of uncontrolled
the reservoir between wells, especially when dealing extrapolation.
with highly heterogeneous geological formations, or This procedure may be improved by using
those characterized by poor lateral continuity. geostatistical techniques. In this case, the correlation
Traditionally, the interpolation of known values function adopted is not predefined, as in the case of
measured at the wells has represented the classic commercial algorithms. Instead, it is calculated
methodology for the construction of reservoir maps, directly on the basis of available data, with obvious
with the geological/sedimentological model forming benefits in terms of the accuracy of the end result.
the only reference point for this operation. In the These correlation functions (the variogram, or its
past, the reservoir geologist drew these maps opposite, covariance) express the real lateral continuity
manually; only from the 1980s onwards did computer of the variable being modelled, and also allow us to
mapping techniques begin to be used. Since the take into account possible directional anisotropies. The
1990s the situation has changed radically. On the one geostatistical algorithm used in the next stage of the
hand, the availability of computers with increasingly evaluation process is known as kriging. This algorithm
high processing and graphic capabilities has allows us to represent accurately the lateral
definitively changed the way reservoir geologists distribution of the parameters, and has the additional
work. On the other, the development of new advantage of providing an evaluation of local
methodologies such as geostatistics and the uncertainty (kriging variance).
extraordinary evolution of techniques for acquiring A further improvement of the expected results
and processing geophysical data have provided may be obtained by using seismic data. Geophysics
geologists with new tools, allowing them to build is the only direct source of information on areas of
more accurate and less subjective models. In the the reservoir which are distant from wells, and in
following sections we will describe separately the recent years the geophysical techniques available for
two possible approaches: two-dimensional and three- this purpose have improved considerably. This
dimensional modelling. approach is based on a possible correlation between
particular characteristics (or ‘attributes’) of the
Two-dimensional modelling seismic signal recorded, and the petrophysical
of reservoir parameters characteristics of the reservoir (typically porosity
Two-dimensional geological modelling consists in and/or net pay). This correlation is defined in the
the generation of a set of maps representing the calibration phase, by comparing surface seismic data
lateral distribution of reservoir parameters. We can with data measured at the wells (sonic and velocity
distinguish between two basic types of map: those logs, VSP, etc.). Once the correlation has been
which describe the geometry of geological units (top, defined, we proceed to integrate the seismic data,
bottom and thickness of the various layers: see generally using the following methods (in order of
above), and those which describe their petrophysical complexity):
properties; porosity, water saturation, net/gross ratio, • The normal well data interpolation, improved by
and permeability. It should be stressed that the latter using maps of seismic attributes; these are used to
type of map, whilst not strictly speaking necessary calculate the large-scale trend of the parameter
for the static model, is essential for dynamic under consideration.
simulations. • The conversion of the map of the seismic attribute
The procedures used to generate maps of (e.g. amplitude or acoustic impedance) into a
porosity and net/gross (the ratio of net pay to gross porosity map, using the correlation defined at the
thickness), are basically similar. Mean values are wells. Later, the resulting map is modified to be
calculated in the wells for each geological unit, and consistent with available well values.
these values are then adopted for the interpolation • The geostatistical approach, using spatial distribu-
process by using computer mapping techniques. In tion functions calculated on the basis of the corre-
the simplest cases, as we have said, this operation lation between well data and seismic data. The use
is performed solely on the basis of the of collocated cokriging techniques (Xu Wenlong et
sedimentological reservoir model, with fairly al., 1992) has become widespread in recent years.
reliable results, at least where there is a high Fig. 7 shows an example of a porosity map
density of existing wells. However, considerable generated by integrating information obtained from
attention must be paid to the peripheral areas of the wells with geophysical data.
reservoir, where the mapping algorithm may This type of approach to the construction of
extrapolate meaningless values. In these cases, it is reservoir maps is becoming increasingly common in
common practice to use reference control points, current practice, due mainly to the availability of
highly sophisticated software applications. These partly because the methodology employed is similar
allow us to visualize seismic and traditional to that used for dynamic simulation in the
geological data simultaneously, with obvious benefits initialization phase of the model. In this light, this
for the modelling process. However, considerable method promotes greater consistency between the
care is required in these operations, since seismic hydrocarbons in place values calculated during the
signals are influenced by a broad range of factors geological modelling phase, and those calculated
(lithology, petrophysical characteristics, fluid during the dynamic simulation phase.
content, overlying formations), and it is thus The construction of an accurate permeability map
important to check the correlation between seismic is one of the most important aspects of a reservoir
data and well data carefully. Spurious correlations are study, since the results of the dynamic simulation
more common than one might think, especially where model are largely dependent on it. Various
only a few wells are available for control (Kalkomey, methodologies are available, and the choice of which
1997). to use depends on the characteristics of the reservoir
There are also various methodologies for the under examination, on the available data, and on
production of water saturation maps. As for porosity available human and technological resources. The
and net/gross, the most traditional technique is based traditional method, as in the case of porosity and
on the direct mapping of values measured at the wells net/gross, involves direct mapping of available well
for each geological layer. This procedure works fairly values. However, as compared to other petrophysical
well where a large number of wells are available, and parameters, this methodology has greater limitations,
has the added advantage of reflecting the values linked to the following aspects.
effectively measured in the wells themselves. Availability of data. Generally, the availability of
However, this methodology fails to take into account data for the mapping process is more limited than for
the correlation with other petrophysical parameters other petrophysical parameters, given that, with the
(porosity and permeability), and does not allow an partial exception of nuclear magnetic resonance logs,
accurate reproduction of the capillary fringe (see permeability data are available only from cored wells.
Chapter 4.1). Moreover, it is prone to consistency Type of data. As already discussed (see above),
problems in the petrophysical interpretation of the there are generally various possible sources of
various wells. permeability data, each of which provides
Another technique frequently used to generate characteristic values relating to scale, saturation
saturation maps consists in the direct application of a conditions and type of information (direct/indirect).
porosity-water saturation correlation. In cases where The homogenization of these data required prior to a
pore geometry is relatively simple, we can frequently mapping process often turns out to be an arduous task,
observe a linear correlation between these parameters and subject to compromises.
on a semilogarithmic scale. The main advantage of this Spatial variability. The spatial continuity (lateral
technique lies in its speed of execution and the and vertical) of permeability is usually much lower
consistency of results. However, it does not allow us to than that of other reservoir parameters. In the case of
model the capillary fringe; its principal application is highly heterogeneous formations, this continuity may
thus for gas fields, and in general for those reservoirs
where the height of the capillary fringe can be
disregarded. 0.275
0.250
Other techniques for generating saturation maps 0.225
rely on the application of capillary pressure curves, 0.200
0.175
which reproduce the distribution of the fluid phases 0.150
0.125
relative to the height above the water-hydrocarbon 0.100
contact. These functions may be derived from 0.075
0.050
capillary pressure data measured in the laboratory 0.025
(see above), or they may be calculated on the basis of
multiple linear regressions. In the latter case, both
petrophysical (porosity) curves and height above the
contact are used, and this allows us to simultaneously
take into consideration the dependence on the porous
system, and on the distance from the interface
between the fluids. These methods, whilst more time- Fig. 7. Example of a porosity map
consuming, generally represent the most satisfactory generated by integration with seismic data
compromise for the generation of saturation maps, (courtesy of L. Cosentino).
be as little as one metre, or even entirely inexistent. It Generally speaking, two types of approach can be
is worth remembering that most algorithms used in identified: in the first, the distribution of petrophysical
software packages assume a predefined and implicitly parameters is carried out directly in the three-
very high spatial continuity which generates fairly dimensional space of the reservoir, starting from well
regular maps. In the case of permeability this is often profiles (single-stage model). This method does not
unrealistic. require a three-dimensional lithological model of the
Despite this, the mapping of permeability using facies (see above). In the second, the distribution is
production tests carried out in wells may generate implemented on the basis of the lithological model. In
accurate maps, especially when a sufficiently large this case, the petrophysical parameters are distributed
number of tests are available. These permeability following the three-dimensional modelling of the
values are often extremely representative, and allow facies, in accordance with statistical laws specific to
us to produce consistent maps which are particularly each facies (two-stage model).
well-suited to dynamic simulation. In the case of The second method has the advantage of being
fractured reservoirs, where core data are shown to be based on a geological reference model which forms
inadequate for the representation of the actual the basis for lithological modelling. This generally
reservoir permeability, this type of approach is often allows a better assignment of petrophysical
a forced choice. Finally, it should be stressed that, as properties, especially in the presence of complex
for other reservoir parameters, these interpolations lithologies characterized by different porous
may be further improved by using geostatistical systems.
techniques and kriging algorithms. A particularly interesting aspect of 3D modelling is
An alternative methodology which is frequently the possibility of integrating seismic data, traditionally
employed is based on the generation of a permeability used in a two-dimensional context, directly in three
map from a map of porosity, using a correlation dimensions. Thanks to the availability of sophisticated
between the two parameters, generally calculated on processing algorithms which allow us to improve the
the basis of available core data. In this case, the vertical resolution of seismic data, and to the use of
resulting permeability map will intrinsically resemble new techniques to characterize the seismic signal, we
that of porosity, the implicit assumption being that the can identify seismic facies within the set of seismic
spatial correlation function for these two parameters is data. These in turn can be correlated with the more
of the same type. However, this is normally inaccurate, traditional facies deriving from the lithological
and the resulting maps often appear unnaturally regular. characterization of the reservoir. Fig. 8 shows an
Furthermore, it should be stressed that the relationship example of profiles derived from seismic data
between porosity and permeability on which this characterized in terms of seismic facies. Examples of
method rests is often far from clear, especially in the this type represent a notable point of convergence
case of carbonate sediments. As such, the results may between lithological, petrophysical and seismic
be improved through a careful analysis of the basic modelling, the integration of which may produce
correlation, and the identification of lower-order extremely accurate three-dimensional models.
correlations, preferably for each individual facies.
Three-dimensional modelling
of reservoir parameters
The 2D methodology described in the previous
paragraph is gradually being replaced by more
complex techniques, based on a three-dimensional
approach to geological modelling. It is now possible to
generate and visualize rapidly three-dimensional
models of any reservoir parameter, with a resolution
that frequently exceeds tens of millions of cells. This
means that the reservoir geologist can quickly check
different working hypotheses and analyse results
directly on the screen of his own computer, with
obvious benefits in terms of time, and the accuracy of
the end results. Three-dimensional modelling may be
applied to all reservoir parameters, basically using the Fig. 8. Example of seismic profiles characterized
same procedures already described for two- in terms of seismic facies
dimensional models. (courtesy of L. Cosentino).
4.5.6 Integrated geological model (for a given algorithm and its associated parameters)
the uncertainties linked to the available data. The
Until a few years ago, the geological model stochastic approach therefore represents a considerable
referred to a workflow rather than an object. improvement on traditional geological modelling
During the past ten years the extraordinary techniques.
development of information technologies and the Currently, the most frequently used algorithms
spatial modelling of oil fields has occasioned such for stochastic modelling belong to either the pixel-
radical changes in the way reservoir geologists based or object-based category. In pixel-based
work and even think, that the meaning of the models, also known as continuous models,
geological model has changed considerably. On the the variable simulated is considered a continuous
one hand, it has become clear that the integration random function, whose distribution (often of
of different disciplines, static and above all Gaussian type) is characterized by cut-off values
dynamic, is fundamental for the correct static which identify different facies or different intervals
characterization of the reservoir. On the other, the for petrophysical values. The most commonly used
information platforms on which we work today algorithms in this category are truncated Gaussian
allow the gradual construction of a model (first random functions (Matheron et al., 1987), and
structural, then stratigraphic, lithological, and functions of indicator kriging type (Journel et al.,
finally petrophysical), which comprises and 1990).
summarizes the results of the interpretations These models are applied especially in the
carried out by the various experts participating in presence of facies associations which vary
the interdisciplinary study. continuously within the reservoir, as is frequently
The integrated geological model has thus taken the case in geological formations of deltaic type or
on a revolutionary meaning compared to the past, shallow water marine reservoirs. No a priori
becoming a virtual object which represents the assumptions are made on the form and extension of
actual reservoir present underground in a discrete the sedimentary bodies, which are simulated solely
(but extremely detailed) way. It is characterized on the basis of the spatial distribution functions
quantitatively by petrophysical parameters used (variograms and proportionality curves). This
distributed within the three-dimensional space of the approach is often adopted in cases characterized by
reservoir, and may be modified and up-dated rapidly relatively high net/gross ratios, in other words in
if new data become available, for example from new prevalently sandy geological formations with
wells. intercalations of clay or other non-productive
The theoretical and practical basis for this new layers.
approach to reservoir geology is represented by By contrast, object-based models, also known
stochastic modelling. The use of stochastic (or as Boolean models, generate three-dimensional
geostatistical) models is relatively recent, but is distributions of sedimentary bodies, obtained by
becoming the most common practice among juxtaposing objects of simplified geometry, such as
reservoir geologists. From the 1990s onwards disks or tabular bodies, within a clayey matrix. The
numerous algorithms have been developed, the most parameters of these bodies (orientation, sinuosity,
versatile being available in commercial applications length, width, etc.) can be estimated on the basis of
which have made them fairly simple to use. In brief the sedimentological model adopted, geophysical
(Haldorsen and Damsleth, 1990), stochastic data, outcrops of comparable rocks, or on the basis
modelling refers to the generation of synthetic of production test interpretations. This type of
geological models (in terms of facies and model is used more frequently for fluvial-type
petrophysical parameters), conditioned to all reservoirs, characterized by channels or meanders
available information, both qualitative (soft), and located within prevalently clayey geological units,
quantitative (hard). where the overall net/gross ratio is relatively low.
These models generate equiprobable realizations, In these contexts, we can obtain extremely
which share the same statistical properties, and which interesting results, with highly realistic images of
represent possible images of the geological complexity the geology simulated. By contrast, in cases where
of the reservoir. There is no a priori method for the net/gross ratio is higher (typically 40%), and
choosing which realization to use in a reservoir study, when the number of conditioning wells is high,
and this hinders the full acceptance of these these algorithms may require an extremely long
methodologies by the geological community. On the time to process.
other hand, the availability of a theoretically unlimited Fig. 9 shows an example of a geological model
series of realizations allows us to explore thoroughly generated with a pixel-based algorithm. Note the
4.5.7 Calculation of hydrocarbons the integrated geological model. These estimates are
in place based on the following formula:
N
The determination of Original Hydrocarbons In [2] OHIPGBV 1 /(1–Sw)
G
Place (OHIP, or OOIP for oil and GOIP for gas) is
generally considered the final stage of the static where GBV is the Gross Bulk Volume of rock in the
reservoir study. It is during this stage that the reservoir; N/G the net to gross (ratio of net pay to
description of the reservoir, in terms of external gross thickness); / the porosity (fraction); Sw the
and internal geometry, and the properties of the water saturation (fraction); (1Sw), equal to Sh, the
reservoir rock, are quantified through a number hydrocarbon saturation (fraction).
expressing the amount of hydrocarbons present in If we know the mean values of these parameters for
the reservoir at the time of discovery. the reservoir in question, we can quickly calculate the
In fact, the most important number for the amount of hydrocarbons in place. In fact, in common
economic evaluation of a field is that relating to practice, this calculation is not performed using mean
the reserves; in other words that portion of values (except as a preliminary evaluation), but rather
hydrocarbons which can actually be recovered with using surfaces (in two dimensions) or volumes (in
a given development plan. The relation between three dimensions) representing the spatial distributions
hydrocarbons in place and Recoverable Reserves of the parameters in the equation. All the computer
(RR) is expressed by the well-known equation: applications commonly used in two or three-
dimensional static modelling supply the relevant
[1] RROHIP·Rf
calculation algorithms, allowing us to obtain the
where Rf is the recovery factor. The value of this volume of hydrocarbons in place simply and rapidly.
factor, and consequently of the reserves, depends In traditional two-dimensional modelling, based on
both on the geological characteristics of the the combination of surfaces (grids), we obtain a map
reservoir, and on a series of other elements such as known as the equivalent hydrocarbon column (Gross
the type of hydrocarbon, the characteristics of drive payN/G/Sh), which provides a clear and immediate
mechanisms, the development plan adopted, the picture of the hydrocarbon distributions within the
surface equipment, gas and oil prices, etc. (see reservoir. The value of OHIP is then obtained simply
Chapter 4.6). The value of hydrocarbons in place, by integrating this map. In the case of three-
on the other hand, is independent of these factors, dimensional models, the OHIP value is calculated
and therefore extremely important, especially directly on the basis of the integrated geological
because it gives a clear and immediate picture of model, using suitable calculation algorithms which
the importance and potential of the existing realize the sum of the volume of hydrocarbons present
accumulation. in each of the basic cells of the model. It is important
Basically, there are two techniques for to note that Eq. [2] supplies a value for OHIP under
estimating hydrocarbons in place: the traditional reservoir conditions. To convert this into surface
method, based on geological volumetric calculation conditions, we need to take into consideration the
techniques; and methods based on material balance variation in volume that oil and/or gas undergo when
(see Chapter 4.3). In this context, it is worth they reach the surface. This variation in volume, which
remembering that dynamic simulation does not is mainly a function of pressure, is measured
provide an independent estimate of hydrocarbons experimentally in the laboratory, and is known as
in place, since the values calculated by the Formation Volume Factor (FVF). In the case of oil, the
simulator simply derive from the geological model equation linking downhole volume and surface volume
used as input. is as follows:
Below, only geological evaluation methods are OHIP
described in detail. It should be stressed, however, [3] OHIPST121R
Bo
that material balance techniques may often provide
extremely accurate estimates of hydrocarbons in where OHIPST is the volume under stock tank
place, and that it is the reservoir geologist’s task to conditions, OHIPR is the volume under reservoir
check the agreement between the various methods, conditions, and Bo is the FVF of the oil, expressed in
and to justify any disagreements. reservoir barrels over stock tank barrels. In the case of
gas, the FVF is indicated with an equivalent volume
Volumetric evaluations factor Bg.
These refer to estimates of the quantity of original It should be emphasized that the application of this
hydrocarbons in place calculated using the results of formula often leads to misunderstandings, because
reports on the PVT analysis of reservoir oils (see in place, since they also evaluate the accuracy of the
Chapter 4.2) usually give different values for the estimate itself. The probabilistic approach involves
volume factor, according to the experiments carried taking into account the probability distributions of
out in the laboratory. We can thus define a differential every single parameter involved in the calculation.
Bo, a flash Bo, and other Bo types deriving from Each of these probability distributions quantitatively
separation tests at different pressures and reflects the degree of knowledge, and thus of
temperatures. These Bo values usually differ from one uncertainty, of the parameter in question. In the
another, especially in the case of volatile oils. simplest case (one dimension), these distributions are
Furthermore, by combining differential Bo values with sampled repeatedly and at random (Monte Carlo
those from separation tests we can calculate a method), ultimately generating a distribution of OHIP
composite Bo, which takes into account both the values. This distribution is characterized by statistical
behaviour of the oil under reservoir conditions parameters (mean, median, standard deviation, etc.)
(differential test), and the actual separation conditions which give a concise representation of the results
at the surface. This composite value generally obtained. In two or three dimensions, the Monte Carlo
represents the best approximation of the fluid’s method can nevertheless be applied, replacing simple
volumetric behaviour, and is that which should be used one-dimensional distributions with surface and grid
in Eq. [3]. distributions. In any case, the final result is still
The direct use of the value under reservoir represented by a frequency distribution, and therefore
conditions expressed by Eq. [1] eliminates possible a probability distribution, for OHIP values.
ambiguities relating to the choice and use of the In general, however, when making a probabilistic
volume factor, especially when data calculated evaluation of hydrocarbons in place, the preferred
volumetrically must be compared with data calculated methodology is that of stochastic modelling.
using the simulation model, where the volume factors
are determined using more complex calculations. Uncertainties relating to geological modelling
The reservoir geologist has the difficult task of
Deterministic and probabilistic evaluations reconstructing with maximum accuracy the geometry
Generally speaking, the volume of hydrocarbons in and petrophysics of a reservoir about which he usually
place may be calculated deterministically and/or has little, and mostly indirect, information. It is
probabilistically. therefore obvious that the final model will always
Deterministic values of OHIP are obtained simply present some degree of uncertainty. The quantitative
by combining the mean values (in one dimension), evaluation of uncertainties relating to geological
surfaces (two dimensions) or grids (three dimensions) modelling is one of the most complex and interesting
of the reservoir parameters indicated in Eq. [2]. These aspects of a reservoir study.
estimates are deterministic in that all the parameters In a typical static reservoir modelling study, we can
are calculated in a univocal way, without taking into identify four main sources of uncertainty.
account the possible uncertainties associated with each Uncertainties linked to the quality of data and
of them. In other words, the estimates calculated for their interpretation. All of the basic data in a study,
the representation of these parameters are implicitly from geophysical data to logs and core data, are
considered to be correct. associated with errors of measurement which
This is the type of estimate traditionally supplied influence the accuracy of the final result. Even though
by the reservoir geologist, and most frequently it is in theory possible to quantify these errors, this
performed. However, the process of constructing a task is rarely carried out, and the basic data are
geological model on the basis of insufficient, scattered generally assumed to be correct. This is even more
information (wells) involves uncertainties due to errors true of the interpretative stages.
of measurement, the lack of representative data, Uncertainties linked to the structural and
interpretative problems, etc. As a result, the value for stratigraphic models. The structural interpretation
OHIP obtained using this type of procedure is just one carried out by the geophysicist is in most cases of a
of many possible values, and depends on the specific deterministic nature, and does not include
interpretative process adopted. If we were to use, for quantifying associated uncertainties, although it is
example, a different interpolation algorithm, we would clear that this phase of the work is to some degree
usually obtain a different value for OHIP which is, a subjective. The same can be said of the correlation
priori, equally valid. phase (stratigraphic model), especially when dealing
In contrast to deterministic evaluations, with depositional environments characterized by
probabilistic evaluations generally provide a much poor lateral continuity (e.g. continental type
more realistic estimate of the amount of hydrocarbons deposits).
Matheron G. et al. (1987) Conditional simulation of the Wendt W.A. et al. (1986) Permeability prediction from well
geometry of fluvio-deltaic reservoirs, in: Proceedings of logs using multiple regression, in: Reservoir characterization.
the Society of Petroleum Engineers annual technical Proceedings of the Reservoir characterization technical
conference and exhibition, Dallas (TX), 20-30 September, conference, Dallas (TX), 29 April-1 May 1985, 181-221.
SPE 16753. Worthington P., Cosentino L. (2003) The role of cut-off in
Mezghani M. et al. (2000) Conditioning geostatistical models integrated reservoir studies, in: Proceedings of the Society
to flowmeter logs, in: Proceedings of the Society of Petroleum of Petroleum Engineers annual technical conference and
Engineers European petroleum conference, Paris, 24-25 exhibition, Denver (CO), 5-8 October, SPE 84387.
October, SPE 65122. Xu Wenlong et al. (1992) Integrating seismic data in reservoir
Mohaghegh S., Ameri S. (1996) Virtual measurement of modeling. The collocated cokriging alternative, in: Proceedings
heterogeneous formation permeability using geophysical of the Society of Petroleum Engineers annual technical
well log responses, «The Log Analyst», 37, 32-39. conference and exhibition, Washington (D.C.), 4-7 October,
Nelson P.H. (1994) Permeability-porosity relationships in SPE 24742.
sedimentary rocks, «The Log Analyst», 35, 38-62.
Vail P.R. et al. (1997) Seismic stratigraphy and global changes Luca Cosentino
of sea level, in: Payton C.E. (edited by), Seismic stratigraphy.
Applications to hydrocarbon exploration, «American Eni - Agip
Association of Petroleum Geologists. Memoir», 26, 63-98. San Donato Milanese, Milano, Italy