You are on page 1of 12

Computers & Geosciences 50 (2013) 4–15

Contents lists available at SciVerse ScienceDirect

Computers & Geosciences


journal homepage: www.elsevier.com/locate/cageo

Hierarchical benchmark case study for history matching, uncertainty


quantification and reservoir characterisation
D. Arnold n, V. Demyanov, D. Tatum, M. Christie, T. Rojas, S. Geiger, P. Corbett
Institute of Petroleum Engineering, Heriot-Watt University, UK

a r t i c l e i n f o a b s t r a c t

Article history: Benchmark problems have been generated to test a number of issues related to predicting reservoir
Received 11 April 2012 behaviour (e.g. Floris et al., 2001, Christie and Blunt, 2001, Peters et al., 2010). However, such cases are
Received in revised form usually focused on a particular aspect of the reservoir model (e.g. upscaling, property distribution,
31 August 2012
history matching, uncertainty prediction, etc.) and the other decisions in constructing the model are
Accepted 4 September 2012
Available online 26 September 2012
fixed by log values that are related to the distribution of cell properties away from the wells, fixed grids
and structural features and fixed fluid properties. This is because all these features require an element
Keywords: of interpretation, from indirect measurements of the reservoir, noisy and incomplete data and judgments
Uncertainty quantification based on domain knowledge.
History matching
Therefore, there is a need for a case study that would consider interpretational uncertainty integrated
Risk
throughout the reservoir modelling workflow. In this benchmark study we require the modeller to make
Geomodelling
Geostatistics interpretational choices as well as to select the techniques applied to the case study, namely the
Benchmark geomodelling approach, history matching algorithm and/or uncertainty quantification technique.
The interpretational choices will be around the following areas:

(1) Top structure interpretation from seismic and well picks.


(2) Fault location, dimensions and the connectivity of the network uncertainty.
(3) Facies modelling approach.
(4) Facies interpretations from well logs cutoffs.
(5) Petrophysical property prediction from the available well data.
(6) Grid resolution-choice between number of iterations and model resolution to capture the reservoir
features adequately.

A semi-synthetic study is based on real field data provided: production data, seismic sections to
interpret the faults and top structures, wireline logs to identify facies correlations and saturation profile
and porosity and permeability data and a host of other data. To make this problem useable in a
manageable time period multiple hierarchically related gridded models were produced for a range of
different interpretational choices.
& 2012 Elsevier Ltd. All rights reserved.

1. Introduction Typically a geomodeller would produce many 100’s of realisa-


tions of the reservoir to assess the uncertainty in a pre-production
Uncertainty is an issue we need to deal with in every reservoir field. Critically these models attempt to cover the uncertainties in
due to sparse and/or low resolution data available and our limited reservoir through a range of possible numerical input parameters
knowledge about the reservoir. Creating a simulation model to that produce variations in key reservoir features. Where we miss
forecast petroleum reservoir production requires the modeller to can out uncertainty is in the interpretational elements of modelling
make a number of choices about how to construct a representa- that are difficult to describe using numerical parameters, such as
tive model. These choices range from which numerical value to the choice of which faults are present, the picking of the top
assign a reservoir property based on measured data to interpreta- structure, the depositional model, cut off selection and perme-
tional choices such as the depositional model. ability prediction models. Different possible interpretations of the
reservoir for instance would require different conceptual models
n
Corresponding author. Tel.: þ44 131 451 8298.
for the facies distributions and different modelling approaches.
E-mail addresses: dan.arnold@pet.hw.ac.uk (D. Arnold), The modelling choices made are related how complex a model
vasily.demyanov@pet.hw.ac.uk (V. Demyanov). of the real system we choose to create. The complexity issue is

0098-3004/$ - see front matter & 2012 Elsevier Ltd. All rights reserved.
http://dx.doi.org/10.1016/j.cageo.2012.09.011
D. Arnold et al. / Computers & Geosciences 50 (2013) 4–15 5

well known as the bias-variance trade-off where over-complex partitioning (faults) of the reservoir are typically not accounted for
models result in inflation of the uncertainty (variance) while over- by the uncertainty quantification process. These features are
simplistic ones—to bias. A meta-model that subsumes specific developed principally from seismic data and well correlations so
candidate models would lead to considerable inflation of variance, are based on the interpretation of geologists/petrophysicists and are
therefore there is a need to fund ways of mixing model that are (a) subject to uncertainties in depth conversion and (b) have
consistent with what we believe is plausible (Christie et al., 2011). uncertainties due to the interpretation process, where the error
On the other hand if the models from the ensemble come from the cannot easily be quantified as we do not typically produce many
same code (are related) the unvariance may will be understated independent seismic interpretations.
(P. Challenor et.al in Christie et al., 2011). Therefore, it is important A number of studies have considered the uncertainty in
to include the bias terms in the model corresponding to the effects seismic interpretation (e.g. Bond et al., 2007, Rankey and
not accounted by the model. Mitchell, 2003) and show a potential for significant variation in
Once it comes to choosing from different modelling algorithms the interpretation of the same data due to the background of the
and multiple interpretations of data and knowledge, uncertainty interpreter (Bond et al., 2007) and overconfidence in the quality of
quantifications inevitably become a subject to that choice, which their interpretation (Rankey and Mitchell, 2003). In Rankey and
is consistent with the subjective nature of uncertainty (Caers, Mitchell (2003), 6 interpreters were given the same seismic data
2011). From a probabilistic point of view uncertainty can be seen from a carbonate reservoir. They all believed that because the
as a ‘‘correct’’ posteriori PDF, which is unknown and needs to be seismic data was apparently easy to interpret, their subsequent
sampled for Oliver et al. (2008). interpretations were very accurate with only a small error in the
Furthermore, uncertainty can be seen as a relative term subject to location of the reservoir top. In fact, while most of the reservoir
the measurement scale and precision of how we define things we top was accurately identified by all the geologists studied, the
want to measure or estimate. Uncertainty analysis based on evaluat- edges of the reservoir were less well defined. A comparison of
ing of covariance matrix for the posterior probability is difficult for the 6 interpretations showed that the portions of the reservoir
complex problems when posterior pdf is complex itself. A Bayesian that were less well defined added considerable variation to the
approach implies there is one ‘‘true’’ posterior to characterise the volumetric estimates, even though the other parts of the field
uncertainty which is defined as a multiplication of the prior prob- were the same for all interpretations. There is therefore a need to
ability and the likelihood. The determination of the unique posterior develop techniques that account for the both the uncertainties in
pdf of a multi-scenario model can become quite difficult. A Bayesian measured reservoir inputs that are easy to handle, such as relative
approach joins the probabilities by means of ‘‘strong’’ conjunction. permeability uncertainties, and interpretational uncertainties in
Relaxation of the way this conjunction is performed would possibly the reservoir.
provide a wider evaluation of uncertainty. There are non-Bayesian A further complication in accounting for the uncertainty is the
approaches to combine prior and evidence e.g. by means of con- grid resolution of the model. Most previous history matching case
junction (Tarantola, 2005). There have been attempts to use studies provide a single grid resolution model to match to real
Dempster–Shafer generalisation of the Bayesian theory of subjective field production data, production data from a truth case model of
probability to measure uncertainty in reservoir geological properties the same resolution or a higher resolution model. The resolution
(Kim, 2002). Another approach was proposed in perception-based of the grid and the accuracy of the gridding approach are
theory of probabilistic reasoning, which uses fuzzy logic to describe important in history matching as the total run time available for
perceptions and subjective probability Zadeh (2002). the process is a limiting factor. In most cases we can either run
The issue of multiple scenarios is compounded when we move many low resolution models that have a higher solution error in
to post-production fields and we want to add production data to their predictions or fewer high resolution with less error but
the uncertainty quantification process. Here the model is history potentially do not identify the best history matched models, due
matched to the measured well production data and the quality of to insufficient iterations of the model.
fit defines the likelihood of the model and is typically now done Solution errors are the difference between the exact mathe-
using a range of automated approaches (Floris et al., 2001). matical solution and the numerical algorithm solution used to
Uncertainty in automated HM approaches is usually described represent them in the simulation model. Any assumptions and
with an inference from an ensemble of multiple models which simplifications from the mathematical model, errors in rounding
come from a prior range of the model parameters or state vectors, numbers by the simulator or numerical errors due to the grid
updated through each iteration. There are several difficulties resolution all contribute to the solution errors. A solution error
associated with this. The uncertainty in the model definitions, model was implemented to account for upscaling errors in
components and complexity is not easy to take into account. For O’Sullivan and Christie, 2005 and Christie et al., (2006).
instance, if there are uncertainties about the structure it can There are also uncertainties associated with the missing physics
become difficult to re-grid the model to new structural config- in the model, which therefore may have an impact on the
urations automatically. observations. This so called model inadequacy, once accounted
Distinguishing between different geological interpretations for with a random statistical processes, improved quantification of
also becomes an issue. Data assimilation approaches, such as the overall uncertainty based on the more consistent inference
Ensemble Kalmar filters (Evensen et al., 2007), can potentially from the generated model response.
handle interpretational uncertainty within the range of the initial In this paper we describe a new case study, called the Watt
ensemble of realisations, however, the consistency of the con- Field (after James Watt the founder of Heriot-Watt University),
verged ensemble with geologically realistic distributions may be we have developed to consider both the impact of how you
an issue. choose to estimate the uncertainty and what model you choose to
Particularly tricky aspects of the uncertainty to account for in do this with. The benchmark study addresses mostly the issues
history matching include the top structure, layering, faults (num- associated with an early stage of a brown field development.
bers, dimensions, displacements and locations) and other structural Previous case studies have concentrated on the choice of uncer-
features. As a result we tend to keep these features fixed, preferring tainty quantification method (PUNQ-S3 (Floris et al., 2001)) or
instead to change elements that are easy to parameterise in the facies/petrophysics modelling (such as the Brugge case study
simulation model. This means that significant uncertainties in the (Peters et al., 2010)). For the case of models like PUNQ-S3 many
Gross rock volume (top structure), Net to Gross (layering) and parameterisations were developed by different authors (Floris
6 D. Arnold et al. / Computers & Geosciences 50 (2013) 4–15

et al., 2001, Demyanov et al., 2004, Hajizadeh et al., 2010) to There are many routes to calculating the Bayesian inference
account for the model uncertainty within a given geological (See section 2.4).
scenario. Stanford 6 case study provided a good example of a 5. We may need to repeat steps 1–3 (or 4 if we are doing
realistic synthetic full scale model with different depositional Bayesian inference) for multiple scenarios of the reservoir model
environments and associated seismic (Castro et al., 2005). The and potentially use all scenarios in estimating uncertainty.
Brugge case study gave the modeller choices from 104 prior 6. We balance run time with grid resolution to minimise solution
realisations of the geology along with high resolution log data and errors while producing enough simulations to describe the
a choice of petrophysical correlations. SPE10 (Christie and Blunt, uncertainty. Therefore, the grid resolution must be chosen
2001) is a widely used model for upscaling studies which deals appropriately and may require testing.
with the problems of grid resolution and solution errors. As yet no
study has combined these choices with other uncertainties that
need to be accounted for such as the shape of the top structure 2.1. Objective function
and the fault network uncertainty.
This paper proposes a new case study that includes the kinds An objective function is a mathematical expression that
of interpretational uncertainties present in a real reservoir as well measures how close a problem has been reduced towards an
as other uncertainties. Specifically we have produced a number optimal value. In the case of history matching, the objective
of realisations of the model based on different top structure function is a measure of the difference between the observed
interpretations, fault models, facies modelling/depositional envir- and simulated results, and we aim to minimise this value. The
onment choices and relative permeability/capillary pressure most commonly used objective function for history matching is
uncertainties for a number grid resolution choices. Our intention the least square norm, which calculates a measure of the
is to encourage researchers to look into accounting for uncertain- discrepancy between the simulated and historical values as a
ties in their interpretations of the reservoir and develop techni- numerical value called the ‘‘misfit’’, M. The least squares misfit
ques that account for these uncertainties. formula is defined as
XN
ðqobs ðt i Þqsim ðt i ÞÞ2
M¼ ð2Þ
n¼1
2s2i
2. Uncertainty quantification of producing fields
where qobs is the observed or historical rate and qsim is the
Extensive work has been carried out on developing techniques simulated results at times ti, N is the number of data points and
for uncertainty quantification in petroleum reservoirs. Most s2i represent the measurement error in the observed data assum-
common approaches develop a Bayesian methodology as it allows ing they are independent and identically distributed.
the engineer to update an initial estimate of uncertainty to a new The use of the least squares misfit is important when carrying
posterior estimate using production data. Bayes’ theorem is out Bayesian inference as by assuming the Gaussian errors in our
described by the relationship between the posterior and prior production data we can define the likelihood function from the
misfit where
pðO9mÞpðmÞ
pðO9MÞ ¼ R ð1Þ
m pðO9mÞpðmÞdm PðO9mÞpeM ð3Þ

where P(m9O) is the posterior probability of the model, an Without Gaussian assumption a rigorous likelihood definition
updated estimate of the reservoir uncertainty from the initial may be difficult to produce, however for many practical problems
prior probability p(m) and the likelihood p(O9m). an objective function different from least squares may be needed.
By calculating the likelihood we can update the initial prior In case of correlated errors s2i transforms on to the full covariance
probability to the improved posterior. The steps in this process matrix.
require the calculation of the likelihood from the mismatch The likelihood function describes how likely it is that the
between the production data and the simulation model responses model response explains the historical data and thus how likely it
where the likelihood increases with a closer match between the is that the model represents the true reservoir configuration.
data and simulation predictions. A well-matched model will therefore result in a low misfit value
The following areas are key components in the Bayesian and a correspondingly high likelihood value.
inference process for integrating production data into uncertainty
quantification: 2.2. Optimisation algorithms

1. Define an objective function that describes the closeness of fit Optimisation methods were classified by Christie et al. (2005)
between the history data and the simulation model response. as either calibration methods or data assimilation methods.
2. Automate the history matching process to find good matches Calibration methods are an automated version of the traditional
to the data. This is done using an optimisation algorithm to find history matching method, whereby a complete run of the simula-
good results quickly or through assimilation methods by tion is carried out and the match quality to the production data is
perturbing the model state vector (Evensen et al., 2007) (see used to move the model towards a better solution. Data assimila-
section 2.2). tion methods carry out a similar function, but data is calibrated
3. The model is history matched using a set of parameters that for each time step and the optimisation step adjusts the model
encompass the uncertainty. The definition of these parameters before the next step is run. The main calibration methods are
that are uncertain is called the parameterization. The choice of gradient and stochastic search algorithms, while the main data
parameterization is important in making sure the model is assimilation method used for history matching is the Ensemble
able to calibrate to the data adequately and produce good Kalman Filter (EnKF).
forecasts. Gradient based methods require the calculation of the deriva-
4. With many matched models found by history matching, we tive of the objective function with respect to the model para-
can calculate the likelihoods of the model parameter space and meters typically as either gradients or sensitivity coefficients.
infer the posterior probability using Bayesian methodologies. Methods include Gauss–Newton, Levenberg–Marquardt and
D. Arnold et al. / Computers & Geosciences 50 (2013) 4–15 7

steepest descent, and are very efficient at finding local optimal A common approach for history matching is to parameterise
solutions. Stochastic optimisation methods such as Genetic Algo- the relative permeability curves (Okano et al., 2005). These are
rithms (GA) (Erbas and Christie, 2007), Neighbourhood Approx- frequently uncertain due to the low number of available mea-
imation algorithm (NA) (Sambridge (1999a)), Particle Swarm surements and geological heterogeneity in the system. The
Optimisation (PSO) (Mohamed et al., 2006), Differential Evolution relative permeability uncertainty increased furthermore in three
(DE) (Hajizadeh et al., 2009) are global estimation methods that phase flow cases.
are designed to find many local minima assuming a global Geological parameterisation describes techniques that include
minima cannot be differentiated—probably a good assumption parameters relating to geological features such as dimensions of
for reservoir history matching where there is a limit on the facies bodies, orientations and relationships to petrophysical
number of simulations we can run. Mariethoz et al. (2010) properties. Arnold (2009) and more recent work by Rojas et al.
pointed out that important sampling can lead to underestimation (2011) showed the impact of realistic geological priors on poster-
of uncertainty due to sampling bias while the posterior is not ior inference, reducing the volume of realistic parameter space by
sampled uniformly but preferentially in the regions of better fit. avoiding unrealistic models and improving the validity of the
In MCMC sampling this bias is corrected by weighting the posterior inference.
acceptance probability with a ratio between the entire prior and
its subset. Stochastic evolutionary algorithms, referred to in this 2.4. Bayesian inference
study, differ from the MCMC adaptive sampling in a way that
while sampling from the prior the acceptance probability is set to The Bayesian inference step evaluates the posterior probability
1. Therefore, a further inference stage is required in this case to from Eq (3) from an ensemble of generated models. Bayesian
evaluated the posterior probability and with an MCMC method inference is most simply applied by sampling from the prior
(i.e. Gibbs sample in the NAB, see section 2.4. The bias correction distribution using a Monte Carlo technique such as Markov Chain
in NAB is ensured by normalising to the volume around each sample Monte Carlo (MCMC) or rejection sampling (RS) from the poster-
and multiplying by the prior to approximating the posterior. ior (Liu et al., 2001). These approaches sample randomly across
Ensemble based optimisation methods (Evensen et al., 2007), parameter space so many iterations may be required to find good
the most common of which is Ensemble Kalman Filtering (EnKF) local minima.
differs from stochastic sampling and gradient based methods Various types of MCMC algorithms has been developed and
because the simulation models are run forward one timestep at applied for the inferences problems in reservoir predictions, such
a time whereupon EnKF updates the model state vectors before as Population MCMC (Mohamed et al., 2010a), Hamiltonian
moving on to the next timestep. EnKF has been recently used in Monte-Carlo (Mohamed et al., 2010b). NA-Bayes (NAB) is another
many applications to fairly complex reservoir models including algorithm used to infer uncertainty from parameter space
structural uncertainty. For instance, Seiler et al. (2010) demon- sampled in assisted history matching. It uses Gibbs sampling to
strated uncertainty study of fault geometry, however, it is limited estimate the posterior uncertainty based on the sampling from
to a single structural scenario based on the given number of faults. any optimisation algorithms (Sambridge (1999b)).
This problem can be tackled within EnKF approach using a multi- EnKF gradually optimises a model after each assimilation step/
point statistics simulations of fault network used by Sun (2011). datum and as a result provides an estimate of the posterior
All techniques have shown an ability to locate good history probability based on the assumptions that the model is linear,
matched models quickly (e.g. Hajizadeh et al., 2009, Mohamed priors are normally Gaussian distributed and the ensemble size is
et al., 2009) however the choice of algorithm has an impact on the infinite.
resolved area of parameter space and therefore the estimation of
the posterior probability as shown by Erbas and Christie (2006) 2.5. Handling multiple scenarios
and the PUNQ project (Floris et al., 2001).
When presented with multiple geological scenarios, a better
option than picking a single scenario would be to either para-
2.3. Parameterisation techniques meterise the model so that we can sample across a parameter
space between the scenarios identified or create separate para-
Choice of parameterisation defines the way to describe natural meterisations for each scenario, sample from each and then
dependencies in the model. Accuracy, complexity and realism of recombine the results in a Bayesian framework.
the model depend on the way it is parameterised. Furthermore, Scenario are constructed through a hierarchy of choices based
the efficiency of calibrating the model to data and its prediction on uncertainties in the model inputs at each stage of the model
power also depends on the number and nature of the parameters. build. Choices earlier in the model build e.g. the shape of a top
Some parameterisations may be very efficient in getting a good structure or the locations of faults, will propagate through the
calibration match but do not actually address the key uncertainties. entire modelling workflow. Any combination of these choices can
For instance we might ignore geological uncertainties in favour of be produced to create a possible realisation of the reservoir. Prior
local factors, like skin, to tune the model around the wells. probability estimates of each model to be used in Bayesian
The difficulty arises when a model includes the parameters uncertainty quantification may well be qualitative rather than
that cannot be directly inferred from nature or this inference is quantitative ones where we cannot describe the prior based on
non-unique and bears large uncertainty. For instance, connectiv- some kind of parameter space. Authors such as Wood and Curtis
ity has the major impact on flow, but a full field reservoir models (2004) have formalized this difference between quantitative and
are usually defined by correlation models and proportions of qualitative uncertainty in terms of Bayes theory. The key issue
geological bodies. More recent training image based models give therefore is to account for (a) the (qualitative) probability of each
good descriptions of connectivity, although they are not con- individual interpretation of the model in a systematic way and
trolled directly by connectivity measures. (b) handle potentially different numbers of parameters for differ-
Engineering approaches often provide consistent control over ent parameterisations of the model.
the volume and connectivity of the reservoir compartments, but The option of sampling in a parameter space developed
they typically have limited capability vs. full field geological between scenarios has been tackled previously by Suzuki et al.
models when making infill drilling decisions. (2008), where a gradual deformation technique was applied to
8 D. Arnold et al. / Computers & Geosciences 50 (2013) 4–15

effectively interpolate in between 6 possible interpretations of The Water/Oil contact (WOC) is identified from wireline and RFT
the structural model create new realisations. As this was a single data at a constant 1623.5 m subsurface.
parameterisation of the reservoir, adjusting only the gradual The interpreted depositional environment for the field is a
deformation parameters to adjust between the interpretations, braided river system, where a number of different possible out-
the resulting ensemble of models could be used in Bayesian crop analogues and modelling techniques could be applied.
inference techniques as the number of dimensions of parameter Common facies types in this kind of environment are fluvial
space does not change. A recently proposed metric space channel sands, overbank fine sands and background shales.
approach (Scheidt and Caers, 2008) provides a way to both The field was initially appraised through a set of 6 wells, Well
explore the relationships and evaluate the similarity between A–F then subsequently developed through a set of 16 horizontal
realisations in their separation in the metric space. production wells located across the central part of the reservoir
An alternative approach is Bayesian model averaging (Pickup and 5 horizontal and 2 vertical (one of which is a recompletion of
et al., 2008) where the posterior probability for each of a set of Well B) injection wells around the edges. Horizontal multi-lateral
possible models can be generated using standard assisted history wells were selected in the original development plan to maximise
matching and Bayesian inversion, then the posteriors are com- the distance from the WOC due to the low relief of the field and
bined to calculate an average posterior over all possible poster- increase oil production rates.
iors. In Pickup et al. (2008) the posterior inference was influenced An interpretation of the top structure and major faults was
by the total number of parameters, where smaller numbers of picked from the seismic in the time domain and then depth
parameters was considered to be more likely due to less chance of converted. Faults are identified by displacement in the seismic
overfitting (Occam principle). data and curvature observed in the top structure. The depth
Transdimensional sampling such as Reversible Jump Monte conversion develops a single depth surface realisation that is
Carlo (RJMC) (Sambridge et al. (2006)) can be applied to different matched is then tied to the well for a number of possible well
parameterisations of the model simultaneously for history match- horizon picks identified from wireline data provided as part of
ing. The algorithm in essence jumps between parameterisations this case study.
to sample from all the prior distributions to estimate the posterior The major faults were identified where possible from the
for all possible models in one step. This approach requires RJMC available seismic data, though there is ambiguity as to the
to be applied to the initial history matching step rather than as a number and extent of the faults in the reservoir. The general
post processor it will require more function evaluations than trend of the faults seen in the seismic data is East/West. Different
algorithms such as PSO or DE to find local minima. structural realisations of the reservoir were possible due to
Overall it is clear that when presented with multiple possible uncertainty in the number and location of faults, specifically sub
scenarios of the reservoir model we must account for the uncer- seismic faults. Three possible top structures, along with major
tainties from the differences between these models. This bench- faults and the OWC are shown in Fig. 1 in (a) plan view and
mark was created to test new techniques that address this issue. (b) cross section. Fault seal is represented simply as transmissi-
bility multipliers that is constant along the fault surface.
Figs. 1 and 2 shows significant variations in the extent of the
2.6. Gridding and numerical solution errors OWC and the thickness of the reservoir zone for each of the
3 produced realisations.
The choice of grid resolution presents the modeller with a The depositional environment of the reservoir is identified as
dilemma in that by reducing the dimensions of the grid cell we probably being a braided fluvial system. Data is provided in the
can reduce the numerical solution errors in the model response appendix to this benchmark study as to the possible structures
however, the simulation run time will increase depending on the present in this system and the probable inputs into the model for
available computer power. Realistically for the purposes of facies modelling. Common facies types in this kind of environ-
history matching in time frames available to engineers, the model ment are fluvial channel sands, overbank fine sands and back-
must limit their complexity. ground shales.
Christie et al., (2006) applied solution error models to account Wireline logs are collected from across the 6 appraisal wells
for solution errors in low resolution models when history match- only, with no data coming from the horizontal production wells.
ing, however in the absence of such techniques the modeller must We provide porosity, permeability and facies logs based on core
make a decision on the appropriate model to use given the time plug data from wells A and B and porosity predictions from
available. neutron density logs from wireline data in the other wells.
Electrofacies identification could be made from any of the log
data available. In this case we have developed a set of facies logs
3. Field description based on Relative Porosity Difference (RPD) calculated from
Neutron and Density porosity estimates from all 6 appraisal wells.
3.1. Overview RPD is expressed by
ðPor Neu þ0:05Por Den Þ
The Watt field is a synthetic study based on a mixture of real RPD ¼ ð4Þ
Por Den
field and synthetic data to describe a realistic field example seen
through appraisal into the early development life stage. Top where PorNeu is Neutron porosity and PorDen is density porosity.
structure and wireline data is based on real field data however, Cutoffs for 0.6, 0.7 and 0.8 were used to create 3 possible
the fluid properties, relative permeability, and capillary data are facies logs.
synthetic. The field development plan is also synthetic resulting Permeability/porosity from the cored wells A and C is used to
in an artificial production response. The model spans a 12.5 km by predict permeability in the other wells and outside the cored
2.5 km surface area, elongated in the East/West direction with a sections. Fig. 3 shows a cross plot for all cored data with a global
total modelled thickness of around 190 m, much of which is poro perm correlation for Well A. As can be seen multiple
below the oil water contact. The field is located around 1555 m interpretations of the poro perm correlations are possible, with
subsurface with an initial reservoir pressure of 2500 psi as the prediction curve given here (red line) being used later in this
measured from repeat formation testing (RFT) and well test data. paper as a permeability predictor.
D. Arnold et al. / Computers & Geosciences 50 (2013) 4–15 9

Fig. 1. Seismic top structure and major identified faults. Seismic lines, top structure and fault data is provided in the data pack.

Fig. 2. Pay zone thickness maps for the 3 top structure scenarios.

PVT and relative permeability data is supplied for the reser- exponents of 1.5–6. There is no significant Capillary pressure in
voir. PVT data was sampled across the 6 wells and shows no this reservoir therefore no J-functions are provided for the model.
variation in properties therefore this is kept as a fixed known. A corner point simulation/modelling fine grid was developed
SCAL data was sampled from a number of a number of core plugs for a ‘‘truth case’’ model to provide a production history. A truth
from well A, D and F, and shows variation between the braid bar case realisation of the reservoir facies, porosity and permeability
and overbank sand facies and within those facies types. was generated with particular assumptions about their distribution
Ranges of uncertainty in the Rel perm curves is shown in Fig. 4 and spatial correlation. As modelling choice, based on the provided
and can be described by a Corey model with variations in the data, is an uncertainty in the model the particulars of the truth case
10 D. Arnold et al. / Computers & Geosciences 50 (2013) 4–15

modelling approach and settings are not expanded on in this paper. geologists and engineers typically make deterministic choices in
Seismic was not used for conditioning the facies model due to its creating models of the reservoir. A single model is often selected
poor quality and confidentiality. for simulation and subsequent history matching. Therefore to
Production history for the field is generated from the truth case create the same situation we have generated a set of models
model based on a synthetic development scenario. The field devel- based on several choices for each of the key interpretational
opment plan consists of a set of horizontal production and injection elements, the modelling approach choices and grid resolution.
wells drilled and completed over a 4 year period as well as a Users can choose a combination of predefined descriptions of the
recompletion of well B as an injector. A total production history of reservoir structure, grid resolution and saturation profile, or they
20 years was produced but only 8 years of BHP, oil, gas and water can interpret their own from the data.
rates for each of the production wells were made available for history A set of key uncertainties were identified which are:
period. The field was initially controlled by a maximum fluid rate to
account for topsides handling constraints, switching to a BHP con- (1) Top structure seismic interpretation
straint once the BHP falls below the limiting pressure of 1000 psi. (2) Fault network definition
A random Gaussian noise of 15% was added to all the production data (3) Grid resolution
to simulate the impact of measurement errors in the data. (4) Cutoffs for electrofacies identification

3.2. Model uncertainties and modelling choices


Table 1
Table of uncertain choices to make in choosing a scenario for this model. The
The purpose of this paper is to create a case study that includes combination of Grid resolution, Top structure, fault model and facies cut-off
uncertainties in the interpretation of the reservoir. When pre- uncertainty leads to 81 different possible combinations of the options.
sented with these kinds of uncertainties in real field studies,
Model property Description File name

Grid 100 m by G-1 Total of 81


100 m by 5 m different
100 m by G-2 combinations of
100 m by 10 m these
200 m by G-3 properties
200 m by 5 m
Top structure 1 TS-1
2 TS-2
3 TS-3
Fault model 1 FM-1
2 FM-2
3 FM-3
Facies model 0.6 CO-1
(Cutoffs) 0.7 CO-2
0.8 CO-3
Modelling Data is provided for different possible field
approach depositional models based on different outcrop
analogues. Data from these sources could be
used in the construction of the geological model
Relative Coarse sand RP_0_1
permeability relperms RP_0_2
data RP_0_3
Fine sand RP_1_1
relperms RP_1_2
RP_1_3
Simulation 100 m by 100  100  5
model 100 m by 5 m
100 m by 100  100  10
100 m by 10 m
Fig. 3. Poro perm cross plot for Facies 0 in Well A and prediction model used in 200 m by 200  200  5
case studies Case 1 and Case 2. (For interpretation of the references to color in this 200 m by 5 m
figure legend, the reader is referred to the web version of this article.)

Fig. 4. Relative permeability data gathered from wells A and C for (a) Fine sand (facies 1) and (b) Coarse sand (facies 0). A total of 6 samples are provided, 3 for each facies
type. No data is provided for the shale facies.
D. Arnold et al. / Computers & Geosciences 50 (2013) 4–15 11

(5) Facies modelling approach The overall match quality for this parameterisation is variable
(6) Porosity/permeability prediction between wells. As can be seen in Fig. 5 the well matches for some
(7) Relative permeability wells (Well 7) is good for water and oil rates, however other wells
either have predicted earlier (Well 5) or later (Well 3) water
For each of the first 4 uncertainties we have developed a range breakthrough. Field rates are approximately correct for all para-
of possible descriptions and provide data so they can choose an meters as the over and under predictions of water breakthrough
appropriate model from a set of possible models. We have created times effectively cancel out.
a set of 81 simulation grids based on every combination of our
predefined interpretations. The truth case lies in the model space
4.2. Case 2: Zonal parameterisation case study
of these choices, though is not necessarily any one of these
models for all the uncertainties. In addition we provide the raw
Based on the results from Case 1 we identified that the well
surface and fault data and wireline logs so other simulation
behaviour was different for each well, therefore the layered
models and facies interpretations can be developed.
parameterisation with equal properties along the layer provided
For the last 3 uncertainties we provide the necessary data (rel
good matches in some wells and poor matches in others. A zonal
perm curves, poro perm data and descriptions of possible deposi-
parameterisation approach was developed to break the reservoir
tional models) for the researcher to develop a range of possible
into regions around each well.
models based on the 81 grids provided.
In this case study we divided the models into 9 different
The key uncertainties are expanded upon in a separate
sectors (see Fig. 6) around each group of wells to allow more
Appendix document provided in the download pack for this field
freedom to tune individual wells. Each sector is parameterised
study. All scenarios are listed in Table 1 for each uncertainty along
separately with an individual porosity, Kz, critical water satura-
with the relevant file name of the simulation input data.
tion and Corey oil and water components. The prior ranges
applied to all sectors are given in Table 3 with a total of 45
parameters to describe the model uncertainty.
4. Test of case study As before horizontal permeability was predicted based on an
estimated porosity permeability function based on the main sand
From the 81 pre-defined scenarios (See Appendix/download facies. We again used the same Kh prediction function used in
for all possible scenarios) created for this case study we chose two Case 1, given in Eq. (5) above.
at random to attempt history matching. We history matched a set Vertical permeability was predicted by a field-wide constant
of models, with different parameterisations using the RAVEN multiplier The same 200 m by 200 m by 5 m grid/top structure
assisted history matching software (www.Epistemy.com). A sin- 1 combination was used again in this case study to allow a more
gle set of history match results were produced for each case study direct comparison with Case 1. The model was history matched
using the in-built Particle Swarm Optimisation (PSO). NA Bayes using RAVEN’s Standard PSO algorithm run for a total of 500
was applied to the ensemble of models to create estimates of the iterations to WBHP, WWPR and WOPR for all wells. History
posterior probability, creating P10, P50 and P90 estimates of matching results are presented in Fig. 7 for Wells 7, 5 and 3a.
forecast production. A comparison of the P10–P90 estimates is Overall the match quality is no better than for Case 1 with
given at the end of the paper to show any differences due to the adequate matches for some wells (e.g. Well 7) and poor matches
choice of scenario, parameterisation and grid. in others (well 3a). Comparing the field rates for Cases 1 and
2 shows both approaches manage a reasonable field match and
4.1. Case 1: Layered model parameterisation case study both approaches produce similar quality matches (Fig. 8).
This suggests that both models are scaled appropriately for the
The first attempt at history matching was done using a layered field though heterogeneities in the real field are not captured in
model with a simple parameterisation of porosity, permeability these simple models and may well be the cause of the discre-
and relative permeability. For this attempt the 200_200_5 grid pancies in individual well matches. These results from Cases
was used with structural model 2. The 40 layers of the model 1 and 2 suggest the simple models are missing some feature of
were broken into 8 layers of constant properties, the 6 top layers the reservoir that is important in controlling the water break-
being 4 cells thick with the lower 2 layers being 8 cells thick. through in individual wells. We expect this is linked to the
A constant porosity parameter horizontal permeability was pre-
dicted based on an estimated porosity permeability function
taken from Well A and Well C data. Initially a constant function Table 2
was applied based on the main sand facies as this is the Table of parameters and prior ranges.
predominant reservoir facies. The function was predicted as
Parameter Prior range
Kh ¼ 1016:0|1:0 ð5Þ
Porosity (Layer 1–4) 0.05–0.3
Vertical permeability was parameterised by 4 multiplier para- Porosity (Layer 5–8) 0.05–0.3
Porosity (Layer 9–12) 0.05–0.3
meters, 1 for each of 4 layers 10 cells thick. Finally we para- Porosity (Layer 13–16) 0.05–0.3
meterised the relative permeability using a Oil water Corey Porosity (Layer 17–20) 0.05–0.3
function for the entire reservoir, parameterising the Corey expo- Porosity (Layer 21–24) 0.05–0.3
nents Cw and Co and Swc (critical water saturation). Porosity (Layer 25–32) 0.05–0.3
Porosity (Layer 33–40) 0.05–0.3
A total of 15 parameters were chosen for this parameterisation
PermZ (Layer 1–10) 0.01–1
and are given in Table 2. PermZ (Layer 11–20) 0.01–1
The model was history matched using RAVEN’s Flexi-PSO PermZ (Layer 21–30) 0.01–1
algorithm run for a total of 500 iterations to WBHP, WWPR and PermZ (Layer 31–40) 0.01–1
WOPR for all wells. The results are shown in Fig. 5 for a subset of Corey water component (Cw) 1–6
Corey oil component (Co) 1–6
the history matched wells. The sigma values for each well are Critical water saturation (Swc) 0.1–0.5
based on the 15% noise value for each data type.
12 D. Arnold et al. / Computers & Geosciences 50 (2013) 4–15

Fig. 5. History matching results for Case 1—layered parameterization of the Watt reservoir.

Fig. 6. Zonal divisions around each group of production wells in the model.

Table 3 out for each study therefore we could (a) increase the number of
Prior ranges for each of the 9 sectors/zones of iterations of the models and (b) run several repeated runs with
the model.
different initial seeds to allow different areas of parameter space
Parameter Prior range to be explored. Alternatively a more exploratory algorithm could
be employed such as the standard PSO algorithm used in Case 2.
Porosity 0.05–0.3 A comparison of Misfit vs. iteration for Case 1 and 2 (Fig. 9)
PermZ 0.01–1 shows the significantly faster convergence of the flexi-PSO (Case 1)
Corey water component (Cw) 1–6
Corey oil component (Co) 1–6
towards good models is countered by a very rapid refinement of the
Critical water saturation (Swc) 0.1–0.5 models into a local minima.
So over and above the differences in the model grid, para-
meterization and prior definition we see differences in the results
complexity of the real field case controlling the influx of water from the sampling algorithm behaviour. Therefore we may well
that cannot be expressed by simple poro perm representations of want to run for different algorithms as well as different scenarios
the reservoir. of the model for appropriate numbers of iterations.
Additionally there is a potential issue around the number of Overall the performance of our trial parameterisations is
runs carried out. Only a single run of 500 iterations was carried limited in predicting individual well performance and the simple
D. Arnold et al. / Computers & Geosciences 50 (2013) 4–15 13

Fig. 7. Production history matches for Case 2 parameterisation for Wells 7, 5 and 3A.

Fig. 8. Comparison of field rates for Case 1 (bottom) and Case 2 (top).

engineering approach may not lead to good forecasts however the 5. Conclusions
model does have the capacity to match on field rates reasonably
well. Improved parameterisations of the field may well lead to The Watt Field case study was designed to test the influence
better history matches, possibly suggesting that more geological on history matching and uncertainty quantification of different
complexity should be included in the model to better capture parameterisations, structural models, interpretations and model
behaviour. building methods. The scenarios developed were designed to
14 D. Arnold et al. / Computers & Geosciences 50 (2013) 4–15

large number of possibilities with this benchmark we have only


scratched the surface of the modelling that could be done. The
aim of this case study is to promote new techniques for inter-
polating between scenarios and to deal with regridding issues to
allow more geological variability to be included in the history
matching and uncertainty quantification process. We expect a
range of techniques could be applied (some referenced in this
paper) to account for this range of possibilities. We hope that this
will prompt a number of groups to attempt this problem to create
techniques that better cover the qualitative uncertainties in
geological models. Technique such as Bayesian Model Averaging
(BMA) would be suitable to deal with this problem and provides a
possible route to bringing together a number of match attempts
using different benchmark scenarios and different modelling/
parameterisation approaches.
The benchmark case study can be possibly used to guide
sensitivity studies for the identification of factors influencing
development decisions. A sensitivity study is a good way to
evaluate the robustness of the model response, however, the
plausibility of the models obtained from a sensitivity study must
be evaluated. Experimental design and response surface methods
conventionally used in sensitivity analysis need to be applied
across multiple scenarios. Furthermore, factors with more impact
in one scenario may have less impact in another especially if their
impact has a compensating nature. For instance, continuity of a
shale barrier between the horizontal sand layers would have a
major impact on vertical flow, in the absence or fragmental
Fig. 9. Model convergence over 500 iterations for 2 different algorithm config- presence of the shale barrier the spatial variation of the kv/kh
urations for Cases 1 (red squares) and 2 (blue diamonds). Case 1 used flexi-PSO
which produces a better overall match but is more aggressive as illustrated in the
ratio can bring the major impact.
refined view in (b). Case 2 used a more exploratory PSO configuration and is Furthermore, the accurate definition of priors for some of these
therefore less refining over the same 500 iterations. (For interpretation of the qualitative geological uncertainties will be a challenge and while
references to colour in this figure legend, the reader is referred to the web version we can assume that all scenarios are equally likely, geological
of this article.)
knowledge may well lead us to believe that some models are
more likely than others. Assigning a qualitative/judgment based
cover key uncertainties in the reservoir model and require the prior probability to the model may require the input from several
modeller to make decisions around which is the appropriate geologists/engineers to reduce bias in the estimate.
model selection. There are many choices to make in this bench-
mark problem and many ways in which the reservoir properties
can be modelled spatially thus we expect future work on this field Acknowledgements
to include a large number of modelling techniques to be applied
to a range of model scenarios. We would like to thank JIP sponsors of the Uncertainty Project
Here we have attempted only 2 simple and non-geological for their support in funding the ongoing work in uncertainty in
parameterisations of the reservoir. For Case 1 we attempted a layer Heriot-Watt, Schlumburger and Epistemy for providing the Petrel,
based parameterisation, providing a general description of the Eclipse and Raven software and the Heriot-Watt IPE MSc courses
porosity and permeability in 8 defined layers. The idea here is that and data sponsors of that course for providing some of the data
the model has a high NTG and sand properties are fairly constant used in this field. We greatly appreciate the review comments
along the reservoir. The speed of water influx into the wells from from Jef Caers and the other reviewer that helped to improve the
below is controlled by vertical permeability values for 4 layers. paper and tie in ideas.
For Case 2 we zoned the model into 9 sectors, 1 for each well that
were controlled by their own porosity, Kv and relative permeability
parameter. Here we saw little improvement in the overall quality of Appendix A. Supporting information
the history match with a similar match results in the wells.
For all cases the reservoir complexity makes simple parame- Supplementary data associated with this article can be found
terisations of the reservoir less useful in matching to individual in the online version at http://dx.doi.org/10.1016/j.cageo.2012.09.
wells, though the field matches were adequate. Individual well 011.
rates in both the engineering cases did not provide good matches
to all wells, though some wells matched quite well. This is
something we observe in real fields where regional complexity
in the field is not captured by the model. References
The work so far at matching this model is very early stage. At
Arnold, D., 2009. Geological Parameterisation of Petroleum Reservoir Models for
this point we have only made a couple of attempts at matching
Improved Uncertainty Quantification. Heriot-Watt University Ph.D. Thesis. 198 pp.
the model and not yet carried out any Bayesian inference of the Bond, C.E., Gibbs, A.D., Shipton, Z.K., Jones, S., 2007. What do you think this is?
models this being part of a planned future attempt at this Conceptual uncertainty in geoscience interpretation. GSA Today 17 (11), 4–10.
problem. Our intention for this work is to revisit the study to Caers, J., 2011. Modeling Uncertainty in the Earth Sciences. Wiley, p. 6.
Castro, S.A., Caers, J., Mukerji, T., 2005. The Stanford VI Reservoir.’’ 18th Annual
investigate the differences between the different scenario choices Report. Stanford Center for Reservoir Forecasting. Stanford University, May
in terms of predictions and uncertainty inferences. As there are a 2005.
D. Arnold et al. / Computers & Geosciences 50 (2013) 4–15 15

Challenor P., Tokmakian, R., 2011. Modelling future climates. In: Christie, M., Cliffe, Mohamed, L., Christie, M., Demyanov, V., 2010b. Comparison of stochastic
A., Dawid, P., Senn, S., (Ed.), Simplicity, Complexity and Modelling, Wiley and sampling algorithms for uncertainty quantification. SPE Journal 15 (1),
Sons, Ltd. pp. 69–81 (Chapter 5). 31–38, SPE-119139-PA, March 2010.
Christie, M.A., Blunt, M.J., 2001. Tenth SPE comparative solution project: a Okano, H., Pickup, G., Christie, M., Subbey, S., Sambridge, M., Monfared, H., 2005.
comparison of upscaling techniques. SPE Reservoir Engineering and Evaluation Quantification of uncertainty in relative permeability for coarse-scale reser-
4, 308–317. voir simulation. SPE Journal 1, 11, SPE94140.
Christie, M.A., Glimm, J., Grove, J.W., Higdon, D.M., Sharp, D.H., Wood-Schultz, Oliver, D.S., Reynolds, A.C., Ning Liu, N., 2008. Inverse Theory for Petroleum
M.M., 2005. Error analysis and simulations of complex phenomena. Los Reservoir Characterization and History Matching. Cambrige University
Alamos Science 29, 6–25. Press. p. 270.
Christie, M., Cliffe, Andrew, Philip, D., Senn, S.S. (Eds.), 2011. Wiley, p. 31. O’Sullivan, A., Christie, M., 2005. Error models for reducing history match bias.
Christie, M., Demyanov, V., Erbas, D., 2006. Uncertainty quantification for porous Computational Geosciences 9, 125–153.
media flows. Journal of Computational Physics, 217 (1), September 2006, pp. Peters, L., Arts, B., Brouwer, G.K., et al., 2010. Results of the Brugge benchmark
143–158. study for flooding optimization and history matching. SPE Reservoir Evalua-
Demyanov, V., Christie, M., Subbey, S., 2004. Neighbourhood Algorithm with tion & Engineering 13 (3), 391–405.
Geostatistical Simulations for Uncertainty Quantification Reservoir Modelling: Pickup, G.E., Valjak, M., Christie, M.A., 2008. Model complexity in reservoir
PUNQ-S3 Case study. Presented at 9th European Conference on Mathematics simulation. In Presented at the 11th European Conference on the Mathematics
in Oil Recovery ECMOR IX 2004. Cannes, France, September 2004. of Oil Recovery. Bergen, Norway, September 2008.
Erbas, D., Christie, M., 2006. ‘‘How does Sampling Strategy Affect Uncertainty Rankey, E.C., Mitchell, J.C., 2003. Interpreter’s corner that’s why it’s called
Estimations?,’’ Oil & Gas Science and Technology. interpretation: impact of horizon uncertainty on seismic attribute analysis.
Erbas, D., Christie, M.A., 2007. Effect of sampling strategies on prediction The Leading Edge 22, 820.
uncertainty estimation. SPE 106229, 8. Rojas, T., Demyanov, V., Christie, M., Arnold, D., 2011. Use of Geological Prior
Evensen, G., Hove, J., Meisingset, H.C., Reiso, E., Seim, K.S., Espelid O., 2007. Using
Information in Reservoir Facies Modelling. In: Marschallinger and Zobl (Eds.),
the EnKF for Assisted History Matching of a North Sea reservoir model.
Proceedings of the IAMG 2011. Salzburg, Austria. pp. 266–285.
SPE106184-MS. SPE Reservoir Simulation Symposium. 26–28 February 2007,
Sambridge, M., 1999a. Geophysical inversion with a neighbourhood algorithm-I.
Houston, Texas, USA.
Searching a parameter space. Geophysical Journal International 138 (2),
Floris, F.J.T., Bush, M.D., Cuypers, M., Roggero, F., Syversveen, A.R., 2001. Methods
479–494.
for quantifying the uncertainty of production forecasts: a comparative study.
Sambridge, M., 1999b. Geophysical inversion with a neighbourhood algorithm-II.
Petroleum Geoscience, S87–S96.
Appraising the ensemble. Geophysical Journal International 138 (3), 727–746.
Hajizadeh, Y., Christie, M., Demyanov, V., 2009. Application of Differential Evolu-
Sambridge, M., Gallagher, K., Jackson, A., Rickwood, P., 2006. Trans-dimensional
tion as a New Method for Automatic History Matching. SPE 127251, Kuwait
inverse problems, model comparison and the evidence. Geophysical Journal
International Petroleum Conference and Exhibition. 14–16 December 2009,
International 167 (2), 528–542.
Kuwait City, Kuwait.
Scheidt, C., Caers, J., 2008. A new method for uncertainty quantification using
Hajizadeh, Y., Christie, M., Demyanov, V, 2010. Comparative Study of Novel
Population-Based Optimization Algorithms for History Matching and Uncer- distances and kernel methods: application to a deepwater turbidite reser-
tainty Quantification:PUNQ-S3 Revisited. SPE 136861, Abu Dhabi International voir.’’. SPE Journal 14 (4), 680–692.
Petroleum Exhibition and Conference. Abu Dhabi, UAE, 1–4 November 2010. Seiler, A., Aanonsen, S.I., Evensen, G., Lia, O., 2010. An Elastic Grid Approach for
Kim, Ch.-S., 2002. New Uncertainty Measures for Predicted Geological Properties Fault Uncertainty Modelling and Updating Using the Ensemble Kalman Filter.
from Seismic Attribute Calibration. In Soft Computing for Reservoir Character- SPE 130422-MS.
ization and Modeling, ed. Wong P., Aminzadeh F., M/Nikravesh. Sun, A.Y., 2011. Identification of geologic fault network geometry by using a grid-
Liu, N., Betancourt, S., Oliver, D.S., 2001. Assessment of uncertainty assessment based ensemble Kalman filter. Journal of Harardous Toxic and Radioactive
methods. SPE, 71624. Waste 15 (4), 228.
Mariethoz, G., Renard, P., and Caers, J., 2010. Bayesian inverse problem and Suzuki, S., Caumon, G., Caers, J., 2008. Dynamic data integration for structural
optimization with iterative spatial resampling. Water Resources Research modeling: model screening approach using a distance based model parame-
vol. 46. W11530, 17 pp., http://dx.doi.org/10.1029/2010WR009274. terisation. Computational Geosciences 12, 105–119.
Mohamed, L., Christie, M.A., Demyanov, V., 2006. Comparison of stochastic Tarantola, A., 2005. Inverse Problem Theory and Model Parameter Estimation.
sampling algorithms for uncertainty quantification. SPE, 119139. SIAM, p. 13.
Mohamed, L., Christie, M., Demyanov, V., 2009. Comparison of stochastic sampling Wood, R., Curtis, A., 2004. Geological prior information, and its applications to
algorithms for uncertainty quantification. SPE 119139, SPE Reservoir Simula- geoscientific problems. In: Curtis, A., Wood, R. (Eds.), Geological Prior Informa-
tion Symposium. The Woodlands, USA, 2–4 February. tion: Informing Science and Engineering. Geological Society, London Special
Mohamed, L., Calderhead, B., Filippone, M., Christie, M., Girolami, M., 2010a. Publications, London, pp. 239.
Population MCMC Methods for history matching and uncertainty quantifica- Zadeh, L.A., 2002. Toward a perception-based theory of probabilistic reasoning
tion, B012. In: Proceedings of the 12th European Conference on the Mathe- with imprecise probabilities. Journal of Statistical Planning and Inference 105,
matics of Oil Recovery. 6–9 September, Oxford, UK. 233–264.

You might also like