Professional Documents
Culture Documents
Lecture 1
Lecture 1
Introduction
Petroleum reservoir management is a dynamic process that recognizes the
uncertainties in reservoir performance resulting from our inability to fully
characterize reservoirs and flow processes. It seeks to mitigate the effects
of these uncertainties by optimizing reservoir performance through a
systematic application of integrated, multidisciplinary technologies. It
approaches reservoir operation and control as a system, rather than as a set
of disconnected functions. Reservoir management has advanced through
various stages in the past 30 years. The techniques are better, the
background knowledge of reservoir conditions has improved, and the
automation using mainframe computers and personal computers has helped
data processing and management.
reservoir management process consists of three stages:
1 .Reservoir description;
2 .Reservoir model argument;
3. Reservoir performance, well performance and field development
The primary objective in a reservoir management study is to determine the
optimum conditions needed to maximize the economic recovery of
hydrocarbons from a prudently operated field.
Reservoir
Engineer
Reservoir
Operations &
Facilities Management Geophysicist
Team
Production Drilling
Engineer Engineer
Economist
Fig. 1.2 Example of an integrated reservoir modeling of a slope channel system. (a) An integrated
display of the reservoir model, along with the 3D seismic data and geological and reservoir
properties. (b) Three cross sections of sand probability. (c) One layer of the 3D lithofacies model.
2
Using Geostatistics in Modeling
Just decades ago, statistics was a narrow discipline, used mostly by
specialists. The explosion of data and advances in computation in the last
several decades have significantly increased the utilization of statistics in
science and engineering. Despite the progress of quantitative analysis of
geosciences, there is still a lack of using statistics in quantitative
multidisciplinary integration. Many geoscience problems are at the core of
statistical inference. Traditional probability and statistics are known for
their frequency interpretation and analysis. Most statistical parameters and
tools, such as mean, variance, histogram, and correlation, are defined using
frequentist probability theory. Geostatistics, a branch of spatial statistics,
concerns the characterization and modeling of spatial phenomena based on
the descriptions of spatial properties. The combination of using both
traditional statistics and geostatistics is critical for analyzing and modeling
reservoir properties. In big data, we are often overwhelmed by information.
In fact, what is important is not information itself, but the knowledge of
how to deal with information. Data analytics and integration of the
information are the keys. Data cannot speak for itself unless data analytics
is employed. In big data, everything tells us something, but nothing tells us
everything. We should not completely focus on computing capacities;
instead, we should pay more attention to data analytics, including queries
of data quality, correlation analysis of various data, causal inference, and
understanding the linkage between data and physical laws.