Professional Documents
Culture Documents
AI Deepwater Horizon Spe-191613 Mohaghegh2018 PDF
AI Deepwater Horizon Spe-191613 Mohaghegh2018 PDF
This paper was prepared for presentation at the 2018 SPE Annual Technical Conference and Exhibition held in Dallas, Texas, 24-26 September 2018.
This paper was selected for presentation by an SPE program committee following review of information contained in an abstract submitted by the author(s). Contents
of the paper have not been reviewed by the Society of Petroleum Engineers and are subject to correction by the author(s). The material does not necessarily reflect
any position of the Society of Petroleum Engineers, its officers, or members. Electronic reproduction, distribution, or storage of any part of this paper without the written
consent of the Society of Petroleum Engineers is prohibited. Permission to reproduce in print is restricted to an abstract of not more than 300 words; illustrations may
not be copied. The abstract must contain conspicuous acknowledgment of SPE copyright.
Abstract
Petroleum Engineering and Analytics Research Laboratory at West Virginia University (PEARL) played a
key role in estimating the amount of oil being discharged into the Gulf of Mexico during the Deepwater
Horizon disaster in 2010. PEARL's calculation of the amount of oil that was being discharges into the Gulf of
Mexico were based on the Smart Proxy technology. Smart Proxy is a unique an innovative implementation
of Artificial Intelligence and Machine Learning to develop fast and accurate proxy models of high fidelity
numerical simulations that do not compromise the complex physics and the high resolution of the original
numerical simulation models. Smart Proxy modeling was implemented by many of PEARL's scholars
for their Ph.D. dissertations. Execution of the Smart Proxy models takes from fractions of a second to
few minutes on a PC workstation (laptop) to accurately replicate detail results of high fidelity numerical
simulation models that take hours or days to run on High Performance Computational facilities (HPC -
clusters of large number of CPUs or GPUs).
As a subcontractor to the U.S. Department of Energy, National Energy Technology Laboratory (NETL),
PEARL was asked to join a team of scientists and engineers from across the United States from multiple
DOE national Labs in order to start an effort to estimate the amount of oil that was being discharged in the
Gulf of Mexico. Since the initial BP's estimate of 1,000 barrels per day seemed an under-estimation (this
estimate was later increased to 5,000 barrels per day after the initial estimate were challenged by the U.S.
government), the Obama administration asked the secretary of Energy (the Noble Laureate, Steven Chu) to
set up a team for providing a realistic estimate of the oil discharge rate into the GOM.
This article presents a summary the efforts by PEARL in estimating the oil discharge rate into the GOM
during the Deepwater Horizon disaster. PEARL's use of Smart Proxy modeling of numerical reservoir
simulation models coupled with detail numerical flow in pipe and network models allowed the team to
generate millions of scenarios in order to exhaustively examine and quantify the uncertainties associated
with all that was unknown and were being estimated at the time of analyses.
The final estimate of the oil discharge rate into the GOM calculated by PEARL proved to be highly
accurate once the well was contained and the flow was actually measure providing a valuable use-case of
the application of the application of Artificial Intelligence and Machine Learning in the upstream oil and
gas industry.
2 SPE-191613-MS
Figure 2—Pictures of the oil discharge from the riser on the seafloor in the Gulf of Mexico.
After comprehensive studies and analyses that were performed by scientist in the Flow Rate Technical
Group (FRTG), using a large number of different approached, it was finally estimated that the initial flow rate
was 62,000 barrels per day. The total estimated volume of leaked oil approximated 4.9 million barrels (210
million US gal; 780,000 m3) with plus or minus 10% uncertainty, including oil that was collected, making
it the world's largest accidental spill. These figures were challenged by BP upon their release. However,
after the containment of the spill and actual measurement of the flow rate the discharge rate was measured
to be approximately 57,400 barrels per day.
4 SPE-191613-MS
"Additional estimates of the flow rate were derived when the well was shut in for the well integrity test
on July 15, 2010. The mechanism for shutting in the well was to close off the flow with a three-ram capping
stack that was mated with the upper flange of the LMRP on the top of the BOP. Government scientists in
Houston had requested that the capping stack be equipped with redundant pressure gauges. When the choke
valve in the capping stack was throttled back in a series of precisely controlled steps to close off the well,
pressure readings from the capping stack taken at the time were analyzed by three separate Department of
Energy (DOE) laboratories to yield very consistent results for the flow rate of the well at the time of shut
in: 53,000 BPD" [1].
been published on this specific topic (application of Artificial Intelligence and Machine Learning in the
upstream oil and gas industry) initiated from the research and development at the Petroleum and Natural
Gas Engineering (PNGE) department at West Virginia University. During this time period PEARL (and
its predecessors) have graduated more than 15 Masters and 10 Ph.D. scholars. Currently, several M.S. and
Ph.D. candidates are involved in the research and developments at PEARL.
At the time of Deepwater Horizon incident, several of PEARL's projects were being sponsored by U.S.
Department of Energy, specifically at the National Energy Technology Laboratory (NETL). During this
time PEARL was a subcontractor to the NETL on multiple projects. Several project managers and team
leaders at NETL that later became the main leaders of the DOE's Flow Rate Technical Group were in close
contact with PEARL and were well aware of the quality of the research and development team at PEARL.
The PEARL team was asked to use its expertise in numerical simulation and Artificial Intelligence and
Machine Learning in order to help NETL and other national labs that were involved in the process in
order to estimate the amount of oil that was being discharged into the Gulf of Mexico as the result of the
Deepwater Horizon incident. Members of the PEARL that contributed to the results of this project included
Vida Gholami, Saeed Zargari, Ognjen Grujic, and Camilo Calderon.
The PEARL team recommended to use its latest developments of application of Artificial Intelligence
and Machine Learning technology in simulation and modeling called the "Smart Proxy Modeling". After
presenting the idea and the details of the "Smart Proxy Modeling" to the team leaders at NETL, they agreed
with the approach. Two Smart Proxy models were developed, one for the numerical reservoir simulation
mode and one for the flow in the pipes and the network. The two Smart Proxy models were coupled and
executed while access was provided to tens of parameters that were uncertain and a change in each one of
them would completely change the coupled model's outcome. The Coupled Smart Proxy Model (CSPM)
would ran in fractions of a second and therefore, make it possible to make tens of millions of coupled
simulation runs.
Figure 4—The schematic diagram of the wellbore casing, tubing, drill pipe, and the Blow Out Preventer (BOP).
The same simplification technique (estimation of reservoir contribution in the form of oil rate to the flow
in the in pipes and the network) was initially used in this study as well. However, as the importance and
the influence of such boundary condition in calculating the final discharge rate at the seafloor became more
and more obvious and as the time progressed and more and more data was being provided to the research
and development teams (on a daily basis and sometimes on an hourly basis) it became clear that estimating
reservoir contribution (oil rates) as a boundary condition to the simulation and modeling of the flow in
pipes and the network using simple analytical approaches would be an over-simplification that will result
in unrealistic outcomes.
At this point it was decided to perform numerical reservoir simulation using the most updated data that
was being provided by BP. Furthermore, the numerical reservoir simulation model had to be updated as
new data would emerge and be shared with the national labs. The numerical reservoir simulation for the
Maconda was developed and was updated regularly as new data and information were becoming available
on a regular bases. Any new piece of information that would become available would impact the outcome
of the numerical reservoir simulation model and then would then directly impact the pipe and network
simulation model and the final outcome of the study (a rate of the oil discharge at the seafloor).
SPE-191613-MS 7
Furthermore, a detail simulation of the flow in pipes and network was also developed. This simulation
model started at the interface of the wellbore with the reservoir to the end of the riser on the seafloor where
the oil was being discharged. The simulation of the flow in the pipes and its network included the casing,
the drill-pipe, the Blow-Out Preventer (BOP) and the riser. Many unknowns were included in this process
that had to be taken into account including the buckling of the drill-pipe and its role in the flow of the fluid
throughout the wellbore, the state of the blow-out preventer and the role it was playing in the amount of
fluid that was going through it and the impact of annular space, pipe rams, annular preventers, and the blind
shear rams, and finally the way the riser had reacted and buckled and finally the way it was laying down
on the seafloor. Also included in the simulation mode was the possible role of the drill pipe throughout the
network including in the riser.
Figure 4 through Figure 7 shows the schematic diagrams explaining how the pipes and the network flow
simulation model was constructed. As it is shown in these figures the pipes and the network flow simulation
model included the casing, the drill pipe, the BOP and the riser. Since very little was known about the details
that needs to be modeled certain assumptions needed to be made and multiple cases needed to be studied.
The four main cases that were included in this study are shown in Figure 5. Each of the cases shown in this
figure included a large number of uncertainties about the inner working and the interactions of the casing,
the drill pipe, the BOP and the riser on each other and their potential impact on one another. Therefore,
the uncertainties associated with all the parameters that were associated with the drill pipe, the BOP and
the riser and their interactions and impact had to be quantified. This uncertainty quantification becomes
more complex once it is noted that large number of uncertainties were also associated with the numerical
reservoir simulation models.
8 SPE-191613-MS
Figure 6—The general schematic of the coupled reservoir-wellbore-BOP-riser simulation model. Separate
simulation models that were generated for the reservoir and the rest of the system. Smart Proxy models were
generated for each of the simulation models and then the Smart Proxy models were coupled into one complete
and highly accurate (when compared to numerical simulation models) Smart Proxy model with very small
computational footprint (thousands of coupled Smart Proxy model would be executed in just a few seconds).
SPE-191613-MS 9
The process of completing the modeling process and uncertainty quantification that is demonstrated in
Figure 8 include the following steps:
1. Developing the Numerical Reservoir Simulation model,
2. Building a Smart Proxy Model for the Numerical Reservoir Simulation model,
3. Developing the simulation model for the flow in pipes and the network,
4. Building a Smart Proxy Model for the s Smart Proxy for the,
5. Coupling the Smart Proxy for the Numerical Reservoir Simulation with the Smart Proxy for the Smart
Proxy for the
6. Identifying the number of uncertain parameters that are involved in each of the model and therefore,
in the Coupled Smart Proxy Model (CSPM),
7. Identifying the range of the of uncertain parameters (the min., the max., and the most likely values),
8. Preforming Monte Carlo Simulation, using the Coupled Smart Proxy Model (CSPM) as the objective
function,
9. Identifying the P10, P50, and P90 of the Coupled Smart Proxy Model (CSPM) outcome, i.e. the oil
discharge rate at the seafloor.
Figure 8—Workflow used for the development of the Coupled Smart Proxy Modeling (CSPM).
Development Planning (FDP), Production and Recovery Optimization (PRO), or Uncertainty Quantification
(UQ) that is the subject of this particular study. These shortcomings can seriously impact studies such as
oil spill rate estimation in the aftermath of the Deepwater Horizon incident that is overwhelmed by large
number of uncertainties. These shortcomings can be summarized as follows:
1. Numerical simulation models are uncertain by nature and their effective use for FDP, PRO, and UQ
require incorporation of technologies that allow such uncertainties to be quantified.
2. Numerical simulation models that are constructed for non-academic, real case scenarios are very large
(millions or tens of millions of cells). Therefore, these models have massive computational footprints.
This fact limits their practical utilization for uncertainty analysis and quantification and for detail field
studies, development planning, and optimization.
The industry have been well aware of these shortcomings and during the past several decades have spent
a considerable amount of efforts to address them. The response of the industry has been to develop proxy
models to address the above-mentioned issues with the numerical simulation models. The proxy models
that have been developed can be divided into the following major categories:
a. Reduced Order Models - ROM: Numerical simulation models, especially those with formulations
to model complex physics that have been incorporated into the simulation models. The inclusion of
such complex physics sometimes requires smaller grid blocks (cells) in order to accommodate proper
convergence of the numerical techniques that are used to solve the complex, non-linear, and higher
order, partial differential equations involved in the fabric of such models.
In order to accomplish their objective of reducing the computational footprint of the numerical
simulation models, the Reduced Order Models (ROM) usually incorporate one or both of the following
techniques:
i. They tend to simplify/reduce the physics used to build the model,
ii. They attempt to reduce the model's resolution in space and time. Sometimes to accomplish
this, they must incorporate a reduction in the physics of the problem as well.
b. Statistical Response Surfaces - SRS: Here we emphasize the word "Statistical" to draw attention
to the fact that the common response surfaces development techniques use traditional statistics (not
machine learning) as their base. The main differences between traditional statistics and Artificial
Intelligence and Machine Learning are much deeper than some may believe and/or admit. Although
these differences show themselves in the techniques used in order to accomplish their objectives, their
main differences stem from their philosophical approach to problemsolving. The differences are so
deep seeded that may be referenced to the dissimilarities between Aristotelian versus the Platonic
view of the world.
The Smart Proxy Modeling that was used in this study is capable of overcoming the abovementioned
shortcomings of the numerical simulation models through increasing their run-time speed (shortening their
computational footprints) by multiple orders of magnitude without compromising the incorporated complex
physics in the numerical simulation and without decreasing its resolution in time and space.
Many reservoir/production engineers and reservoir/production modelers refer to this claim as "too good
to be true". They correctly refer to the fact that "you can't have your cake and eat it too" and the fact that
there is only so much one can do to push the envelope in creating better, more efficient algorithms or using
High-Performance Computers (HPC including GPUs) to speed up the execution of traditional numerical
simulation models. This is true as long as you are operating within the same paradigm that was used to
develop the numerical simulation, i.e. computational science. That is the main reason behind the lack of
major success in this area by those that are incorporating mathematical, statistical, and/or other approaches
including multiple versions of Principle Component Analysis (PCA) and Proper Orthogonal Decomposition
SPE-191613-MS 11
(POD). While these techniques have proven to enhance the speed and efficiency of the numerical simulation
models, they are incapable of the type of modifications that would transform this technology to new levels
of computational footprint, such as being able to perform complete, high fidelity simulation and modeling
on your smartphone, tablets, or laptops in seconds.
The reason behind the success of the Smart Proxy Modeling is the fact that this technology incorporates
a completely different paradigm in problem-solving. Instead of Computational Science, Smart Proxy
Modeling uses Artificial Intelligence and Machine Learning to teach the detailed physics of reservoir and
production engineering to open computer algorithms using the vast amount of data that is generated by the
complex, high fidelity numerical simulation models. A good example and analogy of this technology are
the driverless cars that theoretically could have been developed by numerical simulation methods but would
have never been able to be practiced in real-time. It was the paradigm shift in problem-solving (a shift from
computational science to data-driven analytics) that made the driverless cars a reality.
Figure 9—Sample of results generated by the Coupled Smart Proxy Modeling technology
SPE-191613-MS 13
Using Artificial Intelligence and Machine Learning to solve Oil and Gas related problems has a long
history in the Petroleum and Natural Gas Engineering department at West Virginia University. Developing
Smart Proxy Models for wellbore/surface facility models [3] and for the Numerical Reservoir Simulation
models [4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19] [20][21][22] had been practiced in the
past with highly successful outcomes. Details of how Smart Proxy models are developed has been published
in large number of papers as are referenced in this article and a in a recently published book [23], therefore
these details will not be covered again here.
Results of this study are shown in the Figure 9, Figure 10, and Figure 11. As can be seen in these
figures comprehensive Monte Carlo Simulation was performed using the Coupled Smart Proxy Model as
the objective function. During the Monte Carlo Simulation large number of uncertain parameters were
analyzed simultaneously using ranges that have been identified for each of them and assigned different
types of distributions and then the Coupled Smart Proxy Model was executed thousands of the time and
the results that were generated as a distribution of the oil discharge rates were used to identify the P10,
P50, and P90. This process was repeated for multiple scenarios (as shown in Figure 5) and the final results
for all possible cases were presented to the leaders of the Flow Rate Technical Group at National Energy
Technology Laboratory.
Figure 10—Sample of some of the results generated by the Coupled Smart Proxy Modeling technology. It is
shown that the P50 of three of the scenarios that had been generated by the Coupled Smart Proxy Modeling
technology have calculated the Oil Discharge rate to be 76,220 BOPD, 58,217 BOPD, and 52,423 BOPD.
The final results, part of which are presented in Figure 9 and Figure 10 were then used to perform more
detail analyses such as generating Type Curves, as demonstrated in Figure 11. The Type Curves demonstrate
the sensitivity of calculated discharged oil rate as a function of different uncertain parameters. Very large
number of such graphs were generated using a large number of changing parameter in order to develop
a good understanding of how the numerical reservoir simulation and modeling and the flow in pipes and
network simulation were treating different parameters that were involved in the analyses. As the result
of these analyses the PEARL team would learn how to treat the uncertainties associated with different
parameters during its analyses. The more sensitive parameters would be treated with more care and more
attempt would be made to better understand the data that was communicated with the team during the course
of this project.
As was mentioned before the final estimate of the oil being discharged from the riser to the seafloor that
was presented to the Flow Rate Technical Group by the PEARL team was the range between 50,000 and
55,000 barrels of oil per day. The Smart Proxy modeling using Artificial Intelligent and Machine Learning
was developed using the software applications that were provided to the PEARL group by Intelligent
Solutions, Inc. (WWW.IntelligentSolutionsInc.com).
14 SPE-191613-MS
Figure 11—The final version of the Coupled Smart Proxy Modeling technology was able to
generate Type Curves to analyze the impact of the uncertain parameters on the oil discharge rate.
SPE-191613-MS 15
Figure 12—U.S. Secretary of Energy, Nobel Prize winning Physicist, Steven Chu, honoring WVUPEARL's director,
Shahab D. Mohaghegh for his contribution to the Flow Rate Technical Group (Deepwater Horizon incident).
Upon completion of the project and the final results that later proved to be highly accurate, the Flow
Rate Technical Group, including the director of PEARL at West Virginia University were honored by the
U.S. Secretary of Energy, Steven Chu (Figure 12 and Figure 13) in an award ceremony held at the U.S.
Department of Energy headquarters in Washington D.C.
Concluding Remarks
Artificial Intelligence and Machine Learning will be one of the most important contributers to the oil and
gas industry's future. The contribution of this technology will become even more impactful as it is used to
address the most complex problems that the oil and gas industry are facing. Today, Artificial Intelligence and
Machine Learning can build highly complex, large, full field reservoir models and history match production
of highly complex mature reservoirs that large number of teams of reservoir engineering and reservoir
modeling experts from top national and international oil companies have been unble to history match
using traditional numerical reservoir simulation models [24]. This technology has successfully been used
to address, analyze, predictively model and optimize complex completions and the hydraulic fracturing
process in the unconventional resource [25], a problem that traditional techniques have failed to address,
properly, effectively, and efficiantly.
While Artificial Intelligence and Machine Learning are enjoying unprecedented enthusiasm from the
younger generation of petroleum engineers, the technology is still questioned in its delivery of impactful
results mainly by those that are highly dedicated to the traditional approaches to engineering problem solving
and have little interest in fundamental changes in how engineering probleams can be approached, addressed,
and solved. This group of individuals are highly supported (although unintentioanally) by those petroleum
engineering domain experts that have little to no understanding of the Artificial Intelligence and Machine
Learning technology but neverthe less try to use it to solve problem and end up treating Artificial Intelligence
and Machine Learning as a regression tool, and others with high expertise in Artificial Intelligence and
Machine Learning but little to no understanding of the petroleum engineering domain expertise that is a
fundamental requirement for successful use of this technology in our industry. Both these groups contribute
to the hype and generate, at best, mediocre results, spending large amount of resources and at the end of the
day contribute to the traditionalists’ arguments against Artificial Intelligence and Machine Learning.
This paper, summarizing the contribution of Artificial Intelligence and Machine Learning in developing
Smart Proxy models that were able to accurately estimate oil discharge rate in such a highly complex
situntion, along with many of the articles and books mentioned in the reference, demonstrate the capabilities
of Artificial Intelligence and Machine Learning in solving highly complex oil and gas related problem.
What is required to be successful in this areana is (a) domain expertise in the oil and gas industry and (b)
the willingness to dedicate yourself to the art and science of Artificial Intelligence and Machine Learning
and become an expert in the field.
References
1. Marcia K. McNutt, Rich Camilli, Timothy J. Crone, George D. Guthrie, Paul A. Hsieh, Thomas
B. Ryerson, Omer Savas, and Frank Shaffer; "Review of flow rate estimates of the Deepwater
Horizon oil spill". PNAS (Proceedings of the National Academy of Science) December 11, 2012.
109 (50) 20260–20267; https://doi.org/10.1073/pnas.1112139108
2. Mohaghegh, S., Arefi, G. R., and Ameri, S.; "Design and Development of an Artificial Neural
Network for Estimation of Formation Permeability." SPE 28237, Proceedings, SPE Petroleum
Computer Conference. July 31- August 3, 1994, Dallas, Texas.
3. Mohaghegh, S., Hutchins, L. and Sisk, C.; "Prudhoe Bay Oil Production Optimization: Using
Virtual intelligence Techniques, Stage One: Neural model Building." SPE 77659, Proceedings,
2002 SPE Annual Conference and Exhibition. September 29-October 2, San Antonio, Texas.
4. Mohaghegh, S. D., Modavi, A., Hafez, H., Haajizadeh, M., Kenawy, M., and Guruswamy, S.;
"Development of Surrogate Reservoir Models (SRM) For Fast Track Analysis of Complex
Reservoirs." SPE 99667, Proceedings, 2006 SPE Intelligent Energy Conference and Exhibition.
11-13 April 2006, Amsterdam, the Netherlands.
SPE-191613-MS 17
18. Shahkarami, A., Saint Francis University, Mohaghegh, S.D., Gholami, V., West Virginia
University, and Bromhal, G., National Energy Technology Laboratory; "Application of Artificial
Intelligence and Data Mining Techniques for Fast Track Modeling of Geologic Sequestration of
Carbon Dioxide - Case Study of SACROC Unit." SPE 173406, SPE Digital Energy Conference
and Exhibition. The Woodlands, Texas, 3-5 March 2015.
19. Mohaghegh, S. D., Intelligent Solutions, Inc. & West Virginia University, Abdulla, F., and
Abdou, M. ADCO, Gaskari, R., and Maysami, M., Intelligent Solutions, Inc.; "Smart Proxy: an
Innovative Reservoir Management Tool; Case Study of a Giant Mature Oilfield in the UAE." SPE
177829, ADIPEC - Abu Dhabi International Petroleum Exhibition and Conference. Abu Dhabi,
UAE, 9-12 November 2015.
20. Faisal Alenezi, and Mohaghegh, S. D.; "A Data-Driven Smart Proxy Model for A
Comprehensive Reservoir Simulation". 2016 4th Saudi International Conference on
Information Technology (Big Data Analysis) (KACSTIT). Riyadh, Saudi Arabia - doi: 10.1109/
KACSTIT.2016.7756063.
21. Qin He, Saint Francis University, Shahab D. Mohaghegh, Intelligent Solutions, Inc.& West
Virginia University, Zhikun Liu, Xi'an Shiyou University; "Reservoir Simulation Using Smart
Proxy in SACROC Unit - Case Study." SPE-184069 - SPE Eastern Regional Meeting 2016
(URTeC). Canton, Ohio - September 13 – 15, 2016.
22. Alenezi, F. Saudi Aramco. Mohaghegh, S. D.; "Developing a Smart Proxy for the SACROC
WaterFlooding Numerical Reservoir Simulation Model." SPE-185691. SPE Western Regional
Meeting. Bakersfield, CA. 23 April 2017.
23. Data-Driven Analytics for Geological Storage of CO2. Shahab D. Mohaghegh. CRC Press -
Taylor and Francis Group. Boca Raton - London - New York. ISBN-13: 978-1-138-19714-5.
24. Data-Driven Reservoir Modeling. Shahab D. Mohaghegh. Society of Petrileum Engineers,
Richardson, Texas. ISBN: 978-1-61399-560-0.
25. Shale Analytics. Shahab D. Mohaghegh. Springer International Publishing. Cham, Switzerand.
ISBN: 978-3-319-48753-3.