You are on page 1of 51

2020-04-20

CMOST for Conventional & Tight


Reservoirs
Alex Novlesky (Calgary)
Thanh Nguyen (Houston)

Agenda
• CMOST Overview
• CMOST Functionality and Tutorials
– Sensitivity Analysis
– History Matching
– Optimization
– Uncertainty Assessment (Time Permitting)

1
2020-04-20

Overview

CMOST AI: Going Beyond Today’s Reservoir


Simulation Workflow

2
2020-04-20

Product Suite

Visualization: Smart Text Fluid Property Black Oil & Compositional & Thermal & Advanced Visualization:
Pre-Processing Editor Modelling Conventional Unconventional Processes Post-
Processing

History Matching,
Optimization & Analysis

Typical Workflow for a Brown Field


Reservoir Field History File
Model History
Sensitivity
analysis Matching
Matched Parameter
Parameter Histograms
Model
Sensitivities

Forecast
Model Uncertainty
Optimization
Assessment
Optimal Operating Optimal
Conditions Model
Uncertainty
6 Quantification

3
2020-04-20

CMOST Process

Select
combination of
Experimental Design parameter values
& Optimization Algorithms Parameterization

Substitute
Parameter Values
Analyze Results into Simulation
Dataset

Objective Functions &


Proxy Analysis
Run Simulation
7

Sensitivity Analysis

4
2020-04-20

Sensitivity Analysis Goals

• Determine which parameters have an effect on results


• E.g. “I expect that rock compressibility is between values A
and B. Does this uncertainty impact my results?”
• Determine how much of an effect parameters have on
results
• E.g. “If permeability is increased by 50mD, how much will
cumulative oil increase?”

Sensitivity Analysis Process

• Select parameters to analyze


• E.g. porosity
• Select range of values to analyze
• E.g. between 20-30% porosity
• Select results (Objective Functions) to analyze
• E.g. Cumulative Oil

10

5
2020-04-20

Sensitivity Analysis Methodology

• One Parameter at a Time (OPAAT)


• Each parameter is analyzed independently while remaining
parameters are set to their reference value
• Response Surface Methodology
• Multiple parameters are adjusted together then results are
analyzed by fitting a response surface (Polynomial
equation or Neural Network) to results

11

One Parameter at a Time (OPAAT)

• Analyzes each parameter independently


• While analyzing one parameter, the method freezes the other
parameters at their reference values (Median or Default)
• Process Repeated for each parameter

• Measures the effect of each parameter on the objective


function while removing the effects of the other parameters.

12

6
2020-04-20

One Parameter at a Time (OPAAT)

13

Response Surfaces and Proxy Modelling

7
2020-04-20

Response Surface / Proxy Modelling

• A response surface is a proxy for the


reservoir simulator that allows fast
estimation of the response
• Simulation using full reservoir physics
trains proxy model
• Proxy modelling fills in gaps and
analyzes trends
• Rapid estimation of simulation results
without requiring additional runs
• Correlation between response and
parameters
• NPV = f(x1, x2, …, xn)

Polynomial Regression Method


• Experimental Design generated as
training data for response surface
• Latin Hypercube Experimental Design
(default)
• Multiple parameters changed
simultaneously
• Training data based on simulation
results
• Response surface (polynomial equation)
is fit to simulation results
• Linear
• Linear + Quadratic
• Linear + Quadratic + Interaction terms

16

8
2020-04-20

Experimental Design
• Design of experiment (DoE) is a structured, organized method that is used to
determine the relationship between different variables (Xs) affecting a process and
the outputs of that process (Ys).

• With well-structured data matrices, DoE delivers accurate results even when the
matrix that is analyzed is quite small.

• Good Designs are:


• Orthogonal (no correlation between inputs)
• Space Filling (inputs spread out, no clustering of experiments)

Experimental Design Comparison


Latin Hypercube Design (Default) Classic Experimental Design
FracPerm FracSpacing FracWidth HalfLength ProducerBhp
FracPerm FracSpacing FracWidth HalfLength ProducerBhp
10000 250 0.001 500 2750
10000 150 0.0005 300 2800 10000 100 0.0001 200 3000
30000 200 0.0025 450 2850 10000 100 0.0001 550 2600
50000 100 0.0005 350 2650 10000 100 0.0025 200 2600
60000 200 0.001 400 2900 10000 100 0.0025 550 3000
60000 300 0.0015 300 2900
10000 300 0.0001 200 2600
30000 250 0.0001 450 2700
10000 300 0.0001 550 3000
20000 150 0.0001 550 3000
40000 200 0.002 200 3000 10000 300 0.0025 200 3000
10000 250 0.0015 250 2600 10000 300 0.0025 550 2600
20000 300 0.002 450 2950 60000 100 0.0001 200 2600
60000 300 0.0001 250 2600 60000 100 0.0001 550 3000
40000 300 0.0025 500 2700
60000 100 0.0025 200 3000
10000 150 0.0025 250 2850
60000 100 0.0025 550 2600
30000 250 0.0005 300 2950
20000 100 0.0025 350 2650 60000 300 0.0001 200 3000
50000 150 0.0015 200 2700 60000 300 0.0001 550 2600
50000 100 0.002 400 2750 60000 300 0.0025 200 2600
40000 100 0.001 550 2800
60000 300 0.0025 550 3000
20000 100 0.0005 550 2700
50000 300 0.002 550 2700
60000 150 0.0005 200 2850

9
2020-04-20

Polynomial Regression Method


Linear Model
y  a 0  a 1 x1  a 2 x 2    a k x k
Linear + Quadratic Model (Simple Quadratic)
k k
y  a0  j 1
a jx j  
j 1
a jj x 2j

Linear + Quadratic + Interaction (Quadratic)


k k k
y  a0  j 1
a jx j  j 1
a jj x 2j  
i j j2
a ij x i x j

Statistically insignificant terms are automatically removed


Model type is automatically chosen but can be changed if necessary
19

Polynomial Regression Method

20

10
2020-04-20

Polynomial Regression Tornado Plot

Increasing PERMH_L1
(permeability) from 2625mD to
4375mD results in an increase in
Cumulative Oil of 12,461 STB on
average

21

Response Surface Verification

• When using the response surface methodology, one


should verify that the response surface provides a valid
match to the simulation data

• This can be verified through


• Response Surface Verification Plot
• Statistical Summary

22

11
2020-04-20

Proxy Model QC
Check the QC plot to make sure you have a reliable proxy model before analyzing results
Results & Analyses Objective Functions Proxy Analysis

Good fit when data points


fall close to 45° line
Actual
Simulation Training Data used to
Results generate Proxy Model

Verification Experiments
used to check predictability

23
Prediction

Statistical Checks (Quick Summary)


R2 greater t-statistic (in absolute value), given the
• Measure of how closely the data fits to the response hypothesis that the parameter (coefficient) is
surface (goodness of fit) zero.
• R2 = 1  Perfect Fit • Probabilities less than 0.1 are often considered
as significant evidence that the parameter
• R2 = 0  Model predicts no better than the (coefficient) is not zero.
overall response mean
• Used to filter statistically insignificant terms
• R2 Prediction give indication of proxy model
predictability
F Ratio VIF (Variance Inflation Factor)
• Tests the hypothesis that all the regression • Measure of multi-collinearity problem due to poor
parameters (except the intercept) are zero (have sampling of the design space
no effect on the objective function) • It is suggested that the variance inflation factors
• If Prob>F high (>0.05) there may be problems should not exceed 4 or 5 (poor estimates of
with inputs (no parameters are impacting results) parameter coefficients/parameter sensitivity)
• If the design matrix is perfectly orthogonal (no
t Ratio correlation between parameter inputs), the
• Statistic that tests whether the true parameter variance inflation factor for all terms will be equal
(coefficient) is zero to 1.
• Prob > |t| checks probability of getting an even

12
2020-04-20

Proxy Analysis

Morris Analysis

Analysis technique for Non-Influential input


estimating mean and standard 1 parameters
deviation of parameter effects (Low 𝛍 & Low 𝛔)

Influential, linear
Sensitivity measures: 2
effect on objective
𝛍 (mean effect): Assessing the function
overall importance of an input factor (High 𝛍 & Low 𝛔)
on the model output
3 Influential, non-linear
𝝈 (standard deviation of and/or interaction
effect): Describing nonlinear effects effects
and interactions (High 𝛍 & High 𝛔)

13
2020-04-20

Sobol Method
Variance-based technique, where the
main idea is to quantify the amount of
variance that each input factor
contributes to the total variance of the
output.

• Global sensitivity analysis


• Handle nonlinear responses effectively
• Measure the effect of interactions in
non-additive systems
• Quantify the measures in percentages
• Interpretation of results are intuitive and
easy to understand

27

Parameters

Inputs to Simulation

28

14
2020-04-20

Parameterization

• Parameters are variables in the simulation model that will be


adjusted when creating new datasets
• E.g. Porosity, Permeability, etc.
• Template dataset generated which identifies parameters that are
adjusted
• Master Dataset (.cmm)
• Almost identical to a normal simulation dataset except
CMOST keywords added to identify where a parameter
values should be changed

29

Master Dataset

A master dataset can be


created in multiple ways:
• CEDIT (CMG Text Editor)
• Builder

30

15
2020-04-20

Parameterization of Simulation Model (Builder)

List of Parameters
Select Dataset
Section

Select Parameter
from Section

Export Master Dataset (Template file)


31

Parameterization of Simulation Model (CEDIT)


Dataset Template
• Complementary to Builder
• Create CMOST parameters
• Better syntax highlighting
– Highlight CMOST parameters
– Fold no-need-to-see sections
• Easy navigation
– Different sections of the dataset
– Navigate CMOST parameters
• Handle include files
– Create/extract include files
– View include files
– Parameterize include files

32

16
2020-04-20

Master Dataset Syntax


Original Dataset:
POR CON 0.20

Master Dataset:
POR CON <cmost>this[0.20]=Porosity</cmost>

Simulator CMOST Original Variable CMOST


Keywords Start (Default) Value Name End
in Dataset
No Spaces in Variable Names
33 CMOST Portion Case Sensitive

Include File Substitution


R1
Parameterize sections of the dataset
• Geological Realizations
• Relative Permeability Tables
R2
• Other Tabular Input
• Viscosity vs. Temperature
• Compaction Tables
R3
• Etc.

17
2020-04-20

Dependent Parameters Using Formula

• Syntax highlighting
• Shows what variables are available to be used to create
formulas
• Test and check the formula anytime

35

Pre-Simulation Commands

• Passes dataset to separate application before submitting


to simulator
• Run CMG Builder Silently
• Run SKUA-GOCAD Silently
• Run Petrel Silently
• Run User Defined Command
• Can be used to create new geostatistical realizations,
recalculate formulas in Builder, recalculate rel. perm.
curves, etc.

36

18
2020-04-20

Coupling with Geological Software

Simulation model

H Pair

Geological model Simulation model

37

Objective Functions

Outputs from Simulation

38

19
2020-04-20

Objective Functions

• An Objective Function (OF) is something (an expression or


a single quantity) for which you wish to achieve some goal
• Usually this goal is to achieve a minimum or maximum
value
• In the case of History Matching, one usually wishes to
minimize an error between field data and simulation
• In the case of Optimization, one usually wishes to
maximize something like NPV

39

CMOST Objective Functions

Basic Simulation Results


• Values directly taken from simulation results (no modification)
• User Defined Formulas (taken from Fundamental Data)
• Time Durations

History match error


• Percentage relative error between simulation and measured results
• Perfect match: 0%

Net Present Value


• Simplified NPV calculation
• Incorporates discount (interest) rate to evaluate cash flows w.r.t. time
40

20
2020-04-20

Objective Functions
Characteristic Date Times
• Identifies dates to evaluate objective functions
• Specific Dates (fixed)
• Date where maximum or minimum value is found (dynamic)
• Date when value surpasses a specified criteria (dynamic)

Advanced Objective Functions


• User defined objective function based on formula or code
• JScript or Python
• Excel
• User Defined Executable (e.g. Matlab)

Soft Constraints
41 • Re-evaluates objective functions based on simulation results

Use Excel Spreadsheet


3500000 60
After each simulation is done:
Production Volume (bbl) & Cash Flow

3000000
50
2500000

Price ($/bbl)
2000000 40
• CMOST write Parameter and 1500000
30
simulation results to Excel cells; 1000000
500000 20
($)

0
• Excel calculate objective function -500000 0 5 10 15 10
-1000000 0
using formula or VBA code; Year

OilProduced (bbl) WaterProduced (bbl)


• CMOST reads value back into SteamInjected (bbl) CashFlow ($)

CMOST and use it for the objective DiscCashFlow ($)


WaterTreatment ($/bbl)
BitumenPrice ($/bbl)
SteamCost ($/bbl)
function value

42

21
2020-04-20

Control Centre
Running CMOST

43

Engine Settings

Defines task type


Study Type Engine
• Task type can be modified
from what was originally
selected when creating the
study

Any other options related to


the engine can be modified
from this page

44

22
2020-04-20

Simulation Settings
• Simulation related settings:
• Schedulers (Local, Cluster, Cloud)
• Simulator version
• Number of CPUs per job
• Maximum simulation run time
• Job record and file management
• Data I/O Cleanup

45

Submit Simulations to Compute Cluster


• Cloud
• CMG Scheduler
• Microsoft HPC
• IBM Platform LSF
• Oracle Grid Engine
• Portable Batch System (PBS Pro)

46

23
2020-04-20

Experiments Table
Refresh

Create New Experiments


(Experimental Design or User Defined) List of Experiments
Export To Excel
• Raw Data from simulation
results
Table Configuration (Select Columns, Filter Data)
• Parameter values used
Reprocess • Objective function results
Check Quality (experimental design quality) • Able to sort and filter results
• Set Ratings for highlighting
Open Log File
• Miscellaneous Data (Status,
Open Dataset in CEDIT file path, etc.)
Open Dataset in Builder

Open SR3 in Results


47
Interactive Data Visualization Tool

Now Hands on Work!!!

Any Questions?

48

24
2020-04-20

History Matching and Optimization

49

History Matching Goals

• In history matching, we are trying to reduce the error between the


simulation results and field measured data
• By matching the simulation model to the historical behaviour, we
have more confidence that the model will be able to predict future
behavior
• When creating a simulation model, there may be uncertainty in the
input parameters. These will be the parameters that should be
adjusted when history matching

50

25
2020-04-20

History Matching Process

• Select parameters to analyze


• E.g. porosity, permeability
• Select range of values to analyze
• E.g. between 20-30% porosity
• Select results (Objective Functions) to match
• E.g. Cumulative Oil
• CMOST will search for the best combination of parameter
values that will give the lowest history match error

51

Objective Function Hierarchy

• Objective Functions can be defined for each well


• Global Weighted Average will determine optimal case

Total Error

Well 1 Error Well 2 Error Well 3 Error

Oil Water Gas Oil Oil Water Gas


Bottom-hole
Production Production Production Production Production Production Production
Pressure Error
Error Error Error Error Error Error Error

52

26
2020-04-20

Calculating History Match Error


simulated measured
Nt
Number of measurements 
t1
(Y t s  Y t m ) 2

Nt
• For each measured data point, calculate
different between simulation and
measured result
• Square terms to make them positive
• Sum up all of the points at all times
• Divide by the number of measurements
to get average square
• Square root to get average error

53

Calculating History Match Error


Nt(j)

t1
(Y s
j,t Y m
j,t )2

Nt(j)
TermError j 
ΔY j
m
 4  Merr j

measurement error
maximum difference
• Error normalized to compare terms with
different units
• Done by dividing by the maximum difference
of measured values
• Measurement error can also be included
• Merr represents 1 standard deviation from the mean
• Value of 4 is used to include 95% confidence interval (2
standard deviations from mean)

54

27
2020-04-20

Calculating History Match Error

N(i)

 TermError
N
 100%  tw i, j  w Qi
w

i, j i
j 1
Qi  N(i)
Q global  i1
N w

 twi, j 
i1
wi
j 1

• Each Local Objective Function • The Global Objective Function


is made up of a weighted is made up of a weighted
arithmetic average of each of arithmetic average of each of
the Terms the Local Objective Functions

55

Field Data Weighting

• Able to weight each measured data point individually


• Remove or reduce weight of outliers

56

28
2020-04-20

Optimization Methods

• CMG Bayesian Engine (HM Only)


• CMG DECE (Designed Evolution, Controlled Exploration)
• Particle Swarm Optimization
• Latin Hypercube plus Proxy Optimization
• Differential Evolution
• Random Brute Force Search

57

Optimization Philosophy

Mathematical optimization
• Mathematicians are particularly interested in finding the true absolute optimum.
• Optimum 0.000001 is much better than 0.01 even though it may take 20 extra days to
achieve the former.
Engineering optimization
• Engineers are more interested in quickly finding optima that are close to the true optimum.
• Optimum 0.01 is much better than 0.000001 if it takes 20 less days to achieve the former.
CMOST Optimization Philosophy
• Engineering optimization
• Not intended to solve pure mathematical problems

58 58

29
2020-04-20

Engineering Optimization Methods

CMG DECE Optimization Algorithm

60

30
2020-04-20

DECE Characteristics
• Handles continuous & discrete parameters
• Handles hard constraints
• Asynchronous – complete utilization of distributed computing power
• Fast and stable convergence

61

PSO Optimization Algorithm


• A population based stochastic optimization technique developed in 1995 by James
Kennedy and Russell Eberhart.
• Let particles move towards the best position in search space, remembering each local
(particle’s) best known position and global (swarm’s) best known position.

62

31
2020-04-20

Latin Hypercube plus Proxy Optimization Algorithm


Generate initial Latin hypercube design

Run simulations using the design

Get initial set of training data Polynomial


RBF Neural Network
Multilayer Neural
Network
Build a proxy model using training data
Add validated
solutions to Find possible optimum solutions using proxy
training data
Run simulations using these possible solutions

No
Satisfy stop criteria?
Yes
63 Stop

Latin Hypercube plus Proxy Optimization Algorithm

64

32
2020-04-20

Proxy Optimization

Optimization using proxy

Latin hypercube design

65

Which Optimizer Should I Use?

66

33
2020-04-20

Now Hands on Work!!!

Any Questions?

67

Optimization Goals
• History matching and optimization are very similar in that in each one
would like to find the maximum or minimum of an objective function
• In history matching, we are trying to reduce the error between the
simulation results and field measured data
• With optimization, we are trying to improve an objective function
• Find maximum NPV
• Find maximum recovery
• Etc.
• Typically with optimization, the parameters that will be adjusted are
operational parameters as opposed to reservoir parameters when
history matching

68

34
2020-04-20

Optimization Process
• Select parameters to analyze
• E.g. Injection rate, well spacing
• Select range of values to analyze
• E.g. between 200-500bbl/day injection rate
• Select results (Objective Functions) to improve
• E.g. NPV, recovery factor
• CMOST will search for the best combination of parameter values that
will maximize your objective function
• In some cases we may want to minimize an objective function such
as when looking at run times during numerical tuning

69

Calculating Net Present Value (NPV)

• Net Present Value (NPV) is often used as an economic


indicator to evaluate the value of a project

• A discount rate (I) is used to incorporate the time value


of money
• Money now is worth more than money later

70

35
2020-04-20

Calculating Net Present Value (NPV)


• In CMOST, the cash flow is always calculated based on the period
that has been selected.
• Daily
• Monthly
• Quarterly
• Yearly
• Yearly discount rate will be converted to the period of interest
• E.g. Daily:

71

Robust Optimization

36
2020-04-20

Traditional Nominal Optimization

Are we “precisely wrong?”

Robust Optimization Workflow for a Brown Field in CMOST


History Match using
Sensitivity Analysis
Bayesian Engine

100 History Probabilistic


Matched Models Forecast

P10, P30, P50, P70, Robust Optimization


P90 Models using P10-90 models

Uncertainty Assessment
Optimal Operating
Probabilistic Forecast of
Conditions
Optimized Model

74

37
2020-04-20

Results: Optimal Wells Locations


Base Case25
Realization 1
3
4

Base Case Histogram

38
2020-04-20

Base Case vs. Nominal

Base Case vs. Nominal & Robust Optimizations

39
2020-04-20

Now Hands on Work!!!

Any Questions?

79

Uncertainty Assessment

80

40
2020-04-20

Uncertainty Assessment
• Analysis carried out to determine the likely variation in simulation results due to
uncertainty, in particular, of reservoir variables
• Even when a history match has been conducted there may still be alternate
inputs that could achieve an equally good match

Greenfield Forecast Brownfield Forecast


Constrained by: Constrained by:
• Rock measurements • Rock measurements
• Fluid measurements • Fluid measurements
• Etc.
• Etc. • Historical Production &
Injection Rates
Any combination of properties Limited combinations of
possible reservoir properties will
match historical rates

Greenfield Forecast - Monte Carlo Simulation

Input probability distributions Probability distribution for input parameters


derived from experience p DWOC p SORG p SORW

(Prior Probability Density Function) 81 82 83


DW OC
0 .2 5 0 .3 0 0 . 3 5
SO RG
0 .2 5 0 . 3 0 0 .3 5
SORW

Pick random values (that follow


input distribution) and calculate
Objective Function NPV=F(DWOC, SORG, SORW, …)

Repeat for thousands of


iterations
12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
Net present value (M$)

82

41
2020-04-20

Uncertainty Assessment

83

Prior Probability Density Functions

• Defines likelihood of a parameter values being selected in


Monte Carlo Simulation for continuous parameters
• Probability for each value defined for discrete parameter
types Normal Lognormal

• Prior probability distributions of geological related


uncertain parameters should be set according to the
geological observations and interpretations.
• core samples, Triangle Uniform
• well logging/testing,
• etc.
• If there is no existing data regarding the nature of the
reservoir properties, uniform distributions can be
assumed
84 Custom Discrete

42
2020-04-20

Parameter Correlations
Sample Correlation Coefficients
• Some parameters may be related to each
other
• E.g. porosity and permeability may
correlate with each other
• Correlation coefficient defines how
closely related parameters are to each
other
0.0 0.25
• Value of 1 means parameters are directly
related
• Value of 0 means that parameters have
no relation with each other
• Negative correlations indicate one
parameter tends to increase as the other
tends to decrease
0.50 0.75

85

Probabilistic Forecasting

43
2020-04-20

Probabilistic Forecasting (UA)

• Given residual uncertainties in the HM (or other) variables, what impact will those
uncertainties have on the NPV or other objective functions going forward?

• It is important to recognize that the HM process develops alternative realizations and


that parameters which are part of these realizations cannot have their values
changed independently and arbitrarily

87

Probabilistic Forecasts

• History matching (HM) is an


inverse problem with
non-unique solutions
– Perfect HM ≠ Perfect Prediction
Cumulative Oil (bbl)

• Probabilistic forecasting reduces


risk in making business decisions
– Provides range of possible
outcomes

44
2020-04-20

Probabilistic Forecasts
Deterministic Forecasts may Probabilistic Forecasts are
be Misleading Preferred
• Only provides one solution • Range of Possibilities
• Ignores Uncertainty • Quantification of risk
Cumulative Oil (bbl)

Cumulative Oil (bbl)

Now Hands on Work!!!

Any Questions?

90

45
2020-04-20

Working With CMOST

91

Main CMOST Components

Base Files
• To begin a CMOST project, a completed simulation dataset (.dat)
along with its Simulation Results (.sr3) files are required
CMOST Project
• A CMOST Project is the main CMOST file that can contain
multiple related studies

92

46
2020-04-20

CMOST File System

Project Name: SAGD_2D_UA

Project File: SAGD_2D_UA.cmp

Project Folder: SAGD_2D_UA.cmpd


Best practice: All files related to the project should
be stored in the project folder.

93

Main CMOST Components

CMOST Study
• A CMOST study contains all of the input information for
CMOST to run a particular type of task
• Information can be copied between studies
• Study types can be easily switched
• The new study type will use as much information from the
previous study type as possible

94

47
2020-04-20

CMOST File System (cont’d 1)

Study Name: BoxBen

Study File: BoxBen.cms


Study File Auto Backup: BoxBen.bak

Study Folder: BoxBen.cmsd


Don’t modify/delete files in the study folder unless you know
95 what you’re doing.

CMOST File System (cont’d 2)

Vector data repository file: *.vdr

VDR stores compressed simulation data


required for objective function calculations

Subset of SR3 results

Never modify or delete vdr files manually

96

48
2020-04-20

Licensing Multiplier

• CMOST uses only partial licenses when running


simulations
• E.g. Run 2 STARS simulations while using only 1 STARS
license
• Applies to other license types (Parallel, Dynagrid, etc.)

• IMEX 4:1 x4
• GEM 2:1
• STARS 2:1 x2

x2
97

Further Assistance

Email: support@cmgl.ca

• Zip an entire project or selected studies


• Email or ftp the zip file to CMG

98

49
2020-04-20

Diagnostic Zip for Support Request

99

Diagnostic Zip for Support Request

100

50
2020-04-20

CMG’s Vision
To be the leading developer and supplier of
dynamic reservoir technologies in the WORLD

51

You might also like