You are on page 1of 20

Large Dimension Parametrization with Convolutional

Variational Autoencoder: An Application in the History


Matching of Channelized Geological Facies Models.
Potratz, Julia, Canchumuni, Smith W. A., Castro, Jose David B., Emerick, Alexandre A.,
Pacheco, Marco Aurélio C.

Júlia Potratz
Rio de Janeiro - RJ - Brazil
jupotratz@gmail.com

July 1-4, 2020

1
Introduction to the Problem

Author Info ICCSA 2020 Online, July 1-4, 2020 2


Motivation

Schematic architecture of the convolutional


variational autoencoder (CVAE).
Author Info ICCSA 2020 Online, July 1-4, 2020 3
Motivation

● Most trainable parameters used CVAE network for


large models.

Author Info ICCSA 2020 Online, July 1-4, 2020 4


Adopted Methodology

● We present two innovative deep learning


architectures based on the traditional CVAE
network to work with large dimension imagery.

● CVAE network incorporated the Inception


Module in it’s structure (CVAE-IN).

● CVAE network incorporated the depthwise


separable convolution in it’s structure
(CVAE-DSC).

Author Info ICCSA 2020 Online, July 1-4, 2020 5


CVAE with Inception Module

Schematic architecture of the convolutional


variational autoencoder with Inception Module
(CVAE-IN)

Author Info ICCSA 2020 Online, July 1-4, 2020 6


CVAE with Depthwise Separable
Convolution

Schematic architecture of the convolutional


variational autoencoder with depthwise separable
convolutional Module (CVAE-DSC).
Author Info ICCSA 2020 Online, July 1-4, 2020 7
Apply Decoder Networks with ES-MDA

● Apply decoder architectures to the historical


matching using ES-MDA algorithm.

Author Info ICCSA 2020 Online, July 1-4, 2020 8


Experiments and Results

● Dataset

○ Case 1: 45 x 45 ○ Case 2: 100 x 100

Author Info ICCSA 2020 Online, July 1-4, 2020 9


Experiments Setups

● Computational resource
○ We used NVIDIA TESLA P100 GPU cluster
○ This work was implemented with Keras
framework.

● Training mechanisms:
○ Early stopping
○ Learning rate decay by a factor of 0.5.
○ ADAM with method for stochastic optimization

Author Info ICCSA 2020 Online, July 1-4, 2020 10


Experiments Setups

● Dataset:

Author Info ICCSA 2020 Online, July 1-4, 2020 11


Experiments Setups

● Tuning the models

○ Latent vector dimension


■ best value = 500

○ Batch-size training

Cases CVAE CVAE-IN CVAE-DSC

Case 1 24 20 32

Case 2 12 24 32

Author Info ICCSA 2020 Online, July 1-4, 2020 12


Results

● Number of training parameters

Author Info ICCSA 2020 Online, July 1-4, 2020 13


Results

● Reconstruction accuracy in validation set

Cases CVAE CVAE-IN CVAE-DSC


Case 1 96.7% 96,0% 96,2%
Case 2 95,4% 93,1% 95,1%

Author Info ICCSA 2020 Online, July 1-4, 2020 14


Results

● Reconstruction realizations in Case 1

Author Info ICCSA 2020 Online, July 1-4, 2020 15


Results

● Reconstruction realizations in Case 2

Author Info ICCSA 2020 Online, July 1-4, 2020 16


Results

● History Matching with ES-MDA

○ For all cases, we used:

■ Ne = 200 realizations

■ Na = 4 iterations for Case 1

■ Na = 10 iterations for Case 2

Author Info ICCSA 2020 Online, July 1-4, 2020 17


Results

● Production Curves in Case 1 and Case 2

Author Info ICCSA 2020 Online, July 1-4, 2020 18


Conclusions

● It was observed that the number of parameters for the


CVAE-DSC alternative increases very little in
comparison with the original CVAE.

● The architecture CAVE-IN proposed, that it is invariant


to the size of the reservoir model.

● The experiments showed that this modification


does not affect the capability of the network to
preserve the geological realism of the models.

Author Info ICCSA 2020 Online, July 1-4, 2020 19


Thank you everyone!

Author Info ICCSA 2020 Online, July 1-4, 2020 20

You might also like