Professional Documents
Culture Documents
Kingdom® 2018
VelPAK User Guide
February 2018
2 © 2018 IHS Markit™. All Rights Reserved.
Trademarks and Copyright
This manual was produced by IHS Markit.
February 2018
.
IHS Markit Kingdom® software and all of its components, AVOPAK, CGMPAK, GeoSyn®,
LoadPAK, PAKnotes®, Petra®, SynPAK®, Tunnel L+, Tunnel O, VelPAK®, VuPAK®,
Kingdom 1D Forward Modeling, Kingdom Colored Inversion, The Kingdom Company,
Kingdom Data Management, Kingdom DM Catalog Builder Kingdom Illuminator
Kingdom Seeker, and Kingdom I3D Scan are trademarks of IHS Markit.
Portions of data loading are copyrighted by Blue Marble Geographics.
Mapping API for the Spatial Explorer map provided by Esri ArcGIS Runtime SDK for .NET.
Kingdom Geophysics contains components under U.S. Patent Numbers 6,675,102,
8,265,876, and 9,105,075.
VuPAK® includes OpenInventor® and VolumeViz from FEI Visualization Sciences Group,
Inc. Some components or processes may be licensed under U.S. Patent Number 6,765,570.
Tunnel L+ includes OpenWorks® and SeisWorks® Development Kit from the Halliburton
Corporation.
Kingdom Connect and Tunnel O include OpenSpirit® FrameWork from OpenSpirit, a
TIBCO Software Group Company. Kingdom Data Management includes components from
OpenSpirit and are copyrighted by OpenSpirit, a TIBCO Software Group Company.
Kingdom Gateway plug-in for Petrel* E&P software platform uses the Ocean* software
development framework and * is a mark of Schlumberger.
Kingdom®1D Forward Modeling® includes software developed as part of the NPlot library
project available from: http://www.nplot.com/.
Portions of Kingdom® bitmap graphics are based on GD library by Boutell.Com, Inc. Further
information about the company can be found at www.boutell.com.
PAKnotes TIFF support is based in part on libtiff.
Copyright Notice
© 2011 - 2018 IHS Markit. For Internal use only. All rights reserved.
This manual contains confidential information and trade secrets proprietary to IHS Markit Ltd.
and its affiliated companies (“IHS Markit”). No portion of this manual may be reproduced,
reused, distributed, transmitted, transcribed or stored on any information retrieval system, or
translated into any foreign language or any computer language in any form or by any means
whatsoever without the express written permission of IHS Markit. For more information,
please contact Customer Care at kingdom_support@ihsmarkit.com.
Misuse Disclaimer
IHS Markit makes no representation or warranties of any kind (whether express or implied)
with respect to this manual or the Kingdom® software and to the extent permitted by law, IHS
Markit shall not be liable for any errors or omissions or any loss or damage or expense
incurred by an user. IHS Markit reserves the right to modify the Kingdom® software and any
of the associated user documentation at any time.
Acknowledgments
IHS Markit wishes to gratefully acknowledge the contributions of the many client software
testers in preparing the Kingdom® software. The enthusiastic Beta testers, smoke testers,
amber testers and staff greatly appreciate their invaluable feedback and contributions.
VelPAK - Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Abort. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
Model Tree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Model Tree - Top Bar Icons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
Refresh Tree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
Show Unused slots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
Model Tree - the Data Elements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Model - Profile . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Model - Fault . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Model - Surface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Event Number . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Model - Well. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Model Tree - The Expanded Data Slots . . . . . . . . . . . . . . . . . . . . . . . . . 72
Model Tree - Cut/Copy/Paste/Delete . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
Preferences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Console . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
Profile Displays . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253
Theory of Fault Allan Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253
What to do with an Allan Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256
Summary of Steps for the successful creation of an Allan Diagram . . . . 258
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 507
Quick Guide to the Curve Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 508
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 555
The Optimizing process within VelPAK . . . . . . . . . . . . . . . . . . . . . . . . . . 555
Concepts of Optimizing in VelPAK. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 559
Types of Optimize . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 561
Fit Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 562
Calculation Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 562
Well_Grid_Fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 563
Well_Line_Fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 564
Well_Point_Fit. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 564
Layer_Grid_Fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 565
Layer_Line_Fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 565
Layer_Point_Fit. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 566
Residual Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 566
Well_Grid_Residual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 566
Well_Line_Residual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 567
Well_Point_Residual. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 567
Layer_Grid_Residual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 567
Layer_Line_Residual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 568
Layer_Point_Residual. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 568
Overlay displays available in the 2D Parameter Space . . . . . . . . . . 568
Grid Nodes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 568
Line Fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 569
Line Residual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 570
Point Fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 571
Point Residual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 572
Point User. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 573
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 617
Pre-defined Workflows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 618
The Nested Workflows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 618
The Nested Workflow Hierarchy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 619
Analyze Module:
Stochastic Modelling in VelPAK . . . . . . . . . . . . . . . . . . . . . . . . . 681
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 681
P10, P50, P90 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 681
Parametric Variation and Sequential Gaussian Variation (SGS) . . . . . . . 682
Methods of Parametric Variation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 682
Sequential Gaussian Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 685
Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 735
A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 735
AOI or Area of Interest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 735
B . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 736
Background Velocity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 736
BIN file . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 736
C . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 737
D ......................................................... 737
Datums . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 737
Depth conversion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 737
Diagnostic mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 737
Dix Interval Velocity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 737
E. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 739
Event . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 739
F. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 740
Fly out . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 740
G ......................................................... 741
Grid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 741
GridEvent. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 741
GridDesc . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 742
GridType . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 742
H . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 744
I .......................................................... 744
Interpretation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 744
Interval Velocity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 744
INDT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 744
Instantaneous Velocity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 744
Isochron . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 744
Isopach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 744
J . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 745
K . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 745
K . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 745
L. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 746
Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 746
M......................................................... 747
Model Tree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 747
Model Tree Slots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 747
Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 747
N . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 748
O . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 748
Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 748
P ......................................................... 749
Pods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 749
Profile Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 749
Property Grid. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 750
Pseudo Velocity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 750
Q . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 751
R . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 751
RMS Velocity, Vrms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 751
S ......................................................... 752
Seismic Velocity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 752
Stacking Velocity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 752
Surface Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 752
T . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 754
TKS Link . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 754
U . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 754
V ......................................................... 754
V0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 754
VelPAK Model File. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 754
Velocity Gradient . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 754
Velocity Intercept. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 754
Velocity Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 755
Velocity Volume Generation (VVG) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 755
W . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 756
Well Velocity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 756
Mapping,
Location Standard navigation - Import ASCII file provide the
data read into VelPAK - via seismic X,Y data for all
to produce a basemap workstation link other VelPAK
functions
PROFILE Profile Intrinsic part of the - Import ASCII file. Linking these
Faults Profile data - Create new profile to the Map
faults inside VelPAK faults is the
- via seismic basis of Allan
workstation link Diagrams
- Intersections between lines and misties between horizons can be displayed on your filled or
unfilled profile.
Base Maps
The Basic Map option displays on-screen a basic basemap stick plot of the map data within
the memory model, along with an XY grid around the map.
FAULTS and other types of flags taken from the snapped profiles can be marked onto
basemaps, with the ability to view these Flags on any selected horizon on the standard
basemap.
Once these Fault Polygons are assigned a name, these polygons can be viewed in profile to
produce Allan Diagrams (in the Fault Module) of the horizons that run into and out of the fault.
Ribbon Maps
This will construct a polygon of color varying with the time/velocity/depth values that are
posted along a seismic line, for any selected horizon within your model.
You DO NOT need to SNAP your data before displaying them as a Ribbon Map, and thus this
can act as a very useful QC check on your data when they have just been read in, to check
for inconsistencies in the general trend of the data.
Depth Conversion
Depth Conversion of surface data within VelPAK, using a variety of methods or ‘Create-your-
Own’ methods.
Process
Grid manipulation; multiplying, adding etc. grids together, or with constants to produce new
grids. Also a whole set of other processes are available including Grid-to-Grid depth
conversions. Or ‘Create-your-Own’ methods.
Note: Fault Generation only generates further Map Faults on lower (or higher) horizons
using the fault pattern picked. Profile faults occurring only in lower horizons that do
not appear in the original horizon from which the Map Fault pattern was picked would
not appear as fault patterns, and therefore you would have to pick them manually.
FAULT1
FU
FL
FU
FL
FAULT1 FAULT1
FU
FU
FL
FL
Map Fault Pattern picked by user, using Map Fault pattern generated by
FU/FL information from Profile. Fault Gen using Green Horizon
FU - Fault Upper Flag Fault pattern picked by user, and
FL - Fault Lower Flag
Trim
Allows you to Trim data in a model, using polygon(s) as control. Polygons are used as trim
control for data either within them or outside the polygons, as selected in the dialog.
Data trimmed in the model are not retrievable. It is recommended that if you trim data, you
use the SAVE AS option to save your trimmed data as a new model, keeping your original
data model intact.
The well log module is used to specify which tops correspond to the top and base of seismic
layers. The Well Curve is displayed in the centre of the display. To the right the thicknesses
are shown according to the grid isopachs generated between seismic layers.
To the left thicknesses are shown according to the Well Tops picked by you to define that
layer.
The Tie is a method to check how well the current depth conversion method ties to the well
data in the model.
GO TO the Well Log Module Main Documentation
Many of the VelPAK processes can be turned into Workflows which step-by-step can lay out
the actions required to complete a particular task. Tasks that are cumbersome and/or
repetitive can be completed with a click of the mouse in WorkFlows. They guide you through
each process step-by-step, leaving very little room for error and confusion.
VelPAK makes extensive use of workflows to assist you.
GO TO the Workflow Main Documentation
Customized
VelPAK
window with all
views docked
in main
window
If you close a design and reopen it, the document windows have the last layout that you used.
This means that the same windows are open, and they have the same sizes and positions.
Your favorite lay-out can be loaded and saved from the File drop-down.
If no Layout has been defined, the views will initially be displayed in Tab form, clicking will
bring the required module to the front.
Note: For pop-outs these need to be pinned out before dragging is possible.
2. Continue to drag the item over the window; you will see docking icons appear:
3. As you drag the detached window over these docking icons the area of the window will
become shaded showing you where the window will be attached when you lift your finger
up.
Docked
Position if
held over
right hand
docking
icon
Floating
Window
4. Unless you drop the window when the docking icons are displaying a shaded zone
showing where the window will be docked, the window will be a floating one.
File
New - Will clear the current model allowing you to load a new set of data.
Open - the option for loading a VelPAK binary save file previously created from VelPAK.
Go to Input/Output from VelPAK for full details
Open TKS - the option for importing relevant data from your open TKS project.
Go to Kingdom Link input for full details.
Save/ Save As - the options to save your data as an internal VelPAK binary model. Save
will automatically save it as a previously defined named model, Save As will ask you to
give a new file name.
If the model contains SEG Y velocity volumes using "File> Save As..." will prompt if you
wish to copy those to the new model or not (press Cancel at the prompt and the SEGYs
will not be copied).
Note that if you have a project with a lot of data in it, for example grids and images, you
can turn on/off the data you wish to save from the Preferences tab of the Model Tree.
Import/Export - the method of reading data in or exporting data as ASCII files.
Go to Input/Output from VelPAK for full details.
Printing Options - standard Windows printing options available to you are utilized here.
See below for details.
Layout - allows you to load or save the layout you have designed for your VelPAK
windows. Default will restore the standard VelPAK window. Go to Layout - docking and
floating details.
Recent Files - A list of recently used VelPAK projects will appear here.
Printing
Print Page Set-Up
Use the Page Setup dialog box to determine the paper size, page orientation and margin
width.
To set or change the page setup:
• In the “Paper” panel, select the paper size and its source in the printer.
• Orientation:
- Portrait orientation places the top of the image against one of the page's narrow sides.
- Landscape orientation places the top of the image against one of the page's wide
sides.
• Enter the appropriate values in the margin boxes. The left and top margins indicate the
position on the page of the image’s left and top edges; the opposite for the right and
bottom.
• Click Printer to select a different printer.
• Click OK to close the dialog box and save the settings.
Print Preview
To the left of a thumbnail shot of the generated image. Drag the red window over this
thumbnail to move the main image or use the standard zoom keys to view the generated
image in larger of smaller detail.
Standard ‘Windows’ Print page. Before using this dialog box, open the Print Setup dialog box
to select the paper size, page orientation and margin width.
• Select a printer in the Printer name box.
• Select print range if you have more than one image to print from the Print Preview.
• To print the image to a file instead on the printer, select the “Print to file” check box.
Clicking Print opens the Print to File dialog box, where you can choose a location and
name for the new file.
• Set the number of copies you want to print.
• Click OK. VelPAK sends the file to the printer or opens the Print to File dialog box.
Note: Print, Print Set Up and Print Preview are also available from Icons on all the Module
tabs.
Edit
}
}
The Edit Delete options provide ‘multi-layers’ of delete, so that single or all elements can be
removed from the memory model.
Delete All
Lines, Faults, Surface, Wells
This option will delete ALL elements stored under the various type of data you select to
delete. Proceed with caution.
Delete Profile Faults/ All Event / Selected Event
Use this option to delete elements of your VelPAK model relating to the Profile data; that
is Faults and Horizons.
Delete Surface Data
Any type of Map data can be selectively deleted.
This includes XYZs, Faults, Polygons and Grids as well as all or some of the Surface
Location navigation lines.
For the navigation lines you need to select the lines you wish to delete from a basemap,
unless you want to delete all the navigation data. Go to Delete Options below.
All other types of MAP data refer to the slot or slots in the Model Tree under the horizon
where the data are stored.
Note: For deletion of individual items of data in slots within the Model Tree, use the Model
Tree delete options.
Delete Options
All
As this suggests, all data held under the selected type will be deleted.
Selected
Certain delete functions give you the option to delete selected lines. These are lines
selected from your basemap. Go to Select Lines.
Current
This deletion option refers to the line currently selected on the map, profile or from
within the Model Tree.
Unsnapped
For profile data and map location data this will delete all unsnapped lines.
No Interpretation
For Surface locations with no profile data attached
Delete Stacking Velocities
All
All stacking velocities in the model will be deleted.
Current
Delete Stacking Velocities on the currently selected line.This deletion option refers to
the line currently selected on the map, profile or from within the Model Tree.
Selected
Delete Stacking Velocities on certain, selected lines. These are lines selected from
your basemap. Go to Select Lines.
Unsnapped
Delete stacking velocities on all unsnapped lines.
No Interpretation
Delete stacking velocities for Surface locations with no profile data attached
Sort Model
Provides an option to either sort and display the lines within your VelPAK model
Numerically or Lexically within the line selection display. It is preferable to sort 2D lexically
but 3D numerically.
Reset Cache
Some items of VelPAK are stored within the Cache (grids for example). This option will
reset/clear all items from the cache memory.
Rebuild
On rare occurrences when you may suspect the VelPAK model you are working in has
been corrupted in some way, using this option will rebuild the data base.
This option would usually only be used after consulting with the Help Desk staff.
View
Opens the main Modules including the Workflow into the main VelPAK window.
Tools
Lever Analyse - The Analyse Utility launches an external program; it takes the data from the
Analyse module of VelPAK and performs sensitivity analysis/spider plotting - comparing the
relative importance of the results of multiple realizations in producing the volume data derived
from the Analysis module of VelPAK. An on-line help system is available from the Help drop
down within Analyse.
HiDef Tool - launches a new window that allows log-scale velocity detail to be applied to a
SEGY velocity volume whilst honouring the depth conversion velocity trends and geological
layering
Scanit - Scanit is a visual, general purpose data formatting tool which allows you to reformat
your data files into any columnar flat file format you require.
An on-line help system is available from the Help drop down within Scanit.
Volume - Volume is a stand-alone program available from the Tools drop-down menu.
Working on volumes within TKS (possibly generated by VelPAK but not necessarily) its main
usages are two-fold
Time-Depth Curve text file production - This uses an Interval Velocity, Average Velocity
or RMS Velocity Volume loaded into TKS to produce Time-Depth curve values for
selected wells within the area of the volume. ASCII time-depth pairs are produced which
are then read into the TKS for the selected wells.
Velocity Volume text file production - This will extract velocities from a velocity volume
and produce an ASCII text file of the Time/Velocity pairs that can be read into VelPAK via
the Import - Profile Stack option.
An on-line help system is available from the Help drop down within Scan
Window
Shows you which modules of VelPAK you currently have open and allows you to switch
between them in a similar fashion to pressing the tab for each window.
Note: This only shows modules that are currently docked as opposed to floating.
Help
Brings up VelPAK standard Windows Help system on either the Contents, Index or Search
page.
Explore takes the user to the directory where the pdf versions of the helps are stored.
StartUp will bring up the ‘What’s New’ start up screen that comes up when you first open
VelPAK.
About will show the version of VelPAK that you are using.
Display
Types
Icon: A
drop
down
selection
of the
types of
All pop-ups and fly-outs can be made into a fixed “PROPERTY GRIDS”
window by clicking the ‘drawing pin’ icon to the top for actions relating to
right, in standard Windows fashion. the profile module;
All pop-ups and fly-outs can be removed from main crucial options to
Abort
Any process within VelPAK can be aborted by using the Abort button.
Model Tree
Expanded to
show unused
slots
The Model tree displays all data within the model you are looking at. Each element can be
expanded in standard ‘Windows’ fashion. Use the Model tree (or the Surface & Slot selector)
to select the surface, line or profile or well etc. you wish to analyze and the data within that
element.
You can use the Model Tree to delete specific elements from your model - used along side
the main bank of delete options available from the Edit drop-down at the top of the main
VelPAK window.
Data Elements stored under the Model Tree such as Fault files, XYZ files, Grids can be
moved or copied into other slots within the Model Tree within the same horizon. Go Here for
details of this.
Double-clicking on certain elements of the Model Tree will automatically turn on the overlay
for that data: for XYZ data, for example, XYZ and XYZ Label overlays are turned on. Double
clicking on a grid name will automatically display the grid in "Shaded" mode.
Double-click on an item in the project tree now For XYZ, XYZ and XYZ Label overlays are
turned on. For grids, the default is automatically set to "Shaded".
The Model Tree can be removed from view by clicking on the Drawing pin to the top right. It
will sit as a tab at the left of the window ready to be opened up. It can also be made to be an
independent window by clicking on it and dragging the top bar away from the main VelPAK
window.
Refresh Tree
The tree should automatically update and refresh itself after most procedures; however you
can manually update the Model Tree using this button.
Toggle On
If the same example is toggled so that the unused slots are hidden, the Model Tree becomes
considerably more manageable:
Toggle Off
Model - Profile
Seismic Lines are stored under the Profile Data Element. You can scroll through the lines in
the Model Tree to select the Profile to be displayed in the Profile Module. Double-clicking on
a line name will display the profile in the Profile module window.
Note: If the profile has not been depth converted the ‘Display Type’ must be set to ‘Time’ for
the display to be shown.
The selected line will also be displayed on the Surface Module as the selected line (in blue).
• Random 2D lines are displayed as R:[Line Name]
• Inlines are displayed as I:[Line Name]
• Crosslines are displayed as X:[Line Name]
Note: Random 2d or in-lines &/or cross-lines with no interpretation will not populate the
Profile section of the Model Tree.
Model - Fault
Named Profile Faults are stored under the Fault Data Element. These are faults that have
been assigned a name, either from within VelPAK or from the external package that originally
produced the fault patterns.
Right-clicking on Faults on this part of the Model tree will delete surface named faults but not
profile faults (use Edit -> Delete Profile Faults to delete fault sticks on a profile).
Within VelPAK, Profile Faults need to be named in order to select them for display as Allan
Diagrams.
Model - Surface
All data relating to Surface within VelPAK. All data are stored under the relevant horizon
number.
Note: For Layers the convention is that these are always stored under the horizon at the
base of the layer.
Note: Double-clicking on an item in the project tree will automatically turn on the overlay for
that data, where appropriate.
Event Number
Up to 32 Events/Horizons can be utilized within a VelPAK project. Eight horizons will be
shown unless you are using the Model Tree in the Show Unused mode.
XYZ
All XYZ data are stored under these slots.
Fault
Surface Map Fault Polygons [not Profile Faults stored under the Fault Data Element]. These
can be read in via the Input/Output routines or manually digitized from the Profile flag pattern
on a Surface. Putting the fault patterns in different slots within the Model Tree allows you to
switch on and off between separate fault patterns.
Note: You can not display more than one slot at the time on your surface; to ‘merge’
separate fault patterns you will need to export the data to ASCII, concatenate the file
and import the data into a new slot.
Polygon
Polygons - used in Trim, Analyse and Group to define areas for manipulation.
Polygons can be stored in any slot (Time, Depth, General01 etc.) within a certain event but
you can not display polygons from more than one slot together on your surface.
More than one polygon area can be defined within the one slot within one event.
If polygons have been read in from you seismic workstation, a number of polygons can be
read in and stored within the same slot of your VelPAK model. These will retain the same
names as they have been assigned in the original project. Their names can be checked in
Polygon mode within the Surface module by selecting the polygon on screen using the
Polygon Segment Edit option.
Alternatively if you are digitizing polygons on screen from within the Surface module of
VelPAK you can name each polygon individually using the Polygon Segment Edit option. For
details of how to enter Surface Polygon manually. Go Here.
Note: To ‘merge’ polygons from different slots or events you will need to export the data to
ASCII, concatenate the file and import the data into your selected new slot.
Grid
All the grids produced from within VelPAK or Input into VelPAK from an external source are
stored here. Grids can be moved or copied into other slots within the Model Tree within the
same horizon event. Go Here for details of this.
Interpretation
Due to their nature, interpretation slots within the Model Tree are slightly different from the
rest of the Data Elements. The Time, Velocity and Depth interpretation data are stored under
these slots for each horizon within the model but you do not need to ‘activate’ them from here
in order to display the data on a Surface.
This is because routines which access the Interpretation data within VelPAK can use
Snapped, Un-snapped or Merged (a mixture of the two) data; the choice of these three types
being made in the Property Grids that relate to the features that use these data; for example
Gridding when you can use between the following as your input XYZ data:
Note: You can only Delete interpretations from these slots; a function you can also do from
the Edit Drop-down menu which would give you more control on the deletion routines.
Model - Well
A list of all the Wells within the model. Wells can be deleted from this list.
Toggle On
Further down, the slots named Average Velocity, Isochron, through Stack Interval Velocity
Smooth, Stack Debiased Velocity Smooth through to Depth Conversion System 06 are all
named to be used in various Depth Conversion processes. The standard supplied processes
within the Depth Conversion modules allow for all these interim values (usually grids) to be
output into their relevant slots here while producing the final required Depth Grid.
For an example of a depth conversion that would use these slots Go Here.
While Grids and XYZ are the most common elements to be stored in these slots, there are
the same named slots available under Fault and Polygon data types.
Note: Profile and corresponding Line data can only be deleted using this technique, not
copied etc.
Major Data Types - Profiles, Fault, Surface and Well can also be deleted via this method.
This will delete ALL the data elements stored under that Data Type, so proceed with caution,
and as Alert says we aware that if there is a large amount of data stored under that Type then
it could take some time to delete.
Delete
Note: The Edit Delete option from the Command Drop Down menus at the top of the main
VelPAK window provide ‘multi-layers’ of delete, so that single or all elements can be
removed from the memory model. This may be a more appropriate method of deleting
data elements or types. Go Here for details.
The Surface&Slot selectors are directly linked to the Model Tree; they show and allow you to
select the various surfaces and data slots currently active within VelPAK.
Keeping the Surface&Slot selectors open and on display helps you to see at a glance what
slots are active and what surface you are currently working in.
Note: This information is also shown at the bottom right of the main VelPAK window.
The Previous / Current horizon event selection is not available under the Surface&Slot
selection; due to its nature it is only available as a feature of the Workflow Module within
VelPAK.
Properties
The Properties window displays the values for the data type and element that has been
selected in the Model Tree.
For example selecting a ‘Profile seismic line name in the Model will display the min/max
values associated with this.
Selecting a Surface in the Model will display other relevant information associated with it.
The Properties window can be pinned out or removed from view by clicking on the drawing
pin to the top right. If removed it will sit as a tab at the left of the window ready to be opened
up. It can also be made to be an independent window by dragging the top bar.
Preferences
The Preferences tab is stored behind the Properties tab. It has two distinct options
Save
The save part of the tab allows the user to select which items from the VelPAK project you
wish to save when you save the project as another named file and project. For example you
may have a a number of images generated from a realisation run or a large SEGY file stored
with in the project directory files. You can turn the various items on or off (Yes//No to save) so
that these are not saved in the new project.
You can manually copy them across to the correct directory if required at a later date.
Well
The Well part of the Preferences tab stored behind the Properties tab allows the user to
select what part of the well labels they wish to have displayed on map. Choose to turn on or
off the Name, Number or Unique Well Identifier as shown in the screen grabs below.
Console
The console shows you what action VelPAK is taking following each user button press.
In some modules clicking the mouse button - either the left hand or middle button - will bring
up certain values for the data element you have clicked on. For example Stacking Velocities
in the Surface module- by pressing the middle mouse button on a velocity point you would
get a display in the Console of all the values for that point:
Like all other fly-out windows in VelPAK the console window can be kept out on display by
‘pinning’ the drawing pin on the top right of the window. It can also be made to be an
independent window by dragging the top bar.
Hot Key
z
i
o
p
u
d
l
r
f
t
Zoom - Any map data displayed can be enlarged by selecting Zoom from the View Menu.
Zoom under the View Menu allows you to define a window, with the mouse, in which you
wish to zoom. This is done by holding the arrow cursor down at one corner of the area of
interest and dragging it over to the opposite corner of the area. If you wish to stop your
zoom operation drag around a small area only.
Zoom In - Selecting Zoom In from the View Menu enlarges the centre of the display by
50%. When you have ‘zoomed into’ an area you can zoom out by selecting Zoom Out or
Reset from the View Menu.
Zoom Out - Selecting Zoom Out from the View Menu decreases the centre of the display
by 50%.
Pan - Selecting this option, wherever the mouse arrow is pointing, on activation will
become the centre of the new display. Pan Up, Down, Right and Left are self explanatory.
Refresh - Will refresh whatever is currently displayed in the window.
Reset - This will reset a zoomed display back to the original display parameters.
The color tables within VelPAK allow selection from standard VelPAK color bars, or import of
user's color bars. Color palettes can also be made and saved.
Available within:
Surface >Display > Contour_Colour
where it sets the color bar for grid/horizon displays
Profile >Display > Velocity_Colour
where it sets the color bar for time or depth velocity fill options
Profile >Display > Seismic_Colour
where it sets the color bar for SEG Y seismic file display
3D >Display > Contour_Colour
where it sets the color bar for mesh contours.
Convert from IHS Kingdom - will read in tables from your Kingdom project and
store them locally in the VelPAK color table area.
Click on the color table bar to insert pegs which can then be used to set the color-range
according to the color palettes on the left.
This allows the user to depth convert layer by layer quickly and efficiently using a variety of
standard depth conversion methods. Most of these tasks generate standard velocity
functions or apply typical depth conversion methods to the selected event.
The Wizard Overview page opens in a new tab.
This tab may be docked and floated just like all the other tabs.
Note: Generally when running the Wizard it is a good idea to float it away from the other
windows, so you can interact efficiently with both the Wizard dialogs and the screens
it presents.
Extensive context help screens assist the user in understanding the methods available.
Wizards may be “replayed” to update the model should minor changes to the model data be
made without user interaction required.
For a detailed guide for individual methods selectable from the wizard refer to the extensive
Helps on each individual Wizard screen.
The Wizard screen starts with the Help screen open.
To start the Wizard running, click the Run button in the Run Wizard column for the event you
wish to use. If you are working in a new model, you would naturally run the wizard for the first
event, then again for the second event, etc.
To change the depth conversion method for an existing layer, you can go back into the set up
via the Run option for that event.
Note: It is important to remember that if you change the depth conversion function or
method for a given layer, all layers below that become invalidated and you will need to
re-run the Wizard for the deeper layers.
When you have set up a Run for an event the final screen for the Wizard for that event is to
save the run as a run file named after the event and type of depth conversion method you
have just set up. This run file can be replayed by pressing the Review Wizard button and
selecting the run file that corresponds to the run you wish to recreate. This can be useful if
data in the model has changed (for example, well data added or deleted) but you wish to
assess the changes to the model using the same options as before.
Note: Be careful to review any regressions, optimisation results etc. Also be aware that you
would typically need to replay the wizard from top to bottom if data has changed. Only
the most recently run Wizard options are stored in the replay file.
General Concepts
The Wizard runs in parallel to the main application
The Wizard runs “in parallel” to the main application. This means that you can fully interact
with the application, even part way through the Wizard (assuming it isn’t doing any
processing, gridding etc.). This enables you to display, edit, correct or delete data mid-Wizard
and continue. Most Wizard pages feature an action button, typically labeled “Generate” or
“Apply” depending upon the context. If you edit data within the model, press the appropriate
action button to update the changes to the model, graphs etc.
If in doubt, simply rerun the Wizard from the beginning - most options are very quick to work
through.
Certain operations should not be done during Wizard execution unless you fully understand
the consequences to your model.
• Do not change the Surface number, either through the use of the icons, or setting
the Surface number selector. If you have pressed an action button in the Wizard to
generate results, you can change the Surface number to compare the results with a
different Surface, but ensure you change the Surface back before pressing the action or
Next button again.
• Do not change the layer definition. If this is required, exit the Wizard, re-do the layer
definition, the rerun the Wizard for all layers.
• If you use Save or Save As in the model mid-Wizard & then close & reload the model, the
Wizard state is not saved. Therefore you will need to rerun the Wizard for a given layer
from the start.
• Do not run a workflow mid-Wizard – results will not be guaranteed to be correct from
either the Wizard or Workflow.
• f you close the Wizard window, you will need to restart the Wizard from the beginning.
Note: Note that if you go back a page, you may need to reset the parameters as they are not
always stored in the user interface.
Help information
Each Wizard page has a corresponding Help screen. It gives extensive hints and tips on what
the function or method is doing. Hyperlinks make navigation around the information efficient.
You can right-click on the help display and select Back to go back to a previous point as
required.
If you no longer wish to see the help, you can “collapse” the help screen down using the Help
icon to give you more space on the screen.
Windows layout
To make using the Wizard as efficient as possible, it’s a good idea to make a layout of the
main tools that makes window switching as “click-free” as possible. Layouts may be saved
and loaded using the File > Layout menu.
If you are unfamiliar with how the modules can be moved and docked, and how to save your
layout for future sessions, please refer to the Tutorial provided with the software (found in the
program installation folder doc\Tutorial subfolders).
Generally you will always need to see the Surface and Well module, so ensure these are
docked and visible at all times.
The Create button allows the user to create a Workflow from the run files currently used in the
Wizard. The Workflow module can be used to load the workflow (the default workflow name
is Wizard.xml) & run it. You can customise the workflow as required.
Using the Looped toggle option Looped Toggle and the adjacent number field, you can create
a “looped” workflow that randomises the parameters according to the multi-realisation ranges
selected by the user when running the Wizard. his allows a range of models to be generated
(typically 500-1000) using slightly different parameters. The variation in the depth surfaces
(and volumetrics if computed) allow the use to get some understanding of the stability of the
depth conversion and the P90/P50/P10 volumetrics distribution.
ASCII Load
Data can be loaded into VelPAK via space delimited ASCII data files. As long as the data are
in the correct order per line, and are separated by gaps, then VelPAK will be able to read it in.
If the data are not already in the VelPAK Generic Format there are two methods available to
the user to re-format the data to read it in. These two methods are by using Scanit or using
Filters.
Scanit
Scanit is a valuable, visual, general purpose data formatting stand-alone tool which allows
you to reformat your data files into any columnar flat file format you require. It is available
from the ‘Tools’ drop-down menu at the top of VelPAK.
Scanit will produce output ASCII files in the correct format for VelPAK which can then be read
into VelPAK via the Input routines.
This is a more modern, visual method of re-formatting data to load into VelPAK.
Filters
Various filters exist to import GeoQuest and Charisma and Landmark ASCII data in to
VelPAK.
If an appropriate FILTER does not exist you can write your own and store it for future use.
This is done in the Filter Tab of the ASCII load window; for details Go Here.
Using a Filter will re-format the input file as it reads the data into VelPAK. This is the older,
more complex version of reading data into VelPAK which relies on the user having some
knowledge of ‘gawk’ - an old unix programming language. Although most users would now
move to Scanit to re-format their files the Filters method has been left in VelPAK so that users
may still use filters previously written.
VelPAK Binary models are read into VelPAK using the Top Line of VelPAK options: File -->
Open --> Model.
This loads a VelPAK Model that has been previously loaded, displayed and stored within
VelPAK.
Select the FILE option at the top of the VelPAK window and select Open.
A standard Windows File selector will open allowing you to scroll through your directories to
find your VelPAK Binary model.
The default Model File Name is ‘.bin’. If your model name is not named to this convention
change the ‘Files of type’ extension window to show All Files (*.*) to find your model.
If you are opening a new project Merge or Replace will work in the same way.
If you are using Merge or Replace on an already created project it is important to select the
correct mode for Replacement or Merging of the data.
Replace
In replace mode all data of a certain type (e.g. Wells) will be deleted and replaced with the
new selection. If you have 1000 wells in a project and then select one more well in replace
mode, all wells will be deleted and you will be left with th one well you have just selected.
Merge
Merge mode will attempt to merge newly selected data with the data already within the
project. Due to the nature of the different data types Merge will work in different ways
dependent on the data. This is outlined in the table below:
Full details of the link are on the help pop-ups for each screen of the link.
Types of Data for selection and what you can do with them in VelPAK
Wells
Well locations, sonic logs or time-depth charts and well tops.
Surfaces
2D Seismic
2D volume slices can be read into VelPAK to produce navigation data in VelPAK for display
as a Surface basemap.
3D Seismic
3D information can be read in to display the extents of any 3D survey selected and will be
displayed on the Surface Map module and can be used within the Volume output for
generation of a new volume.
3D SEGY volumes are used for visual displays within the 3D Visualization module or as a
backdrop in the 2D profile module of VelPAK.
Note: Any transfer of SEGY volumes will take time!
Polygons
For use once in VelPAK define required areas of the data surveys. Polygon control is not
used with the Link – data must be selected via polygons or other methods within TKS prior to
transfer.
Once selection has taken place through the Link, pressing Apply will proceed to process the
data to read into VelPAK.
Abort will stop the process if required.
The process will announce when it has finished reading the data in.
Press Exit to close the Link and VelPAK will have the project loaded.
It is recommended that you save the model immediately.
Note: If the data are not already in the VelPAK Generic Format there are two methods
available to the user to re-format the data to read it in. These two methods are by
using Scanit or using Filters.
File Set
Using the File Set option will allow the reading in of external files containing all the data in the
VelPAK project. This will have been produced by the File -> Export -> File Set option.
The data will default to be stored in the ‘export’ directory of the project directory from which
the data was exported. The default directory from importing data is the import directory of the
project directory for your current project but both of these directories can be changed using a
standard File Selector if required.
On selecting File Set and moving to the correct directory where all the data files and the index
file are stored, the index file name needs to be selected, standardly this will be called
“index.txt”. This index file provides the indexing such that these files will be read in and
labeled correctly and swiftly on import.
This means that data can be exported from VelPAK and imported into a new project
seamlessly without having to define each individual data type for import/export.
This process may take some time if there are a lot of data in the project.
Note: If the data are not already in the VelPAK Generic Format there are two methods
available to the user to re-format the data to read it in. These two methods are by
using Scanit or using Filters.
Select the required filter from the Filter Tab if required. Press OK to load the file.
For details of the Replace/Merge/Ignore option Go Here.
Note: The LINE data must be loaded before the stack data are loaded.
Stacking data comes from a variety of sources and it is very likely you will need to re-
format the data using Filters or Scanit to load the data into VelPAK. For further assistance
on this contact your local helpdesk.
Prefix - Lines within VelPAK brought in from elsewhere are prefixed with the Survey
Name. The Prefix option here will add the Survey Name to the Line Name of your
stacking velocity file. The drop down in the Prefix slot will display all Surveys for line
names currently held in the VelPAK project allowing you to select the relevant survey
name.
Note: If the data are not already in the VelPAK Generic Format there are two methods
available to the user to re-format the data to read it in. These two methods are by
using Scanit or using Filters.
Selecting the File -> Import -> Surface Location option will bring up the window to select
where the Surface location data file is stored. Bring up the file selector by pressing the ‘...’
to the right of the input window and move through the file directories to find your data file
to read in.
Select the required filter from the Filter Tab if required. Press OK to load the file.
In-lines (Lines) are prefixed I: .
X-lines (Traces) are prefixed X: .
Random lines are prefixed R: .
VelPAK does not deal with Latitude and Longitudes. If these are in your original data file
they will be ignored.
For details of the Replace/Merge/Ignore option Go Here.
Note: If the data are not already in the VelPAK Generic Format there are two methods
available to the user to re-format the data to read it in. These two methods are by
using Scanit or using Filters.
Note: If the data are not already in the VelPAK Generic Format there are two methods
available to the user to re-format the data to read it in. These two methods are by
using Scanit or using Filters.
From the slots tab that comes up on selecting which data you are to import, choose from
the list Time, Depth, Velocity, Error, Misc 01...and so on, for the branch slot in the Model
Tree you want the imported data to be stored in.
For example, if you have a Velocity Grid for Event 1 to be imported you will select Import -
> Surface Grid
The Load Event Grid window will be displayed. Select which slot you want the data to go
into from the Slots tab on display.
Note: For data files you wish to read in relating to an Event horizon within the VelPAK model
you can read in files for each horizon at the same time. Up to 32 horizons can be used
with a single VelPAK model.
When you have selected the file(s) to read in, press OK to load the file. Select the
required filter from the Filter Tab if required.
For details of the Replace/Merge/Ignore option Go Here.
Once the data are read into VelPAK the Model Tree will be populated accordingly.
Note: If the data are not already in the VelPAK Generic Format there are two methods
available to the user to re-format the data to read it in. These two methods are by
using Scanit or using Filters.
Select the required filter from the Filter Tab if required. Press OK to load the file.
Replace/Merge/Ignore
On all the import file load pages, against each input assigned there is the drop-down option to
Replace, Merge or Ignore the particular data file you have selected.
Replace
This is the default which is used when loading in new data or for replacing that already in
the memory model.
Replace will over-write any data in the project that has the same name or identity,
regardless of whether that data in the model has been modified from the original input file.
Merge
This option is used for adding other data into your model.
Merge does not replace or delete the data at all and must be used for separate areas of
data otherwise you will get duplication of data in the model.
Ignore
Ignores or does not load the file.
Note: Line Navigation data must be input before the Stacking Velocities are input.
SurveyName – Survey name (this must match that previously loaded exactly).
Shot Point – Shotpoint in the 2d survey.
Time - Time of stack velocity curve in milliseconds.
RMS-Velocity - Max. Coherency Stacking Velocity (as picked on semblance plot).
Depth – optional column of depths (set to zero if not required).
Old(dix) = 0
Average =1
Interval = 2
RMS = 3
Unknown = 4
Inline – the Inline number. If this number is not prefixed with the survey name (e.g.
Galleon:1500), you must select the survey prefix on the ‘File >> Import >> Profile Stack…’
dialog. Go Here further details.
Crossline – the crossline number.
Time - Time of stack velocity curve in milliseconds.
RMS-Velocity - Max. Coherency Stacking Velocity (as picked on semblance plot).
[Depth] – optional column of depths (set to zero if not required).
[Average-Velocity] – optional column of Average Velocities (if these are provided there is
no requirement to apply the Dix equation to the RMS Velocities).
X/Y - optional XY values.
Original Velocity - optional column for Original/Other velocity.
Type Code - optional column for Type Codes - as above.
All optional values should be 0 (zero) if not used.
Note: VelPAK does not deal with Latitude and Longitudes. If these are in your original data
file they will be ignored.
X Y Z
X Y Z SEGMENT_NUMBER SEGMENT_NAME
X - X co-ordinate.
Y - Y co-ordinate.
X Y Z SEGMENT_NUMBER SEGMENT_NAME
X - X co-ordinate.
Y - Y co-ordinate.
Z - Z depth (not used on input - use 0).
SEGMENT_NUMBER - A number that denotes a single segment; a change of number
denotes a new segment.
SEGMENT_NAME - The name of the polygon (optional). If the data is from a TKS project
this will be the name as allocated in that project.
Note: This option will only accept ‘Unix’ CPS1 Binary formats (not ‘PC’ based). This option
will accept grids previously output from software packages VelPAK and Cubit.
WELLNAME TOPNAME Z
Note: If the file is a Deviation file then the Time slots must be INDT and not zero values.
Note: HORIZONTAL DRILLED wells must be entered as deviated wells with DEV in the
appropriate column. This format allows for reversal of time and/or depth within the
data file.
ASCII Export
There are a number of ASCII formats available to output data from VelPAK.
The EXPORT options are laid out in the same form as all data are stored and viewed within
VelPAK, and are therefore split into Profile, Surface, Well and Curve types of export.
The Save and Save As options from the drop down are only used to save a binary VelPAK
model.
All other forms of output are accessed via the EXPORT drop down option.
The VelPAK binary save model is an internal format that cannot be used within any other
packages. However, it is the simplest way of saving your data while working within VelPAK.
This method of save allows you to quickly read the data back into your VelPAK memory
model.
Save
The Save Option will save the model as the file name that has previously been assigned
to the model. No warning message will come up to warn you that you are saving the
model over the previously saved file. If no file name has been assigned to the model yet,
the following message will appear.
ASCII Output
The EXPORT options are laid out in the same form as all data are stored and viewed within
VelPAK, and are therefore split into Profile, Surface, Well and Curve types of export.
File Set
Using the File Set option will produce external indexed data files all the data in the VelPAK
project.
The process produces a series of data files for all the different data types within VelPAK.
Along with these there will be an Index.txt file which provides the indexing such that these
files will be read in and labelled correctly on import.
The data will default to be stored in the ‘export’ directory of the project directory from which
the data was exported. This directory can be changed using a standard File Selector if
required.
This means that data can be exported from VelPAK and imported into a new project
seamlessly and swiftly without having to define each individual data type for import/export.
On selecting File Set an index file a name needs to be assigned, standardly “index.txt”. This
is the file that will then be selected on Import into another VelPAK project.
This process may take some time if there is a lot of data in the project.
Note: If you want your data exported in something other than the generic format for the
output you have specified you will need to use the filters provided under the Filter Tab
or write a new one to change the format. See Filters for full details. Alternatively you
can output the data in VelPAK generic format and edit the output file to the format you
require by using Scanit from the Tools drop-down menu.
Profile Faults
Faults can be output as Time, Velocity or Depth.
Use Filters to output data in a format other than VelPAK generic format.
X Y Z SEGMENT_NUMBER SEGMENT_NAME
X - X co-ordinate
Y - Y co-ordinate
Z - Z depth
SEGMENT_NUMBER - A number that denotes a single segment; a change of number
denotes a new segment.
SEGMENT_NAME - The name assigned to the surface fault within VelPAK.
Profile Events
Note: There are three types of Profile Event output available from the drop-down menu; this
is just one of them. The other two are Profile XYZ and Profile Cont. These output the
profile data in certain different ways to the Profile Event option. For details of the
styles of output and which one may be more suitable for you go to the relevant
sections.
Outputs the data in a VelPAK generic format The data does NOT need to be snapped to
output it in this form.
One purpose of this is to be the inverse of the input; you would use this format to dump
your VelPAK model in ASCII and re-input.
However, it is also possible to use Filters with this export method so that data are
exported in a different format. Alternatively you can output the data in VelPAK generic
format and edit the output file to the format you require by using Scanit.
Horizons can be output as Time, Velocity or Depth When one of these is selected, a
window will come up to enter where you wish the data files to go.
Since profile data are dependent on events, it is expected that you will be enter an output
data file for each event you wish to export.
Unsnapped Profile
Snapped Profile
However, VelPAK generic format will output the real event data. So that in the above
example, just the data for:
and
will be output as two separate files, according to the original event data.
Overhanging events..:
..will have the data output continuously along the event, so that there is a possibility that one
shotpoint could have multiple Z values. Therefore this output is not appropriate for
GRIDDING techniques. (Use XYZ output for Gridding.)
Type in the names you want as your output files.
Note: On the Event output you can press the widget SAVE/IGNORE so that you can select
which events you wish to output out of all the events in your model.
Select the directory you wish the data files to go into, and select a filter if required.
Profile XYZ
This produces an XYZ file in either Time, Velocity or Depth, that allows a VelPAK model to be
read into any suitable software package that can read in ASCII files.
Note: Your data must be successfully snapped for this output style to work.
Time 1
Time 2
Time 3
Time 1
In a six Z situation (multiple over-thrusting) - below - the following result will be seen: Time 1
will start again as soon as it can, in order to produce one continuous layer that can be read
into a Mapping package (such as Equipoise Software’s XGEO) and displayed on a scaled,
posted map.
Time 1
Time 2
Time 3
Time 1
Time 2
Time 3
Time 1
The FLAG field in the data file will can be defined as follows:
FUR
FU
FL FLR
T
P
Within the XYZ file, the MASK value is particularly interesting as it represents the geological
situation. This can be of great benefit to advanced computer mappers.
For full details of the MASK value Go Here.
Z1 value
Z2 value
Z3 value
Z1 value
Masks/
Line Shot Segme
X Y Z1 Z2 Z3 Event
Name point Name
No.
EW-001 499225.91 6144965.50 939.00 345 1e+30 1e+30 4 1 2 R/EW-00
2/0
EW-001 499674.41 6144962.50 956.00 335 1e+30 1e+30 4 0 2 R/EW-00
2/0
EW-001 499836.41 6144958.00 962.00 338 1e+30 1e+30 4 0 2 R/EW-00
2/0
EW-001 499989.56 6144952.50 967.63 368.233 1e+30 1e+30 4 1 2 R/EW-00
2/0
Field Field
Number Name
1 LINE NAME
2 X
3 Y
4 SHOTPOINT
5 Z1
Field Field
Number Name
6 FLAG
7 Z2
8 FLAG
9 Z3
10 FLAG
11 MASK
12 MASK2
13 EVENT NUMBER
14 SEGMENT NAME
FORTRAN FORMAT - X, Y, Z
(A16, 3(1X, F10.2), 3(1X, F10.0, 1X, A4), 2(1X,Z8), 1X, I2, 1X, A32)
Indeterminate Z value = 1.0E+30
Profile Conts
This produces an XYZ file that will not split up the horizons or events when encountering
reversal zones. The continuous output file produced is not suitable for most mapping
packages, it’s primary use is for volume modelling.
This format is detailed here.
Use the Filter Tab on output to provide the option of writing Awk filters to produce custom
output.
Note: Your data must be successfully snapped for this output style to work.
If the segment name has been included on input it will also be output.
The output file produced consists of only one value for X,Y and Z in all circumstances.
Time 1
Time 1
Time 1
For example, the overthrust illustrated above would produce a file with three Z values
(Z1,Z2,Z3) if output as an XYZ file but only one Z value if output as a continuous output file.
The Continuous file produced contains FLAG’s and MASK values as described for the XYZ
output.
For full details of the MASK value Go Here
Line Segment
X Y Shotpoint Time Velocity Depth Vup/Vdown / Flag/Masks/
Name Name
Event Number
EW-001 478918.41 6145025.50 177.00 801.5 2730 954.047 1e+30 1e+30 4 0 2 R/EW-001/2
EW-001 479180.41 6145024.50 187.00 809 2730 965.727 1e+30 1e+30 4 0 2 R/EW-001/2
EW-001 479392.50 6145026.50 195.00 821 2730 979.936 1e+30 1e+30 4 0 2 R/EW-001/2
EW-001 479633.59 6145028.50 204.00 832 2730 992.055 1e+30 1e+30 4 0 2 R/EW-001/2
EW-001 479977.00 6145026.50 217.00 829 2730 990.96 1e+30 1e+30 4 0 2 R/EW-001/2
EW-001 480295.31 6145025.00 229.00 835 2730 998.91 1e+30 1e+30 4 0 2 R/EW-001/2
EW-001 480622.00 6145024.00 241.00 846 2730 1013.92 1e+30 1e+30 4 0 2 R/EW-001/2
Field Field
Number Name
1 LINE NAME
2 X
3 Y
4 SHOTPOINT
5 TIME
6 VELOCITY
Field Field
Number Name
7 DEPTH
8 VUP
9 VDOWN
10 FLAG
11 MASK
12 MASK2
13 EVENT NUMBER
14 SEGMENT NAME
Profile Stack
Outputs the data in a VelPAK GENERIC FORMAT style.
Use the Filter Tab on output to provide the option of writing Awk filters to produce custom
output. Alternatively you can output the data in VelPAK generic format and edit the output
file to the format you require by using Scanit.
Example of STACK DATA file output for this case:
Note: No filters are available through this option to save the data in any other format than
the one stated above.
Note: Only XYZ Export gives you the option to export values using a filter. The filters
available for XYZ output are for 2DI and EarthVison formats.
Grid Data
Saves the internal grids generated in VelPAK as an external CPS1 Maps 15 format grid.
If you wished to use your internal grid in another package as another style of grid other
than CPS1, the recommended course of action would be to turn the grid values into XYZs
from the Model tree, and output the XYZ data file for use in another package. See Model
Tree for full details of how to do this.
From the File Export Surface data drop down menu choose from the list Time, Depth,
Velocity, Error, Misc 01...and so on, for the branch slot in the Model Tree where your data to
be exported is sitting.
For example, if you want to export a Surface Velocity Grid for Event 1 you will select Export -
> Surface Grid
The Save Event Grid - Velocity window will be displayed on the Slots tab; select which
allows you to select which grid slot you want to export; in this case you would select the
Interval Velocity slot.
Then select the FIles tab and select where you want the files to be stored; bring up the file
selector by pressing the ‘...’ to the right of the export window to give an output file name to
the grid to be output for that event from the Model Tree (in this example stored within the
velocity slot for that particular event).
Note: There are no pre-defined filters available for Well data output.
From the File Export Well data drop down menu choose the type of Well data you wish to
output.
The relevant Save Well window will be displayed. Bring up the file selector by pressing
the ‘...’ to the right of the export window to give an output file name.
Select the required filter from the Filter Tab if required.
SAVE/IGNORE
Selecting IGNORE allows you to retain the save file name information in the slots, but to
ignore the output of data to these files.
Press OK to save the file.
For details of the Filters tab and option that is used to change VelPAK generic format into
output ASCII files of varying formats Go Here. Alternatively you can output the data in
VelPAK generic format and edit the output file to the format you require by using Scanit.
WELLNAME X Y
WELLNAME TOPNAME Z
Note: Grids need to be in the TIME SLOT of the Model Tree for appropriate event for them
to be read.
Also note that Layers need to be defined in your model for this to work.
On deviated wells the values will be taken as shown in the diagram below.
x x Top Winterton
Time-depth curve x x x
lies on curve; layer
will be labelled in
x
output file.
x x
x x x Rot Halite
No layer for
time-depth
curve; marked as x
‘NONE’
x
x x x x
Brockelschiefer
The input and output files are seen on screen together with a Pattern Window above the
Input Window and a Middle Window for extraction of subsets from the original file.
Typically, users would use the Pattern Window as the means of extracting required data from
the Input Window as a subset of data to be placed in the Middle Window. This subset would
be displaying the data in a way that allowed the extraction of columns of data to be processed
and displayed in the bottom Output Window.
Once the output file has been created to the satisfaction of the user, a template of the actions
just used on screen to produce the required format can be saved for future use.
For example, ASCII horizon data read out GeoQuest in their M7 2D format can be passed
through the relevant filter in VelPAK and turned into the correct VelPAK generic format to be
read into VelPAK. This is shown in the example dialog above.
For a list of all the Filters provided with the VelPAK refer to the end of this chapter, Go Here.
1. Select the Files you wish to read in /read out of VelPAK on the Files Tab.
See ASCII Input Process and ASCII Export for details.
2. Select the Filters Tab.
3. Select the Filter you wish to use for the data from the Filter drop-down.
Use the down arrow () to select from available inputs.
Double click to bring it into the filters window.
4. The Filters code will be displayed in the Filters window. Press OK to run the files through
the filters selected either into VelPAK or out of it.
1. You can either write your own filter or, to modify a ready made filter, select the Filter you
wish to modify for the data from the Filter drop-down.
2. Refer to Writing your Filter for details of the language used to write the filters.
3. Load your input file to test the filter you have written by selecting the LOAD option next to
the Input window. A File Select window will come up allowing you to move to the directory
where your data are stored and select the relevant file.
4. Press TEST to test the filter you have written against the data you have loaded into the
Input window. In the output window it will either show you the file as it appears having
been passed through the filter you have written, or else you will see an error from with the
filter.
5. If you are happy with the output from the filter you have written, use Save As to save the
filter with a suitable file name in the specific filter’s directory for the data you have written
the filter for.
6. You can then use this new filter as the selected filter on this tab for when you go back to
the Files Tab to select the data files you wish to read in/ read out using the new filter.
Note: Any user of the VelPAK filters Gawk routines are reminded of the Gnu General Public
License Terms and Conditions of use.
Filter Libraries
Supplied with VelPAK filters are several gawk library routines which will be of use to anyone
creating their own filters or for those just trying to understand the filters supplied. The library
routines are located in the following file:
$TOPIXDIR/filters/LIBRARY
Routines to note are:
• CHRLENReturns the length of a character string without trailing spaces.
• CHRLEFTReturns the character string minus the leading blanks.
• CHRFILLReplaces blanks in a character string with underscores.
• CHRUNFILLReplaces underscores in a character string with blanks.
• ABSGives the absolute value.
• INDIRECTPerforms a process prior to the filter execution.
For details of all the functions see the LIBRARY file.
DATA
FILTER NAME DESCRIPTION
TYPE
DATA
FILTER NAME DESCRIPTION
TYPE
DATA
FILTER NAME DESCRIPTION
TYPE
CURVE TD_CURVE_CALIBRATED Turns old ‘VelPAK’ format curve files and turns
them into the correct format for VelPAK.
DATA
FILTER NAME DESCRIPTION
TYPE
DATA
FILTER NAME DESCRIPTION
TYPE
This polygon filter would not tend to be used by the user in the Filter Tab of the Input Polygon
routine. This is the awk script that is used within the TKS Link to use Polygons (previously
defined in TKS) to draw in specific Well information into VelPAK. Go Here for information
about this in the TKS link.
DATA
FILTER NAME DESCRIPTION
TYPE
STACK VARIOUS EXAMPLES for Stack Descriptions within the Filter file
Column data
A number of filters exist for the many varied formats of Stacking Velocities. Please look at the
different examples give if required.
It is recommended however that Scanit should be usually used to re-format Stacking Velocity
data.
Data
Filter Name Use
Type
Data
Filter Name Use
Type
Data
Filter Name Use
Type
Data
Filter Name Use
Type
CONT Profile_Cont_to_TKS_CUL
Profile_Cont_to_TKS_CUL_PLUS_DEPTH_N
O_F
Data
Filter Name Use
Type
covered only if its contents constitute a work based on the Program (independent of having been made
by running the Program). Whether that is true depends on what the Program does.
1. You may copy and distribute verbatim copies of the Program’s source code as you receive it, in any
medium, provided that you conspicuously and appropriately publish on each copy an appropriate
copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to
the absence of any warranty; and give any other recipients of the Program a copy of this License along
with the Program.
You may charge a fee for the physical act of transferring a copy, and you may at your option offer
warranty protection in exchange for a fee.
2. You may modify your copy or copies of the Program or any portion of it, thus forming a work based
on the Program, and copy and distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
a) You must cause the modified files to carry prominent notices stating that you changed the files and
the date of any change.
b) You must cause any work that you distribute or publish, that in whole or in part contains or is
derived from the Program or any part thereof, to be licensed as a whole at no charge to all third parties
under the terms of this License.
c) If the modified program normally reads commands interactively when run, you must cause it, when
started running for such interactive use in the most ordinary way, to print or display an announcement
including an appropriate copyright notice and a notice that there is no warranty (or else, saying that
you provide a warranty) and that users may redistribute the program under these conditions, and
telling the user how to view a copy of this License. (Exception: if the Program itself is interactive but
does not normally print such an announcement, your work based on the Program is not required to
print an announcement.)
These requirements apply to the modified work as a whole. If identifiable sections of that work are not
derived from the Program, and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those sections when you distribute them
as separate works. But when you distribute the same sections as part of a whole which is a work
based on the Program, the distribution of the whole must be on the terms of this License, whose
permissions for other licensees extend to the entire whole, and thus to each and every part regardless
of who wrote it.
Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by
you; rather, the intent is to exercise the right to control the distribution of derivative or collective works
based on the Program.
In addition, mere aggregation of another work not based on the Program with the Program (or with a
work based on the Program) on a volume of a storage or distribution medium does not bring the other
work under the scope of this License.
3. You may copy and distribute the Program (or a work based on it, under Section 2) in object code or
executable form under the terms of Sections 1 and 2 above provided that you also do one of the
following:
a) Accompany it with the complete corresponding machine-readable source code, which must be
distributed under the terms of Sections 1 and 2 above on a medium customarily used for software
interchange; or,
b) Accompany it with a written offer, valid for at least three years, to give any third party, for a charge
no more than your cost of physically performing source distribution, a complete machine-readable
copy of the corresponding source code, to be distributed under the terms of Sections 1 and 2 above on
a medium customarily used for software interchange; or,
c) Accompany it with the information you received as to the offer to distribute corresponding source
code. (This alternative is allowed only for noncommercial distribution and only if you received the
program in object code or executable form with such an offer, in accord with Subsection b above.)
The source code for a work means the preferred form of the work for making modifications to it. For an
executable work, complete source code means all the source code for all modules it contains, plus any
associated interface definition files, plus the scripts used to control compilation and installation of the
executable. However, as a special exception, the source code distributed need not include anything
that is normally distributed (in either source or binary form) with the major components (compiler,
kernel, and so on) of the operating system on which the executable runs, unless that component itself
accompanies the executable.
If distribution of executable or object code is made by offering access to copy from a designated place,
then offering equivalent access to copy the source code from the same place counts as distribution of
the source code, even though third parties are not compelled to copy the source along with the object
code.
4. You may not copy, modify, sublicense, or distribute the Program except as expressly provided
under this License. Any attempt otherwise to copy, modify, sublicense or distribute the Program is void,
and will automatically terminate your rights under this License. However, parties who have received
copies, or rights, from you under this License will not have their licenses terminated so long as such
parties remain in full compliance.
5. You are not required to accept this License, since you have not signed it. However, nothing else
grants you permission to modify or distribute the Program or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Program
(or any work based on the Program), you indicate your acceptance of this License to do so, and all its
terms and conditions for copying, distributing or modifying the Program or works based on it.
6. Each time you redistribute the Program (or any work based on the Program), the recipient
automatically receives a license from the original licensor to copy, distribute or modify the Program
subject to these terms and conditions. You may not impose any further restrictions on the recipients’
exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties
to this License.
7. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason
(not limited to patent issues), conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of
this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License
and any other pertinent obligations, then as a consequence you may not distribute the Program at all.
For example, if a patent license would not permit royalty-free redistribution of the Program by all those
who receive copies directly or indirectly through you, then the only way you could satisfy both it and
this License would be to refrain entirely from distribution of the Program.
If any portion of this section is held invalid or unenforceable under any particular circumstance, the
balance of the section is intended to apply and the section as a whole is intended to apply in other
circumstances.
It is not the purpose of this section to induce you to infringe any patents or other property right claims
or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of
the free software distribution system, which is implemented by public license practices. Many people
have made generous contributions to the wide range of software distributed through that system in
reliance on consistent application of that system; it is up to the author/donor to decide if he or she is
willing to distribute software through any other system and a licensee cannot impose that choice.
This section is intended to make thoroughly clear what is believed to be a consequence of the rest of
this License.
8. If the distribution and/or use of the Program is restricted in certain countries either by patents or by
copyrighted interfaces, the original copyright holder who places the Program under this License may
add an explicit geographical distribution limitation excluding those countries, so that distribution is
permitted only in or among countries not thus excluded. In such case, this License incorporates the
limitation as if written in the body of this License.
9. The Free Software Foundation may publish revised and/or new versions of the General Public
License from time to time. Such new versions will be similar in spirit to the present version, but may
differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Program specifies a version number of
this License which applies to it and “any later version”, you have the option of following the terms and
conditions either of that version or of any later version published by the Free Software Foundation. If
the Program does not specify a version number of this License, you may choose any version ever
published by the Free Software Foundation.
10. If you wish to incorporate parts of the Program into other free programs whose distribution
conditions are different, write to the author to ask for permission. For software which is copyrighted by
the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions
for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of
our free software and of promoting the sharing and reuse of software generally.
NO WARRANTY
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR
THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO
THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM
PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR
CORRECTION.
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL
ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO
LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU
OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.
END OF TERMS AND CONDITIONS
Appendix: How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest possible use to the public, the best
way to achieve this is to make it free software which everyone can redistribute and change under
these terms.
To do so, attach the following notices to the program. It is safest to attach them to the start of each
source file to most effectively convey the exclusion of warranty; and each file should have at least the
“copyright” line and a pointer to where the full notice is found.
<one line to give the program’s name and a brief idea of what it does.> Copyright (C) 19yy <name of
author>
This program is free software; you can redistribute it and/or modify it under the terms of the GNU
General Public License as published by the Free Software Foundation; either version 2 of the License,
or (at your option) any later version.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without
even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See
the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with this program; if not,
write to the Free Software Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
Also add information on how to contact you by electronic and paper mail.
If the program is interactive, make it output a short notice like this when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) 19yy name of author Gnomovision comes with ABSOLUTELY
NO WARRANTY; for details type `show w’. This is free software, and you are welcome to redistribute it
under certain conditions; type `show c’ for details.
The hypothetical commands `show w’ and `show c’ should show the appropriate parts of the General
Public License. Of course, the commands you use may be called something other than `show w’ and
`show c’; they could even be mouse-clicks or menu items--whatever suits your program.
You should also get your employer (if you work as a programmer) or your school, if any, to sign a
“copyright disclaimer” for the program, if necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the program `Gnomovision’ (which makes
passes at compilers) written by James Hacker.
<signature of Ty Coon>, 1 April 1989 Ty Coon, President of Vice
This General Public License does not permit incorporating your program into proprietary programs. If
your program is a subroutine library, you may consider it more useful to permit linking proprietary
applications with the library. If this is what you want to do, use the GNU Library General Public License
instead of this License.36
Note: If you load new well data into the model, or change existing well data, it will be
necessary to perform set up the layer definition again in the import link.
Note: If you are using only stacking velocities and no well data, then layer definition is not
necessary. If you generate pseudo-wells from stacking velocities, pseudo-tops are
automatically created and added to the existing layer definition.
The following chapter outlines the advanced layer definition procedures that are available
within VelPAK. In particular how to set VelPAK to work with horizontal wells when the track
may well exit and re-enter the target horizon.
If you load new well data into the model, or change existing well data, it is necessary to
perform the advanced layer definition again if you are using it. Layer definition should always
be performed immediately after well data import to ensure model integrity.
If you are using only stacking velocities and no well data, then layer definition is not
necessary. If you generate pseudo-wells from stacking velocities, pseudo-tops are
automatically created and added to the existing layer definition.
Layer Module takes tops information typically used in a VelPAK project and produces a
Master Layer Definition containing all the tops used but in chronostratigraphic order.
The Master Layer Definition contains flags against each top:
OK - Order completely resolved.
Check - Order almost certainly correct but not completely resolved.
Bad - A contradiction has occurred.
The Model Tops List can be used to deactivate tops that serve no purpose in the Layer
Definition - they will not appear in the Master Layer Definition which makes it easier to select
the tops that are to be used.
Minor ordering problems that are caused by repeated tops (e.g. Zechstein Rafts) can be
tolerated. The VelPAK model is then updated by applying the Master Layer Definition to the
well data – doing an automatic layer definition.
Note: Clearly, tops relating to the seismic pick must always be correct, however tops that
do not relate to the layers do not have to be correct; tops marked ‘check’ or ‘bad’
(although they should be checked) that do not interfere with the layering can be
ignored or can be turned inactive’.
The actual sorting and searching theory that the program uses to establish the Chronological
listing is extremely complex. If the reader desires this can be investigated further by referring
to the following book:
D. Knuth. The Art of Programming, Vol. 3 (Sorting and Searching). Addison-Wesley, 1973.
Horizontal Wells
3. Check integrity of each Well OK/Check/Bad in the Master Layer Definition. See Check.
4. Assign/Select tops to define layers in Master Layer Definition. See Assign.
Inactive Tops
Inactivating tops is simply a method of making the layer definition easier to manage by
making unused tops inactive/invisible.
Highlight the top you wish to inactivate or active and use the -/+ icons at the top of the Model
Tops List or click the right hand mouse button on the left panel next to the top name to change
the top’s mode.
Making a top active or inactive will cause it to be displayed or removed from the Master Layer
Definition panel.
Extra Tops
‘Extra’ tops are for use in drilling horizontal wells. If a well track ‘porpoises’ in and out of a
layer the measured depths for where the track cuts through the layer can be entered as Extra
‘tops’ which are then used in the error correction of a re-depth conversion to more accurately
define the layer.
Full details of using VelPAK for tracking a Horizontal well and depth conversion can be found
in the standalone document, please contact customer support for further details.
Extra tops need to be assigned the numerical value of the layer which the track cuts through.
This is always the top of the layer the track cuts through, regardless of whether the track is
cutting into or out of the layer, as shown in the picture below.
To use the Extra function, highlight the top you wish to make and and use the Extra Icon
icon at the top of the Model Tops List or click the right hand mouse button on the left panel
next to the top name and select Extra.
An alert box will pop up allowing you to assign the Extra top to the correct layer.
A top assigned as ‘Extra’ automatically becomes inactive in the Master Layer Definition.
Find - Find the highlighted well in both the Master Layer Definition and Well Layer
Definition lists.
Rename - Allows you to rename your tops. Selecting the top will bring up the alert box
to allow renaming.
Delete - Deletes the selected and highlighted tops from the model permanently.
Build Find
The "Surface" at the top of the list is a nominal top created by VelPAK to represent a
top at the seismic datum, and cannot be deleted. Double-click in the grey cell to the
left of the top name to assign it to a layer.
Build - Builds the Master Layer Definition list from the information in the model. The
resultant display will be what you use to check and assign the layers.
Find in Wells - Given a highlighted top in the Master Layer Definition, selecting this
will highlight all references to that top in the Well Layer Definition.
Table Headers
Top - The Name of the Top. You may find that the same actual top has been entered
in the model with different spellings, for example. This display allows you to see any
discrepancies and fix them to turn ‘Check’ or ‘Bad’ into ‘OK’.
Layer - The Layer number and color will be displayed here when layers have been
assigned.
Status - Check/Bad/OK. See Status below.
Current - Allows you to see what layer the top is currently defined as - for use when
adding extra tops into the list or changing the layer of a top.
Well Name - List of the well names.
Panel
Find in Wells - Given a highlighted top in the Master Layer Definition or Well Layer
Definitions, selecting this will highlight all references to that top in both panels.
Check - A very powerful facility; designed to give you a quick check of the data before
(or indeed after) proceeding with the building of the well definitions. Once the Master
Well Definition has been run and each top in the model assigned an ID number
(ascending chronologically), pressing Check will highlight in the Well Layer Definition
tab which tops are out of order according to the ID number. You can then review these
tops and see why they are out of order.
However; bare in mind that it may not be the highlighted top that is at fault;
Find in Master - Given a highlighted top in the Well Layer Definition, selecting this will
highlight that top in the Master Layer Definition.
Delete in Wells - Deletes the selected and/or highlighted tops from the model
permanently.
Table Headers
Assigned
First Well in List
Layers
for this
Well
Second Well
in List
Status - Check/Bad/OK/Unused
When the preliminary Master List has been Built, it will contain one of the three following flags
against each top:
OK - Order completely resolved.
Check - Order almost certainly correct but not completely resolved; some tops are
imperfectly defined but statistically the top is likely to be in the right place.
Bad - A direct contradiction has occurred between at least two wells. Note that this
does not mean that the entry is necessarily wrong - for example it is quite geologically
valid to have a repetition of a layer down a well. However the instance needs to be
looked at.
Note: Clearly, tops relating to the seismic pick must always be correct, however tops that
do not relate to the layers do not have to be correct; tops marked ‘check’ or ‘bad’
(although they should be checked) that do not interfere with the layering can be
ignored.
Double-Click on the gray box All older tops will become assigned
to the left of the top that defines the as the next layer, in this case Layer 2.
next layer.
Once you have defined you Master Well List you will want to apply the layers to the wells in
your project.
Referring to the first screen-shot in this chapter (Go Here) you will see that the suggested
display for working in the Layer Tool Module has both Layer Tool Tabs on display along with
the Well Module on display to the right to visualize the Layers once applied being displayed
on the curve.
Apply - Applies the Master Well List to the Well Logs. See Well Log Module for full
details of the manual procedure that this button does automatically. The layers will be
seen on activating the Well Log Module. Grids within the model will be shown if
available to the right of the well curve.
Display Options
Previous/Next Arrows
Use the previous/next arrows at the top of the module tab to move to the previous or next
Event, Well or Line within the VelPAK model.
Tip: Using the Time Filled / Depth Filled in conjunction with the Event Fill Verify Option
allows you to verify the layer model, to check that Snap has worked according to the
geological model. It can be a very useful check to see that the layers are being read
correctly. Basically, if the Fill model you are looking at looks OK, in that all the colors for
all your layers are falling within the layer you expect, you can be fairly confident that
your Depth Conversion will work correctly, layer by layer.
Time or Depth/Vel Shaded - Displays a filled section in Time or Depth filling with the
instantaneous, average or interval velocities set up in an option further down the list on
the Event Option dialog. Thus this display can show a gradational color fill through out the
different horizon layers.
Note: Verify Fill Process can not be used with the T/V or D/VFILL.
Overlay Options
Display on
Display off
Label on
Label off
Turn Display element ON or OFF for display on the Profile. With or without element label
Line Mistie - Displays intersections and mistie ‘blobs’ on your time or depth profile, with a
line display and the line name of the intersecting line displayed when the ‘Line Value
Display’ is switched on.
On TIME profiles,
Green line intersections show SNAPPED lines.
Red line intersections show UNSNAPPED lines.
On DEPTH profiles,
Green line intersections show DEPTH CONVERTED lines.
Red line intersections show lines NOT DEPTH CONVERTED
Grid Display On
Gives the Event Blobs
Grid Display On
Gives the Event Blobs
Well Track -
Displays the well locations and well tracks stored in your model on the profile.
The extreme distance a well can be situated to include it or exclude it from being
displayed on a Profile is set under the Event Dialog.
Seismic Backdrop - will display the SEG Y as a back drop for the profile. You will need to tell
VelPAK where the SEG Y data are; the data are not read in but cached as the display
progresses through the model’s Profiles. Random lines created from the surface module (Go
Here) will also display a crude back drop.
Insert Delete Edit Move Pick Insert Delete Edit Move Split Merge Swap Flip Void Term
Point Edit Segment Edit
The Profile Edit options within VelPAK are crucial to being able to modify horizon data such
that they can be successfully snapped, and used in applications such as MAP or Depth
Conversion.
Pressing the left-hand mouse button over the option you require will cause that option to be
activated. The Console Window of VelPAK displays what edit mode you are in.
Here follows a summary of each option. For full details of each option select the appropriate
Edit option Details.
Note: For any of these procedures to work, you must select a data point on the event
horizon, not the line joining them.
Point Insert - Allows you to insert extra points - one at a time- onto a selected horizon.
Go To Details
Point Delete - Allows you to delete points onto a selected horizon. Go To Details
Point Edit - Selecting edit and then a particular data point brings up a dialog box where
the values for this point are displayed. Go To Details
Point Move - Allows you to select a point and move it. Go To Details
Point Pick - Used as a method of selecting the event horizon you want to edit. Go To
Details
Segment Insert - Allows you to insert a new distinct segment within the current event
horizon. Go To Details
Segment Delete - Will cause an entire segment or event horizon to be deleted. Go To
Details
Segment Edit - Selecting a particular segment of data brings up a dialog box where the
segment name is displayed. Go To Details
Segment Move - Allows you to move an entire segment to another area of the profile. Go
To Details
Segment Split - Allows a segment to be split. Go To Details
Segment Merge - Allows a merge of two segments of an uncontinous event horizon
together. Go To Details
Segment Swap - Allows you to swap the priority of segments within a model. It does not
swap the data information for any given segment, merely the number of the segment. Go
To Details
Segment Flip - Provides a reverse snapping direction for an individual event horizon or
event horizon segment. Go To Details
To Flip the whole profile to its mirror image, use the function from the Profile -> Event drop
down menu)
Segment Void - Voiding a segment allows you to retain a segment within your model,
which is not used in any application or output routine within VelPAK. Go To Details
Further procedures:
1 1
Note: The option ‘FLIP SEGMENT’ for individual segments can be found under Event
Segment Edit.
UNDO LINE
This clears all the edits to all the event horizons made since the last save of the model for
the line you are currently displaying in profile.
An alert box appears before undoing, for verification of the process.
UNDO EVENT
This option allows you to undo the single last edit you have done to the model.
An alert box appears before restoring the last saved model, for verification of the process.
Display Dialog
Tabs include:
General tab - to change the main input features of the Profile display.
Range tab - gives you the opportunity to display a range of the profile; in the X and Y
directions (Y being time, depth or velocity.
Name tab - to change the name of the event horizon. For user reference only.
Color tab - to change the color of the event horizon displayed on the profile.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
Option
Display_Type - Same selection as Display Type drop down on main Profile window. Go
Here for details.
Fill_Verify- If this is activated, between each layer filling there will be an alert box asking
you whether you wish to display the next layer.
Tip: The reason for this is to be able to keep an eye on the layer-filling to check that each
layer is filling correctly. The data may have appeared to have snapped correctly, but for
some reason something has not quite worked correctly. You could then get a layer being
filled by one layer/color, and then overwritten by another. This would cause the depth
conversion to act incorrectly. If you had ‘No Verify’ activated it is unlikely that you would
be able to see the layers behaving incorrectly due to the speed of the fill.
Reversal_Display -
No - Do not highlight reversal zones.
Yes -Highlight reversal zones for all event horizons.
Current Event - Only highlight reversal zones for current event horizon.
Current Event_Z1_Z2_Z3 - only highlight reversal for current event horizon with a color
coding to differentiate between the multiple Z values occurring in the reversal zone.
These different Z values can be output separately into an output file for gridding
purposes.
Z1
Z2
Z3 Z1
Search_Events - Used as an aid for editing event horizons when points from different
event horizons lie on top of each other. You can set VelPAK to pick one in preference to
the other.
Search Upwards - Searches up from the bottom of the model’s defined event horizon
list to select what is usually termed an ‘older’ event horizon.
Search Downwards - Searches down from the top of the model’s defined event
horizon list to select what is usually termed a ‘younger’ event horizon.
Segment_Numbering - Show on the Profile display the number of each segment and the
segment direction This can be particularly useful to debug a troublesome event horizon. It
may be easier to see a problem just using the ‘One Event’ option rather than displaying
the number and direction of all event horizons. Select the event horizon via the
Surface&Slot selector.
Velocity
Velocity_Color - Choose the stored color table to best suit your display. Click on the box
to the right of the color selection slot to bring up a list of color tables available to you. The
color tables within VelPAK allow selection from standard VelPAK color bars, or import of
user's color bars. Go Here for further details of the Color Table options.
Velocity_Increment - Defines the resolution of the fill display. A value of 1 will give a fine
gradation on display, a value of 50 (maximum) will give a very ‘blocky’ display.
Velocity_Type - The type of Velocity you wish to view when your section is filled.
Interval Velocity - fills with the interval velocity between layers (not supported for
Time_Vel_Volume fill).
Instantaneous Velocity - will calculate the Instantaneous Velocity for each layer
using the Vup/Vdown calculated during the Depth Conversion process. It fills with the
instantaneous velocity computed at each sample (this replaces the Vup/Vdown fill in
older versions of VelPAK).
Average Velocity - will display as standard the average velocity.
Note: The Vup/Vdown option must be activated within the Depth Conversion Options for this
to be calculated when Depth Converting and thus available for use in the display
here.
Seismic
Segy_FileMode - Segy3D/Segy2D - The name of the Segy 2D or 3D file.
Segy_FileName - Select the File where your SEG Y file is stored. Note this is not read in
to the project but remains cached while you are in the Seismic Overlay mode.
Segy_Iline - User supplied value for the in-line header word location for 3d SEGY files
(typically 17 for a SEGY export)
Segy_Xline - User supplied value for the cross-line header word location for 3d SEGY
files (typically 25 for a SEGY export)
Seismic_Color - Choose the stored color table to best suit your display. Click on the box
to the right of the color selection slot to bring up a list of color tables available to you. The
color tables within VelPAK allow selection from standard VelPAK color bars, or import of
user's color bars. Go Here for further details of the Color Table options.
Seismic_Type - Density, Wiggle or Contour. Note that on large data files Wiggle can
cause a very dense display and Contour can look very similar to Density until zoomed in
(and also take a while to generate).
Density - Sample values represented by color raster display.
Wiggle - Wiggle traces plotted using parameters set in
Wiggle_Factor - Factor of wiggle amplitude
Wiggle_Fill - Positive or negative or both.
Wiggle_Line - enable wiggle line - Yes/No.
Wiggle_Skip - number of traces to skip.
Wiggle_Type - Type of Wiggle - Fill_Mono/ Wiggle_line/ Fill_color/ Shade_color.
Well -
Well_AOI - Marks the extreme distance a well can be situated to include it or exclude it
from being displayed on a Profile. Default of 50 is in whatever units the X-Y values are in.
Overlays/Labels - These are exactly the same as the ‘Overlay’ options available at the
top of the top of the Profile module tab. The overlays and their labels can be turned on
here via a Yes/No toggle or by the Overlay drop down. For full details of what these
displays do Go Here.
When you have made your edits press to apply the changes and activate the
option.
Range Tools
AOI - Copies the currently displayed Area of Interest (AOI) into the Range extents and
sets the XY_Type to "User".
Seismic Colorbar options - allows user to scroll, flip and reverse the color bar
selected on the Display -> General -> Seismic_Colour slot.
Options
Use the down arrow () or double click on input name to select from available inputs.
Axis
X_Type, Y_Type - ‘Auto’ displays each profile using the range of the entire model. ‘User’
allows you to define the Shot Range and Time/Velocity/Depth range as required.
Range
Depth_Max, Depth_Min - Specify the Depth unit range down the Y axis that you wish to
display.
Shot_Max, Shot_Min - Specify the Shotpoint range along the X axis that you wish to
display.
Time_Max, Time_Min - Specify the Time unit range down the Y axis that you wish to
display.
Velocity_Max, Velocity_Min - Specify the Velocity unit range down the Y axis that you
wish to display.
Scale
Depth_Scale, Time_Scale, Velocity_Scale - The number of Time/Vel/Depth units per
cm you wish to have displayed.
Shot_Scale - Specify the number of Shotpoints per cm to display.
When you have made your edits. Press to apply the changes and activate the
option.
NAME - allows you to type in an actual Geological name for your event horizons. This
name can not be read in to this from any of the ASCII input processes, nor can it be
output through any ASCII output processes. However, this name will be displayed in the
Properties dialog for the relevant event horizon.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
COLOR - provides a slide bar for selecting a color to correspond with a particular event
horizon or fault within VelPAK. There are 32 colors available ranging from ‘color 0’ (black)
to ‘color 32’ (white). The color you have selected will be displayed in the ‘COLOR’ column
along with its respective number.
Snap Dialog
Tabs include:
General tab - to change the main input features for the Snap option.
Bias tab - to aid the snapping procedure by telling the program the way to go to keep the
horizon integrity - be that ‘UP’ or ‘DOWN’.
Missing tab - This provides an up or down option to tell VelPAK what to do when event
horizon data are missing. from the profile, since the layer above an event horizon that is
missing due to truncation would be different from a layer above an event horizon that has
been pierced.
Pod tab - to designate certain event horizons as ‘Pods’ or ‘Lens’ if required, which thus
are discontinuous over the Profile.
Clean tab - to assist in the cleaning of horizons that abut profile faults and thus speed up
the snapping routine.
Discussion
Data output from workstations basically consists of segment orientated data i.e. a collection
of digitized data points measured by SP/CDP and TWT. These points are collected into a
segment until there is a break in the data - usually caused by a fault.
The primary purpose of the SNAP process is to:
FU FUR
FL FLR
5. Generate a unique code describing the geological situation of the data point. This is
known as the Mask value. Full details of this powerful feature Go Here.
6. Generate closed polygons of the layers such that color filled displays can be generated
and depth conversion can be accomplished.
Once the process has been completed and the data has been re-interpolated (if necessary)
the data generated by the DATA OUTPUT procedure represents a highly sampled description
of the geological top or base of layer. If these data then gridded up with VelPAK using the
gridding routines then the resulting grids will accurately represent the top or base of the
layer along the profiles.
Note: You do not need to snap the whole model to use Depth Conversion routines or display
the profile surfaces as Ribbon Maps; you have the options of ‘Snap, Unsnapped or
Merged’ to select the data which suits your model best. For a discussion on this Go
Here.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
Option
Snap_Thickness - Will change the thickness of the tracking line; a visual aid to watching
the snapping track process.
Snap_Type -
Current Line - Only the current line will be snapped and then the process will stop.
Un-snapped Lines - The memory model flags whether a section, has been snapped or
not. The bracketed word ‘(snapped)’ appears after the line name. This option will
therefore only process ‘un-snapped’ sections.
All Lines - Regardless of the sections ‘snap status’ all lines will be (re)processed.
Snap_Verify - this will ask for verification before snapping the next line. A layer-by-layer
time-filled display will be seen of your profile before continuing to the next profile to be
snapped.
Aperture
A specification of how far an event horizon or a fault can automatically be extrapolated by
the program. An example of when you would want to use this is to either make sure the
fault planes connect to an event horizon above or below, thus giving your event horizons
a path, or to make sure the fault planes do not connect to an event horizon above or
below, thereby stopping the event horizon following an incorrect path.
Event_Point - The number of shotpoints or CDPs to search in the X direction.
Event_Value - The Y axis of the search radius for extrapolation - measured in msecs or
metres (for depth profiles).
Fault_Point - The number of shotpoints or CDPs to search in the X direction.
Fault_Value - The Y axis of the search radius for extrapolation - measured in msecs or
metres (for depth profiles).
Ignore_Points - The number of points of an event horizon leading into a fault to ignore.
Typical setting is 1 to ignore a possible spurious point as the last point leading up to a
fault.
Use_Points - The number of points to use in the least-square fit equation to extend/
interpolate the event horizon up to a fault.
Delay
Delay - A numerical input. In complex situations it can be useful to watch the SNAP
process work. This delay factor shows the process so you can watch the decision making
process. The speed is dependent on the number of points on each event horizon of your
profile; changing the value of Delay from 0 to 1 may be enough to slow the process down.
Note: Use the Keyboard ESC key to stop the snap if the Delay is set too slow.
Delay_Type -
All Events - Will apply the delay to all event horizons in the profile.
Current Event - Will apply the delay to just the currently selected event horizon.
Thinking_Time - The amount of ‘thinking time’ you allow the model before it will give up
and request manual intervention.
Expert
When snapping less than perfect data, you will experience a lot of Alert boxes where the
program is either awaiting confirmation before proceeding to do something automatically
(such as merge) or expecting you to do some edits to the data. If you have a lot of
segmented data where the merges or splits are obvious you have the option to tell
VelPAK to do these auto-merges and auto-splits without an Alert Box popping up every
time.
Auto_Intersect - will allow segments to cross horizons.
Auto_Merge - will automatically merge segment data together without asking for
verification.
Auto_Splits - will automatically split data where appropriate with out asking for
verification.
Show_Pauses - Yes/No - Switches off any other non-essential alert boxes that are not to
do with ‘AUTO SPLIT’ or “AUTO MERGE’.
Note: The EXPERT OPTIONS will only Auto Merge or Auto Split the simple merges and
splits common to this feature. If VelPAK finds a more complex and difficult situation
which it can not cope with, it will still stop and alert you.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
Bias
Bias - Select whether to change Bias to be Up or Down. See discussion below.
is where a BIAS can be used; where you can tell VelPAK that should the event horizon
disappear, the way to go to keep the horizon integrity would be either ‘UP’ or ‘DOWN’.
Example of using the ‘BIAS’ option
‘DOWN’
BIAS
ON SNAPPING
VOID SEGMENT: It should be said at this point that there is another method available to
suggest to VelPAK the way to go to keep the layer integrity. This entails inserting a ‘VOID’
segment in an appropriate place to force the layer down (or up). This would be done on
profiles where any one event horizon would require to go both up and down to keep the
geological structure sensible. For full details of how to edit this:
GO TO ‘VOID’
This is set in the Snap Dialog from within the ‘Bias’ Tab.
2. BIAS set for a chosen event horizon for Selected Line (Also known as ‘Line Bias’). This
sets the chosen event horizon to a particular BIAS that could be different to the event
horizon on an adjacent line.
3. BIAS set for Single Segment (Also known as ‘Seg Bias’) of one event horizon. Usually
used to get around a particular unique geological situation.
Both these methods are set by selecting the Segment Edit Option on the data. Clicking on the
chosen segment will highlight that segment, and will call up a Segment Edit dialog box where
you can set the BIAS for either this one segment, or the whole event horizon for that one
profile currently displayed. GO TO Segment Edit.
When you have made your edits. Press to apply the changes and activate the
option.
Missing
Missing - Select whether a missing event horizon has disappeared ‘Up’ or ‘Down’. See
discussion below.
Event 3
VelPAK’s definition of a Pod is a lens-like layer of rock that is discontinuous which does not
on-lap or pinchout; it will exist as an isolated layer within another layer.
Pods can be depth converted successfully within VelPAK. Any number of pods (up to a
maximum of 32 named event horizons) can be accommodated within a profile and each pod
can have its own or a common velocity method.
In normal usage VelPAK presumes and expects layers to be continuous over the whole
profile. If a layer has disappeared it is because it has been truncated by a younger event
horizon or pierced by an older event horizon; the snapping logic will then continue the
tracking of the layer that has disappeared along this new event horizon until the end of the
profile.
Clearly with Pods this is not the case and so a special case needs to be made to inform the
VelPAK snapping logic to treat the pods as single, discontinuous instances.
To distinguish between pods and usual, continuous event horizons within VelPAK there are
two things a user must know to do within the set up:
1. The event horizons in VelPAK which are assigned to a pod must occur as an event
horizon number greater than the oldest normal horizon in the model (as if chronologically
older.) For example:
Event 1
Event 2
Event 3
Event 4
Event 8
Event 5
2. The event horizons in VelPAK which are assigned to a pod must be identified as such
under the Pod tab of the Snap Property Grid within the Profile module.
Details of how to correctly read horizons into VelPAK as Pods can be found in the technical
paper ‘Overhangs & Pods’ available in the Standalone Documents directory of VelPAK
documentation.
Pod Options
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
Pod
01_Pod, 02_Pod - Yes/No - Select which horizons in your model are to be treated as
Pods. As discussed, Pod horizons need to occur chronologically below the oldest
normal horizon in the model.
What is Clean?
A utility designed simply to automatically clean the horizon data that abut faults - and
sometimes inadvertently overrun the fault - in profile mode and thus speed up the snapping
routine. This is often caused by the different picking techniques of horizons and faults within
the workstation environment and when brought into VelPAK the program would be unable to
successfully snap horizon data which overrun the fault. Manual editing of these few points
that overrun the faults would be laborious; Clean mode will do this automatically.
Successful
Snapping
Clean Options
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
Option
Clean Type - Select which profiles are to be cleaned in the model; current, unsnapped or
all lines.
Clean
Clean_Point - Specifies the number of shot points from a fault segment inside which
horizon data points are deleted. The point where the horizon touches (or overruns) the
fault will become the center of a circle of deletion which will be this number of shot points
in radius.
Clean_Verify - Yes/No. Yes will allow the user to see graphically what will be deleted/
cleaned. You must select this option to be No for the cleaning process to actually take
place.
Note: There is no Undo facility on this option; it is recommended you save your model
before activating the clean process.
Depth Dialog
Depth Conversion methods within the Profile Module of VelPAK work solely on the TIME
PROFILE data.
For Grid-to-Grid Depth Conversion and manipulation use the Depth Module or Process
Module available from the Surface Module. Go Here.
Once data within VelPAK has been successfully snapped, a layer cake depth conversion can
be performed on the data.
Depth conversion methods are set up under the Parameters tab.
Within VelPAK’s depth conversion options, there are a certain number of depth conversion
methods already defined. These can just be utilized for any given layer, by inserting the
correct variable values, according to your depth conversion parameters.
However, there is the added dimension within VelPAK’s depth conversion in that you can
write your own methods, or amend the ready stored methods, and save them for future use.
The Depth Conversion methods within VelPAK are written in a language called Xpress. This
has been specifically designed so that it is possible for non-programmers to write the text for
a new depth conversion method.
For details of how to write a depth conversion method in Xpress, go to Writing and Saving
your own process method.
Note: All event horizons within your model need to have a valid depth Conversion method
set up in order for the Profile Layer-Cake depth conversion to run successfully. Set up
is done per event horizon; set up one event horizon in the Depth Dialog Parameters
and then move to a new event horizon via the VelPAK Surface&Slot selector and set
up a new set of Depth Conversion Parameters for that event horizon...and so on.
Tabs include:
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
Option
Depth_Type -
Current Line - Only the current line will be depth converted.
Unconverted Lines - Any lines not already depth converted will be converted.
All Lines - All lines regardless of depth converted status will be converted.
Depth_Verify -
No - No depth display is shown while depth converting.
Yes - Displays a filled depth section after each section is completed.
Hide Graphics
No - Will show the actual profile data points that have been layer-cake depth
converted.
Yes - Will not show data points converted during the process.
Expert
Calc_VupVdown - Options - No/Yes/Yes With Logging.
Allows you to have the Depth Conversion calculate the Velocity Up and the Velocity
Down, with or without a log.
Details of the theory behind these options can be found in the Vup/Vdown theory section
- Go Here. The ‘Yes with Logging’ produces a file called ‘VelPAK.log’. This would contain
a listing for EVERY SINGLE DATA POINT FOR EVERY SINGLE EVENT HORIZON FOR
EVERY SINGLE LINE in your project! It is therefore not advised that this option is turned
on unless you seriously need to look at each individual VUP VDOWN value. (Don’t forget
that you can see individual VUP VDOWN values for selected data points using the POINT
PICK or POINT EDIT option).
However, should you so decide, the log will look something like this:
The print out will go into the directory from which you loaded your model.
Input
Bank 01
Input
Bank 02
Options
Formula - Select the formula to use from the drop down list. Go Here for further details of
the Formulae available.
From - Specify where the values are to some from:
User Defined
You can enter your own parameters directly into the appropriate Parameter banks
below this input.
Optimize
Selecting an Optimize option will call in the relevant parameters as derived from the
Optimize module.
Note: Your Depth Conversion method here would have to be the same as the method used
to derive the parameters in Optimize.
Input Banks -
When you enter a Formula from the drop down selection the Input banks will fill up
various fields with the names of the input/outputs it requires. Other fields have to be
filled in by you if you have specified User as where the information for your Depth
Conversion is coming from.
Having a look at the actual formula selected in the Formula tab will show the list of input/
output variables that have to be defined.
For example:
2 Variables
The relevant number of Input banks will appear depending on the Conversion method
selected. These will be labelled with the type of input required such as Input v0, Input k.
01_Value - If the input is a constant value, the value will be entered here.
If a Grid is to be used and not a Constant then the following are to be entered:
01_GridEvent - The Event Horizon under which the grid to be used as input 01 is stored
in the Model Tree. Note the Previous / Current option which allows you select the
previous or current event horizon rather than name a specific event horizon where the
grid is to be found. Usually this option is only utilized when constructing Generic
Workflows which can work on any event horizon.
01_GridType - The slot under the event horizon in the Model Tree where the grid is
stored.
Tip: A complete list of the system Depth Conversion Methods already stored within VelPAK
and are listed. Go Here. In that section, the entire method has been written out, along
with a discussion on that particular method.
Non-editable display of the Formula you have selected from the parameters tab.
Toggle On
Tip: Always make a back up of the user-defined process text file before attempting to write
your own method.
If you experience difficulties in saving, with an error saying that the program is unable to save
this process to this file, it is likely that the process_user.txt file is still write-protected. Check
with your System Administrator to change the write protection for this file.
Full details of how to write a depth conversion method are given here.
V1
X X X EVENT 1
X V2
X A VUP X X
EVENT 2
X
VDOWN
V3
X X X EVENT 3
X
For VUP, calculate depths at two slightly greater times - in this case 0.01ms and 0.02ms,
using the V2 method (starting at Event 1) and then calculate the velocity using the formula:
4Z 1 – 3Z – Z 2
V up = ----------------------------------
-
2t
where Z is the depth at A, Z1 is the depth 0.01ms below A, Z2 is the depth 0.02ms below A
and Dt is 0.01ms. This formula gives a more accurate value than a formula just based on Z
and Z1.
For VDOWN, also calculate depths at 0.01ms and 0.02ms below A, but using the V3 method
(starting at Event 2) instead and then calculate the velocity using the same formula:
4Z 1 – 3Z – Z 2
V down = ----------------------------------
-
2t
extern v0,k,last_time,time,last_depth,depth,error
Event 2
In the above example the Depth Conversion processes have distinguished between where
Event 2 is an actual ‘pick’ and where it has become the base of a layer. Where Event 2 is the
Current event horizon and it is a real pick, colored ‘blobs’ have been plotted. Where the blobs
have been plotted; an error of
-100.0 has been added.
This displays two features of the Depth Conversion Xpress processes; firstly the ability for
you to select portions of the event horizons to have depth processes (usually errors) added,
and secondly for the window display to plot ‘blobs’ on the event horizon at the points on the
event horizon where the process will act, thus allowing you to see at a glance if the process is
acting as and where you wish.
plot(x,y,color,size) .
For more detail of these options refer to The Language used in Depth Conversion
-400.0
Event 1
-200.0
TWT (ms)
200.0
400.0
The example above shows a theoretical situation where two event horizons have negative
two way times.
Within VelPAK by the inclusion of the following code in the depth conversion formulae most
depth conversion problems can be avoided.
# Check for Negative Time
if (last_time < 0 {
last_time = 0
last_depth = 0
}
-400.0 D = -500ft
D = -300ft
At = -400ms
-500
AD= 2000 *2000
-200.0 = -400ft At = -300ms
-500
AD= 2000 *2000
= -300ft
*D = -200ft
IT = -100ms
ID = -100 *4000
2000
= -200ft
Event 1
Sea Level 0.0
IT = 100ms
TWT (ms)
At = 200ms 100
ID= *4000
200 2000
ID = 2000 *2000 = 200ft
D = 200ft
= 200ft
Event 2
IT = 400ms
400
200.0 D = 200ft ID = *8000
2000
= 1600ft
IT = 100ms
100 IT = 300ms
ID= *4000
2000 300
ID = *8000
= 200ft 2000
D = 400ft = 1200ft
IT = 100ms
100
ID = 2000 *8000
400.0 = 400ft
D = 800ft *D = 1600ft D = 1400ft
As shown in the diagram above by setting the last time and depth to zero, the correct depths
can be calculated. It should be noted that the full interval is not calculated.
This technique fails with the interval method when the second event horizon is also above
zero. The part of event 2 that is above zero is not used in the calculation of the depth for
event 1 (since we calculate depth 1 first, regardless of event 2). An average method must
therefore be used to calculate the depth for event 1 in this particular case.
-400.0 Event 1
-200.0
Event 2
Sea Level 0.0
200.0
DEPTH (ft)
400.0
600.0
800.0
1000.0
1200.0
1400.0
1600.0
Reint Dialog
The Re-interpolation option of VelPAK allows data to be re-interpolated to a smaller shotpoint
increment.
Note: The re-interpolation option never acts on data whose shotpoint increment is already
smaller than the re-interpolated data.
Output from a Seismic Workstation would naturally have more points picked for the event
horizon than for a fault plane, as in the example above. If the model for this event horizon
was output into an XYZ data file to be gridded, control down the fault would not be so great as
for the rest of the event horizon.
Re-interpolation allows you to re-interpolate at whatever shotpoint increment you want to so
that there will be as many points along a fault as along the event horizon.
On selection of the Options within Re-Interpolation you have the chance to select what
shotpoint increment you wish your points to re-interpolate to, and whether you want the re-
interpolation to occur on just the line on display on-screen, or for other selected lines within
the memory model.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
Option
Reint_Type -
Current Line - Only the current line will be re-interpolated and then the process will stop.
Un-snapped Lines - Only ‘un-snapped’ sections will be re-interpolated.
All Lines - All lines will be re-interpolated.
Reint_Verify - between each event horizon re-interpolation there will be an alert box
asking you for verification to continue.
Shot_Increment - a value to select for the shotpoint increment you wish to re-interpolate
to.
When all the values have been selected as you wish, pressing APPLY will activate the re-
interpolation. You will see any re-interpolated points appear as white points. They will remain
white until the event horizons are re-displayed, or the data are snapped once more.
Note: Re-interpolation turns the data into ‘Unsnapped, raw’ data again regardless of
whether it has been snapped in the past. If you wish to snap your data you must run
the snap routine (again) on the data. Re-running the snap process should not take a
long time (in editing time) if it has already been snapped once.
Stuff Dialog
The Module ‘Stuff’ is designed to allow you to add an event horizon to your model using
isochron/isopach grids stored within the VelPAK Model Tree.
Note: The grids to be used are always isopachs of some description, not absolute values.
The Stuff Module stuffs layers above or below the selected event horizon.
WARNING:
STUFF WILL MESS UP YOUR MODEL (with regard to Time/Depth relationship)! Make
sure you actually want to do it before proceeding.
STUFF should be the final procedure you do on a Model.
It would probably be recommended that you made a copy of your original model before
proceeding with a STUFF.
The Stuff Module can work on either Time or Depth profiles. Stuff assumes whichever mode -
time or depth - you are currently displaying is the mode to Stuff in.
Stuffing in Time
Your model needs to be snapped before you can Stuff. Adding in a new event horizon will
mean you will need to re-snap your model in order to continue working on the model with
other processes.
Stuffing in Depth
If you are adding depth event horizons via the Stuff Process then you will end up with a model
that has broken the relationship between time and depth - in that you will have depth event
horizons with no time event horizons.
VelPAK warns you of this before you Stuff new depth event horizons.
Note: The grids to be used are always isopachs of some description not absolute values.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
Option
Stuff Test - Allows you to see what the stuffed event horizon would look like. A visual
white line is displayed where the new event horizon(s) will go.
Tip: Don’t try and zoom on this white line, since it will disappear.
Stuff Type - Select to use your chosen ‘springboard’ event horizon to stuff UP or DOWN
from it.
Stuff Verify - Verify gives alert messages before each new event horizon is stuffed.
Surface_Number - The surface to be used as the ‘spring board’ for the new Event. For
example if Event 2 is selected here and Stuff Type is set to Down then the new event
horizon will be placed between Event 2 and Event 3.
Input
Grid - Select the isopach/isochron grid which is to be used in the Stuff procedure from the
Model Tree or the Surface&Slot selector.
Extrapolation
Ignore Points - The number of points of an event horizon leading into a fault to ignore for
the least square fit equation. Typical setting is 1 to ignore a possible spurious point as the
last point leading up to the fault.
Shot Aperture - Shotpoint Aperture. The shotpoint distance within which to check for
intersections with faults and event horizons - if the ends of (stuffed) segments are outside
this distance they do not generate new points and just terminate where they are.
Use Points - The number of points to use in the least square fit equation to extend/
interpolate the event horizon up to a fault.
A
Stuffed
Layer
Original B
Layer
Simple Fault,
Stuffing one Event
Situation A
The Stuffed event horizon is simply chopped off where it meets the fault.
Situation B
Clearly the stuffed event horizon needs to be extended/interpolated to reach the
fault. This is done doing a ‘least square fit’ on the points preceding the end of the
fault. However, the very last point of the event horizon leading up to the fault can
often be spurious, and so the default is set to ignore the last point. The default
number of points to use to interpolate the event horizon to the fault is four. These
values can be changed in the Stuff General tab.
What Stuff does when it is stuffing a layer that is too thick to fit.
Event 1
Event 2
Isochron to add to
Event 2
The isochron to add to Event 2 in some places is too thick for the thickness
between Event 2 and Event 1.
In this case, VelPAK will chop the isochron where it extends beyond Event 1, thus
truncating the new Event 2 against Event 1.
New Event 2
Event 1
New Event 3
Profile Dialog
This option is usually used in conjunction with the generation of a 2D Random Line over your
map data, using the Line Insert Edit mode. See Process for Producing a Random 2D Map
Line.
Once you have your Random line, use the Profile Mode to generate profile data over the line.
Random lines can be generated in Time or Depth.
For Time Profile generation the feeder profiles must be snapped first.
For Depth Profile generation the profiles must be depth converted first.
Tip: Lines must be snapped for time profile generation and depth converted for depth profile
generation.
Profile_Type
Event - Takes the data from all snapped intersecting profile event horizons that the
random line crosses through.
Grid - Takes the data from Grids stored in the selected Model Tree slot for each event
horizon in the model. This would usually be the default TIME or DEPTH slot for the
event horizon (depending on whether it is a time or depth profile you wish to
generate), but any Miscellaneous slot can be activated for any event horizon of
required.
Expert
Multi_Bias - Used to change the default Bias of the joining algorithms when there is
multiple Z values for one shotpoint. See discussion below.
Recalc_Distance - Specify interpolation distance.
Recalc_Shot - When the Profile is generated the original 1,2,3,4,5 style numbering of the
Shotpoints will AUTOMATICALLY be generated using the XY locations. The Re-Calculate
Shot option gives you further control over the interpolation distance if required.
Line Mistie - takes the random line profile data from the other profiles in the model.
Grid Profile - takes the values for each event horizon from the XY point on the Grid stored in
the Time or Depth slot for that particular event horizon in the Model Tree.
Mistie Display
Green, Blue or Red lines display the Line Intersections or Grid node points.
Colored blobs represent where the event horizon would be.
On TIME profiles,
Green line intersections shows an intersection with a SNAPPED line.
Red line intersections shows an intersection with a UNSNAPPED line.
On DEPTH profiles,
Green line intersections shows an intersection with a DEPTH CONVERTED lines.
Red line intersections shows an intersection with a line that is NOT DEPTH
CONVERTED.
Activating the Grid display option before generating the Profile will give you a display similar
to a Mistie display of the Grid values for each event horizon for the XYZ of the Random Line
map data.
If you pick a line like this with the geological structure thus:
1X X
2
X3
The original display on the profile, using misties or grids would space each shotpoint out
like this, below; causing possible distortion of data.
X X X
1 2 3
Example on real data:
Picked
Random
Line
Profile generated using true XYs will produce a truer picture, using shotpoint numbers
valid through out the model. These values will remain unless the line and profile data are
deleted.
XYZ Dialog
This produces up to 3 XYZ files in either Time, Velocity or Depth that will be placed in three
separate user-defined slots in the Model Tree.
Note: Your data must be successfully snapped for this output style to work.
Time 1
Time 2
Time 3
Time 1
In a six Z situation (multiple over-thrusting) - below - the following result will be seen: Time 1
will start again as soon as it can, in order to produce one continuous layer that displayed on a
map.
Time 1
Time 2
Time 3
Time 1
Time 2
Time 3
Time 1
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () to select from available inputs.
Option
XYZ_Type - Time/Depth/Velocity. Select the type of Z value to produced. Time will be
snapped time.
Output
Output_XYZ - Select the slot in the Model Tree where the first of up to three XYZ files will
be put, containing Z1, Z2 and Z3 data in that order. If there are no Z2 or Z3 data then the
subsequent slots will not be filled.
Editing Procedures
Insert Delete Edit Move Pick Insert Delete Edit Move Split Merge Swap Flip Void Term
Point Edit Segment Edit
The Profile Edit options within VelPAK are crucial to being able to modify horizon data such
that they can be successfully snapped, and used in applications such as MAP or Depth
Conversion.
Note: You are able to edit not only the time profile, but the resultant depth profiles. However,
if you do this then the relationship between the time profile and depth profile will be
lost.
Pressing the left-hand mouse button over the option you require will cause that option to be
activated. The bottom right of the VelPAK window displays what edit mode you are in.
Note: For any of these procedures to work, you must select a data point on the event
horizon, not the line joining them.
Insert
This allows you to insert extra points - one at a time- onto a selected horizon.
The event horizon on which points are to be inserted is instantly defined by VelPAK as soon
as you select it from the Surface selector or Model Tree.The cursor must be on a data point
on the event horizon (rather than the line joining the data points) in order for the event horizon
to be found. Keeping the mouse held down, move to where you want the inserted point to be.
Note the different directions of insertion for the left and middle mouse buttons.
Inserted Data
Held Down Point
Inserts Point Backward
of the selected point.
Held Down
Delete
Held Down
Held Down
Edit
Selecting Edit and then a particular data point brings up a dialog box where the values for
this point are displayed.
You can then edit any of the values displayed for this datum point. Pressing DONE will store
these values, overwriting any values generated from anywhere else. Doing another Depth
Conversion, for example, on the data will overwrite these manually edited values.
Move
This allows you to select a point and move it. This is done by using the arrow mouse to select
the point you wish to move, pressing the LEFT-HAND mouse button or MIDDLE mouse
button and keeping it held down to drag the point to where you want it.
Using the Middle button will give you a real time and shotpoint display as the point is being
moved. The values appears in Console
Move
Out
Pick
Point Pick can be used as a method of selecting the event horizon you want to edit. Clicking
on an event horizon in Point Pick mode will change the selected Event in the model, allowing
you to ‘Insert a new Segment’ of that event horizon type, for example.
Pick gives information about the selected point in the Console window.
Clicking with the middle mouse button provides a generic browser that displays the Time and
Shotpoint for the current position of the mouse.
Held Down
e Insert Delete Edit Move Split Merge Swap Flip Void Term
Segment Edit
Note: For any of these procedures to work, you must select a data point on the event
horizon, not the line joining them.
Insert
This allows you to insert a new distinct segment within the current event horizon.
The current event horizon to be inserted can be defined by one of three ways:
1. Selecting the event horizon in the EVENT OPTION page or:
2. Selecting POINT PICK and clicking on an event horizon point or:
3. Selecting the event horizon in the Event Selection list always displayed along the top of
the VelPAK window.
To insert a segment:
On activating the Segment Insert option, the first point of the data can be selected by
clicking and holding down the left mouse button. By releasing the mouse button over the
desired area the end of the segment is specified and the segment drawn.
Delete
This option will cause an entire segment or event horizon to be deleted. Selecting a data
point within a segment will cause that segment to be highlighted.
There will be an alert box asking for verification of the process before deletion takes place.
If you wish to delete only part of an event horizon you can use the Split function described
below to break the event horizon line.
The Undo facility can be used to restore data that should not have been deleted.
Edit
Selecting Edit and then a particular segment of data brings up a dialog box where the
segment name is displayed.
You then have the ability to change this name as you wish.
The LINE BIAS and SEG(ment) BIAS refer to additional biases that can be added to the
whole line or just the segment you have called up to help nudge the snapping routines into
geologically correct snapping.
The Bias is explained in BIASSING THE PROFILE DATA.
Move
Split
x1
Merge
This allows you to merge two pieces of an uncontinous event horizon together.
Data can be merged from left to right or right to left.
On activating the Merge option, the end data point of the segment of data must be found
using the cursor arrow. Keeping the mouse pressed, the arrow is then dragged over to the
other point of the segment to be merged with.
You may find you need to Zoom in to be able to select the Start Point and End Point for the
merge.
A message will appear: ‘Merge specified segments?’ if the data point has been found and the
join highlighted.
Held Down
Swap
Swap is designed to allow you to swap the priority of segments within a model.
It does not swap the data information for any given segment, merely the number of the
segment.
Usually, the priority of segments within the VelPAK model is from left to right across the
section, and usually this would cause the snapping no problem when it came to following the
layers.
However, when you are dealing with complicated over-thrusting or reverse faulting there may
be a time when you realize that the priority of two segments need to be swapped, to get the
layer following correct.
VelPAK stores the number of each segment and the tracking process see Profile Module -
SNAP Procedure aims to join the segments in order. In the case of complex faulted
overthrusts these segment numbers can be inappropriate.
Use the option under EVENT to activate the DISPLAY SEGMENT NUMBERS to make this
clear. The SWAP facility then becomes quite clear as segments can be swapped such that if
segments are joined in ascending order the appropriate geological situation is achieved.
On activating the Swap option, select the first of the two segments you wish to swap. This will
become highlighted. Then select the segment you want to swap it with. This too will become
highlighted and an alert box will appear asking if you want to swap these two segments.
Flip
The FLIP option provides a reverse snapping direction for an individual event or event
segment.
Note: To flip the whole profile to its mirror image, use the function from the main bank of
Profile Display Icons. Go Here for details.
The example below shows a situation where flip would be used to advice VelPAK as to the
correct way to snap the event horizon.
Snapping Direction
A
B
0
Void
Voiding a segment allows you to retain a segment within your model, which is not used in any
application or output routine within VelPAK.
The Void option is usually used on small segments of horizons, rather than an entire horizon.
On activating the Void option, the segment to be voided is selected by holding the cursor over
a data point within the required segment.
The segment will turn white, and an alert box will prompt for verification of the process.
1 1
1
2
2
In this example, VelPAK would not necessarily know where the event 2 should go;
whether it should track the layer above or the fault below.
The BIAS option could be used to induce the tracking to go down, or up, however it may
be the case that a BIAS of ‘down’ in this particular geological case would cause another
geological case within the same model to behave incorrectly. It is therefore necessary to
be able to define some way of imposing a situation purely for the geological case in hand.
In the above example, a ‘Void’ segment of horizon two inserted at the bottom of the ‘V’ of
the fault block tells the model to track the layer downwards.
When the data are snapped, the void segment is read as normal, and the layer tracking
for the layer two would behave correctly. However, having the segment as ‘Void’ implies
to the model that the data must not be read into any application or output procedure,
since the segment is actually false, (since we know from the original picks that horizon
two is absent within the fault block).
So, in the example given, a Void segment can be inserted within the fault block. This can
be ‘Voided’ so that it shows up as white (on a black background).
Term
Term can be used to terminate an event horizon or segment when data are missing. Upon
snapping, when VelPAK finds a terminated segment, it does not attempt to snap to the end of
the line or to join the ends of two segments together, instead VelPAK draws a vertical line to
the event horizon above.
To mark a section as terminated click Term and select the end of the event horizon or
segment to be terminated. VelPAK then draws a vertical line from this point to the event
horizon above.
Click Once
On depth conversion VelPAK ignores the terminated event horizon area and the event
horizons immediately below it.
SURFACE
SURFACE
FAULT FU
PROFILE
FAULT
F
F Looking at a Profile
F ILE
ALL F PRO
AN D
IAG.
FL
The standard form of looking at an Allan Diagram is to select the directional axis most
appropriate to viewing the diagram ‘head on’; down the throw of the fault.
Without event
horizons
displayed
A standard Allan diagram of a fault on random lines which may not be straight would bring all
the lines ‘up’ or ‘down’ to lie on the one flat plane of the diagram. The lines may therefore not
lie at regular intervals along the diagram.
The view perpendicular to the standard Allan diagram would display the faults where they lie.
Profile faults would therefore be hidden behind other Profile faults. Since the display is
‘hollow’ all the profile faults would actually be displayed, but the results may appear
confusing.
Example 2
Checking for Fault Abutments.
Display Options
Previous/Next Arrows
Use the previous/next arrows at the top of the module tab to move to the previous or next
Event, Well or Fault within the VelPAK
model.
Display Types
Note: the data would need to have been depth converted for the depth display.
Visible Layers
The only mode available is Line pick mode, with the only edit option available Fault Segment
Pick.
Clicking On Fault..
..will give you the profile with a black line at that point
Fault Dialog
Fault Dialog- General Tab
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () to select from available inputs.
Display
Display_Type - Choose to display faults in Time or Depth.
Option
Line_Label - Choice of labelling displays which best suits the density of data you are
displaying.
Line_Type - Choose which set of lines to choose. The choice here between in-lines or
cross-lines would affect the directional axis required for a standard Allan diagram.
Random is also available as an option.
View_Direction - You have the option of selecting the best directional axis in which to
view the fault plane ‘head-on’; along the throw of the fault. The other directional axis can
be selected to view the display perpendicular to the standard Allan diagram.
For example; Viewing South - North instead of North - South would simply ‘flip’ the display
you see.
A standard Allan diagram of a fault random lines which may not be straight would bring all
the lines ‘up’ or ‘down’ to lie on the one flat plane of the diagram. The lines may therefore
not lie at regular intervals along the diagram.
The view perpendicular to the standard Allan diagram would display the faults where they
lie. Profile faults would therefore be hidden behind other Profile faults. Since the display is
in fact hollow, all the profile faults would actually be displayed, but the results may appear
confusing
.
VIEW NORTH-SOUTH
Standard
‘Head-On’
directional
axis chosen
(Same Data)
VIEW EAST-WEST
Perpendicular
to standard diagram
directional
axis chosen
Overlays
Overlays_Into - Yes/No
Overlays_Leaving - Yes/No
Choose to have just one set of lines on the display for event horizon data for one side of
the fault plane, either going into the fault as displayed, or leaving the fault (depending on
the directional view selected) or have both set of lines displayed, showing faults going
into and leaving the fault.
If ‘Yes’ is selected for both, the thinner lines will represent event horizons going INTO the
fault, thicker lines represent event horizons LEAVING the faults, according to the
directional view selected.
Event going
INTO fault
Event
LEAVING fault
When you have made your edits. Press to apply the changes and activate the
option.
Display
01_Display/02_Display - Yes/No options allowing you to select the event horizons you
wish to display.
Use the down arrow () to select from available inputs.
The default is to just display the profile fault planes. You can if you wish turn these fault
planes off and just view the event horizons.
2. In Surface Module, turn the Visible Layers on for Flag data, including labels, and Faults.
You may or may not also want to have the line locations displayed.
.
3. Select Fault from the Edit Mode and then Segment Insert.
4. Use your cursor to click on every Fault Flag location you wish to include in your fault
plane. You may well need to zoom in close to be sure you are getting the fault plane
points close enough to the flag points for them to be successfully snapped onto them.
5. Use the Fault POINT MOVE option to move the fault segment points you have just
inserted to snap onto the Fault Flag points. Black, unsnapped Fault Plane points will turn
red when they are snapped onto the points.
Note: You can change the range for this Fault snapping process to pick up on faults, and
you can even turn the automatic snap feature off altogether, all available from the
Surface Display General dialog. Go there.
A useful check at this point to see if all your points are snapped is to turn all the
displays OFF on your Map, apart from Fault. You will then see at a glance if any fault
points are black and not red.
6. NAME your fault segment/plane using the Segment Edit option. Select this option and
then click on a data point for the Fault Plane you wish to name. The following alert box will
pop up.
This option will name/rename not only the Fault Plane you have just defined, but will
also rename the profile faults for that plane, if you answer ‘OK’ to the alert box that
follows. The Profile faults must be named the same as the Map fault for the Allan
diagrams, under the Fault Module, to work successfully for all profile faults along the
plane.
Situation 1
Situation 1 - referring to the diagram above.
Clearly on a 3D survey it becomes easily apparent that some lines are not being
displayed on the Allan Diagram.
Reasons for this:
1. The simplest reason is that the fault flag on the missing line(s) has not been picked, or
that the flag has been picked but the pick was not close enough to the point for it to be
automatically snapped onto it. (Turning the black Map Fault Plane point to red).
To solve this, go back the Surface display and edit / insert the Surface fault plane points.
2. The Profile Faults for the missing line(s) are not the same name as the Surface Fault
Plane. The profile faults should be automatically named the same if you answer Yes to
the prompt when you come to name your Surface Fault Plane. However it is possible to
re-edit the names of these profile faults (especially if you are selecting more Surface
Map Fault Planes for another event horizon and the same profile fault is selected).
To solve this, in Profile mode do a segment edit on the profile fault not appearing and
change its name manually.
Situation 2
Situation 3
On the profile you can then use Segment Edit on the faults on display to see what
their names are.
You will then see at a glance if any fault points are black and not red. This means they are
not snapped. This check can be further enhanced by using the Fault+Labels option so
that the names of the faults are displayed on your map.
This will display not only the segment numbers and direction of the event horizons, but
also display the names of your profile fault segments. This gives a very quick method of
checking exactly which profile fault has actually been picked, and moving through
Previous and Next profiles you can see quickly if a different fault has been inadvertently
named on one (or more) profile.
Display Options
Previous/Next Arrows
Use the previous/next arrows at the top of the module tab to move to the previous or next
Event, Well or Line within the VelPAK model
Display Type
Select type of Surface display to view.
Basic Surface
The Basic option displays on-screen a basemap stick plot of the surface data within the
memory model, along with an XY grid around the surface.
Selection of a new profile to be displayed in the memory model can take place by pressing
the mouse on the required, displayed, line.
The color varies on lines over the area depending on the line’s current state:
Ribbon Map
This will construct a polygon of color varying with the time/velocity/depth values that are
posted along a seismic line.
You DO NOT need to SNAP your data before displaying it as a Ribbon Map.
The size of each polygon, and hence the thickness of the color ribbon along each seismic line
is defined by the value at which the ‘ribbon value’ on the options dialog is set.
• Select the type of Ribbon map you wish to display; Snapped, Unsnapped or Merged
Time, Velocity or Depth or XYZs from the Surface Dialog page.
The Ribbon Size Scale Bar allows the ribbon thickness to be changed. These values are
simply relative to each other; a value of 1 will produce a smaller ribbon than a value of 5.
Contour Map
The Contour Map option provides an on-screen contour display, with or without basemap
details according to the set up.In order to produce the contour map VelPAK uses the gridding
parameters specified within the GRIDDING Dialog.
Shaded Contour
The shaded contour map produces a map display similar to the one shown below.
The shades of colors used are selected on the Surface Dialog.
The colors are according to the color shading table selected. A key down the right-hand side
of the display shows what values the colors represent.
There is the option for gray shading if you want to produce a hard copy display on a black and
white printer.
The contour interval required is defined on the Surface Dialog.
Display off
Display on
Label off
Label on
Turn Display element ON or OFF for display on the Surface. With or without element labels.
Displaying one or all of these display types will give you the option of EDITING, INSERTING
or MOVING points and/or segments (depending on what type of data you are actually in) Go
here for details of these procedures.
Note: These layers can also be made visible or not via the Surface Display General
Property Grid.
Select XYZ, FAULT, POLYGON or GRID from the Layers Visible drop down. Select the
relevant label from the Layers Visible drop-down to see the values for that element.
Note: This option is intrinsically linked to the Model Tree and Surface&Slot selector. In order
to get your values displayed (be they points or points and numerical values), you need
to select the relevant input on the Model Tree or Surface&Slot selector under the
relevant event horizon.
For example:
Note: You must have the same event horizon selected on the Model Tree/Surface&Slot
selector for a surface to be displayed with this selected information on it.
The full Stack Information will display the Line Name(Inline) and Shot point and the
following information for that point.
Type -
Old(dix) = 0
Average =1
Interval = 2
RMS = 3
Unknown = 4
Well Locations displayed using this feature will be displayed through all event horizons.
Deviated Well track information is taken from the Curve file, and therefore can/will be
displayed on activating the well Deviation Overlay.
A Black Well
is your currently selected well. On selection, any colored well will turn black. The well will
return back to the original color on de-selection.
A Green Well
Means there are:
TOPS and
CURVE data stored for that well in the Model,
and the layer has been defined for this event horizon either in Time or Time and Depth.
See The Well Log Module.
An Orange Well
Means there are:
TOPS and
CURVE data stored for that well in the Model, but the layer has not been defined for
the selected event horizon.
A Blue Well
Means there are
No TOPS but either a real or a dummy CURVE for that well in the Model.
Turn data elements ON for editing and selection on-screen using Edit. See Surface Edit
Icons (below) for details. Default is None so no Edit type will be selected.
Insert Delete Edit Move Pick Insert Delete Edit Move Split Merge
Point Edit Segment Edit
Note: The Edit Icons will be shown as ACTIVE depending on what Element of data you are
editing. Select the Element from the Mode Drop-Down. If the Edit Icon is grayed out
then it is not available as an Edit method for that Data Element.
Each data type available for display within the Surface Module also has various edit facilities
associated with it.
These are used not only to EDIT or MOVE data points on-screen but also to INSERT new
data points or segments of data such as Fault Planes or Polygons.
Random 2D lines can also be added to the model through this method. See Random Line
Generation.
The actual edit procedures for Points and Segments appear in the same form as they do for
Event Profile editing; however you will notice that not all edit procedures are available for all
data types.
The middle mouse button on Point Pick mode for some (not all) data types will give you an
extended data listing in the Console.
See individual data type editing below for details.
Note: To successfully use the edit procedures for each type of data, make sure the data
type display you want to edit is switched from the Edit Mode drop-down. If it is not
turned on, VelPAK will prompt you to switch the data type on for you.
Deselect Invert
Lines
Profile selection for 3d displays
Selection is by pressing the CTRL key on the keyboard along with the left mouse button
while in Line Point Pick Mode.
Any number of lines can be selected for display as profiles on the 3d display.
Whatever the display type is set to be on the Surface-Display-General tab (Inline/
Crossline/Random/All) this will be the type of line available to you for selection for display
your 3D profile.
Wells
This option is only for selecting or inverting your selection of wells for display on a 3D
display.To select wells use the Edit Mode Well -> Point Pick and as in the profile selection
above, use the Ctrl Button on your keyboard and the left hand mouse button to select or de-
select the wells you wish to view in the 3D display. Use the Deselect or Invert icons to
deselect or toggle (invert)your entire selection.
Deselect Invert
Wells
It is not necessary to have to assign a name to your data in the Model Tree slot, but if you
have many different fault patterns, for example, these would all be stored under different slots
of the Model Tree, to keep them separate, and you may find it easier to name them to keep
track of them.
To export these internally created files you need to use the File - Export options from the top
menu of VelPAK. Selecting the correct data type to output will place and external file in a
selected directory.
See Data Export.
Point
Insert DeleteEdit Move Pick
Edit facilities available for XYZ are, by their nature, all POINT edit options.
Insert
Allows you to add XYZs to your model. In this mode, click on the window where you want
a new XYZ to be, and the following box will appear, asking you for a value for your new
XYZ point.
Delete
Deletes XYZ points by clicking on them on the window. No alert will come up for
confirmation of this process...proceed with caution!
Edit
Edit your XYZ by clicking on your selected XYZ point. The following box will pop-up for
you to change the value as required.
Move
Click down on your selected XYZ and keeping the mouse button depressed, drag your
XYZ over to its new location. No alert will come up for confirmation of this process, and no
undo function is available..proceed with caution!
Pick
Will bring up information of your selected XYZ point. The information is the X,Y location,
the Z value and the PNT is the line in the data file your selected XYZ is on.
Point Segment
Insert Delete Move Pick Insert Delete Info Move Split Merge
For Editing Surface Fault planes but also for INSERTING new surface fault planes and the
FAULT module (display of Allan diagrams).
For Surface Fault Planes to be used successfully within VelPAK for Allan diagrams, when
inserting fault planes the points must be SNAPPED to the flag markers from the Event
Profiles. The snapping takes place automatically if you get the surface fault point close
enough to the Event Flag point. (The snapping mentioned has no connection to the VelPAK
snapping module.)
These edit options react in a similar way to the Event Module Point and Segment options. For
details of how to use any of the options (for example how to insert to the left or to the right of
the point you are on) see Profile Edit Icons in Detail.
Point Insert
This allows you to insert extra points - one at a time - onto a selected fault plane.
Point Delete
This allows you to delete points from a selected fault plane.
Point Move
Moves fault plane points. In this case usually towards an event horizon flag point. Moving
the fault plane point to within a short distance of the event horizon flag point will cause the
fault plane point to automatically snap onto the event horizon flag point, which is the
desired effect for the Fault module. Fault plane points that have successfully snapped
onto the Flag point appear red; ones that are only ‘close, but not touching’ appear black.
Point Pick
This gives information in the status bar (and also in the Console window if it is open)
about a selected surface fault point within the display.
Segment Insert
This allows you to insert a new Surface Fault segment on the current event horizon. This
would usually involve joining Flags from the Event Profile marked on a basemap to create
a new Surface Fault plane.
On activating the Segment Insert option, the first point of the data can be selected by
clicking and holding down the left mouse button. By releasing the mouse button over the
desired area the end of the segment is specified and the segment drawn.
For full details of this important option see How to insert a Surface Fault Plane.
Segment Delete
This option will cause an entire fault segment to be deleted. Selecting a data point within
a segment will cause that segment to be highlighted.
There will be an alert box asking for verification of the process before deletion takes
place.
If you wish to delete only part of a fault you can use the SPLIT function described below
to break the surface fault.
Segment Info
Selecting Info and then a particular fault segment will bring up a dialog box where the
segment name is displayed.
In order for the Fault Module to work successfully, your Surface faults must have a
designated name. On changing the name in this dialog box and pressing OK, the
following alert will appear:
Replying ‘OK’ to this will name/rename not only the Fault Plane you have just defined, but
will also rename the profile faults for that plane (providing your fault plane points have
been successfully snapped onto the flag points such that they appear RED on the VelPAK
display). The Profile faults must be named the same as the Surface fault for the Allan
diagrams, under the Fault Module, to work successfully for all faults along the plane.
Segment Move
Allows you to move an entire fault to another area of the surface.
On activating the segment Move, select the fault segment you wish to move, by selecting
a point on the segment, and drag it to where you want that point to lie in the segment’s
new position. On releasing the mouse button the entire segment will move to that position
and an alert box will ask you for confirmation of the move.
Segment Split
Allows you to split a designated fault segment. Select the point you wish the fault to be
split at.
An alert box will ask you for confirmation of this action.
Segment Merge
This allows you to merge two uncontinous faults together.
On activating the Merge option, the end data point of the segment of data must be found
using the cursor arrow. Keeping the mouse button pressed, the arrow is then dragged
over to the other point of the segment to be merged with. A message will appear “Merge
specified segments’ if the data point has been found.
The NAME of the fault will be taken from the fault segment picked on first, to merge with
the other segment.
Go into the Surface&Slot selector (or Model Tree) and for the event horizon you are working
in select the slot under the Faults slot you wish these data to be stored in. If you do not select
a slot then the data will default to being in the first slot (Time). You do not need to assign a
name to this slot yet - or ever if you wish - but it is useful to name it eventually to keep track of
what that slot within the Model Tree page holds.
Select FAULT SEGMENT INSERT. Use your cursor to click on every Fault Flag location you
wish to include in your fault plane. You may well need to zoom in close to be sure you are
getting the fault plane points close enough to the flag points for them to be successfully
snapped onto them.
Use the Fault POINT MOVE option to move the fault segment points you have just inserted to
snap onto the Fault Flag points. Black, unsnapped Fault Plane points will turn red when they
are snapped onto the points
.
Note: You can change the range for this Fault snapping process to pick up on faults, and
you can even turn the automatic snap feature off altogether from the SURFACE
Dialog. Go there.
NAME your fault segment/plane using the Segment Edit option. Select this option and then
click on a data point for the Fault Plane you wish to name. The following alert box will pop up.
This option will name/rename not only the Fault Plane you have just defined, but will also
rename the profile faults for that plane, if you answer ‘OK’ to the alert box that follows. The
Profile faults must be named the same as the Surface fault for the Allan diagrams, under the
Fault Module, to work successfully for all profile faults along the plane.
Point Segment
Insert Delete Move Pick Insert Delete Edit Move
Segment Insert
This allows you to insert a new polygon on the current event horizon.
Note: The last point inserted always joins up with the first point of the polygon. Each
polygon created will be one complete and separate ‘segment’ on the display.
Segment Delete
This option will cause an entire polygon to be deleted. Selecting a data point on a polygon
will cause that segment to be highlighted.
There will be an alert box asking for verification of the process before deletion takes
place.
Segment Edit
Selecting a particular polygon from a number you may have displayed for that event for
that Model Tree Data type will bring up an alert allowing you to name that polygon - or see
the name of the individual polygon if it had been brought in from your original project.
Individual named polygons can be used within the Analyze module.
Segment Move
Allows you to move an entire polygon to another area of the map.
On activating the segment Move, select the polygon you wish to move, by selecting a
point on the polygon, and drag it to where you want that point to lie in the polygon’s new
position. On releasing the mouse button the entire polygon will move to that position and
an alert box will ask you for confirmation of the move.
Point
Delete Edit Pick
No SEGMENT EDIT Features
Edit facilities available for Grid nodes are, by their nature, all POINT edit options.
Delete
Changes the GRID node value to Indeterminate (INDT) by clicking on the selected node
on the window. The grid node will turn from being blue to being purple. No alert will come
up for confirmation of this process...proceed with caution!
Edit
Edit your Grid node by clicking on your selected node on-screen. The following box will
pop-up for you to change the value as required.
Pick
Brings up a hover window over the points dynamically over the grid.
Point Segment
Insert Delete Edit Move Pick Insert Delete Edit
Note: Unlike all other display elements Line data does not need to be switched on/displayed
to be able to use Point Pick to pick a line. This is due to the large volume of data that
can occur with line data; you may not always want to see it all even though you want
to pick from it.
Segment Edit
Select the Edit Info option on the random line you have just created to rename the line as
you wish.
Segment
Insert
3. Use left hand mouse button to insert your Random line over your data area.
4. The name of your line will default to be ‘None’. Select the Edit Info option on the line you
have just created and rename the line as you wish.
Segment
Info
5. You now have a line with as many shotpoints on as points you have drawn. There will be
a blank profile attached to this line, to produce the Random profile, you need to go in to
Profile mode. See Profile Module - Profile Mode.
Note: Random Profile generation works by taking the data from SNAPPED profile data,
only.
X
X X
Random Line X
Inserted
There are no options to edit event horizon data from within the Surface Module.
To edit profile data you would need to go into the Profile module.
Point
Delete Pick
For deleting Velocity Stack Points and reading velocity information on individual points.
Pick
Using the left-hand mouse button on Point Pick to bring up all the information for that
point.
Edit
Select the stack point: usually only used to change the stack point’s Fit status to be active
or inactive in the Optimize and Curve modules.
Point
Insert Delete Edit Move Pick
No Segment Options
For editing aspects of the Surface Well displays and displays under the layer definition for the
selected well. New Well locations can also be INSERTED from here.
Insert
Select the point where you want to insert a well or select a well to edit and press the
mouse button. This Insert dialog will appear allowing you to assign the name to the well
and also to fine-tune the XY Location as required.
Delete
Select the well you want to delete. An alert will appear for confirmation.
Edit
Well Edit is used to not only edit the XY location or name of a well but also the status of
the well.
Also the value (Yes or No) for the well’s Fit and/or Residual values can be made active or
inactive in the Optimize and Curve modules.
The well selected can be assigned in a particular group.
Deselect Layer? Yes/No - will deselect the layer for the current event layer displayed in
the Surface module (and shown at the top of the Edit dialogue box).
Note: Deselection of the layer can not be undone. It will need to redefined in the Layer
Definition module.
The layer will be deselected in the Well Module and in the data tab of the Optimise
module:
Move
Select the Well and drag it to where you want to re-locate it.
WARNING - No alert appears for confirmation about the move, and there is no UNDO
facility.
Pick
Hovering over the well location in Pick mode will show you information about the well.
Clicking on a well using the left-hand mouse button will select it and change the display in
the Well module to that well.
Point
Pick
Point
Pick
Deselect Invert
Lines displayed as a lighter or brighter green shade on the map have been selected for
mapping purposes.
DESELECT
Will deselect all basemap lines selected.
INVERT
Inverts your selection of lines. All your selected lines will become deselected and vice-
versa.
Single line selection and de-selection is by pressing the middle mouse button while in
Line Point Pick Mode.
The Surface Dialog under Surface Module deals with the display style and set up of surfaces
to be displayed in the VelPAK Surface Module window.
Tabs include:
General tab - Basic set up details for particular maps.
Range tab - The definition of how the range the Surface display will show is selected.
Color tab - Changes the color of various items on the Surface.
Mask tab - The setting up of the MASK value within VelPAK which allows a unique
number to be given to any geological situation occurring within the memory model.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () to select from available inputs.
Parameter - - Calculates the Contour Increment taking the Z range of the data on
display. Pressing this will automatically update the Contour - Contour_Increment value
further down this Property Grid. Roughly speaking the contour increment is calculated to give
you approximately 25 levels between the minimum and maximum irrespective of the
magnitude of the range; computed to be a number divisible by 2, 4, 5, 10 etc. or the inverse of
this if the number falls between 0 and 1.
Option
Display_Type - Same selection as Display Type drop down on main Surface window. Go
Here for details.
Line_Type - Random/ In-lines/ Crosslines/ All/ Current
Select the type of lines you wish to have displayed on the surface display. This also
makes the Line Type selected available for selection from your surface display to be
shown as a profile on the 3d display.
Spacing_Inline /Spacing_Outline - Gives control of inline and crossline spacing for
clearer displays
Well_Box - Yes/No - draws a white box around the Well Label to be able to read the label
on a busy map.
points inserted that are close to Surface Fault point will not automatically snap to the
point.
Ribbon
Ribbon_Size - Select the ribbon thickness. These values are simply relative to each
other; a value of 1 will produce a smaller ribbon than a value of 5. Default of 3.
Ribbon_Type - Time/Velocity/Depth/XYZ
Select the type of Ribbon map you wish to display. These data can either come from
Profile data (Time, Velocity or Depth in Snapped or Unsnapped or Merged mode) or and
XYZ slot that has been selected from the Model Tree or Surface&Slot selector.
For a discussion of Snapped, Unsnapped and Merged Data, Go Here.
Volumetrics
Spill_Crest - None/Spill/Crest/Both - if selected, the spill point or the crest from the last
run within the VelPAK project is shown on the map on the target horizon.
Note: A display of a summary of the spill points for a multiple realisation run can be seen by
selecting the XYZ layers visible of the XYZ data slot ‘Height Above: Analyse Spill
Location Values’ Go Here for further details.
Overlays/Labels - These are exactly the same as the ‘Layers Visible’ overlay options
available at the top of the top of the surface module tab. The overlays and their labels can
be turned on here via a Yes/No toggle or by the Layers Visible drop down. For full details
of what these displays do Go Here.
When displaying data on a Ribbon Map or using data for Gridding, you are able to specify
what type of data they wish to use in terms of whether the data are Snapped, Unsnapped or
Merged.
These are terms relating solely to the mapping of Profile data within the Model.
Note: If you are not using profile data to grid or Ribbon display then the Input would always
be coming from an XYZ slot within your model and the correct input to select would
always be Xyz_slot.
Snapped Data
In an area of intense geology within your profile-based model, where there is heavy
faulting or over thrusts, truncations or piercements, in order to be able to Depth Convert
the structures you would probably find you would need to Snap the data to make
continuous surfaces to depth convert.
For a fuller discussion about snapping Go Here.
Unsnapped Data
It may be that the whole model does not require snapping due to the nature of the
geology or time constraints. In this case the data can be left unsnapped.
Note: Surfaces which are not continuos throughout a profile, but are profile (layer-cake)
depth converted will experience velocity errors if left as unsnapped. Use QC layer
display tools within the Depth Conversion routines to check for these errors. Snapping
the data would ensure that the layers were continuous and thus no velocities could
‘escape’ into the incorrect layer.
Merged Data
Possibly the most likely scenario for profile-based models is that a lot of the area within
your model may not be geologically intense or significant enough to warrant the time
needed to snap the data. You would snap the area of interest but leave the rest of the
profiles unsnapped.
In this case you would select the merged data option for gridding and display. Merged
option takes the snapped profile data if it exists, otherwise it uses the unsnapped profile
data.
Note: You can not snap part of a line and leave the rest unsnapped the whole of a profile
needs to be snapped from one end to the other for all event horizons. If this is
impossible to do you would need to split the line and hold it as two separate lines -
with two separate line names - within the model. ( Trim may be the best method to use
to do this.)
Note: Leaving defaults as Auto, you would not usually have to change these values unless
you wanted to work only on a specific area of your model.
Get Current AOI - Copies the currently displayed Area of Interest (AOI) into the Range
extents and sets the XY_Type to "User".
Axis
Are Of Interest
Type
Current - sets the AOI to that defined in the dialog.
Polygon - sets the AOI to the extents of the currently displayed polygon set.
XY_Type -
Auto - Takes the extent of the model as the range displayed in the Surface.
Grid - Takes the range of the Grid currently selected in the Model Tree as the extent of
the display area of the Map.
User - Allows you to define your own Range entirely independent from Grid or Model.
Z_Type -
Auto - Takes the Zmin and Zmax of the model as the range displayed in the Z direction
on the Surface.
Grid - Takes the Zmin and Zmax of the Grid currently selected in the Model Tree as Z
direction
User - Allows you to define your own Zmin and Zmax Range entirely independent from
Grid or Model.
Range
X_Min/X_Max .Y_Min/Y_Max . Z_Min/Z_Max.
If User is selected above you will need to assign the Min/Max range of values for your
surface display here.
Scale
XY_Scale - Adjust the display to real size as opposed to the usual ‘shrink to fit’ window
display.
Allows you to choose the color you wish to display the various elements on your map.
When you have made your edits. Press to apply the changes and activate the
option.
Color
F_Color - Fault flag marker displayed on your map (if requested) taken from your
snapped profile data.
FL_Color - Fault Lower flag marker displayed on your map (if requested) taken from your
snapped profile data.
FT_Color - Fault with no Throw flag marker displayed on your map (if requested) taken
from your snapped profile data.
FU_Color - Fault Upper flag marker displayed on your map (if requested) taken from your
snapped profile data.
Grid_Color - Color of Grid Nodes displayed on your map (if requested).
Indt_Color - Color of Indeterminate Grid Nodes displayed on your map (if requested).
P_Color - Piercement flag marker displayed on your map (if requested) taken from your
snapped profile data.
Polygon_Color - Color of Polygons displayed on your map (if requested).
Note: For Polygon colors - old models will use yellow as the default colour, new models will
default to the more easily visible orange.
T_Color - Truncation flag marker displayed on your map (if requested) taken from your
snapped profile data.
XYZ_Color - Color of XYZs displayed on your map (if requested).
When you have made your edits. Press to apply the changes and activate the
option.
The MASK value within VelPAK allows a unique number to be given to any geological
situation occurring within the memory model.
This MASK number is written in HEX.
VelPAK will consider 32 event horizons and the event horizon ID is represented internally by
setting a bit in a 32 bit word.
This option gives you a chance to map out a particular MASK and see its extent over the
model area.
Taking the scenario below for example:
Type in the HEX relevant to the geological situation in the Mask tab.
Select the Ribbon Surface = Yes to produce a ribbon map of the Mask value. (if you leave this
as No you will get a Ribbon map display of whatever is currently selected in the Surface
Display Property Grid.
Note: You must select RIBBON MAP option on the main SURFACE window of VelPAK
Display drop down to have your chosen ribbon map displayed.
Bit 1 2 3 4 5 6 7 8 9 10 11 1 et 3
Number 2 c. 2
Binary 1 2 4 8 1 3 6 1 2 etc.
Value 6 2 4 2 5
8 6
Bit Number 1 2 3 4 5 6 7 8 9
Binary Mask 1 0 0 1 0 0 0 0
Value
In the above example; the Hex number 9 represents Event 3 on a fault plane.
Example 2) Event 3 and 4 on a fault plane would be represented thus:
Bit Number 1 2 3 4 5 6 7 8 9 10
Binary Mask 1 0 0 1 1 0 0 0 0
Value
In the above example, the Hex number 19 represents Event 3 and 4 on a fault plane.
MASK 2
It is possible for more than one bit to be set, for example, an inserted point may have been
moved (ASCII 3).
Various utility programs to further manipulate these data field are available from Software
Technical Support.
Grid Dialog
Gridding allows you to grid data from a variety of different sources:
Grid time, velocity or depth data from a selected event horizon on a profile in your
model. This can either be snapped, unsnapped raw or merged (both raw and snapped).
Grid any XYZ data file stored in the XYZ slots from the Model Tree page.
VelPAK uses GTGRID as its basis for Gridding, providing access to a library of gridding
routines prepared by Geophysical Techniques (GT), Inc..
Fault Polygons and other Polygons can be used within the gridding methods.
Model Tree
Time Grid1
Interval Grid1
showing four
Grids stored
under Event 1
Grids are also selected in the Surface&Slot selector for each event horizon, along with other
VelPAK elements - XYZ files Polygons, and Faults. For each event horizon the Surface&Slot
selector will show which Grid is currently selected - this will be the grid that will be displayed
on the Surface display.
4. On the Grid General tab select where your Input data are going to come from. If it is an
XYZ file select what slot in the Model Tree the input XYZ file is stored:
5. On the Grid General tab select where you want your Output grid to go:
6. On the Grid Range tab set up the Range for your grid selecting from the Range Tools at
the top of the tab. Get range from the XYZ file, from a defined parent Grid use the
extremes of the entire Model or Blank parameters to manually enter values.
Range Tools
7. If you are using a Parent Grid to define the Range set up the Event Horizon and Slot
where this can be found.
8. Press the Parameter Tool at the top of the tab to automatically fill in the gridding
parameters.
Parameter
Tool
9. Press the Apply tick to start the Gridding Process and select OK to the
Confirmation of Gridding Parameters Alert.
10. To see contours displayed on your map make sure you have the display type set to
Contour or Shaded contour.
11. If the contours are too close together or there are too few of them change the Contour
Increment in the Surface General display.
Note: The Variogram and Data tabs are only used if Krig is selected as the Gridding Method
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
The property grid will change depending on which gridding method has been selected.
Details of the inputs can be found with the gridding methods om this manual.
Option
Tools
Tools
Delete - - will delete the grid currently selected in the Output_Grid slot. If you have
changed the AOI of the grids in the VelPAK model it may be advisable to delete grids
otherwise there may be errors. Used mostly in the wizard and workflows.
XYZ Parameter calculation - - Used to compute certain gridding parameters from the
xyz data for particular gridding methods - for example if Fast Kriging is selected as the
gridding method this will calculate the Kriging Mean of data value.
Option
Method - Select the Gridding method to use.
When you click on the Method, click on the down arrow to the right (as usual) to bring up
the selection. Not ‘as usual’, however, is this will then bring up a set of tabs where you
can select the type of grid and the actual grid method you require.
Move through the types using the left/right arrows that will appear.
See Methods available which summarizes the advantages and disadvantages of each
method and the types of data best suited to each method.
Type - Selecting ‘Grid all Events’, output grid files for each event horizon will be
generated. The output grids will be stored in the slot selected in Output grid, under each
separate event horizon.
Verify - Set to yes for alert dialog to appear before gridding proceeds and a grid
generated.
Input
Input - Select what type of data you are going to use for gridding; snapped or unsnapped/
raw or merged profile data in depth, time or velocity or XYZ data files from any variety of
sources and currently stored in an XYZ slot within the Model Tree.
For a discussion of Snapped, Unsnapped/Raw and Merged Data, Go Here.
Input_Drift - used in Kriging - Select the data slot from the Model Tree where the grid is
being stored. This allows you to input a grid data set which is present at all locations
which are to be estimated by kriging, to use as a trend. The resulting map from kriging
with an external drift will honour the “hard” input data points, but will follow the trend or
shape of the external drift where there is no hard data. Go here for further details of
External Drift.
Input_Fault - If faults are being used in gridding routine - Select the data slot from the
Model Tree where the fault polygons are being stored. For example selecting ‘General 1’
here will access the Fault Polygon data set that is currently sitting under the Fault branch
for the event horizon you are in.
Input_Polygon - - If polygons are being used in gridding routine - Select the data slot
from the Model Tree where the polygons are being stored.
For example selecting ‘General 1’ here will access the Polygon data set that is currently
sitting under the Polygon branch for the event horizon you are in.
Input_XYZ - If XYZ datafiles have been selected as the input data type - Select the data
slot from the Model Tree where the XYZ data file are being stored. For example selecting
‘Interval Velocity’ here will access the XYZ data set that is currently sitting under the
Polygon branch for the event horizon you are in, under ‘Interval Velocity’.
Output
Output_Grid - Select which slot in the Model Tree the generated grid file is to be placed.
This will relate to the event horizon you are gridding up. If you have selected to grid all
event horizons a grid will be placed in the same slot for each of the event horizons being
gridded.
Output_Variance - Select a slot to place a grid of variance - if required; a grid of variance
is produced from Kriging showing the uncertainty at each of the output points.
When you have made your edits. Press to apply the changes and activate the
option.
The Range Tab allows you to set the X,Y and Z range for the grid to be generated, as well as
other gridding set up inputs. You can either type these values in for yourself or bring values in
from other sources; the Model range, the XYZ file range or a ‘Parent’ grid range. Since grid-
to-grid processes within VelPAK rely on the grids being entirely compatible and congruent
using a Parent Grid to determine the range of the Grid you are wishing to grid up is very often
the method to use.
Option
Range Tools and Parameter Tool
XYZ - - Use XYZ file (selected in General Tab) to calculate Area of Interest.
Grid - - Use Parent Grid (entered at the bottom of this tab) to calculate Area of
Interest.
Parameter - - Use data to calculate gridding parameters. This option will estimate the
Interpolations and the Radius given the data and the range that you wish to grid up.
NOTE that this must be done AFTER the Range tool has been used to define the AOI of
your grid!
The Parameter Tool takes the area of Grid definition and divides it by the number of data
points it finds. Once it knows the area per point it then works out the radius of the circle
which would encompass on average 64 points. If the density of the data you are working
on is very variable this will not work correctly and you will need to work out a suitable
radius yourself.
Area Of Interest
Increment Factor - default =1. If required a value of 10 or 20 would be entered here, but a
factor of 0.1, for example, can also be used.
This will multiply the XY Increment by the given factor and recalculates it to generate a
(usually) coarser grid without changing the XY increment set-up itself.
A default of 1 will use the XY increment as defined in the inputs below in the property grid.
A particular usage of this option would be when using the Krig_Sgs method of
gridding or some other form of Kriging, which can be very slow using the standard XY
increment from your project. The grid can be generated using an increment factor but
in order to make the grid match the Parent grid in the project the data points would
have to be output from VelPAK, read back in and used to make a grid compatible with
the Parent grid using a quick gridding method such as ‘Global’.
Note that every time the grid is run using an increment factor it uses the set XY increment
from the parent grid as its base to calculate the increment. So the effect is not cumulative
if the increment factor is used more than once.
Number_Columns - The defined number of columns of nodes which will be generated in
your grid. These values are determined by the program once you have input the XY Min/
Max and the XY Increment.
Number_Rows - The defined number of rows of nodes which will be generated in your
grid. These values are determined by the program once you have input the XY Min/Max
and the XY Increment.
X_Increment, Y_Increment - These control the size of the grid increment, in coordinate
units. This is an important variable as it not only affects the process run time but also the
quality of data fit. A ‘noisy’ detailed surface requires a fine grid to fit each point. A less
detailed surface only requires a coarse grid.
X_Maximum, X_Minimum, Y_Maximum, Y_Minimum - These values are deduced from
whichever XY source you choose as defined from the Range Tools at the top of this page.
Alternatively you can type your own range in from scratch.The program will round these
up and down according to your XY Increment. You can change these to select just a
portion of your data file to be gridded.
Parent
If you are to use parameters from a grid already in the model this is where you enter the
Event Horizon number and the slot in which that grid is to be found. When you have
selected these, press the at the top of the tab to read in the parameters into the
relevant AOI slots.
Parent_Event - Select the Event Horizon under which the parent grid is stored.
Parent_Grid - Select the slot from the Model Tree where the parent grid is stored.
Note: You must press the Range tool at the top of the tab to call the Parent grid values
in as the AOI.
Estimated Value
per lag
Variance model fit
Note: The Variogram Tab is only used if KRIG is selected as the Gridding Method.
The variogram is a plot showing the spatial variation within a data set. This is a cross-plot
of distance between points against variance. To show the underlying trends, the data are
grouped into a number of “lags”, each lag corresponding to a range of distances.
The calculated variogram displays the points for each lag and the experimental model fit.
By changing the Nugget, Range and Sill in the Model parameters set up the curve can be
best fitted to the red lag points and these values used in gridding.
The process works on the XYZ data file selected in the ‘General’ tab of the Gridding
Properties.
WHAT IF...
Calculate Parameter
Tool Tool
The square boxed Tools at the top of the Variogram tab are to calculate and display the
variogram and estimate the variogram parameters.
Calculate and Display - - using the parameters estimated from the Parameter tool
this will calculate, display and update the variogram. An initial model is calculated and
displayed using a Spherical function. This can give you an idea of what the parameters
should be. It will be a reasonable approximation using the most likely values based on the
statistics drawn from the XYZs, but it will not be good at using outlying points. You would
expect, therefore, to have to adjust these parameters before reaching a model you are
happy with.
Options:
Experimental
Direction - The direction vector in which to look. This is a compass bearing, so 0 is North.
Lags - The total number of lags to be considered. The largest separation between points
which will be considered will be (lag length) x (number of lags).
Length - The length of each individual lag.
Number - A number indicating the number of XYZ to be used in the calculation of the
experimental variogram. These points are chosen systematically and at equal spacing,
from the order in which they are loaded.
It is recommended that a small number of points is used first, until you are happy with the
rest of the Variogram parameters; a large number of points selected can make the
generation of the variogram very slow.
Tolerance - The tolerance (in degrees) either side of the direction. A tolerance of 90 will
include all data values.
Model
Anisotropy - The difference in range for different directions. For example, a data set
composed of North-South channels would have a much larger correlation range in the
northerly direction than in the easterly direction. In this case, there would be significant
anisotropy. Within VelPAK, the anisotropy is a number between 0 and 1. If the anisotropy
is 1, there is no anisotropy (the correlation length is the same in all directions. If the
anisotropy is 0.5, then the correlation length is twice as long in one direction as it is in the
perpendicular direction).
Azimuth - The direction of the Anisotropy. This is a compass bearing, so 0 is North.
Nugget - The value of variogram at zero lag. Data samples at the same location may
have different values due to, for example, measurement error. If this is the case, the
variogram will not go through zero at zero lag.
Range - The distance at which the variogram reaches the “sill”. The value of the
variogram will generally increase with increasing lag, until it levels off at some value. The
distance at which this happens is called the range. This parameter is also known as the
correlation length. Points which are closer than this distance will show some degree of
correlation. Outside this distance, points are independent.
Sill - The value of the variogram where it flattens off - where there is no longer seen to be
any apparent correlation between the variance and the distance. Typically the lower the
value entered here the less noise would be seen on the grid.
If you require the SGS result to match a particular Standard Deviation then set the Sill
parameter to SD squared.
Type
The type of model function to be used. Each function gives rise to a different shaped
curve, corresponding to fundamentally different types of data distribution. The most
appropriate function is usually fairly evident from the experimental data.
Used in conjunction with the variogram tab for Kriging depth conversion methods.
Randomise - - pressing this will take a random value from within the Range Min/Max
and place it in the Range slot within the Variogram tab for variogram generation.
Range Min/Max - User entered values between which the program will generate the
random number for the Range within the Variogram tab.
Popular
Kriging_Fast Optimised Ordinary or Can be slow with a large Small to medium sized
Simple kriging. No number of points. data sets. Will tie the
need to construct data.
variogram. Favourite
method for residual or
parameter mapping at
well locations.
Kriging Ordinary Kriging Can be slow with a large Small to medium sized
based on GSLIB. An number of points with a data sets. Will tie the
opportunity to variogram needed to be data.
examine data and constructed. It is more
construct a complex to set up than other
variogram. One of the gridding method and more
best gridding methods prone to user errors
available. therefore.
Simple Simple Kriging based Slow (but not as slow as Small to medium sized
Kriging on Gstat. An normal Kriging). data sets. Will tie the
opportunity to data.
examine data and
construct a
variogram. One of the
best gridding methods
available.
Quicker than Kriging.
Radial Basis Good general- Tends to ramp off in areas Small to medium sized
Function purpose method. of poor data. Unable to data sets. Will tie the
Result similar to control with variogram. data.
Kriging.
Natural Very fast and stable Poor in sparse data areas. Large data sets with
Neighbours method. good coverage. Will tie
3 the data.
Smoothing
Smooth Generates a smooth Only suitable for dense data Noisy data such as
map. sets. derived from stacking
velocities.
Loess Good smoothing Can take a long time if there Stacking Velocities.
algorithm for noisy are lots of input data points. There is no limit to the
data such as stacking Gridding time surfaces from size of dataset,
velocities. 3D interpretation can take a however the larger it is
Visually very pleasing long time.Does not always the longer it takes.
results. honour the input data so not
good for gridding well
parameters such as interval
velocity.
Specialist
Kriging Sgs Adds noise back into Slower than normal Kriging. Error maps and depth
(Sequential the data to undo the Due to the randomization conversion parameter
Gaussian smoothing effect of factor of this algorithm the maps such as V0.
Simulation) Kriging. grid will be different each Typically only used in
If you require the SGS time generated and the multiple realizations.
result to match a corresponding contour map
particular Standard could also show dramatic
Deviation then set the differences which could be
Sill parameter to SD disconcerting.
squared.
System of
Linear
Equations
Multi Level B
Spline
Block Radial As Radial Basis but Can Leave artefacts from Larger, dense data
Basis splits large data sets blocking in sparse data sets.
Function into blocks. areas.
Depreciated
Global Fast. Only works for a small data Small data sets with no
Visually pleasing set (<10000 points). discontinuity (faults).
results. Performs excessive
averaging on larger data
sets.
Random Handles faults and Not appropriate for noisy Random, large data
large data sets well. data - if data are noisy, sets with or without
Honours local spurious results can occur. discontinuity.
gradient.
Popular Methods
Kriging Methods; Kriging, Simple Kriging and SGS Kriging.
Kriging is a group of geostatistical techniques to interpolate the value of a random field at an
unobserved location from observations of its value at nearby locations.
Kriging belongs to the family of linear least squares estimation algorithms. As illustrated in
the graph below the aim of kriging is to estimate the value of an unknown real-valued
function, f, at a point, x* given the values of the function at some other points x1 ,..., xn. , A
kriging estimator is said to be linear because the predicted value is a linear combination.
METHOD = ‘Kriging’
Kriging is a statistical gridding technique. Kriging produces the most statistically likely
map, given the statistical distribution of the input data. Put another way, the average of a
large number of random models (each of which is constrained to match the distribution
shown by the input data) will tend towards the kriged results; the variance of these
models will tend towards the variance from kriging.
Kriging is an “exact interpolator”: in other words, if a grid node is exactly coincident with
an input data point, then the grid node will be assigned that value. Otherwise, kriging
generates a smooth map, which tends to the mean value of the data at large distance
away from the data points.
The power of kriging comes partly from the large number of parameters that are available
to specify how the interpolation is to occur, particularly through the specification of a
variogram model. The ability to incorporate soft information through the use of an external
drift is also extremely powerful.
There are a number of excellent Geostatistical textbooks available, and it is
recommended that the interested reader obtains one of these if any of the concepts
mentioned in this part of the manual are unfamiliar.
The curve-fit module should be used to check that the drift data set is well correlated with
the data to be gridded. If a drift surface is used which bears no resemblance to the
primary data, the answers can look very strange.
Another consideration with external drift is the search radius. If the search radius is very
small, external drift kriging will still generate a value everywhere where the drift surface is
present. Away from the data values, the resulting surface will be identical to the drift
surface.
In the above diagram the thin dashed line shows how the data actually behaves, but this is
only known at the well locations. The solid line represents the velocity model derived by
kriging the values at the well points. It represents the best guess away from control points.
The thick dashed red line represents the uncertainty of the kriged estimate (lines of +/- two
standard deviations). The thin dotted line is a random realization generated using Sequential
Gaussian Simulation. It is no more accurate than the kriged estimate, but it is more “realistic”.
Note: Due to the randomization factor of this algorithm the grid will be different each time
generated and the corresponding contour map could also show dramatic differences.
An advanced method of using SGS is to use it in multiple realisations and setting the sill to
the square of the Standard Deviation. The grids generated can be QCed in the Analyse
module in the Statistics -> Statistics Fly-out using Type = Standard Deviation - the grid-of-
grids generated should show the same value as the SD squared entered into the spill.
Smoothing Methods
METHOD = ‘SMOOTH’
The Smooth method uses a simple unweighted circular average method to generate a
smooth map. The only parameter it uses is the Search Radius. All the input data values
within the search radius of each node are averaged, and the result is assigned to that grid
node. There is no octant test, so the output grid will extrapolate beyond the data by the
search radius. You can chop off this extra coverage using PROC to constrain the grid to
have the same areal extent as one of your time grids, for example.
probability distribution; this affects how much weight is given to outlying points (those with
extreme Z values). Optionally, a series of iterations can be performed which progressively
down-weight points which are a long way from the regression surface. The following
parameters therefore need to be specified:
Span
Degree
Family
Iterations
For more information on LOESS please refer to CLEVELAND, W.S. and GROSSE, E.,
1991, Computational methods for local regression, Statistics and Computing, 1, 47-62.
Specialist Methods
METHOD = ‘ Krig_Sgs’ - see Kriging
METHOD = ‘Randomised’
Produces a grid of a constant value between -1.0 and +1.0. Useful in writing workflows where
a random scalar is required.
is assigned to data through the use of a weighting power that controls how the weighting
factors drop off as distance from a grid node increases. The greater the weighting power, the
less effect points far from the grid node have during interpolation. As the power increases, the
grid node value approaches the value of the nearest point. For a smaller power, the weights
are more evenly distributed among the neighboring data points.
Depreciated Methods
METHOD = ‘Global’
The Global method provides an approximation for generating a minimum curvature
solution. This method can only be used on data that contains less than 10,000 points. It is
the fastest gridding method for small data sets, and also tends to produce the most
visually pleasing results.
This method employs points averaging whenever the number of points exceeds 1000.
This may cause the resulting surface to not pass directly through the input points that
were averaged out.
All nodes are calculated regardless of the input point distribution. The resulting surface as
represented by the Z values at each node in the grid is the exact solution to a set of
equations that approximates the relationships among a thin metal plate, a set of desired
displacements at the random input points, and the forces applied at each point.
By solving this system of equations the necessary values of the force are determined at
each point. These values are then used to compute the plate displacement at every grid
node. If no averaging is performed, this solution provides a surface that passes through
every point specified and has the least amount of curvature possible, as shown in Figure
1.
If averaging was performed to reduce the number of points, the solution will not
necessarily pass through those points that were averaged.
z l
d
grid spacing
Interpolated curve, shown in two dimensions, using the Direct method
No averaging is performed.
METHOD = ‘Random’
The Random method uses the Radial Search for Scattered Points algorithm. This is the
default method for GTGRID. This approach is good for large data sets that contain noisy
data.
The radial search uses randomly positioned input points to compute grid nodes in the
“immediate vicinity” of each input point.
The radial search technique relies on localized fits of a plane to the selected subsets of
the input points. Using each input point as an origin, a subset of the surrounding points is
formed by selecting the nearest two points in each octant about the origin. Each point is
then weighted according to its displacement from the origin. The plane is also constrained
to pass through the point selected as the origin.
If a successful fit is accomplished, the grid nodes about the point selected as an origin are
then assigned Z values.
Only grid nodes that are interior to the convex hull of the input data point distribution will
be assigned a Z value (the convex hull is a boundary that encompasses all the data
points. For example, if each data point were a nail in a board, the convex hull could be
represented by a rubber band stretched around all the nails). This is done to minimize
edge effects that are inherent to this approach.
This process repeats using every input point as an origin. GTGRID attempts to assign all
unassigned grid nodes during secondary gridding. Note that some grid nodes are
computed more than once if the input point distribution is such that more than one point is
contained within a grid cell.
METHOD = ‘Cluster’
The Cluster method uses the Radical Search for Clustered or Linear Data algorithm. The
Cluster Method differs from the Random Method in that additional effort is employed
during calculation of grid nodes near input points to minimize the extrapolation of
excessive gradients due to noise or points that are very close together.
METHOD = ‘Weighted’
This method uses the Weighted algorithm. This is a relatively fast method of mapping
points onto a specified grid without any attempt at interpolation or gradient estimation.
Use this method if you want to input a set of values which were computed on a grid size
desired. Each grid node is a weighted combination of the data points surrounding it within
adjacent cells. Secondary gridding is then employed to fill unassigned grid nodes. If the
input distribution is fairly uniform, this method is much faster than the others. Figure 2
shows the curve fit produced with this method.
l
l
z l
d
grid spacing
Process Dialog
The Grid Processing Property Grid is an extremely powerful tool to deal with grid
manipulation; multiplying, adding etc. grids together or with constants to produce new grids.
A whole host of other processes are available including Grid-to-Grid depth conversions. You
also have the opportunity to make ‘Create-your-Own’ methods. Go Here for details.
Grids can be processed to give new grids which you can then display or use. In common with
other VelPAK programs, the grids are stored internally within the binary VelPAK model. They
can only be accessed for processes external to VelPAK by using the File/Export/Surface
Grid option.
The grid processing methods are set up from the Parameter Tab.
Note: Each set up refers to one event horizon. Once you have set up the Process run for
one event horizon select a new event horizon from the Surface&Slot selector or
Model Tree and then select the process to occur on that event horizon.
Tabs include:
General tab - (currently inactive).
Parameter tab - for entering formula inputs and outputs.
Formula tab - for viewing the formula you are using.
Setting up the Formula to use in the Parameter Tab will change the Parameter banks in the
Property Grid. These banks of information will change from simply saying Unused to being
filled in with inputs such as Name: top_time, Type Input..etc. Each of these Parameter banks
relate to an input or output element from the defined process formula for this event horizon.
Input
Bank 01
Input
Bank 02
When you have made your edits. Press to activate the option.
Use the down arrow () or double click on input name to select from available inputs.
Options
Formula - Select the formula to use from the drop down list. Go Here for further details of
the Formulae available.
From - [currently inactive].
Input Banks -
When you enter a Formula from the drop down selection the Input banks will fill up
various fields with the names of the input/outputs it requires. Other fields have to be filled
in by you if you have specified User as where the information for your process is coming
from.
Having a look at the actual formula selected in the Formula tab will show the list of input/
output variables that have to be defined.
For example:
7 Variables
Each Input or Output Bank has its own values to be set up - for example:
01_Value - If the input is a constant value, the value will be entered here.
If a Grid is to be used and not a Constant then the following are to be entered:
01_GridEvent - ** User Entered ** - The Event Horizon under which the grid to be used
as input 01 is stored in the Model Tree. Note the Previous / Current option which allows
you select the previous or current event horizon rather than name a specific event horizon
where the grid is to be found. Usually this option is only utilized when constructing
Generic Workflows which can work on any event horizon.
01_GridType - ** User Entered ** - The slot under the event horizon in the Model Tree
where the grid is stored.
Non-editable display of the Formula you have selected from the parameters tab.
Process Description
MERGE SHALLOW If INPUT1 is less than INPUT2 then the result equals
INPUT2
otherwise the result will be INPUT1. Go here for detailed
example.
Input:
2
1
Output:
Process Description
MERGE SHALLOW If INPUT1 is less than INPUT2 then the result is INDT
AND BLANK otherwise it equals INPUT2. Go here for detailed
example.
Input:
2
1
Output:
Input:
1
2
Output:
MERGE DEEP AND If INPUT1 is greater than INPUT2 then the result is INDT
BLANK otherwise it equals INPUT2. Go here for detailed
example.
Input:
1
2
Output:
Process Description
INVERT INDTS AND Changes Input grid values from INDT to 1 and from
SET TO 1 values to INDT
MATH FUNCTIONS Gives you the various outputs of Math functions: arccos,
arcsin, arctan, cosine, hcosine, exponential, natural_log,
log_10, sine, sineh, tangent, tangenth
PRESERVE INSIDE Given a constant or grid upper and lower limits the
BAND INPUT1 grid will be trimmed above and below these
values. Idealistically the resultant would look like a donut
grid:
Input
Band Upper - 20
Band Lower - 10
10
20 30
Output
Resultant
10
20 Grid
BLANK INSIDE BAND Given a constant or grid upper and lower limits the
INPUT1 grid will be blanked inside these values.
V0 given DEPTH, Works out V0 given the Depth, Error and K from INPUT1
ERROR & K
“Shallow” “Deep”
Input1 Input2
Shallower Deeper
Deeper Shallower
Note: You can work on these examples yourself by accessing the ‘merge deep & shallow’
directory containing the binary file and a workflow from within the VelPAK training set
of directories.
Tip: Always make a back up of the user-defined process text file before attempting to write
your own method.
If you experience difficulties in saving, with an error saying that the program is unable to save
this process to this file, it is likely that the process_user.txt file is still write-protected. Check
with your System Administrator to change the write protection for this file.
The language that the processes are written in is the same as that which the Depth
Conversion methods are written in. For more details of this language Go Here.
Depth Dialog
For Depth Conversions of Profile data use the Depth tab within the Profile Module.
The depth conversion methods are set up from within the Depth Option module.
Note: Each set up refers to one event horizon. Once you have set up the Process run for
one event horizon select a new event horizon from the Surface&Slot selector or
Model Tree and then select the process to occur on that event horizon.
The depth conversion option can use the values, such as V0 and k, created by the
‘OPTIMIZE’ part of VelPAK. If you select a Depth Conversion method that uses these values,
they will be automatically displayed to show what values are being used.
Within VelPAK depth conversion options, there are a certain number of depth conversion
methods already defined. These can be utilized for any given layer, either by using the
calculated values generated via the Optimize option or by inserting the correct variable
values, according to your depth conversion parameters.
These ready-stored depth conversion contain all the usual methods you would expect; such
as Interval Constants, Interval Grids and Curves, as well as a number of standard formulae
such as velocity varying with two-way time.
However, there is the added dimension within VelPAK’s depth conversion in that you can
write their own methods, or amend the ready stored methods, and save them for future use.
The Depth Conversion methods within VelPAK are written in a language called Xpress. This
has been specifically designed so that it is possible for non-programmers to write the text for
a new depth conversion method. For details of writing your own methods Go Here.
Tabs include:
General tab - setting up the event horizons to be depth converted.
Parameter tab - for entering formula inputs and outputs.
Formula tab - for viewing the formula you are using.
Setting up the Formula to use in the Parameter Tab will change the Parameter banks in the
Property Grid. These banks of information will change from simply saying Unused to being
filled in with inputs such as Name: vel, Type Input ..etc. Each of these Parameter banks relate
to an input or output element from the defined depth conversion formula for this event
horizon.
Option
Type - Select whether the Depth Conversion to be set up under the Parameters tab is to
run on just the current event horizon or whether all Depth Conversions set up for all event
horizons are to be activated.
Initially you would probably set this to Current_Event as you set up the parameters for
each event horizon separately and run them; but once you have set the depth conversion
up for each event horizon you would probably set this to All_Events to be able to run the
depth conversion again for all layers in the event horizon of changing one layer.
Verify - will verify the parameters set up before running.
Input
Bank 01
Input
Bank 02
Use the down arrow () or double click on input name to select from available inputs.
When you have made your edits. Press to activate the option.
Options
Formula - Select the formula to use from the drop down list. Go Here for further details of
the Formulae available.
From - Specify where the values are to some from:
User Defined
You can enter your own parameters directly into the appropriate Parameter banks below
this input.
Optimize
Selecting an Optimize option will call in the relevant parameters as derived from the
Optimize module.
Note: Your Depth Conversion method here would have to be the same as the method used
to derive the parameters in Optimize.
Input/Output Banks -
When you enter a Formula from the drop down selection the Input banks will fill up
various fields with the names of the input/outputs it requires. Other fields have to be filled
in by you if you have specified User as where the information for your Depth Conversion
is coming from.
Having a look at the actual formula selected in the Formula tab will show the list of input/
output variables that have to be defined.
For example:
Variables
01_Name - The name of the first input to be defined. This is automatically entered as a
named variable from the formulae list (as shown in the example above).
01_Value - If the input is a constant value, the value will be entered here. If you have
selected ‘User’ as where your depth conversion method will come from you must enter a
value in this slot. If you have selected an Optimize or Curve method to be used for your
depth conversion method then this value will be filled with the a0 value derived
(standardly the V0 value).
If a Grid is to be used and not a Constant then the following are to be entered...
01_ GridDesc - The name of the grid (as displayed in the Properties panel and Model
Tree).
01_GridEvent - ** User Entered ** - The Event Horizon under which the grid to be used
as input 01 is stored in the Model Tree Note the Previous / Current option which allows
you select the Previous or current event horizon rather than name a specific event
horizon where the grid is to be found. Usually this option is only utilized when constructing
Generic Workflows which can work on any event horizon.
01_GridType - ** User Entered ** - The slot under the event horizon in the Model Tree
where the grid is stored.
Toggle On
Non-editable display of the Formula you have selected from the parameters tab.
containing processes written originally by the Users. An original ‘User-Defined’ file is also
supplied with VelPAK but this can be edited from the original by a user.
If you wish to write your own process, write it in a standard text editor and append it in the
standard format to the file ‘process_user.txt’. This can be found in the VelPAK/sys directory.
Contact your local helpdesk for assistance with this if necessary.
Be aware of syntax convention for the name of the formula such that it is read by VelPAK.
Tip: Always make a back up of you-defined process text file before attempting to write your
own method.
If you experience difficulties in saving, with an error saying that the program is unable to save
this process to this file, it is likely that the process_user.txt file is still write-protected. Check
with your System Administrator to change the write protection for this file.
mask Binary mask value of the current point being depth converted.
This can be viewed interactively in a profile window by using
point pick mode and checking the resulting information
vdown_array - not guaranteed correct in overthrust situations contains and
array of velocity down values that correspond to the
intersections above Values can be retrieved using the new
xpress user function called getarr - see below
depth_array - not guaranteed correct in overthrust situations contains and
array of depth values that correspond to the intersections above
Values can be retrieved using the new xpress user function
called getarr - see below
inflect Contains a 1 if are in a reversal zone Contains a 0 if we are not
“fault” Contains a 1 if not on a fault Contains a 0 if we are not
real_event Contains a 1 if converting real existing data points and Contains
a 0 if we are converting virtual intersections with other event
horizons
multi_event Contains a 1 if there are multiple intersections with this event
horizon, down from the surface Contains a 0 if there are not.
current_event Contains the current event horizon number we are depth
converting This is also set for virtual intersections, down from
the surface
total_intersect Contains the number of intersections down from the surface to
this point
current_intersect Contains the current intersection we are depth converting
event_array contains and array of event horizon numbers that correspond to
the intersections above Values can be retrieved using the new
xpress user function called getarr - see below
time_array contains and array of time values that correspond to the
intersections above Values can be retrieved using the new
xpress user function called getarr - see below
Expressions
Expressions must consist of one of the following:
numeric constant e.g. 5, 123.4
character constant e.g. “ABC”
variable e.g. time, temp1
expr operator expr e.g. var1+10.2, diff^2
- expr e.g. -3, -intvel
(expr) e.g. (total*10)
Brackets may be used to override normal operator precedence and may be nested to any
level.
^ exponentiation
- unary minus (negation)
*/ multiplication, division
+- addition, subtraction
Logical Expressions
Logical expressions have the following syntax:
expr lop expr e.g. sum<10.1, var!=5
Functions
Xpress has a rich set of built-in functions. All function arguments may be either constants or
variables. Functions have the following syntax: function(arguments) e.g. log(x),
grid(“GGTIME”,x,y).
Control Flow
a. If-Else
if(lexpr) {
statements
} else {
statements
}
b. Else-IF
if(lexpr) {
statements
} else if(lexpr) {
statements
} else if(lexpr) {
statements
} else {
statements
}
Comments
Comment lines start with a # (hash) and are ignored by the parser.
Statements
A statement must consist of one of the following:
variable=expression e.g. total=sum1*(value-1.2)
control flow statement (if, else if, else)
comment
Example
Clearly the Xpress language is a powerful tool for writing virtually any depth conversion
method.
For example, supposing a multi-variant statistical method has been derived where:
V=(X*A)+(Y*B)+(2WT*C)+(ISOC*D)+E
X=X UTM coordinate
Y=Y UTM coordinate
2WT = Two way time in msecs
ISOC = Interval time in msecs
A = -0.02
B = -0.03
C = 1.6
D = -2.4
E = 112328
The Xpress code would be as follows:
a = -0.02
b = -0.03
c = 1.6
d = -2.4
e = 112328
isoc = last_time - time
intvel = (x*a)+(y*b)+(time*c)+(isoc*d)+e
isop =intvel * (isoc / 2000)
depth = last_depth + isop
LINEAR (V0+KZ)
Discussion:
Linearly accelarating velocity function1
# LINEAR (V0+KZ)
# V = Vo + kZ
extern v0,k,last_time,time,last_depth,depth,error
depth=INDT
} else {
depth = last_depth
} else {
depth=depth+error
1.Least Square Determination of the Velocity function V=V0 + KZ for any set of time depth data - Geo-
physics Volume 8. J.A. Legg Jr. J.J. Rupnik.
# V = Vo + kZ - pkZ
# t
extern v0,k,p,last_time,time,last_depth,depth,error
depth=INDT
} else {
depth = last_depth
v01 = v0 - k * p * last_depth
} else {
# V = Vo + kZ - kS
extern v0,k,last_time,time,last_depth,depth,error,sea_floor
depth=INDT
} else {
depth = last_depth
} else {
HOUSTON
# HOUSTON
# 1/2
# V = Vo.(kZ + 1)
extern v0,k,last_time,time,last_depth,depth,error
depth=INDT
} else {
if( v0==0.0) {
depth = last_depth
} else {
BINOMIAL
# BINOMIAL
# 1/m
# V = Vo.(kZ + 1)
extern v0,k,m,last_time,time,last_depth,depth,error
depth=INDT
} else {
if( m==0.0) {
depth = INDT
depth = last_depth
} else {
p = (m-1)/m
tmp1=(k*last_depth + 1)^p
depth=(tmp2 - 1)/k
depth=depth+error
SLOTNICK
# SLOTNICK
# kZ
# V = Vo.e
depth = INDT
} else {
if( v0==0.0) {
depth = last_depth
} else {
tmp2 = v0 * k * isoc_owt_secs
FAUST1
# FAUST
# 1/m
# V = a.Z
extern a,m,last_time,time,last_depth,depth,error
1.Seismic Velocity as a function of depth and geological time. Geophysics Volume 16. L.Y. Faust
depth=INDT
} else {
if( m==0.0) {
depth = INDT
depth = last_depth
} else {
tmp1 = (m - 1) / m
if(last_time == 0.0) {
} else {
depth=depth+error
HYPERBOLIC TANGENT
# HYPERBOLIC TANGENT
# V = V tanh(aZ + b)
# 1
depth = INDT
} else {
a1 = a/1e6
b1 = b/1e3
if( v1==0.0) {
depth = last_depth
} else {
depth = tmp4 / a1
PARABOLIC
# PARABOLIC
# 1/2
# V = (w + eZ)
depth = INDT
} else {
depth = INDT
} else {
tmp2 = e * (isoc_owt_secs ^ 2) / 4
extern vel,last_time,time,last_depth,depth,error
depth=INDT
} else {
extern vel,time,depth,error
depth=INDT
} else {
# Given seismic time from grid, interpolate depth from nominated well
time-depth curve
extern
last_time,time,last_depth,depth,error,isochron,isopach,interval_velocity,av
erage_velocity
depth = INDT
isopach = INDT
isochron = INDT
interval_velocity = INDT
average_velocity = INDT
} else {
depth=well("none",time)
depth=depth+error
isopach = depth-last_depth
isochron = time-last_time
Trim Dialog
Allows you to Trim data in the model, using polygon(s) as control. Polygons are used as trim
control for data either within them or outside the polygons, as selected in the Options.
THIS TRIM IS A FINAL SOLUTION! - data trimmed in the model is not retrievable. It is
recommended that if you trim data, you use the SAVE AS option to save your trimmed data
as a new model, keeping your original data model intact.
You can select what types of data are to be trimmed from the selected polygon area by
selection on the Options page. In ‘Current Event’ mode the trim option will only trim data on
the Event Horizon selected in the Surface&Slot. In this case the Polygon used to trim would
need to be the current selected polygon under the current event. In ‘All_Events’ mode all of
the specified data types will be trimmed for all of the events. In this case the Polygon to be
used to trim can be stored under any event but it must still be the selected, current polygon in
the current event.
Line Data which runs through all event horizons would also be deleted whatever Events are
selected.
You can select more than one Polygon area for trimming in any one trim process; the process
will prompt for a Yes/No against each polygon in the model. By the nature of the trim process
however, this would only work with trimming data inside individual polygons.
For Grids the Trim option will trim just the selected grid under the selected event. The ‘All
Events’ option does not work for Grids.
For details of inserting and defining your Polygons please refer to Polygon Edit Mode.
Use the down arrow () or double click on input name to select from available inputs.
When you have made your edits. Press to apply the changes and activate the
option.
Option
Events - Choose to Trim data on all event horizons or selected event horizon. Note that
this option is inactive for Grid Trimming which needs to be done event by event.
Type - Choose to Trim your data inside or outside your defined Polygon.
Verify -YES will give you an alert box will pop up for verification of your action before.
proceeding to Trim.
This option is especially useful if you have a number of Polygons defined over an area for
one event horizon and you wish to pick and choose which Polygons you which to trim
within and which you do not.
Data
Event, Grid, Location, Stack, Well, XYZ
Select the type of data you which to have trimmed. Note that for Grids the Trim option will
trim just the selected grid under the selected event. The ‘All Events’ option does not work
for Grids.
Group Dialog
Group controls how wells are grouped together. This can be used to take account of different
geological provinces in the area, or to include / exclude particular wells from the calculations.
Wells can be grouped either by drawing polygons on the surface display, or by editing the
group number of each well. When groups are defined, VelPAK will only use the wells in the
current group. Grouping also applies to stacking velocities.
Draw a polygon around the wells you want to group. Refer to Polygon Edit for details of
inserting Polygons into your VelPAK model. When you apply the Group set-up those wells
inside the Polygon you have just drawn will become grouped.
Use the down arrow () or double click on input name to select from available inputs.
When you have made your edits. Press to apply the changes and activate the
option.
Option
Active - Some parts of VelPAK operate on the active group only, and this can be changed
by using this value. When grouping is active, all wells numbered 0 are ignored. The
current group of wells is colored white, and the other groups are also color-coded by their
different number. Hence, when grouping is active, the normal well colors (which tell you
how much data are available) are replaced by a color scheme which tells you which group
each well is in. This is particularly important when you have overlapping polygons.
Because VelPAK implements the grouping concept by assigning a number to each well, it
is not possible for one well to be in two different groups. The well color will tell you which
group the well has been put in if it is within a polygon.
Type - Group or Ungroup. The default option ‘Group’ is used to make the groups, while
the ‘Ungroup’ option resets the group number, so all the wells have group 0 as their group
number (i.e. they are not in a group).
This is the original dataset, with the wells colored according to their data (in this case all
green showing that they are fully loaded). The current well is black.
1. Select the Event Horizon you are going to be working on.
2. Draw a polygon around three of the wells.
3. Bring out the Group Property Grid and apply the group of wells within the polygon to the
first group by pressing the Activate tick.
4. The wells are now colored by group rather than by data. The currently active group is
colored black, and wells in group 0 (in other words not in a group) are colored red.
5. Under Edit Mode select Well Location and Point Edit and click on one of the selected,
black wells.
You will see that the box that appears for the well selected will show that this well is now
defined as being in Group 1.
6. Next, draw another polygon around one of the other wells and press Apply Group again.
This will fall into Group 2 since it is in a different polygon to those wells in Group 1.
7. Now, two groups are defined. The new group defined by the smaller polygon has well(s)
colored yellow. This indicates that they are in Group 2. The current group is still Group 1
so they are still colored black. In order to select Group 2 as the active group, change the
Active Group to 2 in the Property Grid. and press the ‘tick’.
8. The well that is in Group 2 which was yellow has turned black since it is now the active
well. The wells in Group 1 have now changed to being orange in color since they are no
longer active.
XYZ Dialog
XYZ produces or manipulates XYZ data files from the grids stored within the model.
The options are:
Convert
Interpolate
Delete
Copy
Concatenate
Use the down arrow () or double click on input name to select from available inputs.
When you have made your edits. Press to apply the changes and activate the
option.
Option
Type
Convert converts your grid into XYZ, placing them in the XYZ slot you define in the
Output_XYZ slot. This will produce a XYZ file with XY values at regular spacing exactly as
the Grid nodes were defined
Interpolate will take the random XY values from a selected XYZ datafile in the Input_XYZ
then go to a selected Grid and extract Z values from the Grid at the given random XY
values. A new XYZ data file will be created and placed in the slot defined in Output_XYZ.
You can thus produce any number of XYZ files, all with the same XY values but with new
Z values. Reasons for doing this include determining grid values at specific locations,
such as the XY values of a deviation survey.
Delete will delete the XYZ data file selected in the Input_XYZ slot.
Copy will allow you to copy the XYZ data file selected in the Input_XYZ slot and place it in
the slot selected in the Output_XYZ.
Concatenate adds the selected XYZ values from Input_XYZ onto the Output_XYZ slot
XYZ file. The result will be stored in the slot selected under Output_XYZ.
Verify- Yes/No
Will bring up an alert clarification showing you what the process is about to do.
Input
Input_Grid - Grid selected from Model Tree to be used in conversion or interpolation.
Input_XYZ - Not used in Convert mode. Select the input XYZ file from which the XYs
are to be used to extract interpolated Z file from the Input Grid file named above.
Output
Output_XYZ - Select slot in the Model Tree where the output XYZ file is to be put.
The Well Module (default display Well Tab) is used to view and specify well curves, tops and
layers and to specify which tops correspond to the top and base of each seismic layer. Grid
thicknesses can be related to portions of the curve as a means of QC.
The layers can be individually selected for each well manually via this module; or can be
automatically derived using the Master Well List produced from the Layer Module.
The Tie dialog can be used as a method to check how well the current depth conversion
method ties to the well data in the model in either volume mode or profile mode. Optionally,
the results from this mode can be used to modify the depth conversion process so that the
wells are tied perfectly.
Select the well that is to be displayed using the Well Selection in the Model tree.
The Data Tab displays a summary of information for all the wells in the model.
Well tab
What you are looking at on the Well Display
Sea Floor
Horizon 1
Horizon 2
Horizon 3
} Grid 1
Grid 1 for layer 1
is thickness between
Surface and Horizon 3
Horizon 4
Horizon 5
} Grid 2
Grid 2 for layer 2
is thickness between
Horizon 3 and Horizon 5
}
Horizon 6
Grid 3
Horizon 7 Grid 3 for layer 3
Horizon 8 is thickness between
Horizon 5 and Horizon 8
Tops Curve Grids
Horizon 1
Horizon 2
Horizon 3
Horizon 4
Horizon 5
Horizon 6
Horizon 7
Horizon 8
Viewing a Well
Note: If the well does not pierce a time grid at top and base of the layer, there will be no
highlighting.
The grid titles are displayed on the Well display. If the grid label is empty (“None”) or auto-
generated (“XPRESS”) then the posted label is taken from the Surface name, otherwise the
user defined grid label is posted
Note: Your TIME grid must be stored in the TIME GRID slot within the Model Tree for the
relevant event horizon i.e. NOT the DEPTH GRID slot, not the General 01 SLOT etc.
The normal procedure for selecting the tops is to start with Event 1 and the first well curve.
You then select the tops corresponding to Time grids.
A detailed example of how to pick Tops follows is here.
Data tab
For a better understanding of the well layer properties, there is a summary Data table tab in the Well
display.
This table can be copy/pasted to Excel etc. by clicking on the top left hand corner of the table to
highlight the whole data area.
Details of how this data table can be sorted and the columns filtered is documented in the
Curve chapter here.
Details of what each columns means can be found at the end of this chapter here
Display Types
Visible Layers
Only ‘None’ is currently available for Layer to have displayed on the Well Log. This can not be
turned off.
Edit Mode
Select what EDIT mode you are in; whether you are in TOPS edit mode or CURVE edit
mode.
Pressing the left-hand mouse button over the option you require will cause that option to be
activated. The bottom right of the VelPAK window displays what edit mode you are in.
Switching between the two you will notice the Edit options (icons to the right of mode) will
change according to what edit options are available under which mode.
Tip: Use Tops Pick to manually select Layer Tops/Bases. Go here for details.
Insert
Used to insert a new top in your well display in depth or time (depending on the current
axes). Press the mouse button at the point on the curve you wish to enter a new Top.
When you release the mouse a dialog box will pop up showing the time or depth you have
selected and asking you what you want to call the top.
Delete
Click on Delete and select a well to delete. An alert box will pop up for verification.
Edit
Click on a top to rename it or change the time or depth. An alert box will pop up for you to
edit the values.
Pick
Pick mode is used to select and deselect the layers to define each layer.
Click on Top to
define layer for
event 1
After selecting tops for the first well, move onto the next well in the model, select the
thickness that represents the time grid and so on.
After completing the selection for Event 1, select Event 2 and repeat the selection
process for each well.
You can then repeat the process for the remaining event horizons.
A detailed example of how to pick Tops follows is here.
Insert
Delete
Note: There will be no alert to warn you are deleting curve points but you will be able to see
that you have ‘hit’ or ‘missed’ the points to delete by looking at the Console window
information.
Move
Splitting a Sonic Log - Click on the point of the sonic curve down to which you which the
sonic to be deleted.
Set as Find
Single Depth
Well Conversion
Well
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
General Tab
Option
Display_Type - Select between Time or Depth display for the well.
Overlay_Model - Yes/No - overlays the current depth converted velocity curve function.
Overlay_Stack - Yes/No - View your stack (pseudo) wells if you have a created them (via
the Amalg option of the Velocity module. Particularly useful if you have created them at
your Well Location so you can compare the two. Moving on to another well on the display
using the red arrows will move this display on to the next well too.
Overlay_Well - select which well (if any) you wish to overlay on top of the well selected.
Default of blank will not overlay any well trace. Moving on to another well on the display
using the red arrows will NOT move this display on to the next well too.
Show_Events - Choose between ALL Event Horizons being shown (as Grids to the right
and picked layers to the left of the well curve) or just ‘Current’ being the event horizon
currently selected in the Surface&Slot.
Top_Type - Choose to display which tops are to be displayed on your well display:
All - displays all formation tops in the model for the current well
Layer - displays only tops used in the layer definition
Used - displays all tops flagged as “Used” in the Layer definition table
UsedAndExtra - displays all tops flagged as “Used” and “Extra” in the Layer definition
table
Range Tab
Tie Dialog
Tie can be used as a method to check how well the current depth conversion method ties to
the well data in the model.
Optionally, the results from this dialog can be used to modify the depth conversion process so
that the wells are tied perfectly. Activating the TIE process produces X,Y, Z error points and
these need to be gridded to form an ‘ERROR’ grid.
Tip: After TIE has run you will need to go to the GRID option in the Surface Module and set
up a Grid run from these XYZ values generated.
Note: The TIE option honors the Residual Flag within the Data Tab of the Curve and
Optimize modules. If the Residual Flag for a well is turned off within the Data tab it will
not be used to produce the Error XYZ data file here.
At each well position, TIE uses the current depth conversion method to calculate a depth at
each well. This depth is the depth which you would get if you were to depth convert the entire
time grid. You do not need to have activated the Depth Conversion set-up before using this
option, but you do need to have gone into the Depth Property Grid and selected your depth
conversion method.
Depth Grid 1
DEPTH
Depth Grid 2
Straight or
Deviated Wells
In order to activate the TIE feature, the wells in your model need to have had the Well
Layers from tops defined for each layer via Layer Module or as previously described in
the Edit facilities of this module.
Note: To use the Grid or Curve Depth Conversion the Time Seismic grids need to be
available and defined for each layer.
Note: TIE sets the values of the ‘error’ slot in the Depth Conversion Parameter page to zero
because it uses the depth conversion formula to work out what the error should be;
the reason for this being if an old error grid was present then the TIE procedure would
give you the error INCLUDING the old error grid and thereby give you an erroneous
result.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
After TIE has run you will need to go to the GRID Dialog to set up a Grid run from these
generated XYZ values.
The file will be stored under the Error XYZ slot for the relevant event horizon. For
Time_Grid_Converted option the file will be called Depth Errors(Tie to Time Grid).
Time_Curve_Converted - This option uses the CVL time from the Well Curves as the
source of the time. It will calculate the errors at the wells, producing X,Y, Z error points.
After TIE has run, you will need to go to the GRID Dialog page to set up a Grid run from
these XYZ values generated.
The file will be stored under the Error XYZ slot for the relevant event horizon. For
Time_Curve_Converted option the file will be called Depth Errors(Tie to Curve Grid).
Note: The time grid method will interpolate the depth value at the top of the layer from the
defined depth grid. The time-depth curve method will use the stored Top. Therefore, if
the previous depth grid has not been TIED to the wells, the different errors obtained
by the two techniques may not be entirely due to the CVL/seismic time mistie.
General comments for producing the best Error Grid via this method
1. The Grid spacing for the error grid must be small enough to account for the error values
from each well spot.
2. If you are using the depth conversion methods that rely on conditional variables, care
must be taken that these conditional variables act as you want them to act. These can be
tested using the ‘plot’ option within the Depth Conversion.
Calibrate Dialog
The Well Calibration module assigns a time to a depth on a log. If you have a log that has not
been processed from the surface then the time curve will start at a different place to the depth
curve. The Calibrate module assigns an adjustment to the time curve, either by using a grid
layer to shift the curve or by using a constant value.
This is a simple method of calibration that does not rely on Check Shot data.
Note: VelPAK assumes that check-shots or time-depth curves are ‘Calibrated’ i.e. the times
are good, so only ‘Uncalibrated’ will appear in the title if they are not calibrated.
2. To reset the status of the well change the type to 'Uncalibrate' as follows:
3. Once this done then the well can be re-calibrated in two ways:
a. Either a constant shift:
or
b. Use the Two way time from the an event (surface), this is the more common option
Before After
Note: In order that your Time grid is found and used it grid must be stored in the TIME
GRID slot within the Model Tree under the relevant Event; i.e. NOT the DEPTH GRID
slot, not the General 01 SLOT etc.
Well: 12/34a
Note: If the well does not pierce a time grid at top and base of the layer, there will be no
highlighting.
The normal procedure for selecting the tops is to start with Event 1 and the first well curve.
You then select the tops corresponding to Time grids.
Note: You must be in Tops Point Pick mode to select your top.
Event 1 will be from the Surface. Select the Surface by pointing the cursor arrow on the
‘Hidden Top’ at the zero time. (To find the ‘Hidden Top’ you can zoom out on the display to
see it). This defines the top of the layer. Then point the cursor on the Top that should
represent the base of the layer represented by the Time grid for event 1.
x1
Left Hand
Side of Right Hand Side of curve
curve: displays thickness of Grid
displays
thickness
When you have gone through and selected all the grid thicknesses you want to have
defined within your model, you will select another well and do the process again. (You
could also go through your model selecting Event One’s thickness on the Tops for all
wells, and then select Event two, define the Tops for all wells, and so on.)
Selecting Surface 0 (Faults) from the Model Tree or Surface&Slot selector will display all the
picked grid thicknesses for the well you are currently viewing.
TOP_TIME Time from time-depth The time at the top well pick. This
value is obtained by interpolating the
curve at the top of the time-depth curve at Top_Depth.
layer
TOP_DEPTH Depth at the top of the The depth at the top well pick.
layer
TOP_TIME_GRD Depth from time grid at The value of the top time grid
top of layer using T-D interpolated at the top well pick (i.e.
at TOP_X, TOP_Y)
curve
TOP_DEP_TIM_GRD Depth from time grid at Interpolate Depth from T-D curve
top of layer using T-D using time from time grid at TOP_X,
curve TOP_Y. These are the values
displayed on the right-hand side of
the WELL TOPS display in velocity-
depth mode
TOP_TIM_DEP_GRD Time from depth grid at Interpolate Time from T-D curve
top of layer using T-D using depth from depth grid at
curve TOP_X, TOP_Y
Layer number
CF_ERROR (Internal Only) Calc shifted
formula RMS error
DF_ERROR(grid tie) (Internal Only) Depth
formula(tied to grid)RMS
error
DF_ERROR(well tie) (Internal Only) Depth
formula(tied to well)RMS
error
GRID_ISOCHRON BOT_TIME_GRD -
TOP_TIME_GRD
GRID_ISOPACH BOT_DEPTH_GRD -
TOP_DEPTH_GRID
Well Location
There are four different locations associated with a well: the surface location, the location of
the well bore at the top well pick, the location of the well bore at the base well pick and the
location of the well bore at the mid-point depth. The values will be identical for a vertical well,
but will be different in the case of a deviated well.
The different locations are illustrated below.
MidX,MidY TopX,TopY
BotX,BotY A - XY Top of Pick
B - XY Bottom of Pick
C - XY Mid Point
A
C
B
rid
G
p
To
id
Gr
se
Ba
B C A
Top Time or Depth Grid
Time /
Depth
Well Definitions
The following diagram of a log defines the terms Top Well Pick and Base Well Pick.
MID_TIME, MID_DEPTH
BOT_TIME, BOT_DEPTH
Base Well Pick
Surf_X, Surf_Y
B C A
Time
Bot_X, Bot_Y
Mid_X, Mid_Y
Grid Definitions
TopX,TopY
MidX,MidY
BotX,BotY
A
C
B
id
Gr
me
Ti
p
To
rid
G
e
m
Ti
se
Ba
B C A
Top_Depth_Grd
Top_Time_Grd
Bot_Depth_Grd
Bot_Tim_Grd
Remember that you may need to consider Mistie discrepancies between Grid and Well
Pick.
Bottom Time - (Bot_Tim_Grd)
The value of the base time grid interpolated at the base well pick (i.e. at BOT_X, BOT_Y).
Top Depth - (Top_Depth_Grd)
The value of the top depth grid interpolated at the top well pick (i.e. at TOP_X, TOP_Y).
Top Time - (Top_Tim_Grd)
The value of the top time grid interpolated at the top well pick (i.e. at TOP_X, TOP_Y).
Bottom Depth - (Bot_Depth_Grd)
The value of the base depth grid interpolated at the base well pick (i.e. at BOT_X,
BOT_Y).
Vertical Apparent IV
THIS IS THE VALUE NEEDED IN A DEPTH CONVERSION TO TIE THE TIME GRID WITH
THE WELL DEPTH.
The interval velocity derived from the top time and depth grids, TOP_TIM_GRD_BOT and
TOP_DEP_GRD_BOT and the time and depth values at the well base pick BOT_TIME_GRD
and BOT_DEPTH.
Top_Tim_Grd_Bot
Top Time (BotXY)
Vertical Apparent
Interval Velocity
Bot_Depth_Grd
Bot_Tim_Grd
Misties
With deviated wells (or indeed straight wells) the values of the tops would not necessarily tie
exactly with the grid for the same surface. If this is the case you will need to be aware which
of the two values you will want.
The diagram below shows an exaggerated mistie between the well pick and grid. This
highlights the fact that if GRIDS are used then it is the X,Y position of point A (the surface XY
location of the well’s top pick) that is used as the Grid Value; not where the well track ‘enters’
the grid surface.
TopX,TopY
BotX,BotY
rid
G
e
m
Ti
p
To
A
This information also covers the Curve inputs. Not all inputs are relevant to the Well Log Data
table.
STACK CURVE
STACK LAYER
SURFACE XYZ
SURFACE GRID
WELL CURVE
Uncalibrated Time
The uncalibrated time value at each depth sample. Curve data can be displayed for more
than one well. All the samples within the current layer are displayed.
Depth
The depth value at each sample (these first 5 columns are what are loaded in to the File-
>Import ->Well Curve file). Curve data can be displayed for more than one well. All the
samples within the current layer are displayed.
Calculated Well Depth
Depth results that come from the current depth conversion set-up
WELL LAYER
Well Layer - Isochron
Along Hole(GridTime)
BOT_TIME_GRID
BOT_TIME
(base well pick)
Layer(GridTime,GridTime,BotXY)
Layer(GridTime,WellTime,BotXY)
BOT_DEPTH_GRID
Along Hole(Well)
The isopach based on the top and base well picks (i.e.TOP_DEPTH and
BASE_DEPTH).These may well be offset from the grid values for top and base of layer
due to misties (as shown in the diagram below). Values are only meaningful for vertical
wells. They should be treated with caution for deviated well.
BASE_DEPTH
(base well pick)
Layer(GridDepth,GridDepth,BotXY)
Layer(GridDepth,WellDepth,BotXY)
Locations from one of the main VelPAK curve Definitions. Go Here for details.
Time(WellTime,GridTime,BotXY)
Depth(TieMode,BotXY)
Depth(WellDepth,GridDepth,BotXY)
Global Fit ()
Global Fit ()
Global Residual ()
Global Residual ()
Well Fit ()
Well Fit ()
WellResidual ()
WellResidual ()
Remember that you may need to consider Mistie discrepancies between Grid and Well
Pick.
Top Time
One of the main Well Definitions. Go Here for details.
Top Time(GridDepth) - (Top_Time_Dep_Grd)
The time obtained by interpolating the well time-depth curve at TOP_DEPTH_GRD. Time
values are obtained by interpolating the well time-depth curve at the grid-derived depths:
B C A
Top Depth Grid
Middle Time
One of the main Well Definitions. Go Here for details.
Bottom Time
One of the main Well Definitions. Go Here for details.
Bottom Time(GridDepth) - (Bot_Tim_Dep_Grd)
The time obtained by interpolating the well time-depth curve at BOT_DEPTH_GRD. Time
values are obtained by interpolating the well time-depth curve at the grid-derived depths.
B C A
Top Depth
One of the main Well Definitions. Go Here for details.
B C A
Middle Depth
One of the main Well Definitions. Go Here for details.
Bottom Depth
One of the main Well Definitions. Go Here for details.
Bottom Depth(GridTime) - (Bot_Dep_Tim_Grd)
The depth obtained by interpolating the well time-depth curve at BOT_TIME_GRD. Depth
values are obtained by interpolating the well time-depth curve at the grid-derived times.
These are the values displayed on the right-hand side of the WELL TOPS display in
velocity-depth mode.
B C A
B C A
B C A
Top(GridTime,GridDepth)
The average velocity from the surface to the top of the layer, based purely on the grid
values i.e. TOP_TIME_GRD and TOP_DEPTH_GRD.
Top(GridTime,WellDepth)
The average velocity to the top of the layer, based on the grid time and well depth i.e.
TOP_TIME_GRD and TOP_DEPTH.
Top(WellTime,WellDepth)
The average velocity from the surface to the top well pick, based purely on the well values
i.e. TOP_TIME and TOP_DEPTH.
Bottom(GridTime,GridDepth)
The average velocity from the surface to the base of the layer, based purely on the grid
values i.e. BOT_TIME_GRD and BOT_DEPTH_GRD.
Bottom(GridTime,WellDepth)
The average velocity to the base of the layer, based on the grid time and well depth i.e.
BOT_TIME_GRD and BOT_DEPTH.
Bottom(WellTime,WellDepth)
The average velocity from the surface to the base well pick, based purely on the well
values i.e. BOT_TIME and BOT_DEPTH.
Along Hole(GridTime,WellDepth)
The interval velocity based on the grid times and well depths i.e. TOP_TIME_GRD,
TOP_DEPTH, BOT_TIME_GRD and BOT_DEPTH.
Along Hole(WellTime,WellDepth)
The interval velocity between the well picks, based purely on the well values i.e.
TOP_TIME, TOP_DEPTH, BOT_TIME and BOT_DEPTH.
Layer(GridTime,GridDepth,BotXY)
The interval velocity based purely on the grid times and depths i.e. TOP_TIME_GRD,
TOP_DEPTH_GRD, BOT_TIME_GRD and BOT_DEPTH_GRD.
Layer(GridTime,GridDepth,TopXY)
The interval velocity derived purely from the top time and depth grids interpolated at the
base well pick (i.e.(TOP_TIM_GRD_BOT, TOP_DEP_GRD_BOT, BOT_TIME_GRD and
BOT_DEPTH_GRD).Vertical Interval velocities.The vertical interval velocities are derived
at the location of the base well pick (i.e. at BOT_X, BOT_Y). They correspond more
closely with the interval velocities used in depth conversion.
Layer(GridTime,WellDepth,TopXY)
Layer(GridTime,WellDepth, BotXY)
Layer(GridTime,WellDepth,BotXY,Well Tie)
The interval velocity derived from the top time and depth grids (TOP_TIM_GRD_BOT and
TOP_DEP_GRD_BOT) and the time and depth values at the well base pick (
BOT_TIME_GRD and BOT_DEPTH).
THIS IS THE VALUE NEEDED IN A DEPTH CONVERSION TO TIE THE TIME GRID
WITH THE WELL DEPTH.
The vertical interval velocities are derived at the location of the base well pick (i.e. at
BOT_X, BOT_Y)). They correspond more closely with the interval velocities used in depth
conversion.
WELL TOP
Well Top - Location
X Coordinate
The X coordinate of this top; this is calculated by looking at the time-depth curve, and
interpolating the X coordinate off the deviation information there.
Y Coordinate
The Y coordinate of this top, calculated as above. In a deviated well, the X and Y
coordinates will be different at each top.
The velocity module is used for conditioning and modelling velocities within VelPAK.
From the link, RMS, Average or Interval velocities can be read in and stored within VelPAK.
Within this module these velocities can be calibrated spatially against the well data, have the
Dix routine applied to RMS velocities, have pseudo wells produced and generate velocity
volumes.
Amalg
The amalgamation process creates a combined time depth curve (pseudo-well) from
surrounding time depth curves generated via the Dix process. This works by gathering data
within a search radius.
Pseudo-wells at the same location as a true well can be generated and the results compared.
Alternatively a grid of pseudo-wells can be generated automatically or from the survey
locations.
Calibrate
The Calibrate fly out is used to calibrate the velocities within VelPAK layer by layer to the well
velocities in the model.
Volume
Velocity Volume Generation generates 2D or 3D velocity volumes.
XYZ
The XYZ option implements a simple and fast method for getting velocity information from the
stacking velocity data.
Individual controls on which of the various velocity data types to plot as well as the well
curves.
Note: From the Model Tree select Event 0 (no event horizon) to display all velocities for all
event horizons.
Select the specific event horizon you require from the Model Tree for detailed display
of one event horizon.
In the Map Module – In Edit mode make sure it is set to velocity so that you get the Velocity
points displayed on the map.
In Point Pick mode select a stacking velocity point. The dialogue of values associated with tat
point will pop up. The Mode column tells you what type of Velocity it is. Old(Dix) = 0, Average
=1, Interval = 2, RMS = 3, Unknown = 4.
Display Dialog
Display Dialog - General Tab
When you have made your edits. Press to apply the changes and activate the
option.
Display
Display_Type - Controls the display rather than the calculation.The data can be plotted in
the time-velocity or depth-velocity domains.
Size - Controls the size of the symbols on the plot.
Overlays
These are exactly the same options as the ‘Overlays’ options from the top of the Velocity
module tab. The overlays can be turned on here via a Yes/No toggle or by the Overlay
drop down.
Choose which Velocities to display -
Average_Overlay - Velocity after DIX transformation & integration from surface (Green).
RMS_Overlay - RMS Debiassed velocity (Black).
Well_Overlay - Well velocity curves (selected well in Blue other wells in Purple).
Note: DIX only operates on the current selected GROUP of stacking velocities. Go To
Group Overview.
DIX
Interval velocity is RMS over the
Vint rms interval
ANISOTROPY AND
HETEROGENEITY Calibration factor
CORRECTIONS
INTEGRATE
When you have made your edits. Press to apply the changes and activate the
option.
The General tab is used to specify the path name to the stacking velocity volume segy file to
be used in the calibration routine and byte positions for the if it is going to be used for the
calibration routine.
Note: For this to be used the Type must be set for each or any of the Events on the Factor
tab to Volume.
When you have made your edits. Press to apply the changes and activate the
option.
This tab sets up the calibration and the method for each event.
Note: If all events are being calibrated they should all be done at once.
Note: Only the number of events that are in the project will be shown. Each Factor set up
01_Factor...02_Factor etc. relates to the relevant event layer in the model. For
example - if there are only 4 event layers in the model there will only be four Factors
available for input. The above example is for a seven layer model.
Note: If RMS velocities are being stored and used, this calibration will perform the DIX
equation on the values. If other calibration methods (Grid, Volume) are being used a
Dix Value Factor MUST BE PERFORMED as part of the calibration routine to start
with by running the Calibration at 100%.
3. In the Amalg Fly Out: make pseudo wells generated at the well locations.
• Press the Create from Wells ‘+’ at the top of the Amalg Property Grid.
Create From
Grid From Survey
Wells Locations
• Press Apply on Amalg to create pseudo well logs at the well locations
• This will produce XYZ data at the wells of the factor calculated by comparing the
pseudo-well isopach for the chosen event from stack data with the real isopach.
• These values generated can be displayed on the surface module by selecting ‘XYZ
Layer Visible’ and the correct XYZ slot - in the example above the slot General_01.
5. Grid these values up using the Surface Grid option making sure General01 (in this
example) is the XYZ slot selected for gridding and the Gridding method is Global:
6. Review the Grid in the Surface module. If the Grid of the calibration factors are not good
you can de-select one or any number of the XYZ points from being used in the grid run.
For example:
The following grid shows a bull’s eye where the XYZ values is 109.34.
• Set Display to only show the Stack wells over the area and not the ‘real’ wells by
setting Well_type to Stack.
• Click on the well. The Edit Well dialogue will come up:
Note: It is not easy to get this layer back once deselected so only do this if you are sure you
do not want this XYZ point for this layer used in the calibration!
• Back in the Velocity module re-generate the XYZ percent overwriting the XYZ file
stored in General01.
• In the Surface module re-generate the grid (Global gridding method):
Once you are happy with the grid of the calibration factor enter the grid as the factor type
for the calibration process:
• In the Velocity module set the Factor for the event you have been working with from
Value to the grid slot the percent Calibration grid is stored – in this example
General01 and press Apply.
• Press the Create from Wells ‘+’ at the top of the Amalg Property Grid.
Create From
Grid From Survey
Wells Locations
• In the above example you can see the variation between the Real well trace and the
well trace developed from the well stack. This is what we are aiming to change in this
case by creating a HiDef velocity volume.
4. Run HiDef Tool from the Tools drop down menu.
5. You will not necessarily have a back ground model. IN this case you do not want a
background model - so say No to the alert that comes up.
6. Select the .sgc for the geometry of the volume. If no survey is selected it makes one
called None.sgc. It contains more or less all of the information in the Volume property grid
in VelPAK. It is needed to define geometry, SEGY type etc. Note that VelPAK always
makes a .sgc when it makes a velocity volume also.
7. Set geological model spread parameters for each layer from the HiDef - Set Up tab.
Within the layers you have defined are intermediate micro-layers or micro-horizons which
are modelled on the geological relationship bounding the horizons.
You will need to assign what relationship to the bounding horizons this layer has, typically
geology running proportionally to the bounding horizons or running parallel to the upper
horizon or parallel to the lower horizon.
Refer to the HiDef User Guide for full details of this process.
8. In the Project Parameters Tab, in Simple View use the Bulk Selection of Wells option to
turn off all real wells, leaving only stack wells(wells) on.
You will see that wells that have the status ‘Has Tops Has Logs’ layer definitions assigned
to them will have turned off (de-activated). The stack wells, however, that can be seen if
you scroll down to the bottom of the list of wells will remain on.
Once the wells are turned off you now want to generate the Instantaneous velocity
volume using just the pseudo wells.
9. In the 3D range of the HiDef fly out select a name for the Output SEGY. Make sure it is an
appropriate name - you will need to select it again soon.
16. Set the volume generation going under the Run 3D tab in the same way you did before.
This will be the velocity volume of the percentage difference between the derived stack
wells and the actual well curve that will be used to calibrate the velocities back in the
Calibrate fly out in the Velocity module of VelPAK.
Once the process has completed you must re-generate the values of the pseudo-well
velocities via the Amalg feature:
18. Go into the Amalg fly out and press Apply again.
The velocities will be re-generated at the well-locations.
19. Check the calibrated well against the real well trace in the well module (you may need to
refresh the Well display).
XYZ Dialog
Overview of XYZ Option
The XYZ option implements a simple and fast - (but less predictive than Amalg) - method for
getting velocity information from the stacking velocity data.
It is necessary to process the stacking provelocities using the DIX option before using
LAYER. This generates a horizon-consistent velocity slice. The data created is in the form of
an XYZ file, which is stored in the Velocity XYZ slot under the relevant event horizon in the
Model Tree. The XYZ file created is assigned the name StackvelsXXX where XXX is the
method used.
AV - Average Velocity.
IV - Interval Velocity.
RMS - RMS (Debiassed) Velocity.
This can then be gridded up using the Grid option (under the Surface module), and an
average velocity or interval velocity depth conversion can be performed.
The horizon-consistent velocities are extracted by performing a linear interpolation on each
stacking velocity curve at the time from the time grid.
Note: Be sure to have the correct Event Horizon selected in the Model Tree or Surface&Slot
selector.
When you have made your edits. Press to apply the changes and activate the
option.
Option tab
Display
Velocity_Type - Interval/ Average/ RMS - used to select the type of velocity to be
extracted.
Interval velocity calculates the interval velocity between the current event horizon and
the next event horizon above. This is suitable for use with the Simple Interval Velocity
depth conversion method.
Average velocity type generates average velocity from surface down to the current event
horizon. This is suitable for use with the Simple Average Velocity depth conversion
method on the DEPTH Option page.
RMS velocities can also be extracted.
Verify - Yes/No
Output
Select the slot in the Model Tree where you want the output file to be stored. Typically this
will be the Velocity slot under the XYZ data for the event horizon you have selected,
although any slot can be allocated:
The XYZ file created is assigned the name StackvelsXXX where XXX is the method used.
AV - Average Velocity.
IV - Interval Velocity.
RMS - RMS (Debiassed) Velocity.
Amalg Dialog
Overview of Amalg Option
The amalgamation process creates a combined time depth curve (pseudo-well) from
surrounding time depth curves generated via the Dix process. This works by gathering data
within a search radius. It can be particularly interesting to generate a pseudo-well at the same
location as a true well and compare results. Alternatively a coarse grid of pseudo-wells can
be generated automatically or from the survey locations. This is a powerful technique to
smooth the errors in the stacking velocity data whilst maintaining information. Once the grid
of pseudo-wells is complete the Curve module can be used to generate a set of control points
of interval velocity for gridding.
The pseudo-wells generated by this process are identical to real wells as far as the rest of the
program is concerned. So you can use these pseudo-wells in the Optimize module to fit
functions to them, as well as obtaining interval velocity maps etc. It is recommended that
function generation on pseudo-wells is performed in the velocity-depth domain.
The time-depth curve for each pseudo-well is defined for all layers defined in the current
model.
For the Amalg process to distinguish between ordinary wells and stacking velocity pseudo-
wells, the pseudo-wells will have the word “stack” in lower case as the first five characters of
the well name. If generated as a coarse grid teach well will be assigned a number following
the ‘stack’ name. If generated at an actual well the ‘stack’ will be followed by the actual well
name.
The process steps for using Amalg are as follows
1. Decide whether to run the process as a Grid of pseudo-wells, at well locations or survey
locations.
2. Create them by pressing the correct ‘+’ at the top of the Amalg Property Grid.
Create
Grid From From
Wells Survey
Locations
3. Decide on the Radius around each well the process is to gather the information from and
the spacing, if applicable.
4. Press the Apply button to generate the curves associated for your created well(s).
5. View the generated locations on the Surface module - selecting Stack as your Well_Type
in Display.
6. View the curve in the Well Module noting that for pseudo-wells at actual well locations you
can choose the Overlay_Well as either the stack well or the real well to view both curves
on the same display.
When you have created your well(s). Press to apply the changes and activate the
option. Go Here for details of what happens when this is activated.
Delete - - Use to delete all the wells locations prefixed with a ‘stack’ within the model.
Spacing - The UTM spacing for the generation of your grid of pseudo-wells. The wells are
always positioned starting at the origin of your data area. They are orientated parallel to
the X-Y axis.
Verify - Yes/No - Setting this to Yes the program will expect verification via an Alert Box
as to whether to create each ‘Stack’ well. Obviously when creating a large grid over a
Model area this could slow the procedure down greatly.
Amalg Methodology
Pseudo-well
Time grid for
event (n-1)
When the Amalg Apply tick is activated, VelPAK scans the list of wells in the Model Tree until
it finds one called “stackXX”. It then works out if there are any stacking velocity points it can
use to generate a pseudo-sonic. It generates a cylinder, with a radius of the search distance
based on the value of the mute distance (fixed from Dix) at the midpoint. The top and base of
the cylinder are the time grids for the top and base of the layer. Only data points from within
this cylinder are considered. If a stacking velocity analysis has only one Time-Velocity pair
inside then it is ignored.
These ‘cylinders’ are then added together to create a curve for the whole trace length. Note
that there may be overlaps due to the geology between the layers. This is compensated for
using the Two-way time at that point.
VelPAK then combines the data points within the cylinder, and rearranges them to form a
pseudo-sonic log. It does this by calculating an interval velocity between each sample down
each stacking velocity analysis curve, and a midpoint depth for each of these sample pairs. It
then sorts these midpoint depths into order, and re-integrates to get a time-depth curve. This
is then in the same form as all the other VelPAK sonic logs, which are stored as time-depth
curves.
If Verify is set to ‘Yes’ - the program then asks for confirmation of whether you want to
create this pseudo-well. If you click on Cancel, the program deletes that particular “stack”
well, and then asks you if you want to carry on with Amalg.
If you click OK on this second dialog, VelPAK will proceed to the next “stack” well.
Note: If a “stack” well has no data points within its cylinder, VelPAK will not ask whether to
create the pseudo-well; it will just delete it.
There are two possible reasons for VelPAK deleting a “stack” well without asking you about it:
1. The “stack” well may be so far from the stacking velocity data that no analysis locations
are within the cylinder.
or
2. The “stack” well grids are indeterminate at the location. However, the process with seek
to go as far down the trace as possible so if only higher grids are present and lower grids
are absent a curve will be generated down to where the grids are absent.
It is possible for the time-depth curve to extend above and below the time grid range if the
time grids have a substantial variation over the radius of the input data. It is also possible that
the generated curve data may not initially cover the entire time grid range because the
amalgamation process only works with midpoint values. However, in this case, the curve will
be extrapolated upwards or downwards, as necessary, using the first and last interval velocity
values in order to ensure that the time-depth curve is defined at both the top and base time
grids.
The top and base TOPS values for the pseudo-well are calculated by interpolating the
pseudo-well time-depth curve at the top and base grid times. This ensures that the well layer
will always tie perfectly with the time grids.
If several analyzes have similar midpoint depths but different interval velocities, this will lead
to a strange looking pseudo-sonic. However, because the intervals are very small, these
points do not have a serious effect on the time-depth curves. Before deciding that a pseudo-
sonic looks unrealistic, check the information on a time-depth display. You may find that it is
quite OK when looked at in this domain, Go Here for details of how to do this.
6000
4ms
Isochron
For the second layer onwards, the last_time and last_depth must also be specified in the
equation.
Note: In order to generate a velocity volume which is referenced to the sea surface, (zero
time, zero depth), a velocity model must be defined for all intervals. In some cases the
velocity model may start at the seafloor, for which a depth grid has been imported into
VelPAK from another source. If this is the case then a depth conversion must be
entered for Event 1, as a velocity model must be defined for this interval.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () or double click on input name to select from available inputs.
Parameter Increment Tool
Parameter
Increment
Update
Parameter - - Calculates the correct volume starts and stops taking in to account the
selected increment to conform with the SEGY selection.
Options
File - Output file name. Use the File Selector to select a location and filename or accept
the default “volume.out”. It may help to suffix SEG-Y filenames with sgy and TDQ format
file with avf (the standard Landmark naming convention).
Survey - Load Survey Information - this accesses a list in the author directory created
during the Link procedure of every 3d survey loaded to any VelPAK project under that
author. An alert message will pop up. Select the survey you wish the 3d information to be
loaded from and press OK.
Type - The output maybe a diagnostic, a SEG-Y file or an ASCII file in Landmark’s TDQ
format.
Diagnostics - Produces a text file containing diagnostic information about the model,
methods and values for each layer written to it as text values.
TDQ from Grid - A TDQ volume made using the grids in the model will be generated
The Volume ASCII output follows Landmark’s TDQ format. AWK filters can be used to
reformat the output file into another format, such as ESSO V2, if required. Contact
software support for advise on this.
Segy from Model or Segy from Profile - This output is required so that the velocity
volume can be loaded into interpretation software such as the Kingdom Suite, Petrel,
OpenWorks or GeoFrame.
The commonest form is a SEGY from Model velocity volume (of type Dimension) from
the model and defined surveys extents will be generated
A SEGY from Profile velocity volume (of type Dimension) from the snapped, depth
converted profiles for this survey will be generated.
The format is a 3D regular orthogonal mesh with a velocity at each node. (The
velocity may be interval, RMS or average velocity.)
The mesh co-ordinate dimensions are:
• X location (UTM co-ordinate).
• Y location (UTM co-ordinate).
• Time or Depth.
• The output is an IEEE format SEG-Y. The trace header locations are as follows;
• in-line is byte position 189, integer format.
• crossline is byte position 193, integer format.
• X co-ordinate is byte 73, floating point format.
• Y co-ordinate is byte 77, floating point format.
• no. of samples per trace is byte 115, integer format.
• sampling interval is byte 117, floating point format.
- The SEG-Y output only allows constant sampling.
Note: Remember, the data and trace header formats are IEEE not IBM.
Option
File - output volume file name. This will be stored in the <model name>/volume directory.
Survey - Select the 2d survey loaded from the Kingdom project you wish to use for
volume generation (or Profile volume or seismic display) by clicking here and then on the
Select icon.
The Survey SP/Trace table is displayed below for the selected survey.
Parameters
Dimension - Instantaneous, Average Velocity or Depth (less common). A SEGY
volume with depth/ instantaneous velocity/ average velocity values at each time sample
will be generated. (Exactly as 3D).
Time_Increment - Output time increment of the time volume - typically 4ms
Time_Min - Minimum time, for example 0ms.
Time_Max - Maximum time of the time volume.
Survey
Trace, Shot, Ratio - for information.
Apply - Pressing Apply on this tab with generate the Geometry. **Proceed with
caution** this option does not generate your Volume!! (Use the Apply tick on the
General 3D/2D ) ** this option will replace all the 3D or 2D seismic map location data
in your project. This can be thought of as a separate facility to the rest of the Volume
Generator in that given the 1st, 2nd and 3rd point inputs it will generate real seismic
line data within your project. An alert will come up to warn you of this before
proceeding.
Geometry Test - - Once you have set up the X Y limits of the area you are going to
generate the volume of, pressing this option will give you a (2d) visualization of that area.
The red that you see is not a real item on the display; zooming in will cause it to
disappear. It is there purely to test that your 3 point XY parameters are sensible.
Note: You must have the Surface Module open to visualize this (although not necessarily
displayed as the top tab).
Inline / Xline
The increments here specified will affect the resolution of the output file and the size of
the file. This option allows you to use the corner points of your 3D survey to define the
geometry, but output the velocity cube over a smaller area.
Inline/Xline_Clipping - Inline/Xline_Rounding - The rounding and clipping parameters
help define the points for which values are output. For example a crossline range of 1004
to 1037 with an increment of 5; if rounding only is toggled on, then for inline 1200 values
will be output at crosslines, 1004, 1005, 1010..1035,1037. If both rounding and clipping
are switched on then values will be output for crosslines, 1005, 1010,......1035. If neither
are switched on then values will be output for, 1004, 1009, 1014 etc.
Line Type - Inline/Xline or both.
Batch Execution
When running a velocity volume it is possible to generate the volume in batch mode
accessing as many processors on the PC as you want and are available.
Select the Batch Execution using Multi Processors seen on the of bank of icons for the
module.
Inline Start / Inline Stop - Setting the Inline Start and Stop here here will override
whatever value is set in the Property Grid.
Number of Tasks - Taken from the number of processors the PC has this would usually
be left as the default allowing all CPUs to be used during the batch execution.
Abort - Allows you to Abort the process at any time.
This can be used as an Advanced use for batch on a number of different computers:
• Set up the Number of Tasks - CPUs to be used
• Press the Abort button once the 12 Tasks have started
• Pass the batch files to the other computers (since they are ‘simple’ batch files)
• When they have finished using the Collate button will collate these task batches
Collate - this is an advanced user option as mentioned above. The process will take a
while and the progress of it will be shown in the console window.
Note that running the batch process on one computer (over a number of CPUs) will
automatically collate the runs on finishing.
The task manager Performance tab will show all CPUs in use.
As the processors finish their allocated loops the bar will turn green.
In the directory of the VelPAK project there will now be a new sub-directory called ‘batch’.
Within this sub-directory will be further sub-directories containing all the information that is
generated during a workflow looping run for each of the tasks run.
There are also batch files (*.bat) stored in the batch directory which have been used to run
the processes. For advanced users, these would allow you re-run each task individually if
required. They can also put onto another PC entirely and re-run - providing VelPAK is
installed on that PC too. (Advanced use only!)
When the process is finished all the information stored in the ‘batch’ directory sub-directories
will be collated pied across to the relevant directories of the project.
6000
4ms
Isochron
• A Volume Generation can take a long time to generate on a large Survey. You may like to
select a portion of ten or so Inlines (possibly over an area where there is some good well
information available) to begin with to check that what you are setting up to generate is
what you want.
• Once the depth conversion is set, go to the Velocity Volume General Tab.
• Set Type to Segy for reading into your seismic package. You may want to run ‘Diagnostic’
as the type before you run the actual Segy. This will produce a file containing values for
the parameters set up. Selection of which diagnostics to output happens at the bottom of
the Property Grid. But NOTE that this will be a very large file even if you just select one
line from a survey to output; it is recommended you use the Inline and Xline specifiers
below to output a small portion of one line.
• Select Survey button to get survey - an alert box will pop up allowing you to select from
the number of Surveys stored within your VelPAK project and press the P button at the
top to get the correct volume start stop extremes limit.
Galleon_3D
Southern North Sea
Teasmade Dome
• If you want to set up just a portion of the survey to output as a volume to test it, set the
Inline Min/Max values to reflect this. Note that Max input comes before Min in the inputs.
• Set Z Range Z_increment to be 4ms.
• Set Z_Max to the limit of your data which you can read off a Profile if one is available.
Press the Activate button.
Display Options
Previous/Next Arrows
Display Types
The method of selecting what type of display you wish to see.
Overlay Options
All the overlay options are basic on/off selections. Selecting the item once will turn on the
relevant display, selecting it again will turn the display off. You can build up as many of the
overlay option selections as you wish.
Well - Selecting this will display the well curve tracks stored in your model. The display of
the well track can be edited in the Display -> General fly-out.
Contour - Adds contours to the surfaces selected. For this reason contour overlay always
must have surface overlay on also. Contour increment can be edited in the Display ->
General fly-out
3d Display fly-out
3d Display - General Tab
Use the down arrow () or double click on input name to select from available input.
When you have made your edits. Press to apply the changes and activate the
option.
Cache All Lines - Allows the display of profile lines if the Segy is set up in the Profile
module (Display General Fly-Out).
Option
Background_Colour - choose the background colour of the 3d display.
Grid_Colour - Choose the colour of the grid and axes within which the 3d display is
presented.
Legend_Type - choose between seeing a Legend of the values of the surface, velocity or
seismic or no legend
Surface
Surface_Colour - select color tables for the surfaces. Go here for details of the color
tables in VelPAK.
Surface_Contour - Choose the increment that best suits the contouring of your data.
Surface_Slot - select the type of grid for display in the 3d display.
Profile
Profile_Mipmap - Yes/No - Yes can reduce memory usage and speed up graphics on a
low-end graphics card.
Profile_Trim - Yes/No - trims the interpretation of the profiles back to the interpretation
min/max.
Profile_Type - Selected/Current - choose between a number of selected profiles to be
displayed (selected via the selection procedure outlined here) or the current profile
selected on the profile module display.
Note: If current is selected, detaching one of either the Profile or 3d modules from the main
VelPAK window will allow you to move through the profiles in the profile module (using
the blue Line arrows at the top of the display) and view the profile on the 3d display.
Well
Top_Colour - the colour of the tops displayed on the wells will either be colored by layer
or by well.
Top_Disc - A disk can be drawn to highlight where the tops are on the wells. Color of the
disk relates to whether Top_Colour is set to Layer or Well (above).
Top Disc Size = 0 Top Disc Size = 500 Top Disc Size = 2000
Top Colored by Layer Top Colored by Layer Top Colored by Layer
Top_Size - Size of font for tops displayed on the wells.
Top_Type - choose to display which tops are to be displayed on the well curve of the 3d
display:
All - displays all formation tops in the model for the current well
Layer - displays only tops used in the layer definition
Used - displays all tops flagged as “Used” in the Layer definition table
UsedAndExtra - displays all tops flagged as “Used” and “Extra” in the Layer definition
table
Velocity_Colour - select color tables for the velocity displayed within the well trace
column. Go here for details of the color tables in VelPAK.
Well_Colour - color of well label.
Well_Text - size of font of the well label.
Well_Thick - based on pixels a value between 1 - 10
Velocity
Velocity_Max, Velocity_Min - Velocity unit range for display.
Seismic
Seismic_Max, Seismic_Min - Seismic unit range down the Y axis for display.
Light
X_Light, Y_Light, Z_Light - Allows the light shading of the display to be adjusted
according to the surfaces and profiles on display
Rotation
XY_Rotate, Z_Rotate - interactively changed when dragging the left mouse button over
the display, this input allows you to set an exact numerical rotation for XY and Z rotations.
XY - An angle from 0 to 360 (where 0 and 360 are True North) from which to view the
model.
Z - The angle of elevation between 0 to 90 from which to view the model.
Scale
X_Scale, Y_Scale, Z_Scale - change the perspective of the X,Y, Z axes relartive to each
other.
Translate
X_Translate, Y_Translate, Z_Translate - Translation of cube values to real world
coordinates. (Not generally used).
Select the surface grids layers to be displayed. The type of grid i.e. Time/Depth etc. from
slots in the Model Tree - is selected on the General tab.
As well as the option Yes/No the amount of opacity can also be selected - allowing individual
surface more or less see-through. The options are 25/50/75 which relate to being 25%, 50%
or 75% see-through.
Use this tab to edit the display for the Profile on the 3d display. Go Here for details of the
Profile tab.
The Curve module enables you to cross plot various data items from VelPAK and visually
inspect the relationship between those items. A line of best fit can be generated and
displayed. You can then interactively edit which items contribute to the fit.
Selection and activation of the curve data occurs from the Property Grid to the right of the
window. A comprehensive list of items are given in the Advanced Tab selection; whilst in the
Standard Tab option there are a selection of well used Graphs, Cross Plots and Relationships
that can be selected. Values derived from graphing can be used in the VelPAK Depth
Conversion.
Once your display data are on view, you can interactively edit the points and the view using
the ‘Style’ option and direct mouse actions on the graph display.
Data that can be selected comes from Curve, Layers, Stack, Surface, Tops and XYZ data.
All data plotted on the Graph display are also displayed numerically on the Data tab. The
Data tab allows you to manipulate or filter the data displayed and add and delete sets of data.
Data continues to be added cumulatively to this displayed data file until cleared by you or
exiting the module.
Clicking on a Well Location on the Curve will automatically bring up that well within the Well
Module.
The XYZ tab dialog allows you to export the data from the curve into XYZ slots within VelPAK
for use within Gridding.
The data can be exported in Microsoft Excel format or an specialist curve “.cft” file which
would allow the same graph with the same set up parameters to be brought in and displayed
at a later date.
Using the Stack Curve Show option under the Display Property Grid you can select or
deselect whole groups of data points rather than individually. This curve will then become
deselected permanently; including within the Optimize module.
Activate Clear
6. Selected data can and will be added to the existing graph and data tab unless you clear
the graph using the clear icon.
Curves can be saved using the special feature icons available only in this module.
Graphs can be saved/read in as a curve-fit “.cft” files and data can be saved in ‘.xls’
format.
7. Export the XYZ data produced from the curve using the XYZ data tab.
New - clears data from Graph and Data tab. Your data will be lost if it has not been saved
previously.
Open - opens a previously stored curve file in the ‘.cft’ format.
Save - saves the curve information as a ‘.cft’ format file. This will bring in numerical data
in the Data tab and display the Graph including any styles you have selected to display
your curve and any filtering that you have done to the numerical data.
Note: The name of the default ‘.cft’ curve file is stored within the Properties of the model for
reference.
Save as Excel - will save the numeric data as an Excel file (.xlsx) . Filtered data will be
displayed within the Excel file but will be in ‘gray’ rather than default black to distinguish it.
Point Edit- Use to de-select and select wells to make them inactive for graphing
purposes.
Point Pick - used to select wells on the curve to display in the well module.
Note: The other options on the icon bar for the curve module are either currently inactive or
are documented elsewhere - in the case of the Printing options.
Graph Tab
A0, A1, A2 - The values derived from the fit; just A0 is constant with A1 the gradient for a
linear fit with A2 for a second order fit and so on.
Standard Error - the estimated standard deviation of the error in that method
R2 - R2 is a statistic that will give some information about the goodness of fit of a model.
In regression, the R2 coefficient of determination is a statistical measure of how well the
regression line approximates the real data points. An R2 of 1.0 indicates that the
regression line perfectly fits the data.
Adjusted R2 - Adjusted R2 is a modification of R2 that adjusts for the number of
explanatory terms in a model. Unlike R2, the adjusted R2 increases only if the new term
improves the model more than would be expected by chance. The adjusted R2 can be
negative, and will always be less than or equal to R2.
F-stat - Tests the null hypothesis that the straight-line model is right. Since the non-
central F-distribution is stochastically larger than the (central) F-distribution, one rejects
the null hypothesis if the F-statistic is too big. How big is too big—the critical value—
depends on the level of the test and is a percentage point of the F-distribution.
User P0/ User P1 (only if Show_User is set to Yes) - the A0 and A1 values derived from
the gradient for a linear fit selected in User option
Data Tab
Data that is selected to be displayed on the Graph is displayed numerically in this Tab. Each
selection of data are added to the end of the data already on display. The ‘Clear Graph’
option that clears the graph of all its display would also clear this tab window. Data can be
filtered to your specifications here to be displayed on the graph. Data that has been filtered
out are displayed in gray. Data that are still active are shown in black.
Clicking on the top of each column will sort the data alphanumerically in standard fashion.
Data can be sorted using more than one column using a ‘nested’ sort by holding the ‘shift’
button down while sorting.
For more control over which well data is utilised in the model, particularly with projects with
many wells, the Data table filters are cumulative. The right-mouse button context menu to
allow fast resetting of all the filters applied, and to force the data table to be rebuilt from the
model.
Data seen in the tab can be output as an Excel ‘.xlsx’ file to read into a spreadsheet. These
data and any filtering that has been done to it can also be saved as a ‘.cft’ file.
Data can be filtered on any number of columns within the file. Filter criteria are brought up for
the individual columns by clicking on the ‘funnel’ icon at the top of each column. This funnel
icon will change from gray to blue if a filter has been applied on that column:
You can select a number of methods as the filter for individual columns:
All - All elements will be displayed (default).
Custom - User defined selection of Operator and Operand to filter data. See below for
details.
Custom Filter
Gives you the ability to build up a list of conditions with which to filter the data.
You can use ‘And’ or ‘Or’ to select the operators used.
Operand - The objects that are manipulated.
Operator - The symbol that represent specific actions.
As the operand you can select another column label from the data file being used as the
qualifier for this filter, or a value from within the column currently being filtered.
For example, to select data only within a certain Block you could set up the filter for the X
Location column to be X values within the block limits (shown below), and similarly for the Y
Location.
The layout of the columns can be changed to suit your requirements by accessing the arrow
to the right of the filter motif. Clicking on it will bring up a list of the columns within the Data
tab. You can then select which column you wish to be placed in this position.
On this Dialog you will select which data are to be plotted on which axis, and from what
source your data are to come from (i.e. Well Values or XYZs), how the fit is to be applied and
how the graph will look.
Tabs include only:
General tab
Listing the display variables for the points and fitted curve. Selection of the type of Graph
Fit to take place.
When you have made your edits. Press to apply the changes and activate the
option.
For any changes to these inputs to take effect, press the Apply tick.
The Apply Robust button. turns off the outliers in the model.
Option
Flip_X - Yes/No - selecting Yes the X axis will be flipped.
Flip_Y - Yes/No - selecting Yes the Y axis will be flipped.
Inactive - Choose to show or not show inactive points selected. Will be shown as grey if
selected to show.
Label - Labelling of points on the Graph.
Label_Color - Color of label for point. See Color Below.
Lable_Font - Font used for the label of points on the graph. See Fonts Below.
Title
Title, X_Title, Y_Title - These are automatically filled according to what has been
selected as the graph to be displayed. They can also be edited here if required. A list of
the automatically generated titles are the same as the definitions for the Advanced Curve
Definitions.
Confidence
The confidence region is calculated in such a way that if a set of measurements were
repeated many times and a confidence region calculated in the same way on each set of
measurements, then a certain percentage of the time, on average, (e.g. 95%) the
confidence region would include the point representing the "true" values of the set of
variables being estimated. However, it does not mean, when one confidence region has
been calculated, that there is a 95% probability that the "true" values lie inside the region,
since we do not assume any particular probability distribution of the "true" values and we
may or may not have other information about where they are likely to lie.
Constant_Factor - only used when generating a constant; it is the factor of the standard
deviation used to generate the constant.
Linear_Percent - the percent confidence region to be displayed.
Show_Bands - green and blue bands displayed on the graph showing the upper and
lower confidence limit.
Show_User - a red line showing the randomise fit.
Fit
Fit - Type of Fit to select:
None - No fit will be drawn on the Graph.
Constant - A horizontal, flat line will be drawn giving a constant value.
Linear - Fits the data to a straight line:
y = mx + c
and will produce two out put values, constant ‘c’ and coefficient ‘M.’.
Note: These derived values, ‘c’,’m’,’a’ and ‘b’ are displayed in the Results section below.
The ‘c’ value (constant) is always A0, the ‘m’ is always A1, and in the examples above
the ‘a’ value would be A2 and ‘b’ would be A3.
Fit_Color - The color of the fitted line on the graph. See Color Below.
Fit_Font - See Fonts Below.
Fit_Symbol - Pick a keyboard symbol to use. Default is no symbol.
Legend - The results of the curve fit can be plotted on the graph.
Choose from None / BottomLeft / BottomRight/ TopLeft / TopRight.
Go Here for a discussion on what the results being displayed are.
The example below shows BottomRight selection, with adjusted Font size and Font Color.
Note: Full and mathematical details of the Robust Algorithms are from the book ‘Robust
Regression and Outlier Detection’ by Peter J. Rousseeuw and Annick M. Leroy.
1987.Wiley Series in Probability and Mathematical Statistics.
Robust - Yes/No. Yes will reduce the influence of outliers. The following options will
change the look of any outlying points excluded due to setting Robust to ‘Yes’.
Fonts - All the above Font inputs have the same set up options. On selecting the Font
option, the following standard font box appears to allow you to set up the font as you wish.
These changes and more can also be made by expanding the ‘Fonts’ input line using the
‘+’ symbol to the left of the input.
Document
Millimeter
Bold - True/False.
GdiCharSet - Specifies the GDI character set that this Font object uses.
GdiVerticalFont - True if this Font object is derived from a GDI vertical font; otherwise,
false.
Italic - True/False.
Strikeout - True/False.
Underline - True/False.
Activate Clear
Options
Filter - Normal/Fit/ Residual - in Normal mode filtering will take place according to how
the wells have been de-selected in the Data tab; any de-selected wells will not be used. If
however only the Fit or the Residual of a well has been de-selected then using this Filter
will allow only one or the other’s values to be displayed on the Curve. This option is
intrinsically linked to the Optimize module and vice versa.
Point_Reduction - Multiple by which to down sample number of data points used.
Info
X_Mnemonic - Automatically filled input from either the Standard or Advanced graph sets
selection of the X-axis value to display. It is not possible to re-name the axis here - it is
given its active text status to allow you to copy and paste the mnemonic to elsewhere.
Y_Mnemonic - Automatically filled input from either the Standard or Advanced graph sets
selection of the Y-axis value to display. It is not possible to re-name the axis here - it is
given its active text status to allow you to copy and paste the mnemonic to elsewhere.
Note: The X and Y axis titles are displayed at the top of this tab; this is for reference only. If
you wish to change the label titles this should be done in the Display property grid fly-
out.
Select the graph required and press the Activate tick to activate the display.
Activate Clear
Standard Graph
Well Layer
Interval Velocity v Isochron This defines the relationship between
Interval Velocity and Isochron for the
specified layer. A typical application
would be where a salt layer interval
velocity increases as the salt thins.
Standard Graph
Interval_Velocity_v_Depth_to_Bottom_of_Layer As above but values taken to bottom
layer.
Average_Velocity_v_Time_to_Base_of_Layer Simple Approximation
Time_v_Seismic_Time WELL_LAYER_BOT_TIM
v,WELL_LAYER_BOT_TIME_GRD
Well Curve
V0 v Kz from sonic log
Values calculated by linear regression
Depth v Time from sonic log of sonic logs.
Average Velocity v time from sonic log
Each of the axis selectors is used to specify what data are displayed in which axis within the
graphing window.
Note: The X and Y axis titles are displayed at the top of this tab; this is for reference only. If
you wish to change the label titles this should be done in the Display property grid fly-
out.
Note: The current event defines the base of the layer, while the top of the layer comes from
the previous event horizon. For the first event horizon, the top of the layer is assumed
to be the surface (i.e. zero depth and zero time).
Select the axes required and press the Activate tick to activate the display.
Activate Clear
Go Here for a Guide to activating the Curve Module.
A DETAILED GUIDE to all the inputs under the Advanced Curve tab is at the end of this
chapter and available Here.
The XYZ tab dialog allows you to export the data from the curve into XYZ slots within VelPAK.
Note: The Save as Excel icon option on the Curve module also allows you to save the
numeric data as an external ‘Excel’ file.
This module takes the values of the X and Y axes from the graph and produces XYZ values
as required in the designated slots.
Type - Normal/Predicted - Predicted - only works on linear regressions. Given the X
value of a point it will work out the Y value for the point that will cross the line. Only hte
predicted ‘new’ X,Y points will be output in the XY output file produced.
X
X
X X Predicted value
X
X X Value
X
X
Inactive_XYZ1 slot takes the values that have been selected as currently inactive for the
X axis.
Inactive_XYZ2 slot takes the values that have been selected as currently inactive for the
Y axis.
For each output type select the XYZ Model Tree slot you wish the XYZ data to be stored in.
For example:
Active
Values Inactive
Values
X
X
X
Active_XYZ2 (Yaxis) X X
Values X
X
X X XX
X X
X
Inactive_XYZ2 (Yaxis) X
Values X
MidX,MidY TopX,TopY
BotX,BotY A - XY Top of Pick
B - XY Bottom of Pick
C - XY Mid Point
A
C
B
rid
G
p
To
id
Gr
se
B C Ba A
Top Time or Depth Grid
Time /
Depth
Definitions
As you will see from the selection of definitions available from the Advanced curve option
there is a vast array to choose from to make your curve.
Most of the definitions stem from a few main definitions as listed below.
Well Definitions
The following diagram of a log defines the terms Top Well Pick and Base Well Pick.
MID_TIME, MID_DEPTH
BOT_TIME, BOT_DEPTH
Base Well Pick
Surf_X, Surf_Y
B C A
Time
Bot_X, Bot_Y
Mid_X, Mid_Y
Grid Definitions
TopX,TopY
MidX,MidY
BotX,BotY
A
C
B
id
Gr
me
Ti
p
To
rid
G
e
m
Ti
se
Ba
B C A
Top_Depth_Grd
Top_Time_Grd
Bot_Depth_Grd
Bot_Tim_Grd
Remember that you may need to consider Mistie discrepancies between Grid and Well
Pick.
Bottom Time - (Bot_Tim_Grd)
The value of the base time grid interpolated at the base well pick (i.e. at BOT_X, BOT_Y).
Top Depth - (Top_Depth_Grd)
The value of the top depth grid interpolated at the top well pick (i.e. at TOP_X, TOP_Y).
Top Time - (Top_Tim_Grd)
The value of the top time grid interpolated at the top well pick (i.e. at TOP_X, TOP_Y).
Bottom Depth - (Bot_Depth_Grd)
The value of the base depth grid interpolated at the base well pick (i.e. at BOT_X,
BOT_Y).
Vertical Apparent IV
THIS IS THE VALUE NEEDED IN A DEPTH CONVERSION TO TIE THE TIME GRID WITH
THE WELL DEPTH.
The interval velocity derived from the top time and depth grids, TOP_TIM_GRD_BOT and
TOP_DEP_GRD_BOT and the time and depth values at the well base pick BOT_TIME_GRD
and BOT_DEPTH.
Top_Tim_Grd_Bot
Top Time (BotXY)
Vertical Apparent
Interval Velocity
Bot_Depth_Grd
Bot_Tim_Grd
Misties
With deviated wells (or indeed straight wells) the values of the tops would not necessarily tie
exactly with the grid for the same surface. If this is the case you will need to be aware which
of the two values you will want.
The diagram below shows an exaggerated mistie between the well pick and grid. This
highlights the fact that if GRIDS are used then it is the X,Y position of point A (the surface XY
location of the well’s top pick) that is used as the Grid Value; not where the well track ‘enters’
the grid surface.
TopX,TopY
BotX,BotY
rid
G
e
m
Ti
p
To
A
STACK CURVE
STACK LAYER
SURFACE XYZ
SURFACE GRID
WELL CURVE
Uncalibrated Time
The uncalibrated time value at each depth sample. Curve data can be displayed for more
than one well. All the samples within the current layer are displayed.
Depth
The depth value at each sample (these first 5 columns are what are loaded in to the File-
>Import ->Well Curve file). Curve data can be displayed for more than one well. All the
samples within the current layer are displayed.
Calculated Well Depth
Depth results that come from the current depth conversion set-up
WELL LAYER
Well Layer - Isochron
Along Hole(GridTime)
BOT_TIME_GRID
BOT_TIME
(base well pick)
Layer(GridTime,GridTime,BotXY)
Layer(GridTime,WellTime,BotXY)
BOT_DEPTH_GRID
Along Hole(Well)
The isopach based on the top and base well picks (i.e.TOP_DEPTH and
BASE_DEPTH).These may well be offset from the grid values for top and base of layer
due to misties (as shown in the diagram below). Values are only meaningful for vertical
wells. They should be treated with caution for deviated well.
BASE_DEPTH
(base well pick)
Layer(GridDepth,GridDepth,BotXY)
Layer(GridDepth,WellDepth,BotXY)
Locations from one of the main VelPAK curve Definitions. Go Here for details.
Time(WellTime,GridTime,BotXY)
Depth(TieMode,BotXY)
Depth(WellDepth,GridDepth,BotXY)
Global Fit ()
Global Fit ()
Global Residual ()
Global Residual ()
Well Fit ()
Well Fit ()
WellResidual ()
WellResidual ()
Remember that you may need to consider Mistie discrepancies between Grid and Well
Pick.
Top Time
One of the main Well Definitions. Go Here for details.
Top Time(GridDepth) - (Top_Time_Dep_Grd)
The time obtained by interpolating the well time-depth curve at TOP_DEPTH_GRD. Time
values are obtained by interpolating the well time-depth curve at the grid-derived depths:
B C A
Top Depth Grid
Middle Time
One of the main Well Definitions. Go Here for details.
Bottom Time
One of the main Well Definitions. Go Here for details.
Bottom Time(GridDepth) - (Bot_Tim_Dep_Grd)
The time obtained by interpolating the well time-depth curve at BOT_DEPTH_GRD. Time
values are obtained by interpolating the well time-depth curve at the grid-derived depths.
B C A
Top Depth
One of the main Well Definitions. Go Here for details.
B C A
Middle Depth
One of the main Well Definitions. Go Here for details.
Bottom Depth
One of the main Well Definitions. Go Here for details.
Bottom Depth(GridTime) - (Bot_Dep_Tim_Grd)
The depth obtained by interpolating the well time-depth curve at BOT_TIME_GRD. Depth
values are obtained by interpolating the well time-depth curve at the grid-derived times.
These are the values displayed on the right-hand side of the WELL TOPS display in
velocity-depth mode.
B C A
B C A
B C A
Top(GridTime,GridDepth)
The average velocity from the surface to the top of the layer, based purely on the grid
values i.e. TOP_TIME_GRD and TOP_DEPTH_GRD.
Top(GridTime,WellDepth)
The average velocity to the top of the layer, based on the grid time and well depth i.e.
TOP_TIME_GRD and TOP_DEPTH.
Top(WellTime,WellDepth)
The average velocity from the surface to the top well pick, based purely on the well values
i.e. TOP_TIME and TOP_DEPTH.
Bottom(GridTime,GridDepth)
The average velocity from the surface to the base of the layer, based purely on the grid
values i.e. BOT_TIME_GRD and BOT_DEPTH_GRD.
Bottom(GridTime,WellDepth)
The average velocity to the base of the layer, based on the grid time and well depth i.e.
BOT_TIME_GRD and BOT_DEPTH.
Bottom(WellTime,WellDepth)
The average velocity from the surface to the base well pick, based purely on the well
values i.e. BOT_TIME and BOT_DEPTH.
Along Hole(GridTime,WellDepth)
The interval velocity based on the grid times and well depths i.e. TOP_TIME_GRD,
TOP_DEPTH, BOT_TIME_GRD and BOT_DEPTH.
Along Hole(WellTime,WellDepth)
The interval velocity between the well picks, based purely on the well values i.e.
TOP_TIME, TOP_DEPTH, BOT_TIME and BOT_DEPTH.
Layer(GridTime,GridDepth,BotXY)
The interval velocity based purely on the grid times and depths i.e. TOP_TIME_GRD,
TOP_DEPTH_GRD, BOT_TIME_GRD and BOT_DEPTH_GRD.
Layer(GridTime,GridDepth,TopXY)
The interval velocity derived purely from the top time and depth grids interpolated at the
base well pick (i.e.(TOP_TIM_GRD_BOT, TOP_DEP_GRD_BOT, BOT_TIME_GRD and
BOT_DEPTH_GRD).Vertical Interval velocities.The vertical interval velocities are derived
at the location of the base well pick (i.e. at BOT_X, BOT_Y). They correspond more
closely with the interval velocities used in depth conversion.
Layer(GridTime,WellDepth,TopXY)
Layer(GridTime,WellDepth, BotXY)
Layer(GridTime,WellDepth,BotXY,Well Tie)
The interval velocity derived from the top time and depth grids (TOP_TIM_GRD_BOT and
TOP_DEP_GRD_BOT) and the time and depth values at the well base pick (
BOT_TIME_GRD and BOT_DEPTH).
THIS IS THE VALUE NEEDED IN A DEPTH CONVERSION TO TIE THE TIME GRID
WITH THE WELL DEPTH.
The vertical interval velocities are derived at the location of the base well pick (i.e. at
BOT_X, BOT_Y)). They correspond more closely with the interval velocities used in depth
conversion.
WELL TOP
Well Top - Location
X Coordinate
The X coordinate of this top; this is calculated by looking at the time-depth curve, and
interpolating the X coordinate off the deviation information there.
Y Coordinate
The Y coordinate of this top, calculated as above. In a deviated well, the X and Y
coordinates will be different at each top.
In the Optimize module, parameter values are derived for different velocity functions, and
various displays are produced to derive best-fit parameter values for all classic V0 and K
depth-conversion methods for each required layer.
Optimize uses non linear least squares curve fitting of time-depth curve data derived from
sonic logs, check-shots or tops and times to derive the best function parameters for each
well. It has the facility for allowing one parameter to be fixed and the other mapped e.g. fixed
K and contoured V0. With close interaction between the parameter grid and the mapping
modules you can immediately see the V0 and k values that best suit your data.
Tip: For the new user there are many different aspects of the Optimize module which could
well appear confusing. It is thoroughly recommended therefore that a new user works
through the Training guide for the Optimize module which will lead you through a typical
flow of work.
• For each node on the grid, Optimize calculates the best ‘Fit’; how well the V0,K parameter
pair fits the log data for this one well. The ‘Residual Line’ is also calculated; any
parameter pair down this Residual line would result in a zero depth error for this well
Residual
Line
For each pair of points on the time depth curve Optimize compares the known depth of
the base point (B) against the depth predicted by the function parameters.
A
B
Depth
Time
• So for each possible V0,K pair Optimize calculates an RMS residual. It then minimizes this
value using it’s optimization techniques.
This provides us with the Best Fit function.
• Optimize then looks at the Best Fit for all of the wells and the mean Best Fit.
This is called the ‘Layer Fit’.
• Optimize can also look at the residual lines for all wells. This can be used to detect and
exclude outliers. The best residual is the point at which the mean value of the residuals and -
given that - the fit to the log data are optimum.
User
Fit
Intercept (V0)
2. It is useful or advisable to always have the gradient starting at zero. This would therefore
show you along this axis the V0 values that would produce no error for each selected
well.
3. Given the type of well and curve data you have it may not even be appropriate for you to
run Optimize on it. For a good result from Optimization then the Residual lines for all wells
should show some sort of convergency in the V0 and k domain. If the Residuals did not
converge then you may as well use a function or simple contour V0 grid for your depth
conversion.
Residuals Residuals
Display Display
k k
Vo Vo
Good Data for Optimization Bad Data for Optimization
4. All values are re-generated if a well is selected or de-selected. If you have set some
values in the Generate General tab to ‘No’ then the original values of these will be wiped
and replaced with blanks. (It will not keep the values as derived from an earlier
generation).
Non-
Conforming
Outliers
7. Curves and tops can be selected or de-selected separately; you may well have sound
tops where the curve is poor or vice versa.
Types of Optimize
Optimize types can be a baffling mix of lines and grids, fits and residuals. Here we try and
explain all the different elements; how they are calculated and what their displays show.
RESIDUAL
FIT The ‘zero-error’ line; the set
How well the data fits to the log data of V0 and K values that would produce a
zero-error for a well
Point Point
Well_Point_Fit Well_Point_Residual
Point Point
Layer_Point_Fit Layer_Point_Residual
Fit Values
How well the data fits to the log data for the given depth conversion method set-up.
Calculation Details
For each pair of sample points on the well curve the depth isopach is worked out using the
given V0 and K pair for that point and the time from the well. Optimize compares the known
depth of the base point against the depth predicted by the function parameters for each
sample. The RMS is then taken of all these difference values for the given V0 and K and this
is the value that is assigned the point or node. Note that values are Normalized within the
process so values are always under ‘1’. Also be aware that each calculation for each sample
down the curve takes the true time-depth value for the sample above as its starting value so
the effect of the calculations are not cumulative down the curve.
} ....etc
Well_Grid_Fit
Given the depth conversion parameters entered, a grid is generated for each well in the
model. The generation takes each pair of V0 and K values and works out the depth value,
comparing it to the time-depth curve data for that layer, for that well. The value plotted is the
difference between these two values. As the value of V0 and K become more true then the
grid values will reach a low (or trough). Contouring these values up will therefore show where
the best values of V0 and K will be.
Y (k)
10
samples
X (V0)
If you displayed both the Well Grid Fit result and the Well Line Fit as shown in the magnified
example below there will be slight discrepancy due to the angle of the trough not being
exactly perpendicular to the fixed Y axis. However this is usually insignificant.
Magnified example
of possible discrepancy
between grid and line fit
Actual value taken
Fixed Y parameter
Y (k)
X (V0)
Well_Point_Fit
The best fit given the values that produced the Line Fit and the Y (k) value. Naturally this
should always fall on the Line Fit at the very bottom of the contour trough.
Layer_Line_Fit
An average of all the Line Fits generated for the Selected wells in the model. These values
need to be generated from ‘scratch’ again in the same method that the Well Line Fit is
generated except that each calculation uses all the well values at each point.
The Average value for all the Layer Fits is an average RMS calculated by:
Residual Values
The ‘zero-error’ line; the value at which any combination of V0 and K would produce a zero-
error for a well, for the given depth conversion method set-up. Bare in mind this is a non-
unique solution and a combination of bad V0 and Ks could give the same zero-error as a
combination of ‘best’ V0 and Ks could give.
The actual methodology of calculating the values for Grid, Line and Point are the same as for
the FIT; however what the procedure is calculating for the Residual values is obviously
different. Given the seismic time at the top and base of layer, the last depth (if a buried layer)
and the depth conversion formula set up, the generation calculates the error using the V0 and
K of that point. In essence it does exactly what the Tie option within VelPAK does.
Well_Grid_Residual
The Grid Residual display tends to be striped contours (as opposed to the trough usually
seen in the Fit Contour display) where the focus would be the zero contour (implying zero
error along that line).
Well_Point_Residual
How can a Well Point Residual be calculated, when any of the V0 and K pairings down the
Well Line Residual line can give zero error? The answer is that for this value the best Fit is
calculated also, and the Well Point Residual is the point that falls on the Well Line residual
line, closest to the Fit point.
Note: The relationship between the Well Fit and the Residual may not make this point a
valid or useful point; this should be kept in mind when viewing this point!
Well Point
Residual Well Point Fit
Layer_Grid_Residual
The average of the grid values calculated for the selected wells.
Layer_Point_Residual
One point calculated from the average of all the Well Point Residual values.This value will be
generated from scratch in the same method the Well Point Residual is generated except that
each calulation uses all the values at each point.
Grid Nodes
The nodes of the grid that has been generated for the current active Map Type.
Standardly this grid will be displayed as pink dots. Zooming in on the nodes with the Grid
Node Label switched ‘on’ will show the values of the node which would be currently displayed
as a contour map.
Note: These are grids of specific data relating to the Optimize generation routines and
should not be confused with standard grids stored within VelPAK.The layer grids are,
however, still stored within the Model Tree under the Expanded Slot ‘Optimization Fit’.
The Fit showing how good the fit is between the Well and the log data. The current well or all
wells can be selected from the Display Property Grid.
The ‘zero-error’ line; the value at which any combination of V0 and K would produce a zero-
error for a well.The current well or all wells can be selected from the Display Property Grid.
The best fit given the values that produced the Line Fit and the Y (k) value.
Displays the lowest point of the Residual (zero error) line; the point where residual is closest
to the Well Fit.
Displays the point picked by you when the Point Edit option is activated.
No display at all!
• Are you in a selected Layer? Check you are not in ‘Layer 0’ (not selected layer).
Check that this is for all wells, not just the well currently selected. Bringing up the well
curve display (from the Well module) may well help you see a reason for yourself.
• If you fail to get any reasonable display it may be because you have not defined the
layers within this module. Layer display in this case will provide a display of one color
band over the entire area.
• You may find that it is just this one well you have no fit data for; scrolling through other
wells either in the Well Module or in the optimize display may show this up for you.
• You have no Point or Line display but you have contours; although there is no ‘trough’
and they appear to veer off to a corner.
Your Optimize calculation window has been incorrectly defined. Re-define your
Optimize generation Min/Max window to a larger window of calculation. (Standardly a
V0 of 0 to 5000 and k of 0 to 3 are sufficient). Note that using the Point Edit option on
the Display would give you the values of the Fit and Residual Points and from there
you should be able to see what sort of values are expected.
Note: Saving the model in the usual ‘Save’ method will only save the current set up as the
situation is when you press the save option.
Previous/Next Arrows
Event Well
Use the previous/next arrows at the top of the module tab to move to the previous or next
Event or Well within the VelPAK model.
Select type of display to view on your Optimize window. It is likely you would want Contour or
Shaded Contours shown at all times while you are working in the Optimize module.
Basic
Will display just the fit and/or residual lines and Map Types as selected.
Ribbon
Reflects the underlying value of the parameter space along a fit or residual line. Particularly
useful for looking at a Line Fit since they can prove invaluable at spotting outliers whose Fit
appears to be valid (as a line) but the value of each point on the Fit line shows discrepancy.
Contour
Black contours of the Optimize grid will be displayed.
Shaded
Shaded color contours of the Optimize grid will be displayed.
Turn Display element ON or OFF for display on the Optimize view. With or without element
labels.
Note: These visible layers can also be selected to be on or off via the Display General
Property Grid.
Go Here for a full discussion and examples of the Layers Visible Overlays.
Line Fit - The Line showing well the data fits to the log data for a fixed Y value (usually k).
Line Residual - The ‘zero-error’ line; the value at which any combination of V0 and K
would produce a zero-error for a well.
Point Fit - The actual lowest point of discrepancy for the fit down the line.
Point Residual - The actual lowest point of residual error for the fit down the line.
Point User - Displays where the User point has been selected.
Surface Node - Displays the Surface Grid nodes over the Optimize display.
Labels - displays the labels of the respective items on display.
Note: For both these options, selection can occur on the well as well as the line on display.
Point Pick -selects the current well as active - and switches the display to this well in the
Well Module.
Point Edit - will allow you to select or deselect wells within the model for use within the
Optimization routine. The wells will become grayed out until you regenerate again.
Note: If you deselect a well you will need to regenerate the Optimize display which can take
some time. Once you have regenerated the display the de-selected wells will cease to
be visible. If you wish to re-select them (and therefore have to re-generate the display
again) you will need to go in to the Data Tab and make the de-selected wells active
again.
Value Edit
In Value Edit mode, using the left mouse button on the optimize display will allow you to
change the position of the User value point on the display. Select Value Edit and then click on
the place on the display where you want your User value point to be. A blob of color will
appear on the display and an Alert style window will pop-up with all the relevant information
for that point.
Tip: If you click a point and do not see a point appear then check in the Display Color tab that
your ‘Layer_User_Color’ is not set to Foreground or the same color as the color contour
over the area.
The alert window will show the values of the v0 and k at the selected User point.
Pressing OK will fix the User value point on the Optimize display to be this value until this
method is executed again.
Note: The User values are the values that will be used in the Depth Conversion if the
Formula is set to come from Optimize_User (as shown below).
Value Move
Selecting the Value Move button over the optimize window at the required place will
activate an impressive set of processes! At a click of the button you will get a grid display
(in the Surface module) of either the X or Y axis - whichever is set to be variable - for all
the (selected) wells in the area. Standardly this would be ‘Fixed k’ and a ‘variable v0’;
allowing you at the click of the middle button to produce and view a “V0 to tie” error map
which would calculate the V0 necessary given the fixed k to tie your wells over the area.
Note: In order that you get the required instantaneous display of the correct values, certain
parts of VelPAK need to be set up or selected before pressing the middle button. This
is documented below.
Values will be output to the three slots in the Model Tree as named in the XYZ tab. (Go
Here for details of what values will be output for each slot.
{
}
2. The XYZ slot selected in the Gridding set-up will be gridded up and the resultant grid
displayed (if the Surface display is set to contour the correct grid.)
Note: It is worth noting that once all these tabs and property grids are set up, if ‘User’ is
selected as XYZ Type then using the Value Edit option will give you an almost-
immediate error grid for the selected User V0 and k, time after time after time with no
further set-up required.
Graph Tab
Provides the 2D Parameter Space for the optimized display, values and fit.
Use the Optimize Display Property Grid to change any display parameters.
Data Tab
Data that is displayed on the Optimize window is displayed numerically in this Tab.
Data can be filtered to your specifications here to be displayed on the graph.
Comment column - note the Comment column which will store any comment you have
regarding individual wells (like for example why you chose to make a well Inactive).
Data can be filtered on any number of columns within the file. Filter criteria are brought up for
the individual columns by clicking on the ‘funnel’ icon at the top of each column. This funnel
icon will change from gray to blue if a filter has been applied on that column:
You can select a number of methods as the filter for individual columns:
All - All elements will be displayed (default).
Custom - User defined selection of Operator and Operand to filter data. See Custom
Filter for details.
Blanks - Will filter out blank values in the field.
NonBlanks - Will filter out non-blank values in the field.
Column Values - Lists all data in the field to select.
Custom Filter
Gives you the ability to build up a list of conditions with which to filter the data.
You can use ‘And’ or ‘Or’ to select the operators used.
Operand - The objects that are manipulated.
Operator - The symbol that represent specific actions.
As the operand you can select another column label from the data file being used as the
qualifier for this filter, or a value from within the column currently being filtered.
For example, to select data only within a certain Block you could set up the filter for the X
Location column to be X values within the block limits (shown below), and similarly for the Y
Location.
The layout of the columns can be changed to suit your requirements by accessing the arrow
to the right of the filter motif. Clicking on it will bring up a list of the columns within the Data
tab. You can then select which column you wish to be placed in this position.
Note: The display generated by Optimize can be re-displayed using the Activate tick of the
Display property grid, so you do not need to re-generate the values every time. All the
displays are saved in the binary model, so pressing the Display Activate after re-
loading the model will re-generate the displays.
When you have made your edits. Press to apply the changes and activate the
option.
Note: Map Type and Well type above two options are used in conjunction with each other;
Go Here for a discussion of what the combinations of the two selections would
display.
Contour
Contour_Color - Click on the box to the right of the color selection slot to bring up a list of
color tables available to you. The list will bring up all the color tables previously defined in
TKS (as well as some default VelPAK color tables available; Cyan_Red, Grey,
Topographic and Seismic).
This option utilizes stored TKS color tables that are usually stored in the ‘Colorbars’
directory of the TKS installation on your PC. If this directory is moved elsewhere then
VelPAK will still be able to find it.
To create a new color table for use within VelPAK you will need to go into TKS and define
and save it in the standard way; the list within VelPAK will automatically be updated with
the new color table the next time it is brought up.
Contour_Fit_Incr[ement] - choose the increment that best suites the contouring of the
FIT data.
Allows you to select the Range of your display from the ‘Grid’ - [the parameters of generation
of the Optimize window as setup in the Parameters tab of the Generate property grid] or
leave it as Auto, in which case any spurious value-displays that fall outside the grid area will
also be shown within the window.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () to select from available inputs.
Option
Axis
XY_Type - Select the XY range either from the entire range as defined for the whole
optimization or just the selected grid.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow ()to select from the available colors.
Color - All of the Color inputs have the same color options. Select the color for the drop-down
selections.
Once the Optimize values have been generated a crucial part of the procedure is to be able
to display and view exactly what you want to see on your display. This is done using a
combination of the Map_Type and the Well_Type on the display Property Grid. Here we set
out what each selection would display.
Note: In this listing it is presumed that all the Layers Visible are switched on (although not
necessarily the labels). It is also presumed that the Display Type is set to ‘Shaded’
(contours) unless otherwise stated.
The Well Grid Fit grid is displayed as a shaded contour map for the current well. No other Fit
methods are displayed since no ‘Well Type’ has been selected.
The Well Grid Fit grid is displayed as a shaded contour map for the current well. The Current
Well Line Fit and the Well Point Fit is displayed as the light blue line. Note that the Well Line
Residual and the Well Point Residual for the current well is also displayed. (These are
displayed because they are selected as ON in the Layers Visible and therefore can easily be
turned off if deemed incongruous on the Fit display.)
The Well Grid Fit grid is displayed as a shaded contour map for the current well. The line fits
and residuals for all (selected) wells are displayed over the contour pattern of the current
well. This would allow you to compare the fits of all the wells with the contour fit pattern of this
one selected well.
Displays the Well Grid Residual contour display for the currently selected well.
Displays the Well Grid Residual contour display for the currently selected well. The current
Well Line Residual and the Well Point Residual is displayed as the dark blue line. The Well
Line Fit and the Well Point Fit for the current well is also displayed. (These are displayed
because they are selected as ON in the Layers Visible and therefore can easily be turned off
if deemed incongruous on the Residual display.)
The Well Grid Residual grid is displayed as a shaded contour map for the current well. The
line fits and residuals for all (selected) wells are displayed over the contour pattern of the
current well. This would allow you to compare the residual of all the wells with the contour fit
pattern of this one selected well.
The average of all the well grid fits will be shown as the Layer_Grid_Fit contour display. The
line and point layer fit and line and point layer residual will also be displayed if switched on
under Layers_Visible even though the actual well selection is set to None.
The average of all the well grid fits will be shown as the Layer_Grid_Fit contour display.
The fit and residual of the current well is also shown as well as the average fit and residual of
the layer’s selected wells.
The average of all the well grid fits will be shown as the Layer_Grid_Fit contour display.
All well fits and residuals will also be displayed.
The average of all the well grid residuals will be shown as the Layer_Grid_Residual contour
display. The line and point layer fit and line and point layer residual will also be displayed if
switched on under Layers_Visible even though the actual well selection is set to None.
The average of all the well grid residuals will be shown as the Layer_Grid_Residual contour
display.
The fit and residual of the current well is also shown as well as the average fit and residual of
the layer’s selected wells.
The average of all the well grid residuals will be shown as the Layer_Grid_Residual contour
display.
All well fits and residuals will also be displayed if switched on under Layers Visible.
Generate Dialog
Note: The display generated by Optimize can be re-displayed using the Activate tick of the
Display property grid, so you do not need to re-generate the values every time. All the
displays are saved in the binary model, so pressing the Display Activate after re-
loading the model will re-generate the displays.
5000
A Level
Another Level
} A Step (Increment)
V0 } Another Step
Y-axis Levels = 10
Step= 500
x x x
x x x
0
Figure 1: Values 0 (Minimum) 5 (Maximum)
will be calculated at k
each of these
Levels = 16
points, and then X-axis
Step= 0.31225
contoured.
Note: Note that since the values are being re-calculated the display generated could be
slightly different from the original one.
Function Parameters
Function Parameters
Constant V0=3894
The limiting velocities used in the asymptotic methods (V1 in the Hyperbolic tangent function
and Sm in the Asymptotic slowness function) were constrained to realistic values.
The corresponding parameter values when using data with depths measured in feet are
shown in Table 2 below.
Function Parameters
Constant V0=12780
Even when real time-depth data are used, it is unlikely that the best fit parameter values will
vary from the values in the tables by more than a factor of 10, although this is by no means
guaranteed. However, these values should normally provide a good starting point for any
optimization.
two intervals, the top one having a higher velocity than the lower one. Ideally, one should
then sub-divide the unit into two units. This is not always welcome news because it
means that one would need to identify the corresponding seismic horizon separating the
two units. The second situation is where the instantaneous velocity is continuously
decreasing downwards at each location. Clearly, in such a case, the unit cannot and
should not be split into sub-units. Almost invariably, in such cases, while the
instantaneous velocity is decreasing downwards at each location, the individual velocity
components increase in value with depth when followed regionally. Hence, as the unit is
followed from a shallow location to a deeper location, the individual components increase
in value and the interval velocity has likewise increased with depth.
There are three alternative ways to model the unit when the gradient is negative, given
below in order of preference (i.e. to produce the best velocity model):
(i) Have a small tolerance limit (e.g. 1ms solution trough) for all locations. Fix k at an
appropriate negative value and produce a map of V0 over the area, guided by
variations in the depth of the unit, (depth of the base of the unit that immediately
overlies it).
(ii) Revert to an interval velocity model, again guided by variations in the depth of the
unit.
(iii) Accept the regional V0, k combination on the positive side of k. This is the most
convenient alternative but is likely to give large residuals. These residuals would
again follow the depth to the top of the unit (in general) and can therefore be removed
accordingly.
Note: The Green, Amber, Red Symbols on the General Generate property grid
are there to aid you in this by setting the inputs to the Minimum (green), Medium
(amber) and Maximum set of values that can be generated. Generating a Green set
of values will generate the quickest values, the Red set will generate more types of
values and thus take longer to complete.
Standardly you would define a rough estimate of the Parameter Space first to give a range of
V0’s and K’s to examine. This would then be changed and the real area of interest zoomed
into when it becomes apparent from the initial run. Once you have defined this real area of
interest you may then request further parameters to generate since the area of which it has to
calculate is smaller.
Dependency
Recommen
Parameter on any other Recommen Recommend
Display in d
to parameter Speed d on third
VelPAK on second
Generate being on first pass pass
pass
generated
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () to select from available inputs.
Note: Remember that you need to set up the depth conversion method (but no input
parameters) you wish to use on the layer from the Surface module depth conversion
property grid set-up before generation of the Optimize values can occur.
Note: Setting these values will automatically change the Display of the optimize window to
the ‘best’ display given the values generated.
Option
Filter - Normal/Fit/ Residual - in Normal mode filtering will take place according to how
the wells have been de-selected in the Data tab; any de-selected wells will not be used. If
however only the Fit or the Residual of a well has been de-selected then using this Filter
will allow only one or the other’s values to be displayed in the Optimize window. This
option is intrinsically linked to the Curve module and vice versa.
Info - Information on the Optimization process will be displayed in the Console window
Residual - How the error per well will be generated either as RMS or an Average.
Samples - The number of curve samples to use. Using 0 means use ALL curves.
Well
All values are re-generated if a well is selected or de-selected. If you have set some
values in the Generate General tab to ‘No’ then the original values of these will be wiped
and replaced with blanks. (It will not keep the values as derived from an earlier
generation).
Go Here for full details of the values that can be calculated.
Well_Grid_Fit - Given the depth conversion parameters entered, a grid is generated for
each well in the model. The generation takes each pair of V0 and K values and works out
the depth value, comparing it to the time-depth curve data for that layer, for that well. The
value plotted is the difference between these two values.
Well_Grid_Residual - Given the seismic time at the top and base of layer, the last depth
(if a buried layer) and the depth conversion formula set up, the generation calculates the
error using the V0 and K of that point. In essence it does exactly what the Tie option
within VelPAK does.
Well_Line_Fit - This method scans on the Y axis (k) and gives the best fit of the solution
trough for the V0. The Y axis is divided up into 10 sample to provide the fixed k values
across the Optimize area. Taking the seed value the calculations along the sample lines
begin at a point somewhere in the middle of the optimize area. When each point is
generated the routine checks to see if that value is lower than the point on either side of it
along the line. Thus the lowest value point along the line is calculated and it is through
this point that the line fit passes.
Well_Line_Residual - Using the given depth conversion formula, a line per well, in the
parameter space (close to the zero contour of the Grid residual) marking the line down
which the error would be zero.
Well_Point_Fit - Using the depth conversion formula, a point per well, in the parameter
space, this is the best fit given the values that produced the Line Fit and the Y (k) value.
Naturally this should always fall on the Line Fit at the very bottom of the contour trough.
Well_Point_Residual - Using the depth conversion formula, a point per well, in the
parameter space, this will be the point that falls on the Well Line residual line, closest to
the Fit point.
Layer
Layer_Line_Fit - Using the depth conversion formula, a line per layer, an average of
all the Line Fits generated for the Selected wells in the model.
Layer_Line_Residual - Using the depth conversion formula, a line per layer, an
average of all the Line Residuals generated for the Selected wells in the model.
Layer_Point_Fit - Using the depth conversion formula, the average of all currently
selected wells, representing the best possible layer surface fit to the time-depth data.
Layer_Point_Residual - Using the depth conversion formula, a point per layer, in the
parameter space, representing the best possible layer surface fit calculated by
searching down the layer line residual.
Layer_Grid_Fit - An average of all the currently selected well values calculated for
each grid node for all the wells in the model.
Layer_Grid_Residual - The average of all currently selected wells in the parameter
space, representing all possible residual values for the time-depth data.
Input
Bank 01
Input
Bank 02
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () to select from available inputs.
Note: Normally just the X-axis value and the Y-axis Values are set here in the Parameter_01
and Parameter_02 banks; however some depth conversion definitions have a third
and fourth parameter banks to set up. Go Here for details of FIXING a parameter in a
three (or more) parameter function.
you wish your graph to be divided up into and the program will work out the values for
you.
Input v0 - standardly the first input would be the V0 parameter; the parameter along the
X-axis.
01_ Increment - The number of divisions between your Min/Max range you wish to act
on. Standardly this value is set to 50 (or 51 to give an even count for the Min/Max of the
grid).
01_Maximum - (note MAXIMUM is set before MINIMUM) - the maximum value of the
initial Optimize display.
01_Minimum - (note MINIMUM is set before MAXIMUM) - the minimum value of the
initial Optimize display.
Note: Once generated the Minimum and Maximum can be refined either by editing this
value here or by using the standard VelPAK Zoom facility on the display.
Input k - standardly the second input would be the k parameter; the parameter along the
Y-axis.
02_Maximum - (note MAXIMUM is set before MINIMUM) - the maximum value of the
initial Optimize display.
02_Minimum - (note MINIMUM is set before MAXIMUM) - the minimum value of the
initial Optimize display.
Note: Once generated the Minimum and Maximum can be refined either by editing this
value here or by using the standard VelPAK.
Further Banks of Inputs; 03_Increment etc. and beyond - Some formulae expect more
inputs than just the two standard V0 an k. An example of a formulae that requires more
than two parameters to be set up is the “Hyperbolic Tangent Function”. (Go Here for
details of this formulae). In this case the other Input Banks here would be set up in a
similar way to the Input banks 01 and 02 detailed above.
The mechanism which allows you to do a random hit in the parameter space selected using
the parameter constraints given.
XYZ Dialog
The XYZ Property grid is used to output XYZ values generated on your display from your
Optimize set-up parameters to be used within other parts of VelPAK. For example the
generation of a ‘V0 to tie’ error grid to be added to your Depth Conversion method.
Type - The ‘Fix X or Fix Y’ values will use either the User, Residual or Fit point displayed
as the value at which to fix the axis.
Residual
Output
The slots within the Model Tree where the value generated will be placed.
Errors - Output slot for Errors XYZs (Fixed X and Y).
X_Axis, Y_Axis - Output slots for the Gradient and Intercept.
All these values have a specialized default slots within the Model Tree where they will go
if you do not select any other slot. For example, the ‘Depth Conversion Gradient’ slot is
one of the ‘Expanded Slots’ within the Model Tree named just for this output.
Note: All three output Model Tree slots will be filled regardless of which of the three you
have selected. This is to avoid confusion with over-writing; all slots will always store
the values generated for the last XYZ run. The table below shows what values will be
placed in the slots according to what method you have selected.
Fix_Y
Standardly it is the Fix_Y option here that would be used (output) most since this would be
the intercept, usually V0, value given a fixed k for your selected Type (User, Fit or Residual).
Fix_X
Since rocks tend to have a fixed gradient the Fixed_X option would not usually be used.
However, should it be used then it is of the same principle as Fix_Y but with the fixed value
being the X axis.
Many of the VelPAK processes can be turned into Workflows which step-by-step can lay out
the actions required to complete a particular task. Tasks that are cumbersome and/or
repetitive can be completed with a click of the mouse in WorkFlows. They guide you through
each process step-by-step, leaving very little room for error and confusion.
You can load workflows for each depth conversion type and expert knowledge can be
incorporated and workflows extended. This makes workflows ideal for audit trail and Quality
Assurance.
You can design your own workflow and save them; they can be specific to a company or a
province. Workflows can also be used for tutorials, training, testing and demonstration.
VelPAK is designed to run from XML.
It is sometimes not practical or possible to automate all activities within VelPAK, in which
case Pause options are introduced allowing you the input required before continuing.
Note: It is recommended that the user become familiar with the workings and concepts of
the Ready Made Workflows as an introduction to the Workflows processes within
VelPAK
• Older, ‘classic’ Workflows are still active in later versions even though the workflow
system has been updated. They can be found in the ‘Classic Workflows’ directory from
the Workflow fly-out.
• Pre-defined workflows can be used with the simplest user intervention to set them up for
each or any Event Horizon.
• There is a bank of Workflows to select from which can be used to run the major VelPAK
processes and a bank of Components which are drawn in to each workflow which contain
repetitive routines of work such as initializing the grids and displaying particular map
types.
• The user can edit the workflows as they wish. These then default to being saved in the
local project directory. The system workflows will not be written over.
• If there is a locally stored workflow or node stored in the project directory this will always
be the one used in the workflow - not the system parameter set. This allows the user to
change displays and parameters as they wish solely for the project they are in while
retaining the main system’s integrity.
Start
Event Depth
Workflow Pause
Event independent pre-written workflows for a particular
VelPAK process built up of component workflows. Pause Gridding
nodes will stop the flow and prompt the user for relevant
Mapping
information if necessary (with a “Don’t Ask Me Again”
check box). Error
End
Start
Component Node
Workflow Pause
A number of Nodes with a Start and End pre-written for
certain repetitive tasks. Built in Pause nodes will stop the Node
flow and prompt the user for relevant information if
Node
necessary (with a “Don’t Ask Me Again” check box).
A real example of expanding a workflow like this can be Node
seen here.
End
Node
The Node is the link to the property grid fly outs in the main
VelPAK program - change the values in the fly out from the
Model Tab of the workflow - the changed values will then be
saved under this node.
anticipating that you wish to join this node to another node in the flow. The cursor
changes to the standard ‘finger pointer’.
Select the red dot on the Start and drag the resulting arrow to the Event workflow node.
This should be fairly easy; the arrow line is trying to join with a node at all times so you
should see it snap to the node without having to be too precise.
6. If you have an Event 2 in the project to depth convert you will need to repeat this process
under the Event 2 selector node.
To add more events to the workflow:
7. In the Workflows property grid fly-out open up the Event directory/tree. You will see all the
Event workflows listed.
8. Drag the workflow for the Event you require into the canvas.
9. You will probably need to delete arrows to fit the event workflow and subsequent depth
event workflow into the Master Workflow.
10. Make sure the End node is attached to the final depth conversion method you have put in.
11. Save your workflow! This will default to be saved within the workflow sub-directory of your
project. This is not the ‘User Workflows’ seen in the Workflows property grid fly-out which
is part of the VelPAK system.
Control Node
Node
Node
Control Node
Node
Component Workflow
Control Node
Control Node
Node
Node Node
Double-clicking on the ‘Optimise Main Residual.xml’ component workflow within this workflow
will expand out the workflow in Scratch 2 tab:
Control Nodes
Node
Node
Node
Node
Component
Workflow
Node
Under this expanded workflow there are more nodes, control nodes and component
workflows. Double-clicking on the ‘Optimise Parameter Choice.xml’ component workflow will
open it up in the Scratch 3 tab, where the Pause nodes that ask the questions the user will
have to answer to proceed through in the main run of the Master Workflow are actually
stored:
Note: To be able to see what is going on within VelPAK while you run this workflow it is
recommended that you detach the workflow tab and have it away from the main
VelPAK window.
1. Pressing the Start button, the program will move through the workflow - (the nodes turn
red when they are in action). The process moves through the Initialise and Event nodes
and pauses on the Depth Event workflow. The Pause comes up asking the question
which you can see above is actually sitting in the workflow ‘Optimise Parameter
Choice.xml’ which is nested within the workflow ‘Optimise Main Residual.xml’ which is
nested within the Event Workflow ‘Optimise V0kz Residual Error Multi.xml’ chosen for this
depth conversion.
This is a ‘Branch’ - the decision you choose will change which branch the flow goes down
- as seen in the screenshot above.
2. Say Yes to the question and press OK. Almost immediately the next pause will come up;
this is also shown in the screenshot above as the text in the left branch pause.
The program has attempted to work out the Optimize Ranges for this model. To the right
you can see the model set up; if these look acceptable press ‘OK’.
Note: The Do not ask again check box- check this if you do not want this pause to pop up
next time you run the workflow on this project.
3. The next pause to come up is another Branch to select the gridding method you want to
use for the error.
Select Global as your method. If you were to answer Kriging a number of very different
pause-questions would need to be answered.
4. There then follows a number of branches and pauses as the program displays the error
grids depth maps, velocity maps etc.
Although on the Master Workflow the Optimise V0kz Residual Error Multi.xml is still
showing red showing it is this that is still being used, the program has moved on from the
nested workflows discussed above and is now accessing the workflow component
‘Display Maps Multi.xml which is nested under the ‘Error Multi.xml’ further on in the flow of
the ‘Optimise V0kz Residual Error Multi.xml’.
Project workflows - For Primary workflows pertaining to the project as a whole, these can
be stored in the project workflow area and accessed via the ‘Open Workflow’ icon on the
workflow page. New workflows (or amended pre-defined workflows) can be saved in the
project directory using the standard ‘Save Workflow’ icon on the workflow screen which will
open automatically in the project workflow area.
Edited ‘Cached’ workflows - Any component, users or pre-defined workflows that are
loaded into the project and edited in some way are automatically stored in the ‘cache’ of the
project which are sub-directories of the project’s workflow area, stored per event in the form:
C:\Software\VelPAK\galleon\workflow\01
C:\Software\VelPAK\galleon\workflow\02
C:\Software\VelPAK\galleon\workflow\03 ...etc
This local variation of the supplied workflows will be named the same as the supplied
workflows and will be the default workflow used in all workflow runs from that project. In
order for the supplied System workflow of the same name to be used again the cached
workflow will need to be cleared (using the Clear Run Cache clear option from the top line of
the Workflow screen) or renamed.
Note: A User can also store a workflow anywhere on the system and load it as required
using the standard File Selector.
Workflow
The main area where you can see your workflow; set it up and run it.
Process Nodes building bricks are dragged across to this window and set up in the order the
workflow is to process them.
The size of the workflow can be changed using the Tool bar options at the top of the Workflow
window.
Scratch1,2,3
Three areas that can be used to set up rough preliminary workflows or parts of a workflow.
The scratch areas work in exactly the same way as the main View area; data can be saved,
run, edited etc.
Note: Saving your workflow will only save either the Scratch window or the Workflow
window that is currently on display. If you wish to save all the windows you will need to
save into four separately named files.
If you are looking at a component workflow and you wish to expand it then double-clicking on
it will expand the workflow in the next tab. However, note that a workflow expanded when in
the final Scratch area ‘Scratch 3’ will be expanded within that tab.
Except for
Scratch Area 3
which will
expand a
workflow
into itself.
Workflows and Scratch areas can be brought out from their default tabbed display and all
presented in the workflow module window as single panels, as shown below.
Top Toolbar
New Workflow - New will clear the View window. Make sure you have saved your
previous workflows before pressing New (Note that Undo will restore it if cleared in error).
Open Workflow -Will bring up the standard file selectors to open your workflow.
Workflows always have the extension ‘.xml’.
Note: The name of the default ‘.xml’ workflow file is stored within the Properties of the model
for reference purposes.
Save Workflow - Will bring up the standard file selectors to save your workflow.
Workflows always have the extension ‘.xml’. If the file name exists already and alert with
come up for clarification of the action.
Note: Saving your workflow will only save either your Scratch or your View window in one
saved file depending on which of the two tabs is on display. If you wish to save both
you will need to save into two files.
Pause -
Turning Pause ‘ON’ and then pressing Run will make the workflow pause after every
process node for clarification. This can be useful when setting workflow up. (You can also
enter a Pause node from the Node property grid into the flow for a permanent ‘hard’
pause in the run if required.)
If pressed during a workflow in action it will cause the workflow to pause at the next
opportunity.
Stop - Stops execution of the workflow.
Ignore Pause Nodes - Allows the user to ignore pause modes placed in the workflow.
Watch Sublevel Execution - Will expand the sub-level workflows while running the
workflow. A useful tool to see what is going on during the workflow. Each component and
node will be highlighted as the workflow runs.
Clear Run Cache - Clears any locally stored workflows that have been created or
modified and stored in the ‘cache’ directory under the project you are working in (under
the event they have been created for). The option gives you the choice of clearing the
cache from all events or just the current event.
Edit - Cut, Copy, Paste, Delete, Select All - Allows the Edit of all elements of the Node.
To select the item to edit click on it with the mouse; alternatively use the Select All to
highlight all items in the workflow. Using the mouse button to drag over and area will allow
selection of those elements within the drag window. Use the Control or Shift Key to select
specific items.
Undo - To Undo an action. Clicking more than once will undo the previous actions too.
The undo pertains to the set up of the workflow display and does not work on internal
changes made to the Node.
Redo - Will redo an action you previously undid. Continual pressing of this option will redo
all previous actions you undid until you have reached the present state at which time
pressing it will do nothing further.
Zoom In, Out, Size to Fit, Normal - Zoom In and Out will do what it states; Size to Fit will
take the whole workflow display and fit it in the View window at the size the window is.
Normal will take it back to the default size of text and nodes boxes.
Align Left, Middle, Right, Top, Center, Bottom - Allows you to align the selected node
boxes amongst themselves in the alignment mode selected. Joining flow lines will not
align, but they will of course move according to where their attached nodes have moved
to.
Same Width, Height, Size - Select a start node and one or more further nodes to turn the
same width, height or overall size as the start node. The start node will be selected with a
green highlight, other nodes will be light blue.
Hide Arrows -
Turning this ON will hide the arrows of the joining flow lines.
Allow Resize -
Allow Edit -
Turning this ON will allow the text in the node boxes to be edited. You can enter as much
text in the node box as required. The box will resize if there is a large amount of text
within the box, regardless of whether the ‘Allow Resize’ option is turned on or off.
Note: Care should be taken not to clutter this area up since it is the system area accessible
by all users and all projects.
These workflows are being continually updated as well as added to by the user, so the
Workflow Property Grid may not look exactly as it does above, however here is a run through
of the main top level directories.
When a workflow node/process/component etc. is dragged onto the canvas you will see that
they appear different colors or react differently. This is so you can tell what type of node/
process/component it is within the workflow.
For full details of all the types of workflows available from within the Workflows Property Grid
fly-out Go Here.
The Nodes tab holds building blocks for each individual VelPAK process that can be added to
the workflow.
Drag the nodes from the Nodes Property Grid into the View window to build up your workflow.
Set up of the nodes are done via the Model tab.
Within the Nodes Property Grid you will see Workflow Control nodes, followed by Selector
nodes followed by Module nodes:
Workflow Control Nodes - These are nodes that are used within the running of the
workflows; the Start, End and Pause nodes and Comment/Balloon options, the Execute,
Branch, Loop and Surface nodes. They tend to control the flow of the workflow process
(although not always).
Note: Further description of what each of the Workflow Control Nodes does along with
examples can found here.
Comment / Balloon allow you to add comments to the workflow. Use Allow Edit from the
top toolbar to edit the comments.
Start /End - Start the flow with the Start node and end the flow with the End node.
Pause - If you wish to pause the workflow you can add a Pause node into the flow. An
alert will come up and the workflow will only start again once you have selected ‘Y’ to start
it running again. (You can also run the whole workflow in Pause mode from the toolbar at
the top of the workflow window. This would allow you to pause after each node but not
have to edit the workflow to have Pause nodes within it). The Pause that comes up will
have two areas on display; to the left it shows you the Pause comment (if any) and the
ability to turn the Pause off, to the right shows you the set-up of the property grid of the
node that is following the pause. The user can then use this pause in the workflow to edit
this property grid - for example the contour increment.
This is described more thoroughly in the Workflow Control Nodes section here.
Execute - is used when you are working with pre-defined and nested workflows. It is the
name of the ‘sub-routine’ nested workflow that is to be run within another top-level
workflow.
For example, you would make up a workflow you were happy with, save and name it
‘Example1.xml’. When you want to run this workflow within another workflow you would
bring in the Execute node and rename its display name from ‘Execute’ to ‘Example1.xml’.
When you are running this top level workflow and it reaches the Execute node named
‘Example1.xml’ the workflow routine will go and run all the processes stored within the
workflow ‘Example1.xml’ before returning back to the top-level workflow.
Note: All the pre-defined workflows already stored within VelPAK have their name.xml
shown in the Workflows property grid. These nodes are all Execute nodes that
have been re-named to the name of the processes they will do.
Branch - Allows the user to branch the workflow to have a choice of up to three
different paths to take. There is also the option to ignore the branch option question
and always select a particular branch if required - set up in the Parameters tab. This is
described more thoroughly - with examples - in the Workflow Control Nodes section,
here.
Loop - Allows the user to loop the workflow process for as many times as specified in
the Parameters tab. Used in re-iteration processes or for creating a ‘movie’ of a
display feature that is changing as the program loops. The number of iterations to be
used is set up in the Parameters tab. This is described more thoroughly - with
examples - in the Workflow Control Nodes section, here.
Pin - A ‘cosmetic’ node, which can be used to make the workflow look neater - as its
name suggests it can pin the flow lines anywhere within the display. A Pin node can
have up to three input and one output flow lines with any of the four edge connectors
being used for the output flow arrow; thus it can bring together three branches back
into one flow line to continue with the workflow with a neater looking display. This is
described more thoroughly - with examples - in the Workflow Control Nodes section,
here.
Check - Acts to check whether the run you are doing is part of a multiple realization
loop or a single run. It is a three pronged automatic conditional branch executable
dependable on the model number of the run it is executing. If it is a single run then
there is no model number involved and the check button will run through the workflow
in standard fashion. If a model number is detected (be that zero for the first run or a
positive value for subsequent runs) it will go through the run and loop back to run
again in multiple realization mode. This is described more thoroughly - with examples
- in the Workflow Control Nodes section, here.
Seed - used in multiple realizations; this resets the seed to get the same random
number starting point for each multiple flow. Otherwise you would not be able to
recreate the exact same run if the run started from a different random number every
time. This is described more thoroughly - with examples - in the Workflow Control
Nodes section, here.
Layout - Loads the currently saved layout. This is described more thoroughly - with
examples - in the Workflow Control Nodes section, here.
Image - will record the current image seen on the screen as ‘*.png’ file and store it in
the model directory under an ‘image’ directory. Under the image directory they will be
stored in directories labelled per event. The name of the image captured will be the
same as the name of the image node (which therefore would default to image.png).
This is described more thoroughly - with examples - in the Workflow Control Nodes
section, here.
Repaint - Refreshes the graphics during a run or turns them on or off. This is
described more thoroughly - with examples - in the Workflow Control Nodes section,
here.
Selector Nodes - Allows you to select the various data slots you want to be currently
active in the workflow that relate to the Surface&Slot and Preferences selectors within
VelPAK.
In a standard workflow you would find that you would put one or both the Selector
process nodes in as one of the first nodes in a workflow since they allows you to select
the various data slots and surfaces you want to be currently active in the workflow. It is
also quite likely you would add one or both of the Selector process nodes many times
within a work flow as you change event horizon or selected surface. For full details of the
Selector Go Here.
Modules - All VelPAK property grids are laid out under the respective Module names.
Dragging the relevant node into the workflow or scratch canvas areas will allow you to set
up the entries within the grid via the Model tab.
Cut/Copy/Paste/Delete - these options do exactly the same as on the top toolbar. They
allow the Edit of all elements of the Node.
Toggle Resize/Edit - will toggle one of the two buttons on the top toolbar to allow resize
or edit of the node. The icons on the top toolbar will go orange when these modes are
on.
Resize will allow the node boxes to be resized.
Edit will allow the text in the node boxes to be edited. You can enter as much text in
the node box as required. The box will resize if there is a large amount of text within
the box, regardless of whether the ‘Allow Resize’ option is turned on or off.
Properties - Will bring up the Parameter Property Grid:
This is exactly the same as the Parameters Property Grid which is discussed here.
Model
The Model Tab will be blank on initial entry in the Workflow module.
Once you have dragged a process node into the View window and double-clicked on the
node the Model Tab will display the details corresponding to the Property Grid for that node.
Note: The Model Tab is showing exactly what is set up in the relevant Property Grid within
VelPAK.
Full details of the different property grids are found in the relevant parts of the VelPAK
manual.
Parameters
The Parameter Tab contains the information stored in the selected node.
Note: This information also comes up from the clicking on a node within the canvas using
the right hand mouse button as discussed here.
Text - This box shows you what text is currently in the selected node. The text can be
changed in this option, or by editing the node on the canvas itself.
Type - Tells you what type of node it is; unless you have renamed the text line above
these will both say the same. However, it is likely that when developing a work flow you
will want to rename them to tell you what the node is in the flow to do - in which case the
Type input here allows you to see what the original name of the node was and therefore
the type of node it is. This is also useful in looking at ready-made workflows.
Color - Changes the color of the selected node and its Text.
Execute - Yes/No - (default yes) - allows you to switch off the execution of the node.
Certain options in VelPAK require setting up of more than one property grid before the
option can be successfully activated; for example the Gridding routines where you would
need to set up items in the ‘Range’ tab as well as the ‘General’ tab before activation. In
this case you would set the first node to set up as ‘Execute - No’ allowing the workflow to
set up that node information but not execute the Gridding generation until the next node
has been set up as required. The second of these nodes in this example would remain
with the default ‘Execute - Yes’ and the gridding would therefore be activated on the
second node.
A node can be seen to be set to ‘No’ at a glance within a workflow when there is a small
red square in the top right corner of the node as seen below.
Branch - Choice - Ask, A, B, C - The default set up for the Branch node is ‘Ask’ which
means that when the workflow comes across a Branch node it will ‘Ask’ which branch you
wish to go down. If you set the Branch node to ‘Do not ask me again’ while you select
Branch A, B or C then the option here in the Parameters will become set to A, B or C. The
Branch node will then not ask you which branch you wish to proceed down. The setting
can be changed back to ‘Ask’ here or set to A, B or C before running the workflow so that
it will not pause during the run on the Branch Node.
Ask Yes - Resets the ‘Do not ask again’ option on the branch to be unchecked to force the
branch to ask again.
Loop - How many times the Loop node option is to be run - set to the number of times
you wish it to loop.
Value - What this is depends on what the node is that the Parameters fly out is showing.
For example the Repaint and Check nodes would have a value here of zero or one
depending on whether it was to be ‘on’ or ‘off’. Other nodes would not use this value input
at all. Value details are documented against the relevant inputs in the in the Workflow
Control Nodes section, here.
Note: Care should be taken not to clutter this area up since it is the system area accessible
by all users and all projects.
These workflows are being continually updated as well as added to by the user, so the
Workflow Property Grid may not look exactly as it does above, however here is a run through
of the main top level directories.
When a workflow node/process/component etc. is dragged onto the canvas you will see that
they appear different colors or react differently. This is so you can tell what type of node/
process/component it is within the workflow.
xpand
Autoe
Depth
All the depth conversion processes are stored under this area. Note they are blue workflow
processes.
When you have selected the depth conversion process you want to run, you would usually
add this to the Simple workflow found under the Example directory. A step-by-step guide of
what to do is documented here.
A full list of these are described here.
Event
The event sub-directory list simple workflows that will change the event to that numbered.
These are intended to be used as a ‘component’ workflow within a top level workflow to move
to the next event to be processed.
Classic Workflows
Contains classic workflows and component workflows from previous releases.
User Workflows
An area for users to store their own workflows (blank on installation).
Components are pre-defined workflows which contain repetitive routines of work such as
initializing the grids and displaying particular map types. They are intended to be nested
within a top level workflow to run required but repetitive routines within it.
Note that component workflows are green in color. Double-clicking on the node will open
up the component workflow in a scratch tab. There may well be further nested workflows
within the this one component.
Utility workflows allow for Grids to be saved externally for use in the Analyse module
Statistics tab.
Pin
Drag
Holding your cursor over the Start node will give you a red dot where the programme is
anticipating that you wish to join this node to another node in the flow. The cursor changes to
the standard ‘finger pointer’.
Select the red dot on the Start and drag the resulting arrow to the End node. This should be
fairly easy; the arrow line is trying to join with a node at all times so you should see it snap to
the End node without having to be too precise.
Once you have started the workflow add various processes between the Start and the End
node. You will need to delete the line joining the Start and End nodes as shown above, by
clicking on it and deleting using the right hand mouse button, the red delete cross on the top
toolbar or the keyboard delete key.
Double clicking on the node will allow you to activate the node without necessarily having to
join the nodes up to make a flow until the end when you are happy you have produced the
workflow you want.
Clicking with the right hand mouse button on a node in the canvas will bring up a menu with
certain edit options for the node and it’s properties allowing you to change them as you wish.
Pause
If you wish to pause the workflow you can add a Pause node into the flow. An alert will come
up and the workflow will only start again once you have selected ‘Y’ to start it running again.
You can also run the whole workflow in Pause mode from the toolbar at the top of the
workflow window. This would allow you to pause after each node but not have to edit the
workflow to have Pause nodes within it.
The Advanced use of the Pause button is that it allows you to see and if necessary edit the
node that follows the pause. The user can then use this pause in the workflow to edit this
property grid - for example the contour increment.
Note: Any changes you make to the Property Grid linked to the pause node will be
automatically saved as an edited ‘cached’ copy of this workflow. This locally stored
workflow will be the workflow used by default when called in to the program again.
Execute
Execute - is used when you are working with pre-defined and nested workflows. It is the
name of the ‘sub-routine’ nested workflow that is to be run within another top-level workflow.
For example, you would make up a workflow you were happy with, save and name it
‘Example1.xml’. When you want to run this workflow within another workflow you would bring
in the Execute node and rename its display name from ‘Execute’ to ‘Example1.xml’. When
you are running this top level workflow and it reaches the Execute node named
‘Example1.xml’ the workflow routine will go and run all the processes stored within the
workflow ‘Example1.xml’ before returning back to the top-level workflow.
Note: All the pre-defined workflows already stored within VelPAK have their name.xml
shown in the Workflows property grid. These nodes are all Execute nodes that
have been re-named to the name of the processes they will do.
Note: This is not to be confused with the ‘Execute - Yes/No’ option in the Parameters tab.
Go Here for details of this.
Branch
Branch - Allows the user to branch the workflow to have a choice of up to three different
paths to take. A ‘Do not ask again’ check box allows a particular branch to be selected
permanently if required.
The default set up for the Branch node is ‘Ask’ which means that when the workflow comes
across a Branch node it will ‘Ask’ which branch you wish to go down. If you set the Branch
node to ‘Do not ask me again’ while you select Branch A, B or C then the option here in the
Parameters will become set to A, B or C. The Branch node will then not ask you which branch
you wish to proceed down. The setting can be changed back to ‘Ask’ here or set to A, B or C
before running the workflow so that it will not pause during the run on the Branch Node.
Note: The ‘Do not ask again’ option once unchecked can be changed back to ‘Ask’ in the
Parameters tab.
Having made the choice the program can then merge and go back to running the same
nodes.
Carriage Return
Carriage Returns
in between each
branch item will
give you the
correct set-up
No Carriage
Returns will give
you just the one
(incorrect) input
Note: You do not need to precede the choices with a), b) or c) or 1), 2) 3) etc.
Entering only
two items
will give only
two branch
options
The branch to the left is always the first choice in the list, the branch to the bottom will always
be the second choice in the list and the branch to the right will always be the third choice in
the list (if used).
Loop
Loop - Allows the user to loop the workflow process for as many times as specified. Used in
re-iteration processes or for creating a ‘movie’ of a display feature that is changing as the
program loops. The number of iterations to be used is set up in the Parameters tab.
Prior Processes
End or further
individual processes
Unlike most other nodes, the loop node can be joined from or to another node on all four
sides of the node; however each of the four join-points are specific in what they do:
Top Join Point - The prior processes in the workflow must be joined to the loop from the
top point
Bottom Join Point - The process to be looped must be joined to the loop node from the
bottom point
Left Join Point - The process must be returned to the loop node for a further iteration at
the left join point
Right Join Point - When all iterations have completed the ‘End’ or further individual
processes must be joined to the Loop node from the Right join point.
Pin
Pin - A ‘cosmetic’ node, which can be used to make the workflow look neater - as its name
suggests it can pin the flow lines anywhere within the display. A Pin node can have up to
three input and one output flow lines with any of the four edge connectors being used for the
output flow arrow; thus it can bring together three branches back into one flow line to
continue with the workflow with a neater looking display.
It is a useful way of adding a comment within the workflow itself rather than adding a
comment to the side. Note that the comment entered in a pin node will not be displayed when
the workflow is running unlike a comment put in a Pause mode.
Check
Check - Acts to check whether the run you are doing is part of a multiple realization loop or a
single run. It is a three pronged automatic conditional branch executable dependable on the
model number of the run it is executing. If it is a single run then there is no model number
involved and the check button will run through the workflow in standard fashion. If a model
number is detected (be that zero for the first run or a positive value for subsequent runs) it will
go through the run and loop back to run again in multiple realization mode.
Seed
Seed - used in multiple realizations; placing the seed node in a flow as shown below will reset
the seed to get the same random number starting point for each multiple flow. Otherwise you
would not be able to recreate the exact same run if the run started from a different random
number every time.
Use a workflow that has the seed node placed within it in conjunction with the Model
Realisation Selector to move to the model you want to within the run.
Layout
Image
Image - will record the current image seen on the screen as ‘*.png’ file and store it in the
model directory under an ‘image’ directory. Under the image directory the images will be
stored in directories labelled per event. The name of the image captured will be the same as
the name of the image node (which therefore would default to image.png).
If this is run in multiple realization mode then the model run number of the run will also be part
of the run. The name of the image will contain the model number along with the name of the
image node; for example Depth000.jpg, Depth001.jpg, Depth002.jpg etc.
Note that the images will be overwritten if another run with the same node name and run
model number under the same event is run. In workflows that have already been set to
produce images, (and found under the component workflows) a branch question precedes
the image generation node to warn the user.
Repaint
Batch Execution
When running a large number of workflows for stochastic modelling it is possible to run the
workflows in batch mode accessing as many processors on the PC as you want and are
available.
Select the Batch Execution using Multi Processors
Loop Start / Loop Stop - In the same way that the loop parameter and node can be set
within a workflow, this will set the workflow process to loop for as many times as
specified. Setting it here will override whatever value for loop is set in the workflow.
Number of Tasks - Taken from the number of processors the PC has this would usually
be left as the default allowing all CPUs to be used during the batch execution.
Abort - Allows you to Abort the process at any time.
This can be used as an Advanced use for batch on a number of different computers:
• Set up the Number of Tasks - CPUs to be used
• Press the Abort button once the 12 Tasks have started
• Pass the batch files to the other computers (since they are ‘simple’ batch files)
• When they have finished using the Collate button will collate these task batches
Collate - this is an advanced user option as mentioned above. The process will take a
while and the progress of it will be shown in the console window.
Note that running the batch process on one computer (over a number of CPUs) will
automatically collate the runs on finishing.
The task manager Performance tab will show all CPUs in use.
As the processors finish their allocated loops the bar will turn green.
In the directory of the VelPAK project there will now be a new sub-directory called ‘batch’.
Within this sub-directory will be further sub-directories containing all the information that is
generated during a workflow looping run for each of the tasks run.
There are also batch files (*.bat) stored in the batch directory which have been used to run
the processes. For advanced users, these would allow you re-run each task individually if
required. They can also put onto another PC entirely and re-run - providing VelPAK is
installed on that PC too. (Advanced use only!)
When the process is finished all the information stored in the ‘batch’ directory sub-directories
will be copied across to the relevant directories of the project.
Memory Allocation
Be aware that if you do not have a PC with ‘a lot’ of RAM running this amount of processes
may cause the PC to run out of RAM and move into ‘Swap space’ which will slow the
processing down considerably.
For every VelPAK run it tells you how much memory 1 model takes to run and store the data
down at the bottom right of the VelPAK screen.
In the example above running 8 models at once using the 8 CPUs available on this PC would
use up approximately 3.2GB of RAM. Therefore it needs to be run on a PC with at least 4GB
of RAM (to allow for other processes running on the PC at the same time).
(You can check the amount of RAM on your Windows 7 PC by looking at the System
information from the Control Panel.)
There are currently 52 depth Event Workflows available from the Workflows Property Grid fly
out. For each of these, once selected and the run started the user is then prompted to select
which gridding method they favor.
The workflows available are shown below; the checks against each method shows what
gridding methods you will be asked to choose from during the run.
Methods Kriging
Re
Global
Family of
None
Ordinary
Ext. Drift
Functi sid
Simple
Method
SGS
Function Name on ual
Type Ty
pe
Note: If Kriging is selected then a further sub-selection will be required for Ordinary Kriging,
Simple Kriging, Kriging with External Drift or SGS Kriging (Go Here for details of
Kriging.)
Gridding Method
Methods Kriging
Smooth
Functio
Global
Loess
Family of
None
Ordinary
Ext. Drift
Resid
Simple
n
SGS
Gridding Method
Methods Kriging
Smooth
Functio
Global
Loess
Family of
None
Ordinary
Ext. Drift
Resid
Simple
n
SGS
Method Function Name ual
Type
Type*
Gridding Method
Methods Kriging
Smooth
Functio
Global
Loess
Family of
None
Ordinary
Ext. Drift
Resid
Simple
n
SGS
Method Function Name ual
Type
Type*
Gridding Method
Methods Kriging
Smooth
Functio
Global
Loess
Family of
None
Ordinary
Ext. Drift
Resid
Simple
n
SGS
Method Function Name ual
Type
Type*
User Average X X X X X X
User Mid point depth v IV X X X X X X
User IV v isochron X X X X X X
User Isopach v isochron X X X X X X
User V0KZ X X X X X X
* Residual Type - Choose to correct in the error domain or depth conversion intercept domain
(typically V0K but in the case of Faust this is ‘M’).
Data Requirements
Method Discussion
within VelPAK
Data Requirements
Method Discussion
within VelPAK
Single Well Time Production of curve using User must select well to
Depth Curve.xml single well use in the Well Module
Stacking Velocity
Data Requirements
Method Discussion
within VelPAK
Stack Velocity Dix Extraction of Dix Interval As above but also ‘Well
Smooth Calib.xml velocity, calibration to well Curves’
velocities, smoothing
using xxx metres and
depth conversion using
smoothed grid
Utilities
Data Requirements
Method Discussion
within VelPAK
Test Depth Surface Grids the depth at base of Tops. Layers Defined.
for Tie to Wells.xml layer and compares this to Time Grids covering area
depth surface; maps the of your wells (check that
result which should be deviated curves do not
zero at all wells. take well out of the area
defined by the grid). Depth
conversion to layer above
already been done.
VelPAK can exploit a ‘Solution Trough’ to vary parameters; except for hypothetical cases
that are never encountered in practice, the solution for the parameters of analytic velocity
versus depth functions (e.g. Vz = V0+Kz) is inherently non-unique. This non-uniqueness
means that there is no particular parameter combination that represents the solution. The
average discrepancy between the observed velocity-depth (or time-depth) curve and the
calculated function curve gives a measure of the degree of fit between the two curves. In
the parameter space, within a given degree of fit that may be considered as the margin of
tolerance for the particular problem in hand, every parameter combination provides a
valid solution. The solution trough is defined as the region in the parameter space (e.g.
V0 and K) containing these solutions.
Note: Due to the randomization factor of this algorithm the grid will be different each time
generated and the corresponding contour map could also show dramatic differences.
Definition of Constraints
The constraints put on the multiple realization depends on what method(s) you have selected
to run within your workflow.
Curve Fitting using Optimization - use the Solution Trough to define a contour
level.
RMS of Residual Errors - use the Solution Trough to define a contour level.
Curve Fitting and RMS Error - use both Solution Troughs to define contour levels.
Confidence of regression - use the Curve Regression plot to define the Confidence
level.
Kriging- use the Variogram to define the Range min and max.
Sequential Gaussian Simulation - use the Variogram to define the Range and Sill,
SGS takes care of the rest.
Spill Point
Note: It is quite likely that the spill point could be different for each realization.
The Spill Point is the structurally lowest point in a hydrocarbon trap that can retain
hydrocarbons. Once a trap has been filled to its spill point, further storage or retention of
hydrocarbons will not occur for lack of reservoir space within that trap. The hydrocarbons spill
or leak out, and they continue to migrate until they are trapped elsewhere.
If the structure is ‘filled to spill’ then the spill point and value will vary for each depth
realization. VelPAK automatically calculates the spill point and calculates the Height Above
surface accordingly.
The Column Height Algorithm works by ‘flooding’ the depth structure to determine all of the
possible accumulations based on pure structural closure.
Phase 1 - VelPAK calculates the initial Height Above Contact surface showing all
possible accumulations.
Phase 2 - VelPAK then enables the user to ‘flood’ any accumulation by using an
existing well or choosing a target or seed point.
Each value is collected in the data tab and an image can be generated if specified in
the workflow.
For a multi-realisation of a number of runs of VelPAK with different parameters, spill points
and crests can happen many different times on the exact XYZ node. This facility has taken
the XYZ locations of the spill points from the Data tab and adds up how many fall on the same
XYZ. It is this value that is plotted on the maps if the correct XYZ data file is selected.
In the picture above the Spill is being displayed of the P50 run of a realisation and with XYZ
‘Height Above: Analyse Spill Location Values’ turned on it can be seen that 564 other runs of
1000 realisations also had their spill point at that node.
For each run data is collected in the data tab. Any column can be graphed in histogram or
scatter mode.
Volumetric Calculation
After the ‘Height Above’ and ‘Spill Point’ calculation has calculated the correct surface –
volumetrics can be easily calculated.
Inputs – any valid combination of
Top reservoir,
Base reservoir,
Contact as surface, constant or Automatic Spill,
Multiple or individual Polygons,
Seed Point to ‘flood’.
Outputs
Cubic Metres, Cubic Feet, Acre Feet etc.
Depth conversion is generally responsible for the bulk of the uncertainty in the estimation of
field reserves. The standard VelPAK depth conversion routines [and associated workflows]
produce accurate velocity models very quickly, but it is important to derive an estimate of the
uncertainty in the depth horizons of interest. However, running a manual sensitivity analysis
is tedious and prone to error and only a limited number of test cases can be run.
The Analyze tool enables the user to develop a reasonable estimate of depth uncertainty
based on a large number of depth models derived from one or more depth conversion
scheme and allows the user to identify the key parameters controlling the range of Gross
Rock Volume.
An 'absolute' answer to the depth conversion problem is impossible; the Analyze Module
gives confidence as to the likely range of uncertainty in the model, and the deterministic
parameters which will yield credible P50 (most likely) and end-point (P10, P90) solutions.
The Analyze module will:
1. Record the depth conversion scheme in a summary table,
2. Record individual models and well results for each iteration,
3. Allow graphing of the results,
4. Allow capture of the recorded images,
5. Run volumetric calculations.
Typically the process is run as a looped depth conversion processes in the workflow system
which will generate multiple depth conversion models as shown in the “Simple Loop. xml”
shown below.
The Analyze Module then allows filtering and analysis of the results.
Well A Well
lnfo
Well B Well
lnfo
Well C Well
lnfo
Well D Well
lnfo
Well E Well
lnfo
Well A Well
lnfo
New - will clear the data on display in the Analyze tab. A default set up will have been
saved in the model directory (under the relevant event).
Open - opens a ‘*.ana’ file saved in the model directory (under the relevant event). A
default .ana file is saved here when Analyze is first run.
Save - if you make changes to the data table in the way it is filtered you can save your
new data display as a new ‘*.ana.’ file.
Save as Excel - will save the data in Excel’s .xlsx format (Office 2000 and beyond). The
more recent format is chosen since it allows saving of more than 64,000 rows (which is
often the case in the Analyze options).
Previous/Next Arrows
Event Well
Use the previous/next arrows at the top of the module tab to move to the previous or next
Event, Well within the VelPAK model.
The only mode is the point pick mode; which is only available to display point values in the
Graph tab.
Graph
The graph tab will show a graphical display of whatever values have been selected from the
Generate Dialog.
Analyze Info
Histogram Mode
Scatter Mode
Note: This feature will only work if the workflow you have run has been set to generate
images.
The model number of the image selected from selecting a point on the scatter plot is shown
at the bottom of the image display.
Data
Data that is generated through the Analyze option via the workflow run is displayed
numerically in this Tab.
Note: Some columns are only used in “Well” mode and some only used in “Model” mode.
Go Here for discussion.
Clicking on the top of each column will sort the data alphanumerically in standard fashion.
Data can be sorted using more than one column using a ‘nested’ sort by holding the ‘shift’
button down while sorting.
For more control over which well data is utilised in the model, particularly with projects with
many wells, the Data table filters are cumulative. The right-mouse button context menu to
allow fast resetting of all the filters applied, and to force the data table to be rebuilt from the
model.
Data seen in the tab can be output as an Excel ‘.xlsx’ file to read into a spreadsheet. These
data and any filtering that has been done to it can also be saved as a ‘.cft’ file.
Select Summaries
Selecting the symbol on a particular column brings up the Select Summaries box which
allows the user to display the Average, Count, Max/Min and/or the Sum of that column. The
results will be shown at the top of the column. To remove the values from display select again
and uncheck the values in the box.
Data can be filtered on any number of columns within the file. Filter criteria are brought up for
the individual columns by clicking on the ‘funnel’ icon at the top of each column. This funnel
icon will change from gray to blue if a filter has been applied on that column:
You can select a number of methods as the filter for individual columns:
All - All elements will be displayed (default).
Custom - User defined selection of Operator and Operand to filter data. See below for
details.
Blanks - Will filter out blank values in the field.
NonBlanks - Will filter out non-blank values in the field.
Column Values - Lists all data in the field to select.
Custom Filter
Gives you the ability to build up a list of conditions with which to filter the data.
The layout of the columns can be changed to suit your requirements by accessing the arrow
to the right of the filter motif. Clicking on it will bring up a list of the columns within the Data
tab. You can then select which column you wish to be placed in this position.
Column Description
When looking at data within the Analyze module some data will have been generated per
model and some will have been generated per well within a model.
Although this data is hierarchical, within the Data tab of the Analyze module it is laid out as a
‘flat list’ for graphing purposes. This means that data relating to just the Model is repeated.
Go Here for example.
Model No. - The number of the model under which the values are generated. Model 1, for
example model is the first of the random models generated of perhaps 1000 runs and
would not be linked to Model 2 at all.
Well Information
Well Name - Name of well
Fit - Related to whether you have put Yes/No to Fit in the Optimize set up.
Residual - Related to whether you have put Yes/No to Residual in the Optimize set up.
Error Before - Z value of Error used in the gridding.
Error After - Should be zero if you have done an error correction on every well. If you turn
a well off you can see what the error should be for that well (a method of cross validation).
Base Depth - The depth at the base well pick.
Base Grid Depth - The value of the base depth grid interpolated at the base well pick
Well Isochron - The isochron based on the top and base time values (i.e. TOP_TIME
and BOT_TIME). These may well be offset from the grid values for top and base of layer
due to misties (as shown in the diagram below). Values are only meaningful for vertical
wells. They should be treated with caution for deviated wells.
Grid Isochron - The isochron based on the top and base time grid values
(i.e.TOP_TIME_GRD and BOT_TIME_GRD).VALUES ARE ONLY MEANINGFUL FOR
VERTICAL WELLS. They should be treated with caution for deviated wells.
Vertical Apparent IV - The interval velocity derived from the top time and depth grids,
TOP_TIM_GRD_BOT and TOP_DEP_GRD_BOT and the time and depth values at the
well base pick BOT_TIME_GRD and BOT_DEPTH
Well IV - Well Interval Velocity
Model Information
Opt Param1 / Opt Param 2 - the two parameters of the depth conversion function used in
the Optimise routine - most commonly the values of Vo and K. These can be two of three
parameters of a function providing that the third is constant. Another more common
function is Faust where these parameters would be ‘a’ and ‘m’ in faust. [It would be
expected that the user would know what these values refer to for each layer in the
Analyse.]
Fit Result / Res Result - the values at the given Optimize Parameters.
Regression Param1 / Regression Param2 - the values of a0 and a1 if applicable.
Confidence Value / Confidence Result - The set value and result derived from the
Confidence input in the Curve module.
Range - a Kriging parameter - if used this is the range used for that event.
Spill Point - the depth at which the Spill occurs.
Crest Value - The maximum depth of the reservoir.
Crestal
Value
0
Top Zmin / Top Zmax / Base Zmin / Base Zmax - the minimum and maximum of the
depth grid used in the volumetric calculation (if used).
Pos Volume - Positive volume in whatever units it has been generated in.
Random - (for internal purposes only) - the number of times the random number
generator had to be run to get to the start of a given model.
Inactive / Active - the current state of the the data grid - whether filtering has happened
or not.
Data2
Data2 is an internal data table which shows the data values that are displayed in the Graph
tab, which can be a useful QC tool.
Anim
The tab where the animation will show - as set up in the Anim tab.
Summary
The summary tab is used to summarize the depth conversion methods and values you have
run on the model. The Base Case shows the methods and the parameter values used for
each horizon while the Random shows the method, the values that were randomized in the
run and the minimum/maximum values assigned to the randomized values. The Random
case will also show in the relevant columns any volumetric calculations that have taken place
plus the P10, P50 and P90 values.
Summary Columns
Gives a listing of all the values for the parameters used or generated in the Base and
Random cases.
Note: Some columns pertain to only the Base Case or the Random Case and also to what
depth conversion method has been chosen.
Gives a listing of all the values for the parameters used or generated in the Base and
Random cases.
Event No. / Event Name - Listing the event number and name
Method Type - Indicates where the values have come from for the depth conversion
method.
Method Name - The Depth Conversion method used for that layer.
Param1 Name/Param1 Value - Name and value of the Parameter 1 used.
Param2 Name/Param2 Value - Name and value of the Parameter 2 used.
Grid Radius - If a grid is used as a parameter - the radius is here.
Conf Type/Conf Percent/Conf Factor - Confidence values.
Fit Min/Fit Max - Min/Max of the box used to optimize
Res Type - RMS or Average.
Res Min/Res Max - Residual Min/Max.
Range Min/Range Max - Min/Max of the ‘wobble’ variogram range value (in Kriging
mode).
Range Value - Base range value (in Kriging mode)
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () to select from available inputs.
Options
Flip_X - Yes/No - selecting Yes the X axis will be flipped.
Flip_Y - Yes/No - selecting Yes the Y axis will be flipped.
Label - Labelling of points on the Graph.
Label_Color - Label_Color - Color of label for point. See Color Below.
Label Font - Font used for the label of points on the graph. See Fonts Below.
Title
Title - The title of your plot. This is editable here but will become over written by the
default title if another graph is selected.
X_Title - The title of your X and Y axes. These are editable here but will become over
written by the default title of another graph is selected.
Histogram
Bins - When drawing the histogram this sets how many bins or containers the data will
fall into; given the range of the X axis the program will calculate the width of the bins
accordingly.
Probability - Yes/No - Choose to display the probability curve that has been calculated;
showing the P10, P50, P90 and M0 points.
Color - All of the above Color inputs have the same color options. Select the color for the
input from either two sets of pre-defined color tables Tabs; Web and System, or from the
customized Tab.
Fonts - All the above Font inputs have the same set up options. On selecting the Font
option, the following standard font box appears to allow you to set up the font as you wish.
These changes and more can also be made by expanding the ‘Fonts’ input line using the
‘+’ symbol to the left of the input.
Generate Dialog
Generate Dialog - General Tab
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () to select from available inputs.
Option
Group - Model/Well - a very important selection. Go here for a discussion on Model
versus Well.
Type - Scatter/ Column/ Histogram - The type of graph to be displayed. Depending on
your selection on the standard tab this will change automatically.
When you have made your edits. Press to apply the changes and activate the
option.
Analyze Info
On all these graphs holding the cursor over the data will bring up the Analyze Information.
Note: The different symbols and colors do not represent anything - they merely help to
distinguish between one point and another.
Column
For example Crest Value vs Model Number:
Histogram
For example Number of Models vs Positive Volume:
Scatter
For example Error After vs Error Before
Column
For example Error Before vs Well Name:
Histogram
For example Number of Wells vs Error after:
When you have made your edits. Press to apply the changes and activate the
option.
Note: The number of points plotted if each well value is being plotted may be considerable
(30,000 points for example) compared to the number of points if model data is plotted.
Statistics Dialog
Statistics Dialog - A Discussion
The General statistics dialog is the process by which the information usually generated
through a looping workflow is collected and - if requested - a Volume produced for each
model run.
The top option of the dialog page is the Active Yes/No input. Whether you put yes or no to this
will change the function of the dialog.
In Inactive mode (Active = No) requested information from the looping workflow will be
collected and displayed in the Analyze data tabs.
In Active mode (Active = Yes) requested information from the looping workflow will be
collected and displayed in the Analyze data tabs and a volume will be calculated from the
parameters set up within the dialog.
Note: The workflows set up to run the Volume dialog will not pause to check that you have
set the Volume units and values correctly - you will need to set the Volume dialog up
for the correct units before running the work flow.
Below is an example of a looping workflow for running the Analyze Statistics module.
The purple nodes seen below relate to the tools at the top of the Statistics tab (shown next to
the node).
The looping work flow starts and proceeds through each event with the Analyze stats model
set to Active = No for all but the final event. This means that although all the data will be
collected for events 1 - 3 it is only event 4 where the volume calculations will take place.
Note: The top grid in these calculations is the the current grid; the base grid is taken from
the event below.
TOP
Volume
Contact or Spill
Volume TOP
BASE
Top Structure
Contact or Spill
Base Structure
BASE TOP
Volume
Contact or Spill
Contact or Spill
End - - finalizes the run and turns the temporary data files where the data has been
collected in to the Data tabs.
Save Zmap Grid - - writes a Zmap grid to the external project directory; the name of
which will be constructed from the event number and name.
Save Statistics Grid - - writes a Zmap grid to the external project directory; the
name of which will be constructed from the event number and name.
Note: If you want a seed point which is not represented by a well in the model you will need
to insert a pseudo-well using Well Edit Point Insert within the Surface module.
Top/Base - set up the grid or value to be used as the top and base of the reservoir in the
calculation. (Go Here for a discussion on these.)
Top_Value / Base_Value - enter a value here for the top / base of the reservoir if a
constant is to be used and not a grid for either the Top or Bottom.
Top_Grid_Event / Base_Grid_Event - Select the Event number the grid is stored under.
Top_Grid_Type / Base_Grid_Type - Select the location of the grid (usually Depth).
Note: The top grid in these calculations is the the current grid; the base grid is taken from
the event below.
Contact
The Contact can be a value or a grid.
Contact_Value - Enter a value here for the contact if a constant is to be used.
Contact_Grid_Event - Select the Event number the grid is stored under.
Contact_Grid_Type - Select the location of the grid (usually Contact).
Seed - A drop down list of all the wells in the model to select the well location controlling
the flood Spill algorithm. The seed will be activated as part of the volumetrics calculation
by selecting Spill_Active = Yes.
Spill_Active - Yes / No. Setting this to Yes will activate the seed as set above.
Note: While usually thought of as the contact between oil and water or gas and water the
contact could be used to determine the volume of a particular ‘slice’ of volume.
Polygon
Use polygon or polygons to control the reservoir volume.
Polygon_Type - select where the polygon is stored in the Model Tree.
Polygon_Type_Name - If more than one polygon is stored in the Model Tree slot the
drop down here will display all polygons stored in this slot and allow you to select which
polygon you want to use.
Volume
Mapping Units - Meters / Feet. Until this moment VelPAK has not been using units but for
volumetric purposes it obviously needs to assign the units the model is in.
Volume_Grid_Type - the output slot where the volume grid will go. Defaults to the Height
Above slot.
Volume_Scale - Millions_Cubic_Metres / Millions_Cubit_Feet / Hectare_Meter /
Acre_Feet - The user will decide what scale is required for the volumetrics calculation.
Given the map units and the volume units VelPAK will calculate the volume in whatever
scale is entered here.
Volume_Units - Not necessarily the same as the Mapping Units; the Z measurement can
often be using feet when the Mapping units (X,Ys) are in Meters. So make sure this
Volume Unit is set to the actual unit the Volume_Grid_Type (above) is generated in.
Discussion
The statistics tab allows a selected statistical process to be run on the grids stored. One grid
will be produced and placed in a specified VelPAK slot to display the results.
For example setting the slots to be as shown below will take all the grids stored in the
external grid directory named ‘Height Above_xxx’ . For each node on each grid the value will
be read and every time the value of the node is greater than the constant (in this case zero)
one gets added to the count for that node.
When all the grids have been read, a percentage value of number of nodes achieving a value
greater than constant is assigned to that node in the out put grid.
Note: For this example the height above has already been defined since it is the ‘Height
Above’ set of grids being used - therefore Constant=0. This technique could equally
well be used on a set of depth grids with the constant being defined as the oil water
contact, for example.
Typically in the centre of a model one would expect to achieve a 100% hit rate for this
statistical process, while around the edges it would taper off - this is shown in the example
below. This therefore visually displays the probability of closure for the model.
Edge of Model
Central Model
Grid Node showing Z = 24.6 (%)
Grid Node showing Z = 100 (%)
Properties
Constant - Used for percentage above and percentage below processing types.
Input - The generic name for the set of grids to be used in the process. Select the generic
name from the drop down list. This will select the set of grids from the ‘grids’ sub-directory
from the model directories.
For example selecting ‘Height Above’ will actually select the named file:
“Height_Above_004_Base_Upper_Permian_Event_4_Created_by_Xpress_xxx.cps1” in
the grids directory where xxx is the number of the run it was generated from.
Output - The slot in the VelPAK model tree where the calculated output grid will go.
Default is General 01.
Note: The results of the above two may look a bit strange plotted as a contour map!
Anim Dialog
The Animation dialog animates the images generated during looping workflows (if the Image
option is chosen by the user during the workflow routine).
Note: To see the animation working you need to have the Anim tab to the front of the
Analyze Tabs. If it is not to the front the animation will be running out of view. Press
the abort button, bring the Anim tab to the top and play again.
When you have made your edits. Press to apply the changes and activate the
option.
Use the down arrow () to select from available inputs.
Tools
Delete Anim Images - - *Caution!* - this will delete all images in the directory with
the selected Basename.
Options
Basename - The generic name for the set of images to be displayed in animation mode.
The Image option in workflows will have generated a number of images, for example. The
name can be selected from the selector which will pop up by pressing the icon to the
right of the basename slot.
Delay - Set to an arbitrary default of 500, this can be changed to make the animation
move faster or slower as required.
Index - The number of the run you want to start the animation from.
Step - Choose to step the animation - a step of 5, for example, would mean only every
fifth image generated would be displayed in the animation.
Type - Choose to run the animation forward or back or ‘ping-pong’ will go forward and
then back and so on.
Glossary
A
AOI or Area of Interest
This is the region in space that the velocity model occupies. It is determined by the extents of
the data loaded into the model.
Generally it is a good idea to create a grid of sea bottom, basement, or laterally ubiquitous
event to act as a master or parent grid. You can then generate all other grids to match the X
and Y extents & cell size of this grid. Note the cell sizes in VelPAK grids must be square; the
X dimension must match the Y dimension, and all grids must have the cell size.
Note that with surface mode depth conversion, ‘holes’ in a grid are propagated downwards
through the volume, so it is important to precondition the grids correctly prior to depth
conversion. See ‘Preparing Grid Data for the VelPAK Model’ in the ‘Introduction’ chapter
under ‘Help > Tutorials > VelPAK’.
When creating a grid in the Surface module, the Grid fly out has a Range tab that features
icons for automatically setting or computing the AOI. Be consistent with these options in the
model.
Apparent Velocity
Also known as pseudo velocity. The velocity estimated by matching seismic event
interpretations in time to formation tops from wells in depth.
Average Velocity
The ratio of the depth of an event to the vertical travel time of the seismic wavefront to that
event. In VelPAK, this is referenced from the seismic datum. Note that as seismic times are
typically measured in two way travel time, the average velocity to any given depth is given by:
In the Velocity module, average velocities for wells can be plotted (after Layer Definition has
been performed). In this module average velocities can be computed from RMS stacking
velocities and overlain with the well average velocities.
An average velocity volume can be generated using the Volume fly out of the Velocity
module. The average velocity volume is most useful for depth conversion of surfaces not in
the velocity model.
See also Seismic Velocity.
B
Background Velocity
The regional trend of the velocity field.
BIN file
See VelPAK Model File
D
Datums
A discussion of datums is given in the whitepaper ‘VelPAK and Reference Datums’ available
to download on the IHS Kingdom website.
Depth conversion
Depth conversion is usually the final objective of a study using the VelPAK module. In
Surface Mode, grids are depth converted as you work down through the layers. In Profile
Mode, the velocity characteristics of each layer must be defined prior to performing the depth
conversion if each horizon interpretation.
Within Kingdom, there are a variety of methods for accomplishing each of these tasks, as
indicated in the following table.
Grids X X
Horizons X X
Faults X X
Seismic X
volumes
T-D chart
Diagnostic mode
There is a useful option in the Velocity Volume Generation fly out that allows you to generate
an ASCII text file that details all of the parameters in the velocity model used for depth
converting each layer. To use this option, set the ‘Type’ parameter to ‘Diagnostic’. The output
file will be that set in ‘File’ (which will be written into the author’s project folder). Set the
‘Diags’ flags to ‘Yes’ or ‘No’ to incorporate more or less information into the text file.
Note that the ASCII files generated can be huge in size so limit the velocity volume
generation range to a few traces in the area of interest!
where:
Note that this equation makes the assumption that the surfaces are flat, parallel layers.
You can apply the Dix equation to RMS stacking velocities using the Velocity module Dix fly
out. This computes both interval and average velocities for that layer. Note that as RMS
velocities are typically several percent higher than the average velocity, the default
percentages for the Dix equation are set to 92%.
See Dix, C.H. (1955), Seismic Velocities from Surface Measurements, Geophysics,20:68-86
E
Event
An Event is either a horizon or grid, loaded into the velocity model, which delineates a
change in velocity regime in the vertical domain. Typically these correspond to changes in
rock lithology. Events correspond to a layer number. Therefore, the first event represents the
bottom of the first geological layer in the model.
If using well data in the velocity model, formation tops must be associated to Events using the
Layer definition module.
F
Fly out
Fly outs are pop out menus, typically located down the right-hand side of each module
window. The fly out content are specific to each module window. Fly outs can be ‘pinned’
open by pressing the pin icon in the top right hand corner of the fly out:
Multiple fly outs can be pinned open at any one time – they will stack on the screen in order to
maximize space for the main display window.
G
Grid
A grid is spatial representation of data that contains data samples in regular array.
It is strongly recommended that all grids used VelPAK have a square grid cell size and have
the same extents as a master (or ‘Parent’ grid). Typically the’ Parent’ grid will be Event 1, and
should cover the whole area of interest. This typically may be water bottom, bottom of the
weathered layer etc.
Using Surface Mode, grids will be depth converted from shallowest to deepest, so if a grid
above the current event covers less of an area, only the lesser area can be depth converted.
Grids may be created in VelPAK from XYZ data points or by processing other grids (on the
Process fly out of the Surface Module).
Grids are stored in the Model tree under the Surface >> Event Name >> Grid branch, and are
stored by GridType.
See also AOI, GridEvent, GridDesc and GridType.
GridEvent
The number of the grid event in the velocity model (0 is the surface, 1 is the shallowest event
e.g. sea bed, increasing downwards). In dialogs, a grid number can be specified by number,
or by using the nomenclature of ‘Previous’ & ‘Current’. The ‘Current’ surface is the one set in
the ‘Surface’ dialog in the model tree window, ‘Previous’ is the grid directly above.
For example, if the user wished to add the isopach of the current grid 3 to the depth of grid 2
to give the new depth grid for surface 3, and surface 3 is current, the parameter declaration
would be valid as either of these options:
GridDesc
Prefixed by a parameter number, GridDesc is a text field that the user can edit to apply a
short description to a grid held in the model tree. The grid description is displayed in the
model tree after the GridType label.
You can edit the description by clicking the grid in the model tree and editing the ‘Name’ field
in the ‘Properties’ table.
See also GridEvent and GridType.
GridType
Prefixed by a number, GridType refers to the named slot in the model tree to read data from
or write data to. There are a range of GridTypes predefined. You cannot add to the GridType
list but there are generic ‘GeneralXX’ types you can use to store any data type in. You can
add your own descriptive text to a GridType using the GridDesc field.
Note that it is perfectly safe to use the same input and output GridType in an operation. For
example, to multiply a depth domain grid by 0.3048, you would use the Surface module
Process fly out & set the parameters as follows:
The output depth grid will overwrite the input grid after it has performed the multiplication.
You can move data between GridTypes in a specified layer by right clicking the source slot,
selecting ‘Copy’, then right-clicking the destination slot and selecting ‘Paste’. If you wish to do
this in a workflow, you can use the node Surface.Process.Parameters and use the Math+
(Add) formula to add 0 to the source grid and store in the required output slot.
I
Interpretation
Horizon and fault interpretation is imported into the VelPAK model, using either the link or by
ASCII import. This is generally only ever used in Profile Mode and must undergo the
Snapping procedure before use.
Snapped interpretation cannot be directly put back into the Kingdom project database as the
data may have been merged to produce pods, channels and multi-Z component
interpretation. Instead it must be exported to ASCII files in segments for import back into the
project database.
Interval Velocity
The ratio of the vertical thickness of a layer to the vertical travel time of the seismic wavefront
through that layer. See also Seismic Velocity and Dix Interval Velocity.
INDT
Indeterminate or NULL value. These occur in grids where maybe a fault polygon exists, or
grid cells are divided by a zero value, or the gridding algorithm doesn’t project data between
sparse data points.
INDT values in grids can be replaced by a user specified constant (or values from another
grid) using the Surface Module ‘Process’ fly out; on the ‘Parameters’ tab set the ‘Formula’ to
‘SET INDTS TO GRID OR VALUE’, and specify the parameters as per usual.
Instantaneous Velocity
The velocity at any given time or depth.
Isochron
A time thickness grid of a layer. After depth converting a layer, the Isochron grid for that layer
can be displayed by double-clicking the Isochron grid in the Model Tree.
Isopach
A thickness grid of a layer in the depth domain. Technically, the isopach grids generated by
VelPAK are in fact depth domain isochore maps, as they measure vertical thickness. In the
strictest terms, an isopach is the thickness of a layer measured normal to the bedding planes,
but VelPAK uses the vernacular meaning of the term.
After depth converting a layer, the isopach grid for that layer can be displayed in the ‘Surface’
module by double-clicking the Isopach grid in the Model Tree.
K
K
See Velocity Gradient.
L
Layer
Layer has two meanings in VelPAK:
1. The geological region between two surface events
2. The visible data on the Surface map and other displays. The visible layers may be toggled
on & off using the ‘Display’ fly out or the ‘Layers Visible’ drop down
Layer definition
This is the process of assigning the layer model from the layer event data to the well data.
This is done by matching up formation tops that correspond to the layer events (grids) in the
velocity model.
Layer definition is performed in the Layer module. Multiple formation tops can be assigned
within one layer. The layer model can be seen in the Well module after the model has been
applied. The layer definition in the wells is displayed to the left hand side of the well curve, the
layer definition made by the event layers (grids) is displayed to the right hand side of the well
curve. Ideally, the color blocks should match up either side of the curve.
M
Model Tree
Analogous to the Project Tree displayed in the main application window, the model tree
contains a list of data in the velocity model. However, the model tree differs in that it contains
a sorted arrangement of Surfaces, ordered from the shallowest layer (the seismic reference
datum) down to the deepest layer. Profile and well data is stored in alphabetical order in the
model tree.
By default, the model tree only display data that is present in the model. To display model tree
slots unoccupied by data, click the ‘Show Unused’ icon in the model tree icon bar.
Module
A VelPAK workspace activated by clicking the Module name icon on the main VelPAK display.
Modules may be detached from the main display by double-clicking the module name tab
after the module has been activated.
To reattach the detached module to the main display, double-click the title bar of the module
window.
The available modules are as follows:
O
Optimization
In VelPAK, optimization is a powerful & robust technique to determine the best parameters for
a given velocity function using well velocity data. The objective function is set by the user in
the Surface module Depth fly out (e.g. [Optimize] Linear V0 KZ, V= V0+Kz), the parameter
space defined, the process is run and the optimal parameters displayed for automatic or user
selection.
A number of pre-defined workflows are built into the ‘Workflows’ fly out of the ‘Workflow’
module, to make using this technique very simple.
P
Pods
Pods are discrete geological units whose velocity can be modeled and are used exclusively
in Profile Mode. Typically they are introduced during or after the main layers of the profile
have been defined. This event number must be classed as a Pod using the Pod tab on the
Snap fly out in profile mode.
Typical uses for pods are to model channels, shallow surface anomalies, gas seepage
between layers, and even lacoliths.
Profile Mode
If the geology of the project is complex and cannot be represented by a grid, Profile mode
can be used. Typical scenarios are for regions with reverse faulting, salt diapirs and channel
systems.
A profile is a single vertical section. This could be a 2d line, inline or crossline from a 3d
survey, or a random (or arbitrary) line across multiple surveys.
Profile mode works by depth converting the data loaded into each profile. Typically this is
horizon data brought into the model from the interpretation project. The horizon data must
undergo the snapping process to produce a 2d sealed earth model, prior to modeling the
velocity changes in each layer of the profile.
Unlike Surface mode, the velocity function for each layer must be defined prior to depth
converting the profile.
Note the Profile mode can use data (e.g. interval velocity grids) to depth convert the horizons.
In addition to layers, profiles can also incorporate sealed units called pods, which may or may
not intersect with the layers in the profile.
Property Grid
Property grids are user interface elements that are tables of parameters and information,
typically seen in the Fly Outs. They are not to be confused with the time or depth grid data for
example (although there is a property grid, under the Model Tree, that contains information
on the grid selected in the Model Tree).
Pseudo Velocity
See Apparent Velocity.
R
RMS Velocity, Vrms
Defined by:
S
Seismic Velocity
The rate at which a seismic wavefront travels through the medium. Seismic velocity is
typically anisotropic within a medium. In flat-bedded sediments, typically the horizontal
velocity is higher than the vertical velocity.
With sedimentary rocks, usually the velocity increases with depth within a uniform layer due
to burial compaction and diagenetic effects. Hence there is often a velocity gradient within a
layer.
Velocities may be categorized as average velocity, interval velocity or instantaneous velocity.
Stacking Velocity
Stacking velocity is the value used to stack seismic traces, typically derived by semblance or
coherency methods in seismic processing packages. Typically hyperbolic moveout with offset
is assumed in the derivation of stacking velocities.
See also RMS Velocity.
Surface Mode
When the geological layers can be represented by a series of grids, it is best to use VelPAK
to model the layer velocities and then depth convert each grid in turn (see Layer cake depth
conversion). This situation lends itself well to the use of workflows.
When working with entirely grids, there is no advantage to be gained by bringing horizon data
into the model.
When the geological layers cannot be represented by surface grids, Profile mode can be
used to model vertical sections. Surface mode is much less labor intensive to work in than
Profile mode.
T
TKS Link
This is the link between the velocity model, and the main Kingdom project database. Data
can be loaded into the velocity model from the database using ‘File >> Open TKS…’ and
stored from the model into the database using ‘File >> Save TKS…’.
Once data has been loaded into the model, it can be edited, manipulated, or deleted without
changing the data in the Kingdom database.
See also VelPAK model file.
U
V
V0
See ‘Velocity Intercept’.
Velocity Gradient
The rate of change of seismic velocity within a layer, often denoted by the symbol ‘K’. This
typically represents the increase of velocity with depth in a homogenous layer due to
compaction, and may be referred to as the compaction gradient.
The value of K for a layer may be derived using VelPAK’s ‘Curve’ and ‘Optimize’ modules if
well velocity data is present. The latter is recommended, especially if only limited well data is
available.
It is important to note that the value of K is unique to the velocity function being used. For
example, for the Slotnick equation K may typically be 0.000258 but for the Houston equation
K may be 0.00924 for the same rock mineralogy. Please refer to the VelPAK on-line Help
section ‘Velocity Function Parameter Ranges’ for more information.
See also Velocity Intercept and Instantaneous Velocity.
Velocity Intercept
The instantaneous velocity at the seismic datum, typically denoted by V0 or V0 (V-nought).
where:
Velocity Model
A set of geological surfaces, methods and parameters are used to create a velocity model.
Typically this will be in Surface Mode, where grids are used to define the geological structure,
but this may also be in Profile Mode, where horizon interpretations and fault sticks are used
to define the structure for any given vertical section. Methods are defined for each layer in the
structural model (e.g. V = Vo+ Kz), along with any associated parameters for that method.
Depth conversion of surfaces takes place in a top-down fashion, but to generate a complete
velocity volume, methods and parameters for every layer in the model need to be defined.
W
Well Velocity
The velocity recorded from well data. VelPAK can use either a sonic log (which is
automatically converted from slowness to velocity on import into the model) or a Time-Depth
chart for each well. If using a sonic log, ideally it should be calibrated to check shot data. If
this data is not available, a crude calibration to a time constant (or structural grid) may be
made in the Well Module.
X-Y-Z
XYZ
Collections of values representing data in space. Each surface has a number of predefined
XYZ slots associated with it; these match the GridTypes. All XYZ data must exist within the
predefined slots.
By default, horizon data imported through the Link as XYZ data will be stored in the ‘Time’
slot in the Model Tree. You can move this data to another slot by right clicking in the ‘Time’
slot, selecting ‘Copy’, then right-clicking the destination slot and selecting ‘Paste’.
To generate grids in VelPAK, the input data must be XYZ data. Therefore, to regrid an
existing grid, the grid needs to be converted to XYZ points first. This is done on the XYZ fly
out of the surface module.