Professional Documents
Culture Documents
Manual Mine Sight Geoestadistica PDF
Manual Mine Sight Geoestadistica PDF
for Modelers
Geostatistics
Workbook
E005
Rev. B
2002, 2001, 1994, and 1978 by MINTEC, inc.
MineSight Overview............................................................................................Over-1
Geostatistics Overheads.........................................................................................1
Classical Statistics.................................................................................................1-1
Variograms..............................................................................................................2-1
Declustering............................................................................................................4-1
Model Calculations..................................................................................................9-1
Quantifying Uncertainty........................................................................................10-1
Change of Support.................................................................................................11-1
This workbook is designed to present concepts clearly and then give the user
practice through exercises to perform the stated tasks and achieve the required results.
All sections of this workbook contain a basic step, or series of steps, for using
MineSight with a project. Leading off each workbook section are the learning objectives
covered by the subject matter within the topic section. Following this is an outline of the
process using the menu system, and finally an example is presented of the results of the
process.
MineSight provides a large number of programs with wide ranges of options within
each program. This may seem overwhelming at times, but once you feel comfortable
with the system, the large number of programs becomes an asset because of the
flexibility it affords. If you are unable to achieve these key tasks or understand the
concepts, notify your instructor before moving on to the next section in the workbook.
Drag - move the mouse while holding down the left mouse button
Highlight - drag the mouse pointer across data, causing the image to reverse in color
The following terms and conventions are used in the Mintec workbooks:
Notes:
Actions or keyboard input instructions - are printed in Times New Roman font,
italics, embedded within arrow brackets and keys are separated with a + when used in
combination, for example, to apply bold face to type is indicated by <ctrl+shift+b>.
Button/Icon - are printed in bold with the initial letter capitalized, in Times New
Roman font, for example Print, on a button, indicates an item you click on to produce a
Menu Commands - are printed in Arial font, bold, with a vertical bar, as an example
File I Open means access the File menu and choose Open.
Parameters - are printed in Arial font, lower case, in bullet format, as an example,
Select - highlight a menu list item, move the mouse over the menu item and click the
mouse.
Questions or Comments?
Note: if you have any questions or comments regarding this training documentation,
please contact the Mintec Documentation Specialist at (520) 795-3891 or via e-mail at
market@mintec.com.
What Is MineSight?
Digitized data is utilized in the evaluation of a project in many ways. It can be used to
define geologic information in section or plan, to define topography contours, to define
structural information, mine designs and other information that is important to evaluate
the ore body. Digitized data is used or derived in virtually every phase of a project from
drillhole data through production scheduling. Any digitized data can be triangulated and
viewed as a 3-D surface in MineSight.
A variety of drillhole data can be stored in MineSight, including assays, lithology and
geology codes, quality parameters for coal, collar information (coordinates and hole
orientation), and down-the-hole survey data. Value and consistency checks can be
performed on the data before it is loaded into MineSight. After the data has been stored
in the system, it can be listed, updated, geostatistically and statistically analyzed, plotted
in plan or section and viewed in 3-D. Assay data can then be passed on to the next
logical section of MineSight which is compositing.
Compositing Operations
Composites are calculated by benches (for most base metal mines) or mineral seams
(for coal mines) to show the commodity of interest on a mining basis. Composites can
be either generated in MineSight or generated outside the system and imported.
Composite data can be listed, updated, geostatistically and statistically analyzed, plotted
in plan or section and viewed in 3-D. Composite data is passed on to the next phase of
MineSight, ore body modeling.
Modeling Operations
GSM the vertical dimensions are a function of the seam and interburden thicknesses. For
Notes: each block in the model, a variety of items may be stored.
Typically, a block in a 3DBM will contain grade items, geological codes, and a
topography percent. Many other items may also be present. For a GSM, the seam top
elevation and seam thickness are required. Other items, such as quality parameters, seam
bottom, partings, etc. can also be stored. A variety of methods can be used to enter data
into the model. Geologic and topographic data can be digitized and converted into codes
for the model, or they can be entered directly as block codes. Solids can also be created
in the MineSight 3-D graphical interface for use in coding the model directly. Grade
data is usually entered through interpolation techniques, such as Kriging or inverse
distance weighting. Once the model is constructed, it can be updated, summarized
statistically, plotted in plan or section, contoured in plan or section, and viewed in 3-D.
The model is a necessary prerequisite in any pit design or pit evaluation process.
This set of routines works on whole blocks from the 3-D block model, and uses either
the floating cone or Lerchs-Grossmann technique to find economic pit limits for
different sets of economic assumptions. Usually one grade or equivalent grade item is
used as the economic material. The user enters costs, net value of the product, cutoff
grades, and pit wall slope. Original topography is used as the starting surface for the
design, and new surfaces are generated which reflect the economic designs. The designs
can be plotted in plan or section, viewed in 3-D, and reserves can be calculated for the
grade item that was used for the design. Simple production scheduling can also be run
on these reserves.
Pit Design
The Pit Design routines are used to geometrically design pits that include ramps,
pushbacks, and variable wall slopes to more accurately portray a realistic open pit
geometry. Manually designed pits can also be entered into the system and evaluated. Pit
designs can be displayed in plan or section, can be clipped against topography if desired,
and can be viewed in 3-D. Reserves for these pit designs are evaluated on a partial block
basis, and are used in the calculation of production schedules.
Production Scheduling
This group of programs is used to compute schedules for long-range planning based
upon pushback designs (or phases), and reserves computed by the mine planning
programs. The basic input parameters for each production period include mill capacity,
mine capacity, and cutoff grades. Functions provided by the scheduling programs
include:
Calculation and reporting of production for each period, including mill production
by ore type, mill head grades and waste
Preparation of end-of-production period maps
Calculation and storage of yearly mining schedules for economic analysis
Evaluation of alternate production rates and required mining capacity
Mine Evaluation
Project
Enter
Scan
Drillhole Assays Load
Edit
List
Dump
Rotate
Add Geology
Statistics
Variograms
Plot Collars
Plot Sections
Special Calculations
3-D Viewing and Interpretation
Load
Edit
Composites List
Dump
Add Geology
Add Topography
Statistics
Variograms
Variogram Validation
Plot Sections
Plot Plans
Special Calculations
Sort
3-D Viewing and Interpretation
Initialize
Digitized Data Mine Model Interpolate
Add Geology
Add topography
List
Digitize Edit
Load Statistics
Edit Reserves
List Special Calculations
Dump Plot Sections
Plot Plot Plans
3-D Viewing Contour Plots
Sort
3-D Viewing & Solids Construction
Long Range
Planning & Short Range
Scheduling
MineSight Capacities
Notes:
Drillholes
No limit to the number of drillholes; only limited by the total number of assays
in the system
99 survey intervals per drillhole
524,285 assay intervals per file
8,189 assay intervals per drillhole
99 items per interval
Multiple drillhole files allowed (usually one is all that is required)
Composites
524,285 assay intervals per file
8,189 composites per drillhole
99 items per composite interval
Multiple composite files allowed (usually one is all that is required)
Geologic Model
3-D block model limit of 1000 columns, 1000 rows and 400 benches
Gridded seam model limit of 1000 columns, 1000 rows and 200 seams
99 items per block
Multiple model files allowed (usually one is all that is required)
Reserves
20 material classes
20 cutoff grades for each material class
10 metal grades
Multiple reserves files allowed
Blastholes
Notes: 524,285 blastholes per file with standard File 12
8,189 blastholes per shot with standard File 12
4,194,301 blastholes per file with expanded limit File 12
1,021 blastholes per shot with expanded limit File 12
99 items per blasthole
Multiple blasthole files allowed (usually one is all that is required)
Objective: To make you familiar with the Sample values are realizations of a random
basic concepts of statistics, and the variable
geostatistical tools available to solve Samples are considered independent
problems in geology and mining of an ore Relative positions of the samples are
deposit ignored
Does not make use of the spatial
correlation of samples
Geostatistics Topics
1
Geostatistics Universe
Throughout this workbook, geostatistics will The source of all possible data (for example,
refer only to the statistical methods and tools an ore deposit can be defined as the
used in ore reserve analysis universe; sometimes a universe may not
have well defined boundaries)
Like universe, population refers to the total A variable whose values are randomly
category under consideration generated according to a probabilistic
It is possible to have different populations mechanism (for example, the outcome of a
within the same universe (for example, coin toss, or the grade of a core sample in a
population of drillhole grades versus
population of blasthole grades; sampling diamond drill hole)
unit and support must be specified)
2
F requency Distribution F requency Distribution
Example PDF
Measures of location:
1
0.9 Mean
0.8
0.7 Median
0.6
0.5 Mode
0.4
0.3 Min, Max
0.2
0.1 Quartiles
0
0 1 2 3 4 5 6 7 8 9 10 11
Percentiles
3
Mean Mean
Mean Mean
Mean Median
4
Median Median
Other Mode
Mode Quartiles
In example:
Q1=?
Q3=?
5
Quartiles Deciles, Percentiles,Quantiles
1, 1, 1, 2, 3, 3, 5, 7, 7 ,11 1, 1, 1, 2, 3, 3, 5, 7, 7 ,11
Q1= 1
Q3= 6 D1= 1
D3= 1
D9= 7
Mean(=4.1)
Max
6
Variance Variance
Example: IQR = Q3 - Q1
S2= 11.21
S = 3.348 Not used in mining very often
S2 = 6
S =2.445
7
Descriptive Measures Skewness
Skewness Skewness
Example: Remove high value:
1, 1, 1, 2, 3, 3, 5, 7, 7 ,11 1, 1, 1, 2, 3, 3, 5, 7, 7
M=4.1 M=3.3
Sk= 1/10 {(1-4.1)3+ (1-4.1)3+ (1-4.1)3+ (2-4.1)3+ Sk= 1/10 {(1-3.3)3+ (1-3.3)3+ (1-3.3)3+ (2-3.3)3+
(3-4.1)3+ (3-4.1)3+ (5-4.1)3+ (7-4.1)3+ (3-3.3)3+ (3-3.3)3+ (5-3.3)3+ (7-3.3)3+
(7-4.1)3+ (11-4.1)3 } = (7-3.3)3 } =
= 1/10 (-29.79-29.79-29.79-8.82-1.33 1.33+ 0.73+ = 1/10 (-12.17- 12.17- 12.17- 2.2- 0.03- 0.03+ 4.91+
24.39+ 24.39+328.51) = 50.65+ 50.65) =
= 277.2/10 = = 67.44/9 =
= 27.72 = 7.49
8
Coefficient of Variation Coefficient of Variation
9
Example of cdf (normal) Lognormal Distribution
Find the proportion of sample values above 0.5 cutoff in a Logarithm of a random variable has a normal
normal population that has m =0.3, and s = 0.2
Solution: distribution
First, transform the cutoff, x0 , to unit normal. f(x) = 1 / (x 2 ) e -u for x > 0, > 0
z = (x0 - m) / s = (0.5 - 0.3) / 0.2 = 1
Next, find the value of F(z) for z = 1. The value of F(1) = 0.8413 where
from Table u= (ln x - ) 2 / 2 2
Calculate the proportion of sample values above 0.5 cutoff,
P(x > 0.5), as follows: = mean of logarithms
P(x > 0.5) = 1 - P(x 0.5) = 1 - F(1) = 1 -0.8413 = 0.16 = variance of logarithms
Therefore, 16% of the samples in the population are > 0.5
10
Statistical analysis E rror Checking
To organize, understand, and/or describe Avoid zero for defining missing values
data Check for typographical errors
To check for errors Sort data; examine extreme values
To condense information Plot sections and plan maps for coordinate
To uniformly exchange information errors
Locate extreme values on map; Isolated?
Trend?
Data Analysis and Display Tools Data Analysis and Display Tools
Correlation
Frequency Distributions Correlation Coefficient
Histograms Linear Regression
Cumulative Frequency Tables Data Location Maps
Probability plots Contour Maps
Scatter Plots Symbol Maps
Q-Q plots Moving Window Statistics
Proportional Effect
distributed 73
86
80
.261
.354
.440
.400 +****
.500 +*****
.600 +****
.
.
.
+
+
+
84 .531 .700 +***** . +
11
Histogram Plot Histograms with skewed data
12
Scatter Plots Scatter Plot
Quantile-Quantile plots
Straight line indicates the two distributions
have the same shape
Cu<0.5, Mo<0.05 45-degree line indicates that mean and
variance are the same.
y= 8.363x +0.049
13
Q-Q plot Covariance
Where
mx = mean of x values and
my = mean of y values
x-mx<0 x-mx>0
y-my>0
my
y-my<0
mx
14
Covariance Correlation
C=20.975
r = Covxy / xy
= -0.03 = -0.97
15
Correlation Coefficient Data Location Map
= -0.08
Divide area into several local areas of Mean and variability are both constant
same size Mean is constant, variability changes
Calculate statistics for each smaller area Mean changes, variability is constant
Useful to investigate anomalies in mean Both mean and variability change
and variance
16
Proportional Effect Plot
17
Spatial continuity Spatial continuity
18
Variogram Variogram parameters
Computation 1 Computation 2
For the first step (h=15), there are 4 pairs: For the second step (h=30), there are 3 pairs:
1. x1 and x2 , or .14 and .28 1. x1 and x3 , or .14 and .19
2. x2 and x3 , or .28 and .19 2. x2 and x4 , or .28 and .10
3. x3 and x4 , or .19 and .10 3. x3 and x5 , or .19 and .09
4. x4 and x5 , or .10 and .09 Therefore, for h=30, we get
Therefore, for h=15, we get
(30) = 1/(2*3) [(x1-x3)2 + (x2-x4) 2 + (x3-x5)2 ]
(15)=1/(2*4)[(x1-x2)2+(x2-x3)2+(x3-x4)2+(x4-x5)2 ] = 1/6 [(.14-.19)2 + (.28-.10)2 + (.19-.09)2 ]
= 1/8 [ (.14-.28)2 + (.28-.19)2 + (.19-.10)2 + (.10-.09)2] = 0.16667 [(-.05)2 + (.18)2 + (.10)2 ]
= 0.125 [(-.14)2 + (.09)2 + (.09)2 + (.01)2 ] = 0.16667 ( .0025 + .0324 + .0100 )
= 0.125 ( .0196 + .0081 + .0081 + .0001 ) = 0.16667 ( .0449 )
= 0.125 ( .0359 )
(30) = 0.00748
(15) = 0.00448
19
Computation 3 Computation 4
For the third step (h=45), there are 2 pairs: For the fourth step (h=60), there is only one pair:
1. x1 and x4 , or .14 and .10 x1 and x5 . The values for this pair are .14 and .09,
2. x2 and x5 , or .28 and .09 respectively. Therefore, for h=60, we get
Therefore, for h=45, we get (60) = 1/(2*1) (x1 - x5 ) 2
(45) = 1/(2*2) [(x1-x4 )2 + (x2-x5)2] = (.14-.09)2
= 1/4 [(.14-.10)2 + (.28-.09)2 ] = 0.5 (.05)2
= 0.25 [(.04)2 + (.19)2 ] = 0.5 ( .0025 )
= 0.25 ( .0016 + .0361 ) (60) = 0.00125
= 0.25 ( .0377 )
(45) = 0.00942 If we take another step (h=75), we see that there are
no more pairs. Therefore, the variogram calculation
stops at h=60.
20
Variogram Models Variogram Models
Geometric
same sill and nugget, different ranges
Zonal
same nugget and range, different sills
Geometric
21
Modeling Anisotropy Modeling Geometrical Anisotropy: a recipe
45o
90o
0o
135o
Normal
Relative
Logarithmic
Covariance Function
Correlograms
Indicator Variograms
Cross Variograms
22
Relative Variogram Logarithmic Variogram
R (h) = (h) / [m(h) + c]2 Variogram using the logarithms of the data
instead of the raw data
c is a constant parameter used in the case of a
three- parameter lognormal distribution. y = ln x or
y = ln (x + c) for 3-parameter lognormal
Pairwise Relative Variogram:
Reduces or eliminates the impact of
PR (h) = 1/(2n) [(vi -vj ) 2 /((vi +vj )/2)2 ]
extreme data values on the variogram
structure
vi and vj are the values of a pair of samples at
locations i and j, respectively.
To transform log parameters back to normal values: C(h) = 1/N [vi vj - m-h . m+h ]
1. Ranges stay the same
2. Estimate the logarithmic mean () and variance (2). Use v1 ,...,vn are the data values
the sill of the logarithmic variogram as the estimate of 2 m-h is the mean of all the data values
3. Calculate the mean, () and the variance ( 2 ) of the
normal data:
whose locations are -h away from some
= exp ( + 2 /2) other data location.
2 = 2 [exp (2 ) - 1] m+h is the mean of all the data values
4. Set the sill of the normal variogram = the variance ( 2 ) whose locations are +h away from some
5. Compute c (sill-nugget) and c0 (nugget) of the normal
variogram:
other data location.
c = 2 [exp (clog ) - 1] (h) = C(0) - C(h)
c0 = sill - c
(h) = C(h) / ( -h . +h )
1, if z(x) < zc
-h is the standard deviation of all the data
i(x;zc) ={
values whose locations are -h away from
0, otherwise
some other data location:
where:
2-h = 1/N (vi2 - m2-h ) x is location,
+h is the standard deviation of all the data zc is a specified cutoff value,
values whose locations are +h away from z(x) is the value at location x.
some other data location:
2+h = 1/N (vj 2 - m 2+h )
23
Cross Variograms Cross Validation
CR (h) = 1/2n [u(xi)-u(xi+h)]2 * [v(xi)-v(xi+h)]2
Predicts a known data point using an
interpolation plan
Used to describe cross-continuity between Only the surrounding data points are used
two variables to estimate this point, while leaving the data
Necessary for co-kriging and probability point out.
kriging Other names: Point validation, jack-knifing
Either the variance of the errors or the Mean = 0.6991 0.7037 -0.0045
weighted square error (or variance) is Std. Dev. = 0.5043 0.3870 0.2869
24
T he Necessity of Modeling Deterministic Models
Depend on:
Suppose we have the data set below
Context of data
It provides virtually no information about the
entire profile Outside information (not contained in data)
A random variable is a variable whose Since the outcomes of a R.V. are numerical
values are randomly generated according to values, we can define another random variable by
performing mathematical operations on the
some probabilistic mechanism.
outcome of a random variable.
The result of throwing a die is a random Example: if D is the variable defined as the result
variable. There are 6 equally probable of throwing a die, 2D can be the variable defined
values of this random variable: 1,2,3,4,5,6 as the result of throwing the die and doubling the
result.
25
Parameters of a Random Variable Parameters of a Random Variable
The set of outcomes and their corresponding The complete distribution can not be
probabilities is sometimes referred to as the determined from the knowledge of only a few
probability distribution of a random variable. parameters.
These probability distributions have Two random variables may have the same
parameters that can be summarized. mean and variance but their distributions
may be different.
Example: Min, Max etc
26
Expected value Joint Random Variables
Covariance Independence
27
Random Functions Parameters of R F
R.F. is a set of random variables that have The set of realizations of a random function
some spatial locations and whose and their corresponding probabilities are
dependence on each other is specified by often referred as the probability
distribution
some probabilistic mechanism.
Like the histograms of sample values,
these probability distributions have
parameters that summarize them
28
Random Process Assumptions Stationarity
29
En suring Unbiasednes s En suring Unbiasednes s
Estimated value: Z* = i Z(xi)
Estimated error: R* = Z*-Zo = iZ(xi) - Zo Sum of weights, wi = 1
Average error: r = 1/n R*
Set expected value of average error to zero:
Two limitations:
E{r} = E{1/n R*} = 1/n E{R*} = 0 The average error is not guaranteed to be
To guarantee that E{r} = 0, make E{R*} = 0
zero, only the expected value
E{R*} = E{i Z(xi) - Zo}
= iE{Z(xi)} - E{Zo} The result is valid only if the linear
Using the strong stationarity requirement: combination belongs to the same statistical
E{Z(xi)} = E{Zo} = E{Z}
Therefore, population
E{R*} =iE{Z} - E{Z} = 0 =>
(i -1) E{Z} = 0 => i -1 = 0 => i =1
30
Inverse Distance Inverse Distance
Definition:
B.L.U.E. for best linear unbiased
Ordinary kriging is an estimator designed estimator.
primarily for the estimation of block grades
as a linear combination of available data Linear because its estimates are weighted
in or near the block, such that estimate is linear combinations of available data
unbiased and has minimum variance. Unbiased since the sum of the weights
adds up to 1
Best because it aims at minimizing the
variance of errors.
31
E rror variance E rror variance
Using R. F. model, you can express the error 2R= 2z + (i j Ci,j ) - 2 i C i,o
variance as a function of R.F. parameters:
2R= 2z + (i j Ci,j ) - 2 i C i,o
Error increases as variance of data
where increases
2z is the sample variance
Error variance increases as data become
Ci,j is the covariance between samples more redundant
Ci,o is the covariance between samples and
Error variance decreases as data are
location of estimation. closer to the location of estimation
See Isaaks and Srivastava pg 281-284
Result:
Ci,o = (i Ci,j) +
i = 1
32
Block Kriging (cont.) Kriging Variance
In point kriging, the covariance matrix D
consists of point-to-point covariances. In block
kriging, it consists of block-to-point covariances. 2ok = CAA - [(i CiA) + ]
Covariance values CiA no longer a point-to-point
covariance like Ci0 , but the average covariance Data independent
between a particular sample and all of the points
within A:
CiA = 1/A Cij
In practice, the A is discretized using a number of
points in x, y and z directions to approximate CiA .
33
E ffect of scale E ffect of shape
34
Search Strategy Search strategy (cont.)
Include at least a ring of drill holes with enough
samples around the blocks to be estimated
Declustering Declustering
Clustering in high grade area: Clustering in mean grade area:
Nave mean= Nave mean=
(0+1+3+1+7+6+5+6+2+4+0+1)/ (7+1+3+1+0+6+5+1+2+4+0+6)/
12 = 3 12 = 3
35
Declustering Declustering
Clustering in low grade area:
Data with no correlation, do no need
Nave mean=
(7+1+6+1+0+3+4+1+2+5+0+6)/ declustering (pure nugget effect model)
12 = 3
Polygonal
36
Cross Validation Cross validation
Check: Remember:
Histogram of errors All conclusions are based on observations
Scatter plots of actual versus estimate of errors at locations were we do not need
estimates.
37
Quantifying Uncertainty Quantifying Uncertainty
Other approach
Incorporate the grade in the error variance
calculation:
Same Kriging Variance!!!
Relative Variance = Kriging Variance /Square
of Kriged Grade
N=4
M = 8.825 N = 16
M = 8.825
38
Change of Support Change of Support
>10
N = 2 = 50% >10
M = 11.15 N = 5 = 31%
M =18.6
39
Kriges Relation (contd) Calculation of A.C.
How to calculate 2 pb ? K2 = 2b / 2p 1
Integrating the variogram over a block provides (from the variogram averaging):
variance of points within the block
K2 = [ (D,D) - (smu,smu) ] / (D,D)
2 pb = block = 1/n2 (hi,j)
= 1 - [ (smu,smu) / (D,D) ] 1
Affine correction factor, K = K2 1
40
Change of Support (other) Change of Support (other)
41
Equivalent Cutoff Calculation Numeric Example
Let the mean of composites = 0.0445, and
zp = ( p / smu ) zsmu + m [1 - ( p / smu )] the specified cutoff grade zsmu = 0.055
The ratio p / smu is basically the inverse If the ratio p / smu = 1.23, what is the
of the affine correction factor K. equivalent cutoff grade?
This ratio is 1. zp=1.23 (0.055) + 0.0445 (1 - 1.23) =0.0574
42
Simple Kriging Cokriging
Z*sk = i [Z(xi ) - m] + m i = 1,...,n Suitable when the primary variable has not
been sampled sufficiently.
Z*sk - estimate of the grade of a block or a Precision of the estimation may be
point improved by considering the spatial
Z(xi ) - refers to sample grade correlations between the primary variable
i - corresponding simple kriging weights and a better-sampled variable.
assigned to Z(xi ) Example: extensive data from blastholes
n - number of samples as the secondary variable - Widely spaced
m = E{Z(x)} - location dependent expected exploration data as the primary variable.
value of Z(x).
43
ORK matrix Nearest Neighbor Kriging
44
Indicator Kriging Indicator Kriging (applications)
45
Indicator Function at point x T he (A;zc) function
46
Indicator Variography Median Indicator Variogram
47
Affine Correction Grade Zoning
Equation for affine correction of any Grade zoning is usually applied to control
panel or block is given by the extrapolation of grades into statistically
*v (A;z) = * (A;zadj) different populations
where
Often grade zones or mineralization
zadj=adjusted cutoff grade = K(z - ma)+ma
envelopes correspond to different geologic
units
Use affine correction if:
Determine how the grade populations are Discontinuity between grade populations:
separated spatially
Is there a reasonably sharp discontinuity
between the grades of the different
populations?
Or is there a larger transition zone between
the grades of the different populations?
48
Grade Zoning (contd) Grade Zoning (contd)
If the average difference in grade Dzi vs
Characterizing the contact between
distance from the contact is more or less
different spatial populations:
constant, then there is probably a
Calculate the difference between the discontinuity between the different
average grades within each population as a populations :
function of distance from the contact:
Dzi = zi - z(-i)
49
Geostatistics Overheads Proprietary Information of Mintec, inc.
Notes:
Learning Outcome
In this section you will learn:
Classical Statistics
The most commom statistical operations available within MineSight are:
Histograms
Correlations
Mineral deposit data is generally not independent. It is for this reason that
geostatistics was developed.
Different geologic zones may have different statistical populations. Mixing zones
may produce incorrect statistics.
Different types of samples have different volumes and should be kept separate
for analysis, e.g., drillhole assays and bulk samples.
Although samples may be equal in size, they may not have an equal volume of
Notes: influence. Drilling tends to be closer spaced in higher grade areas so the statistics
may be indicating a higher proportion of ore than actually exists.
General Statistics
On the MineSight Compass Menu tab, <select the Group Statistics, and the
Operation Calculation; from the procedure list, select procedure p40101.dat - Statistics
(assays)>. Fill out the panels as described.
Notes:
There is also information provided regarding the number of intervals excluded based
Notes: on the selection criteria input through the procedure panels:
Finally, a plottable histogram is also generated; this can be plotted using the M122MF
program, which is controlled with the MPLOT panel. The MPLOT panel can also be
used to create deferred plot files from the histogram.
Exercise 1
Use procedure p40201.dat to generate composite statistics. Use the same parameters Notes:
as in the assays. Use item length as the weighting item. Compare histograms and
statistics.
Exercise 2
Generate composite statistics for those composites that have ALTR = 1 and 2 only.
Hint: this can be done with a change to Panel 3.
Exercise 3
Repeat the exercise for those composites that have ROCK = 1 and 2 only.
Exercise 4
Generate lognormal composite data statistics for those composites that have ROCK =
1.
The report file tabulates the bench by bench mean and standard deviation for CU.
Notes:
There is also information provided regarding the number of intervals excluded based
on the selection criteria input through the procedure panels:
The MPLOT panel will come up, allowing you the option of plotting the resulting
histogram. <Click the Exit button to exit without displaying the plot preview>.
Exercise
Generate composite Cu statistics within ALTR codes.
Probability Plots
On the MineSight Compass Menu tab, <select the Group Statistics, and the
Operation Plot; from the procedure list, select procedure p41201.dat - Probability Plot
(Assays or Composites)>. Fill out the panels as described. Notes:
Panel 1 Select File or Drillholes to Use
<Select File 11 Assays to do the cumulative probability plot, and leave the First and
last survey ref# windows blank to use all assays>.
Notes:
Exercise 1
Generate probability plots for those assays that have ALTR = 1 and 2 only. Hint: this
can be done with a change to Panel 3.
Exercise 2
Repeat the exercises using the composites.
Exercise 3
Overlay the composite probability plot on the assay probability plot and compare.
Hint: Output only the cumulative probability curve for composites and overlay it on the
full assay probability plot.
Notes:
Exercise 4
Generate probability plots without using log transformation.
analysis based on item values. You can also specify an optional boundary file in this
Notes: panel. <Use the RANGE command to specify ROCK Types 1 and 2 only>.
When you close the report file, the MPLOT panel will come up, giving you the
opportunity to preview the contour data plot, send it to the plotter, or generate a deferred
plot file for later use.
Notes:
Exercise
Generate In-situ statistics for assays in elevation range 2500 to 3000.
extension for both the run and report files. Leave the titling options blank for this
Notes: exercise.
There are also symbol plots for the scatter plot and histograms for each of the
specified analysis items.
Notes:
When
Part you close
#: E005 Rev.the
B report file, the MPLOT panel will come up, giving you the Page 1-13
Classical Statistics Proprietary Information of Mintec, inc.
Notes: opportunity to preview the scatter plot, send it to the plotter, or generate a deferred plot
file for later use.
This panel allows you to define portions of the data to include or exclude from the
analysis based on item values. You can also specify an optional boundary file in this Notes:
panel. Use the RANGE command to specify ROCK Types 1 and 2 only.
Panel 1
<Enter dat403.cu as the ASCII input file. Use std (min 0, max 1) for the y axis and the
mean (min 0, max 0.8) for the x axis. Check the apply limits to input data and free
format switches. Use cu as the file extensions for the run and report file 7.>
Panel 2
<Std (y-axis) is the 4th column and mean (x-axis) is the 5th column in input ASCII
DATA, therefore, enter 4 and 5 respectively.>
Panel 3
<Enter a title (Mean vs. Std). Use a dash line for both best-fit and 45 degree line.>
Panel 4
Leave this panel blank.
Notes:
Variograms Notes:
Prior to this section you calculated the composites. In this section you can develop
variograms of the composites. Following this, you can initialize the mine model and do
Kriging.
Learning Outcome
In this section you will learn:
The Variogram
In geostatistics, geologic samples such as assays or thickness values are not
independent samples.
Samples in proximity to one another are usually correlated to some degree. As the
distance between samples increases this degree of correlation declines until the samples
are far enough apart where they can be considered to be independent of one another.
The variogram is a graph that quantifies the spatial correlation between geologic
samples. It is a plot with the average squared assay difference between all pairs of
samples h distance apart plotted along the y-axis (h), and the distance h plotted
along the x-axis.
Logically you would expect this squared difference (h) to increase as the distance h
between the sample pairs increases. Once you reach a distance where the sample pairs
are independent, the average squared difference is not related to the distance h
anymore and the curve levels off.
This distance where the samples are no longer correlated is called the range of the
variogram and the value of (h) where it levels off is called the sill. Theoretically the sill
is equal to the variance of samples.
The distance over which the samples are correlated can be and usually is different in
different directions. This is called Anisotropy and simply states that mineralization may
be more continuous in one direction than another. Therefore, variograms are computed
in different directions.
At DISTANCE h=0 (i.e., 2 samples at the same location) the sample values should be
identical. In reality they usually are not. This is described in geostatistics as the Nugget
effect. Its value should be small if correct sampling and assaying procedures are used.
Variogram Models
The variogram model is the equation of a curve that best fits the variogram generated
with your data. Variogram models (singel or nested) available in MineSight are:
Spherical
Exponential
Linear
Notes:
H - Scatter Plots
On the MineSight Compass Menu tab, <select the Group Statistics, and the
Operation Plot; from the procedure list, select procedure p31101.dat - H-Scatter Plot>.
Fill out the panels as described.
Notes:
Notes:
When you close the report file, the MPLOT panel will come up, giving you the
opportunity to preview the H-Scatter data plot, send it to the plotter, or generate a
deferred plot file for later use.
Notes:
Exercise
Generate h-scatter plots for pairs 0-50m apart in different horizontal directions.
The variogram value is under the column V(H). It is plotted along the y-axis on the
graph. The distance h is under the column DISTANCE. It is plotted along the x-axis
on the graph.
Modeling Variograms
Notes:
On the MineSight Compass Menu tab, <select the Group Statistics, and the
Operation Calculation; from the procedure list, select procedure p30002.dat -
Variogram Modeling>. Fill out the panels as described.
Exercise 1
Interactively fit a spherical variogram model to the experimental variogram by
following these steps:
Exercise 2
Interactively modify the spherical variogram model you just created by following
these steps:
2. Click on your Nugget Value (Point 1) and move it around with the mouse to a new
location. Click right to store the new location.
3. Select Edit Model again. Click on your Sill/Range Value (Point 2) and move it
around with the mouse to a new location. Click right to store the new location.
5. Point to the desired value for range on the X-axis (e.g., 500) and click.
6. Click on Point 2 and move it around. (Note that only the Sill Value is changing.)
Click right to store the new Sill.
7. Select Edit Model and Fix Sill (Make sure Fix Range is off)
8. Point to the desired value for Sill on the y-axis (e.g., .1) and click.
9. Click on Point 2 and move it around. (Note that only the Range is changing.) Click
right to store the new Range
Exercise 3
Try an exponential model and compare the fit (visually) with the spherical.
Exercise 4
Try to model the directional variogram at 0, 45, 90, and 135 degrees.
Exercise 5
If the directional variograms are difficult to model, add an absolute lag tolerance of
25. Dont use composite values above 3. Use extension 002.
Exercise 6
Try a different horizontal angle increment (30). Use extension 003.
Exercise 7
Try to compute variograms using different vertical angle orientations. Compare
horizontal variograms to the vertical ones. Use extension 004.
Exercise 8
Compute variograms using different variogram type options, such as correlogram.
Use extension 005.
Exercise 9
Run variograms (as in Exercise 5) for rock type 2. Use extension 006. Notes:
Exercise 10
Run variograms for different windowing angles and/or band widths. Check reports
and see how the number of pairs used changes.
In the report file, a summary appears for each variogram calculated, as well as a
summary for a combined variogram.
The variogram value is under the column V(H). It is plotted along the y-axis on the
graph. The distance h is under the column DISTANCE. It is plotted along the x-axis
on the graph.
On the following page is the plot of the variogram points of the combined
variograms.
Notes:
Exercise 1
Model the combined down-hole variogram. Pick up a nugget value.
Exercise 2
Calculate and model a down-hole variogram for rock type 2. Pick up a nugget value.
Exercise 3
Generate down-hole variograms using composite data.
Minimum Maximum
2250 2750
Easting Easting
Minimum Maximum
5250 5750
Northing Northing
Notes:
Exercise 1
Overlay an ellipse to the variogram contours using a 200m major axis and a 125m
minor axis. What is the major axis orientation? Adjust lengths until the ellipse fits the
contours. If you find an orientation for which you dont have variograms, rerun the
variograms programs for the new orientation.
Exercise 2
Repeat for rock type 2.
Having studied the individual variograms, down-the-hole variograms, and the contour
maps for each rock type, decide on one set of variograms.
For example:
Notes:
Rock Range Range
Nugget Sil l Model type
typ e (major) (minor)
Assume for the vertical axis the same ranges as the minor axis. If you use an
exponential model, use three times the range as search distances in the interpolation
programs.
Results
The resulting variogram file will be brought up in the screen.
Exercise 1
Set up variogram parameter file for Rock Type 2.
Exercise 2
Set up variogram parameters for both Rock Types 1 and 2 in the same file. Hint:
specify the geology label as ROCK in the first panel.
Notes:
Learning Outcome
In this section you will learn how to use point validation for variogram evaluation
Interpolation Controls
There is a large range of parameters for controlling the point interpolation.
Point interpolation program M524V1 outputs the results for each composite used to
an ASCII file. These results are evaluated using program M525TS and the statistical
summaries are output to the report file.
Point Validation
From the MineSight Compass Menu tab, <select the Group Statistics, and the
Operation Calculation; from the procedure list, select procedure p52401.dat - Point
Validation>. Fill out the panels as described.
major, minor, and vertical search distances of 210, 120, and 120, respectively. Enter
Notes: MEDS for the rotation angle specification and check the box to Use anisotropic
distances?>
Results
This section of the report shows summary statistics for actual composite grades versus
the results from different interpolations.
This section of the report (on the next page) shows the statistics of the differences
between actual and kriging values The histogram is the histogram of the errors.
Notes:
This section of the report file (on the next page) shows correlation statistics between
the actual and kriging values.
Notes:
Exercise
Modify the variogram parameter file. Use nugget of 0.02. Rerun the point validation
procedure. Compare results (you should get a lower correlation and a higher standard
error).
Declustering Notes:
Prior to this section you calculated the composites and sorted statistics. In this section
you will use cell declustering technique to decluster the composite data. This is not
required for later work.
Learning Outcome
In this section you will learn:
Declustering
There are two declustering methods that are generally applicable to any sample data
set. These methods are the polygonal method and the cell declustering method. In both
methods, a weighted linear combination of all available sample values is used to
estimate the global mean. By assigning different weights to the available samples, one
can effectively decluster the data set.
In this section you will be using the cell declustering method which divides the entire
area into rectangular regions called cells. Each sample received a weight inversely
proportional to the number of samples that fall within the same cell. Clustered samples
will tend to receive lower weights with this method because the cells in which they are
located will also contain several other samples.
The estimate one gets from the cell declustering method will depend on the size of the
cells specified, If the cells are very small, then most samples will fall into a cell of its
own and will therefore receive equal weights of 1. If the cells are too large, many
samples will fall into the same cell, thereby causing artificial declustering of samples.
Results
Notes: The report file shows summary statistics for the original and the declustered samples.
Exercise 1
Obtain declustered data using cell sizes 45 x 45 and 40 x 40.
Exercise 2
Create a graph of the cell sizes vs mean values. The cell size that gives the lowest
value should be the best choice.
Exercise 3
Try declustering using Rock Type 1 only.
Notes:
When you exit the report file, the MPLOT panel will appear, giving you the option to
preview, plot directly to the plotter, or generate a deferred plot file for later use.
<Preview the plot, then close the preview with the X and Exit the MPLOT panel>.
Learning Objective
In this section you will learn:
Types of Interpolations
There are several methods of interpolation provided to you.
Polygonal assignment
Relative elevations
Trend plane
Gradients
Kriging
Interpolation Controls
There is a large range of methods for controlling the interpolation available.
IDW Interpolation
On the MineSight CompassMenu tab and <select the Group 3D Deposit Modeling,
and the Operation Calculation; from the procedure list, select the procedure p62001 -
IDW Interpolation.> Fill out the panels as described.
Results
The results of the interpolation are saved to the model file 15; to view the results,
create a model view in MineSight 3-D.
Exercise
Rerun for Rock Type 2. Change search distances and use option omit. Change the
following panels:
Again, the results for the interpolation of Rock type 2 can be checked by creating a
Model View in MineSight 3-D.
Ordinary Kriging
Ordinary kriging is an estimator designed primarily for the local estimation of block
grades as a linear combination of the available data in or near the block, such that the
estimate is unbiased and has minimum variance. It is a method that is often associated
with the acronym B.L.U.E. for best linear unbiased estimator. Ordinary kriging is
linear because its estimates are weighted linear combinations of the available data,
unbiased since the sum of the weights is 1, and best because it aims at minimizing the
variance of errors.
The conventional estimation methods, such as inverse distance weighting method, are
also linear and theoretically unbiased. Therefore, the distinguishing feature of ordinary
kriging from the conventional linear estimation methods is its aim of minimizing the
error variance.
Spherical
Linear
Exponential
On the MineSight CompassMenu tab, <select the Group 3D Deposit Modeling, and
the Operation Calculation; from the procedure list, select procedure p62401.dat -
Ordinary Kriging>. Fill out the panels as described.
files.
Notes:
Panel 2 - M624V1: Kriging Search Parameters
<Specify a 3-D search to use all composites within 210m (based on variograms)
horizontally and 50m vertically of a block. Specify that the closest composite must be
within 100 meters, and allow a minimum of 3 composites and a maximum of 16>.
The results of the interpolation are saved to the file 15 - you can check the results
visually by creating a Model View in MineSight 3-D.
Exercise
Repeat calculations for Rock Type 2. Use variograms calculated in Section 2. Change
search distances as you did for IDW.
find out how small changes in search parameters affect your interpolation.
Examine Results
<Open file rpt624.dbg.> This is a list of the composites used:
Note all of the composites are well inside the 240m maximum search distance.
Distances reported are adjusted by anisotropy. If you want to see the real distances,
rerun procedure without the anisotropic option on. You should now notice that the
order of the composites has changed.
Exercise 2
<Rerun procedure. This time apply a max distance of 120m. Change the ellipsoidal
search distances to 120x90x90.> What do you notice?
Exercise 3
<Rerun procedure. This time apply a max distance to the closest composite equal to Notes:
50m.> What do you notice?
Exercise 4
<Rerun procedure. Change max distance to the closest composite back to 100m. This
time apply max number of composites to be used equal to 7.> Check report and study
results:
Exercise 5
<Rerun procedure using the outlier options from the last panel. Use an outlier value
of 0.70. Use max distance for outliers equal to 75.> Open report:
It seems like there are no more composites available. <To add one more (7 total)
increase max distance to 240m as well as the ellipsoidal searches to 240x180x180:
Exercise 6
Rerun procedure using the octant/quadrant search options.>
Ellipsoidal Search
Notes:
Exercise
<Run procedure p624db.dat to build some ellipses and view them in MineSight 3-D.>
Do not use octant/quadrant options.
In MineSight 3-D, you can import ellipse as a MineSight object (import file
ellips.msr). Adjust properties of object if needed.
Learning Outcome
In this section you will learn:
Kriging
Interpolation Controls
There is a large range of parameters for controlling the point interpolation.
Point interpolation program M524V1 outputs the results for each composite used to
Notes: an ASCII file. These results are evaluated using program M525TS and the statistical
summaries are output to the report file.
Point Validation
On the MineSight Compass Menu tab, <select the Group Statistics, and the
Operation Calculation; from the procedure list, select procedure p52401.dat - Point
Validation>. Fill out the panels as described.
Results
This report (on the next page) shows summary statistics for actual composite grades
versus the results from different interpolations.
Notes:
This section of the report shows the statistics of the differences between actual and
kriging values The histogram is the histogram of the errors.
This section of the report file shows correlation statistics between the actual and
Notes: Kriging values.
This section of the report file shows correlation statistics between the actual and
inverse distance values.
Exercise
Change some of the search parameters and rerun the above procedure. What do you
observe?
Learning Outcome
In this section you will learn:
Model Statistics
From the MS Compass Menu tab, <select the Group Statistics, and the Operation
Calculation; from the procedure list, select procedure p60801.dat - Statistics (Model)>.
Fill out the panels as described.
This section of the report file shows the tonnage and grade of CUID values at each
bench. The grade is reported at whatever the minimum value specified on Panel 3.
Notes:
When you close the report file, the MPLOT panel will come up, giving you the
opportunity to preview the contour data plot, send it to the plotter, or generate a deferred
plot file for later use.
Notes:
Exercise
Generate model statistics for items from IDW, polygonal and Kriging methods
separately. Use the same cutoff intervals. We will use the data output files from each run
to make grade tonnage plots. Use filename extensions cui, cup, and cuk respectively.
Grade/Tonnage Plots
From the MineSight Compass Menu tab, <select the Group Statistics, and the
Operation Plot; from the procedure list, select procedure pgtplt.dat - Grade/Tonnage
Plots>. Fill out the panel as described.
Notes:
Notes:
This section of the report file has a bench-by-bench breakdown of CUID values.
When you exit the report file, the MPLOT panel will appear, giving you the option to
preview, plot directly to the plotter, or generate a deferred plot file for later use.
<Preview the plot, then close the preview with the X and Exit the MPLOT panel>.
Notes:
Model Correlations
<On the MineSight Compass Menu tab, select the Group Statistics, and the Operation
Calculation; from the procedure list, select procedure p61801.dat - Correlations
(Model)>. Fill out the panels as described.
Notes:
When you exit the report file, the MPLOT panel will appear, giving you the option to
preview, plot directly to the plotter, or generate a deferred plot file for later use.
<Preview the plot, then close the preview with the X and Exit the MPLOT panel>.
Exercise
Calculate the correlations between polygonal and Kriging grades. What do you Notes:
observe?
Probability Plots
On the MineSight Compass Menu tab, <select the Group Statistics, and the
Operation Plot; from the procedure list, select procedure p61901.dat - Probability Plot
(Model)>. Fill out the panels as described.
Notes:
Exercise
Create separate probability plots for Rock Types 1 and 2. Compare results.
Learning Outcome
In this section you will learn how to perform calculations using information stored in
the model.
Model Calculations
<From the MineSight Compass Menu tab, select the Group Statistics, and the
Operation Plot; from the procedure list, select procedure p61201.dat - User-Calcs
(Model)>. Fill out the panels as described.
The results are stored directly to the specified item in the model file. To check the
results visually, simply create a Model View in MineSight 3-D.
Notes:
Learning Outcome
In this section, you will learn how to quantify your confidence in the results of the
block model calculations.
Kriging variance,
Distance of 39m corresponds more or less to 25% of the model. Fifty percent of the
model was assigned distances up to 57m, and 75% up to 77m. Distances are not true
(they are anisotropic).
On the MS Compass Menu tab, <select the Group 3D Deposit Modeling, and the
Operation Calculation; from the procedure list, select procedure p61201.dat - User-
Calcs (Model)>. Fill out the panels as described.
These values for ZONE will be used to define proven ore (ZONE =1 or 2), probable
ore (ZONE =3) and possible ore (ZONE =4).
Exercise
Make a model view of item ZONE in MS2 to check the results of the code
assignment.
Kriging Variance
Make a model view of the item CUKVR as it was calculated by running procedure
P62401.DAT. Use cutoffs of 0.039, 0.055 and 0.087 (quartiles).
Notes:
Notes:
Learning Outcome
In this section you will learn:
Change of Support
The term support at the sampling stage refers to the characteristics of the sampling
unit, such as the size, shape and orientation of the sample. For example, channel
samples and diamond drillcore samples have different supports. At the modeling and
mine planning stage, the term support refers to the volume of the blocks used for
estimation and production.
It is important to account for the effect of the support in our estimation procedures,
since increasing the support has the effect of reducing the spread of data values. As the
support increases, the distribution of data gradually becomes more symmetrical. The
only parameter that is not affected by the support of the data is the mean. The mean of
the data should stay the same even if we change the support.
Global Correction
There are some methods available for adjusting an estimated distribution to account
for the support effect. The most popular ones are affine correction and indirect
lognormal correction. All of these methods have two features in common:
Results
The output report file summarizes the following:
Exercise 1
Change block size to 10x10 and re-run the procedure. What change do you see in the
block variance?
Exercise 2
Change block discretization to 10x10x5 and see the effect on the block variances of
20x20 blocks.
Exercise 3
If you have another variogram parameter file, try running the procedure with it. What
do you observe?
Results
Look into the file BLOCK.DAT using Notepad or another text editor. The first
column in this file displays the theoretical block grades after the change of support
correction is made. The second column contains the original data as input.
Notes:
When you exit the report file, the MPLOT panel will appear, giving you the option to
preview, plot directly to the plotter, or generate a deferred plot file for later use. Preview
the plot, then close the preview with the X and Exit the MPLOT panel.
Results
Notes: The report file (rpt508.lvc) contains a summary of the Volume-variance correction
results.
Exercise 1
Run stats on item CUBLK to look at the new distribution. Use extension AFF.
Exercise 2
Run the Volume-Variance Correction using the indirect lognormal method. Use 0.72
for the coefficient of variation. (Store back to CUBLK, run statistics, use extension
ILM.)
Exercise 3
Combine all histograms in one plot:
On the MineSight Compass Menu tab, select the Group Plotting, and the Operation
Plot; from the procedure list, select procedure anyplt.dat - Plot any USERF/DATAF. Fill
out the panels as described.
Plotting Panel
Enter scale and x,y limits (plotting units, not project). Plotting files are USERF. Use
appropriate shift commands.
Results
The plot should look like this:
Notes:
run procedure psblkv.dat for this size of a block, you should get a block variance of
Notes: 0.19157; therefore the correction factor should be 0.19157/0.17699 = 1.08. The average
model grade for Rock Type 1 is 0.6917 and the c.v. is 0.5025.
Results
Exercise 1
Run stats on item CUKGG to lookat the new distribution.
Exercise 2
Plot grade tonnage curves of CUKGG and CUKRG items to compare the original and
adjusted grade distribution.
Learning outcome
In this section you will learn about outlier restricted kriging and how to set it up in
MineSight.
Make a CU probability plot and determine the outlier value. Use only benches 2540
to 2600.> For the next steps we are going to assume a cut off grade for outlier equal to
1.2 (there seem to be a deviation from a straight line in the probability plot around this
grade).
Calculate indicators
Notes:
In order to calculate the probability of each block to be above grade 1.2, we are going
to assign indicator values to the composites:
1 if otherwise.
This step requires that you have an additional item in file 9 to store the indicators with
min =0, max =1 (or more) and precision = 1.
Use item ROCKX. For benches 2540 to 2600 assign value zero to item ROCKX (use
rock types 1 and 2). Then assign value ROCKX=1 for Cu>1.2>
Calculate probabilities
Assign the probability of occurrence of outlier grades to the blocks. This step
requires that you have an additional item in file 15 to store the probabilities. The item
should be initialized with min =0, max =1, precision = 0.01 or 0.001. Use Inverse
Distance Weighting to assign the probabilities.
Use the same search parameters used in previous runs. Add one more item to
interpolate (CUIND using ROCKX). Interpolate only benches 24 to 28. Use OR1 and
OR2 as extensions for run files and reports.>
Perform ORK
<Select Group 3D-Modeling, Operations Calculations. Run procedure p62401.dat
Ordinary Kriging
Use the same search parameters used in previous runs. Replace model item to
interpolate to CUPLY. Interpolate only benches 24 to 28. Use OR1 and OR2 as
extensions for run files and reports. Add the ORK options in the last panel (use item
ROCKX from composite file and item CUIND from model).>
Learning outcome
In this section you will learn:
- How to calculate the probability of a block having a grade value above the Cutoff
Indicator Kriging
The basis of the technique is transforming the composite grades to a (0 or 1) function.
All composite grades above cutoff can be assigned a code of 1 whereas all the
composites below can be assigned a code of 0. Then a variogram can be formed from
the indicators which can be used for Kriging the indicators. The resulting Kriging
estimate represents the probability of each block having a grade value above the cutoff.
Assign Indicators
On the MineSight Compass Menu tab, <select Group Composites, Operation Type
Calculations. Run procedure p50801.dat -User-Calcs (comps) to assign indicators to
item altrx.>
Variogram of Indicators
On the MineSight Compass Menu tab, <select Group Composites, Operation type
Calculation. Run Procedure p30302.dat-Variograms (comp).>
Krige Indicators
Notes:
On the MineSight Menu tab, <select Group 3D Deposit Modeling I Operations Type
Calculations. Run procedure p62401.dat - Ordinary Kriging.>
Panels 6, 7, 8
Leave those panels blank.
Notes:
Learning Outcome
In this section you will learn:
Overview
Multiple Indicator Kriging (M.I.K.) is a technique developed to overcome the
problems with estimating local recoverable reserves. The basis of the technique is the
indicator function which transforms the grades at each sampled location into a [0,1]
random variable. The indicator variograms of these variables are estimated at various
cutoff grades. The technique consists of estimating the distribution of composite data.
The distribution is then corrected to account for the actual selective mining unit (SMU)
size. This yields the distribution of SMU grades within each block. From that
distribution, recoverable reserves within the block can be retrieved. Accumulation of
recoverable reserves for these blocks over a volume gives the global recoverable
reserves for that volume.
Uses of Indicators
Indicators can be used to:
estimate distributions
Incremental Statistics
Notes:
<On the MineSight Compass Menu tab, select Group Statistics, and the operation
Calculation. Run procedure p40201.dat - Statistics (composites).
Run statistics on item CU, using 100 increments of 0.02. Use extension IK1 for
reports and run files. Check the option not to accumulate the intervals. Do that only for
rock type 1.>
Results:
Notes:
This procedure is used to analyze where the M.I.K. cutoffs should be for a given
grade distribution and the number of M.I.K. cutoffs specified. The basis of analysis is
the metal contained in each indicator class. It is possible that the user may try this
procedure several times until he or she is satisfied with the results.
Results:
Notes:
Cutoffs 0.42 0.56 0.68 0.78 0.88 0.98 1.12 1.30 1.46 1.66
appears for each variogram calculated. This is a 3-D omni-directional variogram for the
first indicator cutoff (IND1). Notes:
Use the output from previous procedure as input (dat303.mik). Program M300V1
will display on the screen a list of the 2 directional variograms plus the 2-D Global
variogram and the 3-D Global variogram for each indicator. Click on 3-D Global on
each of the 10 indicators and then on Exit Panel.
Notes:
Notes:
Variogram
Cutoff Mean Nugget Sill-nugget Range
type
0.42 0.1907 3 0.05 0.20 75
0.56 0.4802 3 0.05 0.20 75
0.68 0.6132 3 0.05 0.20 63
0.78 0.7241 3 0.06 0.18 63
0.88 0.8232 3 0.06 0.17 57
0.98 0.9199 1 0.05 0.15 51
1.12 1.0279 1 0.03 0.13 41
1.30 1.2050 1 0.03 0.10 41
1.46 1.3727 1 0.03 0.06 41
1.66 1.5539 1 0.02 0.03 32
MIK Reserves
Notes:
<On the MineSight Compass Menu tab, select Group MIK, Operations Calculations.
Run procedure p60801.dat - Statistics (model).>
Learning Outcome
In this section you will learn:
The program can be run from procedure p62101.dat. The procedure can be found in
MineSight Compass Menu, under Group 3d modeling I Operations Calculations.
Exercise
<Run procedure p62101.dat.> Try to use the same interpolation parameters as you
did with Inverse distance and Kriging methods.
the plane must pass through the function value at the point in question (i.e., through
the Z (grade) value),
and the angles the plane makes with vectors or lines to all of the various points in
the neighborhood must be minimized.
The angles are weighted by a function of how far or near the various neighboring
points are from the point of interest. After the tangent planes are generated, block values
are calculated from neighboring composites (now with gradients). User needs to
specify how many neighbors are used in this calculation. The gradient information for
each composite is evaluated at the block location and the calculation of the block value
is weighted by the distance from the block to each composite.
Exercise
<Run procedure p62501.dat under Group 3D Modeling I Operation
Calculations.> Try to use the same search parameters as in previous methods.
Notes: