Professional Documents
Culture Documents
Scott L Paper
Scott L Paper
-ElM e
Bxplor. Mining Geol., Vol. 7, Nos. 1 and 2, pp. 117·127, 1998
1999 Canadian Institute of Mining, Metallurgy and Petroleum.
-IEM AU rights re=ved. Printed in Canada.
0964-1823/98 $17.00 + .00
'
Practical Quality Control Procedures in Mineral Inventory Estimation
SCOTT D. LONG
· Mineral Resources Development Inc.
San Mateo, California, U.S.A.
Abstract-Grass roots exploration has an emphasis on good precision at low (sub-economic) con-
Centrations of an element. in order to identify "anomalies" which may lead to drilling targets.
Although it is wise to have well-defined quality controls in place during early stage sampling, the
control procedures at this stage tend to be rudimentary. When a deposit warrants detailed delineation,
the need for accurate geological characterization and assaying of ore-grade rock must be supported
by appropriate quality control procedures.
As a project returns more promising results, and the likelihood of the project warranting a pre-
feasibility study increases, it is time to tighten quality assurance requirements and quality control
protocols. Delays in instituting adequate quality control and verification procedures may lead to
costly remedial actions that can also delay completion of the prefeasibility or feasibility study. Some-
times, as a project matures, the quality control and quality assurance aspects fail to keep pace with
the changing nature of the work: an increasing emphasis on precise and accurate assaying of ore-
grade rock.
A resource model is like a house of cards. The foundation of the house is the geological obser-
vations, the sampling methods and the geological interpretations performed upon those samples.
Upon the sampling, which includes all the sub-sampling steps of sample preparation, rest the assay
results. The quality of the assays can be no better than the quality of the sampling. Good assaying .
teclmique will preserve all biases and imprecisions introduced during sampling and sample prepara-
tion; it cannot remove them. If the sub-samples and/or assays are bad and must be redone, all the
succeeding levels of the house of cards must also be rebuilt, or at least patched up.
From geological observations, a geological interpretation is built. the quality of which is critical
to the construction of an acceptable resource model. Geological elements are difficult to assess, and
are not discussed here; they are dependent upon the expertise of the geologists involved. Geological
knowledge contributes essential quality control elements to resource/reserve deterininations (Sin-
clair and Postolski, 1999; Postolski and Sinclair,1999).
This paper consists of five sections: (1) overview of quality control protocols, (2) maintaining
quality control of da~1>ases, (3) quality control of drill sampling, (4) optimi.iing crushing and grind-
ing requirements, and (5) quality control of assay data acquisition. ©1999 Canadian Institute of
Mining, Metallurgy arid Petroleum. All rights reserved.
so that they are readily accessible to technicians performing Whenever data are transferred from on~ software to
the work. This helps maintain consistency in the procedures another, it is not sufficient to rely on built-in check mecha-
when there are changes in the workforc-e . nisms. In addition, basiC tools such as checksums for
Tirirdly, traceability is essential and it applies on several columns or strings of data or checksums with rank order
levels. Anonymity erodes quality. This applies as much to weighting should be used to ensure transfer of data without
who checks the rejects and pulps for grind quality as it does errors. If data are entered from a keyboard rather than
to who maps, logs the drill core or drill chips or handles the received electronically, they should be entered twice {dou-
samples. A common problem encountered with geologic data ble-entry) by two different individuals, and the two sets of
is inconsistent mapping and logging. To monitor this, it is data cross-matched in a relational database to trap all errors.
worthwhile to include the identity (typically the initials) of Data should be entered from originals, not faxes or poor-
the geologist, as one of the fields entered into any database quality photocopies where legibility may be compromised.
table of geologic data. Maintaining a reference set of drill The same goes for any critical geological information used
core samples which show all or most of the rock+alteration to construct the resource model If hand-entered data are
combinations encountered during drilling, and a training set thought to be virtually error-free, check a random selection
of portions of two or three drill core and/or drill chip holes of three percent for errors. A quality standard for data entry
are practical tools for maintaining uniformity of quality. is less than one error in 10 000 keystrokes.
A project may pass through several companies before it Most projects do not adequately compile in their rela-
is brought to feasibility. Procedural changes should be doc- tional database, all the information needed to easily evaluate
umented as clearly as possible. The new project manager the quality of the data and to manage that data. In the case
should strive (make ·reasonable efforts) to acquire all rele- of assay certificates, essential information in the header of
vant quality control and quality assurance documentation the certificate, such "as date, analytical method, and what
from previous drilling campaigns, as well as ensure ade- samples were analyzed together, usually is not preserved.
quate correlation between geologists. Too often, these All that is necessary to do so is to add a field (column) to
details are overlooked in the transfer of data from one party record the certificate name, which is always unique for a
to another. The lack of traceability information may jeopar- particular laboratory. For even more detail (convenient if the
dize the use of these data in subsequent resource models, majority of assay reports are large and the sample numbers
pending geological correlation, verification by check assays are not well-ordered within it) the line number of each sam-
on stored rejects, or twinning old drill holes. Rude surprises ple number can be recorded in a second field; this step is
await the unwary. easily accomplished using the "data fill" menu function in a
computer spreadsheet.
A separate relational database table can then be main-
tained which contains all the header information from each
Maintaining Quality Control of Databases
certificate. With this· structure in place, it is possible to
Drilling projects usually rely upon relational databases arrange data accurately by time sequence and to group data
to store and manage their data. Some of the better-known having different detection limits for a particular element,
personal computer databases are ACCESS, DBASE, PARA- different assay procedures, or different laboratories. Locat-
DOX and INFORMIX. Older software programs include ing the hard copy of particular results becomes simple.
FOXPRO and RBASE. There are some more powerful soft- Adding this step, which can be done in a few seconds in a
ware packages such as ORACLE. Some geologic packages, spreadsheet, by copying the certificate name into a column
such as TECHBASE also contain relational databases. for all rows "Of assay results, maintains the ability to evalu-
These programs are all designed to keep data in order, match ate many aspects of data quality after the data pass to a rela-
up data from different sources using unique identifiers, tional database.
answer questions about the data, and provide the capability
to edit or change the data.
Handling Special Values
Managing values below detection is a common problem.
Database Entry
Relational databases and spreadsheets do not interpret a less-
Database entry and reporting forms can be avoided by than sign (<) as a numeric character; hence, if the computer
importing and exporting selected data from and to spread- file of assay results Jl,as this character in data fields, some
sheets. A good flow of analytical data consists of receiving modifieation must M ma~e to the data. The most common
computer files from the laboratory (attested by them to be convention is to replace the1ess-than sign with a minus sign.
identical to the printed laboratory certificate) provided in This is the best approach, since the meaning is unequivocal,
spreadsheet format (such as LOTUS, EXCEL or QUAT- and no information is lost. Information is lost if one replaces
TRO-PRO). These files can be saved as back-up files and these values with a zero, and there is also risk that such val-
edited copies can be loaded into the relational database. The ues will become confused with "not analyzed," a very differ-
spreadsheets typically are named with the certificate name, ent result when the data are composited!
with the original computer files stored with the same care as Using a zero value when a value below the detection
the original assay certificates. limit is reported becomes ambiguous within a database if an
I. .
I I
Practical Quality Control Procedures in Mineral Inventory Estimation • S.D. l..oNO l19
element has different detection limits over the life of the table, which has the "preferred" or "to use" value. Document
.__ project. The author has seen a large database with one ele- the method of obtaining this value in detail, so that others can
ment having four different detection limits. In that case, confirm it In this way, no information. is lost, and auditors
some below-detection entries were entered as half the detec- can easily trace the chain of steps that leads from the assay
tion limit, some as one unit less than the detection limit, and certificates to the values used in the resource model.
some as zero. The element was a contaminant; the levels of
that contaminant could not be estimated from the. data as
recorded in that database. In general the rules should be: Quality Control of Drill Sampling
make the minimal changes necessary so that the data can be
easily checked against assay certificates, and make changes Evaluation of Diamond Drill Sampling
which are unambiguous and thus easily reversible. Some
software packages have an insidious tendency to introduce Core recovery is the first evaluation step. This is partic-
ularly important in vein-type deposits with structural con-
zeroes into blank numeric fields; this is another reason to
trol, as controlling structures and schistosity reduce the
avoid using zero.
mechanical competency of the rock, leading to selective
A value of zero is best reserved for those intervals that
core loss in metal bearing structures. Core recovery should
are not analyzed .intentionally because they are known,
be monitored systematically and recorded on the drill logs,
based on geological observation and previous testing, to
paying particular attention to expected or known mineral-
contain negligible amounts of the element of interest. In this
ization intercepts. Other core features warrant attention.
case, it may be preferable to assign a value of zero only in
"Washboard type" undulations on the core surface indicate
the "preferred" or "to use" fielcl, and use a special code, such
too high a drilling pressure and hole ·vibration, leading to
as ''-9990" in the assay field. There are more sophisticated
lower core recoveries in weaker rocks.
(statistical) methods of handling dilution effects. A zero
assigrunent is conservative for conunodities but becomes
optimistic and cannot be used for contaminants.
Evaluation of Rotary Drilling Sampling
Sometimes, a greater-than sign (>) is used to report
results that exceed the upper range of measurement. These Drilling recovery is often neglected for rotary samples
samples should be re-analyzed, using a method with a probably because it is more difficult to measure. To obtain a
'ugher upper limit. It is unwise to load these (>) values into reasonable estimate of rotary drill recovery, the drill hole
'--"a database, where some subsequent user may misinterpret diameter, the sample split, the wet weight and the dry weight
them as accurate measurements of concentration. The author of the sample must all be studiously recorded. Failures most
recently audited a project in which he was provided with a often occ.u r where changes in the sample split fail to be
database of check assays in which the highest reported recorded (such as changing from collecting 1/8 to l/4 split,
grades exceeded the stated upper limit of analysis for one of when drilling recovery drops). Very wet Samples are prob-
the check assay· lab~ratories. Results with the greater-than lematical. Rock density changes complicate the picture.
symbol(>) should be assigned a specific code so they can be Despite all this, recovery data from rotary drill holes can
recognized confidently. · . reveal problems that might otherwise be overlooked. For .. ,I
Unambiguous, useful numeric codes in assay fields typ- example, recoveries greater than 100% which correlate with 'I
ically consist of negative numbers, such as -999, that are pronounced decreases or increases in grade may reveal I
I
impossible to confuse with a below-detection limit value. A doWn-hole sloughing and contamination probiems. I
I
well-ordered database may have a variety of codes to cope Rotary drilling should also be evaluated for down-hole I
with different situations, such as -9997 for insufficient sam- contamination by examining changes in grade which occur I
l
ple, -9998 for sample lost, -9999 for not analyzed, and so on. whenever another. segment of drill rod is added (cyclicity ·I
testing), because there is an increased probability of down- I
In such cases, a table describing the meaning of each code I
should also be maintained. For example, the number -99500 hole contamination when this occurs. This is done by com- '
might indicate that the value was above an upper limit of paring assays of those samples straddling the point at which
analysis of 500 ppm; this code is easy to generate by per- a drill rod was added, with other adjacent sample pairs. Indi-
forming a global replacement of -99 for> in a column of the cations of down-hole contamination may lead to a decision
assay computer file. Codes such as these are useless unless to modify or change drilling methods. Clearly, this is a fac-
well documented, unambiguous and consistently applied. tor which should be explored early in any drilling program,
To ensilre all available information is used adequately, since there is no remedy for contaminated samples.
further editing of the data may be needed once transferred
into the assay table. For example, one may have two deter-
minations for a given element, both valid, and wish to take Bulk Specific Gravity (Density)
+he mean for use in the resource model. Or, one may wish to Bulk specific gravity (density) is as critical as assay
~ assign a non-negative value to below-detection limit values results in estimating the total metal content of a deposit, and
(a necessary step before compositing values). Rather than is too often badly neglected. The author is familiar with one I
lose information, it is best to either create a new table with case in which ore tonnage was· underestimated by about i
the modified values, .o r create a new field within the existing 15% because the engineers modeling the deposit used a ton- I
I,
I· i
....-.11
~-·
nage factor from another deposit, even though the presence results should be averaged. Data from the first half of the
of large quantities of barite was known. ' core should be disregarded where the other half of the core
More often, bulk _specific gravity is mis-estimated as a was incomplete. Results from such samples may therefore
consequence of two factors. Core samples are weighed in be expected to have an additional contribution to variance
water without first sealing them with wax, so that pore space which is a function of geological variation. In such cases,
volume is not preserved, leading to underestimation. Con- and in the case of twin hole studies, where the spatial sepa-
versely, less competent sections of core, which often have a ration is much larger and less well-known, it is advisable to
lower bulk specific gravity, are not subjected to measure- compare distributions of results for the entire hole, and aver-
ments, leading to overestimation. Tills constitutes a selec- age values (means and medians) of grade populations, rather
tion bias. Just like assays, bulk specific gravity measure- than individual sample pairs.
ments should be systematic and they should be subjected to Some portions of drill holes may not warrant any dupli-
independent checks. Enough density measurements should cate samples, because, based on geologic information, they
be made of each lithological/alteration/metallurgical cate- are very unlikely to provide grades above the lowest antici-
gory of ore, so that a reliable estimate of the error on the pated ore/waste cutoff. Such intervals should be verified in
mean value can be calculated. Bimodal or skewed distribu- a cursory fashion by sampling, particularly next to mineral-
tions of density measurements indicate the likely presence ized intervals.
of domains and the need to review geological categories. Use of the entire core for assay is generally an unac-
ceptable practice and should be considered only as a last
resort. Use of entire core destroys evidence oftampering; it
Rig Duplicates and Half-cores also prevents revisiting geological intetpretations. Ore min-
eral losses can be exacerbated in some rock by wet sawing. ·
For rotary drilling samples, there may be "rig" or
In such cases, the answer is to use a core-splitter, and be
· "field" duplicates collected from a rotary (wet) or riffle
very conscientious about including half the fines after split-
(typically Gilson) splitter at the drill site. Each duplicate
ting. Where large samples are required, reliance should be
is submitted with a unique sample number (preferably
placed on larger core diameter or reverse circulation drilling
one which preserves its anonymity) and subjected to the
rather than using entire core. This is one of the reasons why
same sample ·preparation and analytical steps as other
many gold deposits depend largely upon reverse circulation .
samples. These additional splits can be collected at a uni-
drilling, even though in doing so, geologic understanding is
form interval, such as every 20th sample, or they may be
reduced and sample quality control can be more difficult.
selected at random.
Most geologists are very cognizant of the fact that when
splitting core, it is important to try to obtain two halves hav-
Optimizing Crushing and Grinding Requirements
ing the same grade, hence, to submit a half-core for analysis
without biasing the selec~on in favor of higher or lower
Required Specifications
grade halves. Even in favorable conditions, split core rarely
provides true duplicate samples, as there is a spatial (geo- Experimental studies by Gy (1979) have demonstrated
logical) component in these duplicates which is not present the 95% passing size (P-95) to be the most reliable particle
in drill chips. Sometimes the technician splitting the core size standard for evaluating crushing and grinding. Any cri-
will be instructed to alternately submit the left or right terion other than P-95 should be recast by selecting an
halves of each split piece, after first attempting to cut the appropriate, usually larger, sieve size. For example, it may
piece as normal to structures as possible, while still obtain- tum out that a laboratory's quoted 60% passing a 10 mesh
ing two half-cylinders. The risk of bias exists where the screen is equivalent to 95% passing 1/4 in.
mineralization is easily recognizable, its distribution is To establish an appropriate protocol, duplicate samples
highly non-uniform, and technicians are not adequately should be collected initially at each step of the sample
trained or supervised. When unintentional, an introduced preparation process where the sample is split to reduce mass
bias is likely to be slight; if intentionally done, it can be prior to proceeding to the next comminution step. A well-
huge. If supervision is close, if there are good records of balanced sequence of sample preparations protocol will pro-
who worked on what samples (that is, chain of custody duce approximately equal contributions to sampling vari-
records), testing for bias can be made more definitive, and ances for each such step. The variances of each step are
any potential damage can be minimized. additive, with the . gr~atest proportion carried by the step
Old assaying may be tested by analyzing the other half contributing the great~st -v~ance because estimates of pre-
of drill core, or in some cases, re-splitting the half-core and cision are proportional to the square root of the sum of vari-
analyzing a quarter core. Analyzing the stored half of the ances. Once the "worst step" is identified, immediate efforts
core where it still exists provides the only test for selection to improve precision should focus upon this step, before any
bias during the core splitting process. Typically, at least 50 other step is targeted.
samples, taken from several contiguous runs within the ore Coarse rejects and pulps should be periodically checked
zone(s) need to be selected for such a check, in order to to be certain that the sample preparation protocol is being
obtain a sufficient population for a statistically valid test. If rigorously followed. While the laboratory should be doing
results for the second half show negative bias, the two this as part of its internal quality control, the project man-
Practical Quality Control Procedures in Mineral Inventory Estimation • S.D. LoNO 121
:er should keep in mind competing considerations, such as cate samples are submitted to a different laboratory (the
-.sample throughput at the laboratory which may cause qual- check assay laboratory) and the results do not agree, one
ity to suffer, particularly in periods of high sample load. cannot tell, without additional information, whether the
Checks on coarse rejects should be done using a single sieve problem rests with the analytical technique or with the
of the size expected to pass 95% of the material. In addition preparation stages. If coarse reject duplicates are analyzed in
to requesting that the preparation manager perform this an identical manner, any bias may be more clearly ascribed
check daily, one should perform an independent check on to problems with the sample preparation.
randomly-selected coarse duplicates on a weekly or bi-
weekly basis. Checks on pulp quality should be performed
by the check assay laboratory on about 10% of the pulps Same-pulp Duplicates
received by them.
At least a portion of same-pulp duplicates should be
sent to a second independent laboratory; these are the check
Minimizing the Nugget Effect assays. Some geologists take the position that there is no
need for check assays at a second laboratory if adequate
One should accept the fact that a material with a sub- numbers of standard reference materials (SRMs), having
stantial nugget effect will incur higheJ;" sample preparation well-established grades, are inserted into the sample stream
costs in order to reduce the effect. Such added effort may in such a way as to maintain their anonymity. Since an out-
involve grinding larger masses of samples to smaller parti- sider can never be certain that anonymity is maintained
cle sizes prior to taking sub-samples, plus perhaps perform- throughout a project's life, and since a set ~f SRMs may not
ing screen fire assays on a larger sample pulp, or firing sam- be adequate at uncovering every manner in which bias may
ple quantities larger than one assay-ton (about 30 grams). be introduced into a sample population, it is best to confirm
Otherwise, the resource model will suffer, and the project results at a second reputable laboratory.
will carry greater risks in its estimates of tons and grade As a rule of thumb, one should expect to obtain
above cutoff, its mine plan, its pit design and plant capaci- agreement on assays of the same pulp by the same labo-
ties. Poor precision means greater uncertainty, which trans- ratory of 10% for 90% of such pairs; and 20% for a
lates into lower efficiencies, higher risk, and lower profits. coarse reject material (assuming there is only one such),
... eostatisticians may place fewer tons in the "proven and using the pair difference divided as the pair mean as the
-yrobable" categories than would be the case if the results estimate of precision. These are levels of precision which
were more reproducible. It is usually far cheaper to upgrade include an allowance for Type I errors such as sample
preparation and analytical protocols than to drill holes at a mix-ups. If rig duplicates show a precision of 30% (for
closer grid spacing. 90% of pairs), the variance is evenly distributed through
the process, and one can improve precision by improving
.
..
any step. Where there is a nugget effect, such levels of
Use of Duplicate Samples precision may not be obtainable using an affordable
pr~paration protocol.
Duplicate sample submissions should be used to gauge
the adequacy of the sample preparation protocol, as well as
for evaluating analytical precision. Several different types of
Quality Control of Assay Data Acquisition
duplicate samples should be evaluated, including those per-
formed on coarse rejects (typically, the portion of material The frrst requirement of a quality assurance program
from the crusher which is not passed on to the pulverizer for for assaying is the selection of the laboratory. .Whenever
final sample preparation) and on sample pulps. Rig dupli- possible, a certified laboratory should be selected (ISO
cates· may also be collected from splits of chips from 25, Canada Standards, or other equivalent accreditation).
reverse-circulation drills. The remaining core-half of split A laboratory's reputation also should be checked, but
core is sometimes submitted as a check, particularly as part even good reputation and accreditation do not free the
of a ..due diligence" examination of a property's value. project manager and project geologist from diligent
supervision. Particularly in remote areas, the laboratory
should be visited before selection and occasionally dur-
Coarse Reject Duplicates ing field work, preferably unannounced. This is how one
In North and South America, coarse reject material typ- learns that a 'half assay ton' rather than 'one assay ton'
ically consists of the output of a jaw or roll crusher. It is pos- portions are being weighed and assaye<f.!...or that other less
sible to have more than one type of coarse reject, if there is obvious steps of the agreed-upon sampte preparation pro-
more than one stage of crushing and grinding followed by tocols are not being respected. The first step of quality
_JUtting. In Australian gold prospects, there is often no assurance consists of improving precision and reducing 'I
coarse reject, because the entire sample is reduced to a fine errors at the source. I" I
pulp in a very large pulverizer. It is preferable to submit Quality control procedures of assay results include the
coarse reject duplicates to the same laboratory that routinely following steps: 1) selection of check assays and avoidance
II
analyzes the samples (the primary laboratory). If such dupli- of selection bias; 2) graphs of duplicate sample results; 3) I.
!
dll
...
~u
~·-
tend to come back lower and low values higher. When there .DATA -x:::.y linear regressions
is no bias, these effects exactly cancel. Choosing a dispro-
portionate number of high values from a population, how- Fig. 1. Example of a scatter (xy) plot with a linear fit. This exam-
'"· ever, introduces a selection bias; the check assays will tend
ple shows poor agreement between the check assays and the origi-
nal assays. Poor correlations lead to very different linear regression
to average lower than the average of the original results. The fits, depending upon which variable is defined as the independent
magnitude of the effect ranges from negligible (where pre- variable.
cision is very good) to quite large (where precision is poor).
In some cases the person evaluating the check assays surement errors than positive ones, one can obtain a higher
introduces a selection bias by, for example, grouping the average result from selectively re-assaying samples
results j.nto grade ranges based upon the.original result, and below the median grade. The amount of "improvement"
then comparing the averages of the grade ranges. Where will be a function of the precision. Quite outstanding
preei.Sion is poor, such as for data having a substantial "improvements" can be obtained when the precision is poor.
"nugget. effect," the lowest grade range is much lower for Variants of this fallacy include assaying slag from fire assay,
the original data than for the check results, while the reverse at gold levels where precision is very poor, to account for
is true in the highest grade range. If, however, the data were gold "losses" from the sample. Since negative grades cannot
to be ordered by the check assay data, one would find that occur, an accumulation of positive errors takes place with
the check a5say data for the lowest grade range has a lower these rounds of re-assaying.
mean than the original data. This apparent paradox is a con-
sequence of the selection bias created by grouping the data
by one of the sets of results. Graphing Duplicate Sample Results
The best way to avoid selection bias is to be sure that
samples selected for re-assay are chosen in such a way that The most common treatment of duplicate data is to
they are' random with respect to grade for the sample popu- make a scatter plot of the data. This is a good first step. Such
lation being evaluated, and using all values to establish an plots have tremendous shock value when the distribution
average value for a sample, avoiding a selection based on looks like a shotgun blast. Scatter plots serve the widest
only one of the results. Using the mean of the two results for readership of the feasibility document, because everyone
each sample does not introduce a selection bias, and the who reads them will understand them. The same is true of
means can be used for grouping the data. In such a case, one histograms. Other plots can be used, but they are often most
should keep in mind that precision in any group near the effective if the subject is first introduced in terms of his-
detection limit may appear better than it actually is, because tograms and scatter plots.
at these low concentrations, pairs with the poorest precision The next refinement usually done is to calculate a clas-
are excluded from that range; if one of the members of the sical least-squares regression fit of one result to the other
pair is high enough, the pair mean will place that pair in a and to calculate the correlation coefficient Different equa-
higher grade range, even if the other member of the pair is tions are obtained depending on which variable is taken as
below detection. the dependent variable. These two equations can be very
Drill holes outside the area of interest, or intervals iden- similar if the prufed .d ata plot closely about a line, but where
tified by geologic observation to be waste can be excluded data are widely scattered, the two equations can be very dif-
from check assaying; these are selections that are not based ferent A linear fit between the two extremes is preferred and
directly on assay results, and so do not have a selection bias. one recommended procedure is a reduced major axis fit
Most geologists also realize the fallacy of selectively which estimates the true functional relationship between the
Practical Quality Control Procedures in Mineral Inventory Estimation • S.D. LoNo 123
. 200%
' U%
i'
''' .
. -............. -... -............ ---.- .. ----- ...... -..... -.................... -- .... --.... -
t
'
I t I
12%
.
········r········T·········;········· ' . ········· ·········
i 150%
... .: .: ..
f I t
--------- ---------
t
: : ....f.~ .:
................................................................................
i .. .. '
.................................................................................................... .. .. ..
<:
1CI% '
:
.
-- .. -- .. -............... ................... . -............... .. .. -.... -...... .. ........ -- .. -
'
l
100% • t
'
' .
'
' .
..
•
.
''
"'
"
·~
E
8% ... ~
.:
~-
:
'
-----1· --- - ---·1··--···-·-~---··-··-
-~
.: .............................. . . . . .
~
6%
'
~ ----------~---·-------t·---·-· ---
. . .!----·-
Sl% 0
. "'
«
i~~~~·~~~·~~~~~~:r~~_J ········i·....····+·········L........L......
' '
OTt : 2%
0% 0%+-~--;-~--+--+--+--+--~-+--~~~
0 10 50 60
-All -t- Weight> 2.5 kg ...,_Weight> 3.5 kg
Fig. 2. Example of a cumulative frequency of precision plot For this Fig. 3. Relative standard deviation versus grade. Graph is made
plot, the estimate of precision is taken as the pair difference divided by sorting the duplicate pair data by pair means, calculating the
by the pair mean. For each line on the graph, the precision estimates pair difference for each pair, then calculating a moving window
are sorted, and a cumulative percent for that line calculated. (30 pairs were selected in this case) of standard deviations of
those differences, divided by the mean of the same 30 pairs.
The result is plotted against the mean value of the same 30
often done visually. If outliers have been excluded when pairs, to create a curve of the form y 1/x. In this example, =
performing the linear fit, they should be identified on the precision begins to worsen when grade falls below about 4
scatter plot for ease of general evaluation and audit (Fig. 1). oz(f Ag.
Scatter plots of duplicate data give a good picture of all
the data, but they are difficult to compare. To obtain direct It is important to demonstrate that the sampling and
comparisons between two scatter plots, one needs a plot that analytical protocols are ·appropriate for the anticipated
compares distributions of differences, rather than the data ore/waste cutoffs; the range of probable ore-waste cutoffs
themselves. This can be achieved by plotting the cumulative should occur in the fiat (sub-horizontal) region of a preci-
frequency of some measure of precision. A useful measure sion versus grade curve (Fig. 3). Some low-grade gold pro-
of precision is the pair difference divided by the pair mean, jects; particularly those with a substantial nugget effect,
the measure utilized by state geological laboratories of the push the ore/waste cutoff boundary down to where the pre-
former U.S.S.R. Several such lines can then be plotted on cision of the assays cannot provide a good estimate of the
the same graph without clutter. This approach can be applied gold present, even when ~suits are composited.
to many comparisons because it assumes nothing about the
distribution it is representing, excepting that the groups
compared are representative of the s~e population (Fig. 2). Use ofStandardReference Materials (SRMs)
Summary statistics·, such· as correlation coefficients, often
A good standard reference material includes the follow-
have an underlying assumption that the distribution of errors
ing characteristics: low variability in assay results under
is bell-shaped. Because a duplicate population will contain
optimized conditions, grade near important boundaries,
gross 1YJ)e I errors unrelated to measurement errors, the dis-
matrix (ore and gangue minerals) similar to the project
tribution is usually far from bell-shaped, even for popula-
material. Low variability in assay res.ults is a function of
tions where no value is below the ·analytical detection limit,
sample uniformity and assay method
and no nugget effect is present.
When an appropriate assay method is optimized, most
of the variability should be a consequence of the distribution
of particle sizes of the minerals containing the element of
Relative Precision/Detection limit/Ore Cutoff
interest, the unifofmity of the distribution of those particles
Relative precision worsens as the concentration of the between different sub-samples used for the analytical deter-
element being assayed decreases toward the analytical mination, and the amount of material used in the determina-
detection limit. Thompson and Howarth (1978) define a tion. Just because a deposit has high variability (such as, in
"practical detection limit" as the concentration at which the many deposits, with visible gold) does not mean that the
"precision" measured by duplicate samples is equal to 100% standard reference materials ·; used to monitor the Iabora-
of the concentration. If one uses this nomenclature, be sure tory's analytical performance shou\d also be highly variable.
to keep that "practical" word because this "practical detec- On the contrary, one of the pUrpoSes of the standard is to
tion limit" can be very different from the analytical detec- demonstrate that the variability present is not a conse-
tion limit, which is performed on laboratory calibration stan- quence of poor assaying practices, but is inherent in the
dards lacking the sub-sampling variance of different samples being analyzed. This means that materials col-
aliquots from the same pulp. It is important to know how the lected from a project may not always make good standards,
· relative precision changes with respect to grade; estimates or may require special treatment in order to become ade-
of this can be obtained from the duplicate results. quate standards.
:.~
'I
Misinterpretations of the results of inserted standard plies. These should be tested before purchase in quantity, by
reference materials can be avoided if the material is care- obtaining small samples. Some effort should be made to
fully split up into portjons for submission at the outSet of the approximate the matrix of the project's samples, although
project, rather than the all too common, happy-go-lucky, this is somewhat less critical than for Standard Reference
'scoop as you go' practice. Those last scoops can be very Materials, since the primary purpose of blanks is to detect
different from the first It is also important to place the pack- contamination, rather than check assay methodology.
ets containing standard refere~ce materials in random order. Blanks are sometimes used as a check to see if samples
Then one will not confront subtle mysterious shifts in the have been kept in order. They are not particularly effective
baseline value of an inserted standard that turn out, after at this with low-grade samples because often the most prob-
some hair-pulling, to be the consequence of opening a new able result is near the detection limit; thus the blank is often
jar of supposedly homogenous material. like a zebra hiding in a herd of zebras. Good-quality stan-
SRMs that have grades near important grade "bound- dard reference materials are more effective at detecting
aries" have additional benefits because they provide a sample mis-orderings.
method of measuring the precision of the assay technique at
that level. The most important of these boundaries is the ore-
waste cutoff. While one cannot know the precise cutoff that Maintaining the Anonymity of Quality Control Materials
may be selected near the completion of a feasibility study, a
To the extent possible, it is preferable to keep the iden-
competent geologist will know the probable range in which
tity of inserted standard reference materials, blanks, and
this cutoff will occur. For especially thorough evaluation,
duplicates anonymous. This insures that these materials are
two standards, one at the bottom and one at the top of this
treated in a fashion identical to that of the samples they are
"cutoff range," can be used. In the author's experience, this
assessing. Reputable ·laboratories have their own internal
often becomes an important issue in deposits where the
quality controls. While the control exercised from outside the
range of probable ore-waste cutoff values drops below about
laboratory can never be as thorough as that exercised by a
fifty times the analytical detection limit. With well-prepared
well-operated laboratory over its own control samples, exter-
standards in this grade range, one may determine early in a
nal control provides the advantages of being tailored to the
project's life whether the analytical method is adequate for
needs of your project, and serving as an independent check
the contemplated cutoffs. Another important "boundary" is
and method of controlling the quality of your samples.
any .grade level which encompasses more thari a few percent
When a laboratory is aware of the identity of control
of the "above-cutoff' sample population and also prompts a
samples, there can be a tendency to take their preSence into
change in the analytical protocol. Typical changes in proto-
col for such "high" samples are a dilution, or a new diges- aceount when deciding whether to pass or fail a batch of
samples. These decisions can be very subtle, such as taking
tion using a lower sample weight. .for gold, there may be a
repeat assay with a gravimetric finish, whereas the typical an additional instrument reading on the inserted standard
and averaging the results (something that only takes a cou-
sample might be done with an atomic absorption finish. ·
ple seconds to do), or applying very slightly more stringent
The matrix of a sample can impact the precision and, in
rare cases, the accuracy of determinations. A typical example requirements on the laboratory's own internal standards on
those batches of samples which are known to contain the
is whether a sample submitted for fire assay is "oxide" or
"sulfide." The analytical laboratory will "adjust the flux" client's standards. At the other.extreme, complete falsifica-
tion of SRM results may take place. Most reputable labora-
used to smelt the sample in the fire assay pro<:ess, depending
tories, which are those that perform the majority of analyti-
upon its assessment There are other possible adjustments for
cal work for the mining industry, make no distinction
high clay, high carbonate, etc. One sometimes obtains a false
between a client's inserted controls and a client's samples,
sense of security from SRM results where the SRMs used
except for such obvious distinctions as noting that a sample
have none of the problems present in the samples. This is
is submitted as a pulp and not as drill cuttings, which is nec-
another reason why check assays should be performed at
essary for invoicing. · ·
other laboratories in order to verify assay accuracy.
Often the same commercial laboratory is in charge of
both the sample preparation and analysis of samples. Com-
Use of Blanks mercial laboratories generally discount the cost of preparing
samples in order to obtain sufficient work for their analyti-
Blanks are most effective when used to check for cont- cal laboratories. This approach also allows a reputable labo-
amination during sample preparation. This requires use of a ratory to maintain good quality control on the sample prepa-
bulk sample, which is submitted and processed like a drill ration. When the task ' is split between two different
chip sample. Geologists usually take little care to insure that laboratories, a laboratory's price list for preparation typi-
a
such bulk blanks are free of gold or gold nugget effect. cally increases. Surcharges of 100% are not unusual. This
Sometimes river gravels are submitted, which can show a can be an expensive way to maintain the anonymity of
substantial nugget effect A more reliable blank material is inserted standard reference materials!
barren unaltered volcanic r~k. which is usually much more In such cases, a reasonable compromise · may be to
consistent in composition. s ·uitable pre-crushed material can provide pulps of quality control materials with anony-
be obtained from quarries or distributors of landscape sup- mous, unique numbers where one type is indistinguishable
Practical Quality Control Procedures in Mineral Inventory Estimation • S.D. LoNo 125
tion, "run in order listed") in the final assay certificate. cally show a better level of precision and accuracy than con-
This is not an optimum approach for monitoring labora- trols inserted by the client because the laboratory's reports
li tory performance! ,.... do not include performance on failed batches, because these
lj
are re-run.
There are no strict guidelines for how much should be
Guidelines on Assay Quality Control spent on quality control and quality assurance. One fairly
widespread recommendation is that not more than 10% of
A reputable laboratory will have its own quality assur-
ance program. Results of their standards and duplicates can
often be obtained from them upon request In fact, some lab-
oratories have reported that this information has been Table 4. Optimizing crushing and grinding requirements
rejected by some clients when offered. These results typi- Required Specifications for Crushing and Grinding
Particle size evaluation for crushing and grinding should be based on 95%
passing the sieve size used (P 95), as required by Gy's sampling formula.
Other sieve tests should be redone to P 95.
Table 3. Optimizing drill Sampling requirements Evaluate a protocol by using duplieate samples at various stages.
Determine the step contributing the greatest variance and give priority to
Diamond Drill Hole Monitoring reducing this imprecision.
Core recovery, drill advance and penetration rate. Review other steps if contributing significant variance.
Core quality: "washboard surfaces," lost intervals, etc.
Steps to Minimize the Nugget Effect .
Monitoring of Reverse Circulation, Rotary, etc., Drilling Grind larger sample masses to smaller sizes prior to sub-sampling.
Hole diameter, sample split (1/2, 1/4, 1/8, etc.), wet and dry weight, rock Screen gold particles for fire assay out of larger pulps portions.
density. Use larger sample quantities than 30 g for fire assay.
Recovery (should be lower than 100%).
·Check for assay changes associated with drill string additions (drill hole Duplicate Sample Checks for Independent Appraisal of Adequacy of
caving).· Sample Preparation
Assay and grinding checks of coarse rejects and pulps at separate labora-
Bulk Specific Gravity (Density) tory, using identical analytical method (when the nugget effect is not sig-
Measurements of each lithological/alteration/mineralogical mineralized nificant):
category, to allow reliable estimate of error of the mean. • for 5% of pulps: agreement of+/- 10% on 90% of pairs;
Analyze the distribution of measurements: skewed or bimodal distribu- • for S% of coarse rejects: agreement of+/- 20% on 90% of pairs; and ·
tions may indicate the presence of domains and the ·need to review geo- • for rig duplicates: agreement of+/- 30% on 90% of pairs.
t I logical categories or procedural problems. Use similar assay methods to avoid variations from this cause.