You are on page 1of 10

HOME

Geometallurgy Data Management –


A Significant Consideration
V Liebezeit1, M Smith2, K Ehrig3, P Kittler4, E Macmillan5 and
C Lower6

ABSTRACT
Data management is a significant, but perhaps unrecognised, consideration for any organisation
embarking on a geometallurgy program. While resource geologists are accustomed to dealing with
thousands or millions of samples and their associated assays, metallurgists and mineralogists
have traditionally carried out only a few expensive and complicated tests and consequently tend to
manage the resultant data in spreadsheets. The emerging discipline of geometallurgy arrives with
significant challenges as a result of the sheer volume of data generated.
As an example, the Olympic Dam geometallurgy program has generated more than:
 60 000 physical subsamples analysed for up to 30 analytes;
 25 000 spreadsheet-based laboratory test results;
 30 000 sized fractions analysed via QXRD, MLA and/or QEMSCAN;
 multiple external providers of data, and
 more than seven terabytes of data.
Considerations for any geometallurgy data management system include storage space, data
integrity, accessibility, traceability (sample provenance), maintenance and ease of use, integration
with data providers and the ability to modify the system to adapt to changing requirements.
Further, aspects of a geometallurgy program, such as conducting a range of tests on one sample,
producing subsamples such as concentrates and tailings, compositing and routine QA/QC may not
be well handled by existing software and providers geared to standard geochemistry, mineralogy or
metallurgical test programs. The development of a geometallurgy database using existing software
(AcQuire), and the associated file management system will be described for a large geometallurgy
program at Olympic Dam.
As with many aspects of geometallurgy there is no approach that meets the needs of every site
or program. Recognition of data management as an issue, a structured approach in the program
planning phase and commitment to executing the strategy will increase the chances of successfully
mining the geometallurgical data.

INTRODUCTION
A fundamental principle of geometallurgy is to test a large or limitations of the particular metallurgical test program,
number (high volume) of samples and to relate metallurgical and may contain only limited information about the assays
response to ore properties. Results from the metallurgical (methods, detection limit and original units).
program must be able to be associated with the ore Resource geologists are used to having well defined
characterisation program, and with the spatial location of
data structures to support their data. There are numerous
the sample. This means analysing and correlating data from
commercial solutions available to support universally accepted
a number of sources. Metallurgists have traditionally carried
data structures, such as assays, downhole geology and
out test work based on relatively few bulk or composite
samples, supplemented by so-called variability samples, and geophysics. There is nothing comparable in the geometallurgy
therefore data has generally been managed in spreadsheets. world.
Spreadsheets will always be a valuable tool for analysis of the In addition, in the excitement of kicking off a major
data but may not be the most appropriate repository for long- test program, the relatively boring issue of managing and
term use and storage of large data sets. Spreadsheets often accessing the data tends to get overlooked. However, when
do not include documentation about the intent, assumptions the mine planners, design teams and upper management are

1. Senior Engineer, Long-Term Planning, BHP Billiton, 55 Grenfell Street, Adelaide SA 5000. Email: vanessa.liebezeit@bhpbilliton.com
2. Senior Database Geologist, BHP Billiton, 55 Grenfell Street, Adelaide SA 5000. Email: michelle.smith@bhpbilliton.com
3. Principal Geometallurgist, BHP Billiton, 55 Grenfell Street, Adelaide SA 5000. Email: kathy.ehrig@bhpbilliton.com
4. MAusIMM, Senior Geometallurgist, BHP Billiton, 55 Grenfell Street, Adelaide SA 5000. Email: paul.kittler@bhpbilliton.com
5. MAusIMM, Project Geometallurgist, BHP Billiton, 55 Grenfell Street, Adelaide SA 5000. Email: edeltraud.macmillan@bhpbilliton.com
6. Geometallurgist, BHP Billiton, 55 Grenfell Street, Adelaide SA 5000. Email: chantelle.lower@bhpbilliton.com

THE FIRST AUSIMM INTERNATIONAL GEOMETALLURGY CONFERENCE / BRISBANE, QLD, 5 - 7 SEPTEMBER 2011 237
V LIEBEZEIT et al

clamouring for results, the prospect of manually combining The geometallurgical program budget is significant and
data from thousands of spreadsheets into coherent data sets it was critical that the data is utilised to its full potential by
for analysis can be a major and time-consuming obstacle. being accessible to, and usable by, future project investigation
teams. Each team member had prior experience of projects
DEFINING THE DOMAIN where previous test results were examined but ending up
being thrown out because the test conditions were unclear, or
Geometallurgy program scope of spreadsheets of results from previous test programs that
There are four metals recovered at Olympic Dam (copper, could not be verified from their primary source. Thus it was
uranium, gold and silver). The existing process is identified that a degree of discipline and rigour with respect
relatively complex and includes grinding and flotation, a to data management was required to ensure the maximum
hydrometallurgical facility comprising concentrate and benefit from the Olympic Dam geometallurgy program.
tailings leach circuits coupled with copper and uranium Additionally, the aims of the program work were so
solvent extraction and uranium oxide production, and a research-focused that it was clear the resultant data would
smelter/refinery facility to produce copper cathode and need to be reviewed from many different angles to extract
gold and silver bullion. Future process routes may or may the knowledge that the data represented. There were a large
not include all of these elements, and the decision on number of relationships to be defined to characterise the
process route is intrinsically linked to the characterisation behaviour of the known valuable and deleterious elements, as
of the ore. Therefore the type of information required to well as investigations into mineral occurrences which had not
geometallurgically characterise the Olympic Dam orebody is previously been seen in the ore processing zones at Olympic
extensive and includes: Dam. As such, the geometallurgical team were aware that all
 assays inclusive of potentially economic or deleterious the data needed to be hosted in a centralised repository which
elements, would permit ‘slicing and dicing’ of the data. In short, they
 mineral characterisation of head and selected products recognised that a database was required.
via Mineral Liberation Analyser (MLA) and QEMSCAN®
(‘QEM’) automated mineralogy systems, Data management objectives
 comminution data, Explicit recognition of the primary objectives of the data
 flotation data at a sufficient level of detail to characterise management system is a critical element of program
the deportment of minor elements, planning. Objectives will vary between companies, operations,
 leach extraction of valuable and deleterious elements for projects, and even between programs. For the Olympic Dam
both flotation concentrates and tailings, and geometallurgy program the primary objectives of the system
were to:
 settling and rheology behaviour of flotation and leaching
products.  Provide a single, centralised data source accessible by
Metallurgical tests and analytical suites were devised multiple users.
to meet these requirements. Five service providers were  Provide mechanisms, preferably automated, to import and
initially involved in delivering the range of tests. In the validate data provided by various laboratories.
first comprehensive geometallurgy program there were  Store sufficient data such that future personnel could fully
500 - 1000 samples for full metallurgical characterisation, replicate the test conditions/assumptions (ie the system is
and approximately 10 000 samples for mineralogical a data repository, not an analysis tool).
characterisation.  Ensure that all data fit within a predefined data structure
The primary continuing objective of the geometallurgy that would facilitate standardised querying and reporting
programs is to populate the Olympic Dam resource block of that data.
model (approximately eight million 30 × 30 × 15 m blocks)  Provide easy access to the data, ie export to Excel or
with sufficient geometallurgical information to enable value- statistical analysis packages.
based mine planning. Secondary objectives included:
 Providing functionality to support project management
Data management scope such as tracking despatch of samples and listing pending
The scope of the data management requirements for the results.
Olympic Dam geometallurgy program was identified early in  Providing basic Quality Assurance/Quality Control
the first study. It was clear that, given the number of samples (QA/QC) functionality for assay data.
to be analysed, significant volumes of data would be generated Organisations of different sizes may have very different
throughout the program. The data would be generated by objectives and requirements in the area of ease of use and
multiple laboratories, and delivered in a variety of digital maintenance of the system. In the case of Olympic Dam, the
formats as shown in Table 1. geometallurgical team was closely aligned with the resource

TABLE 1
Example of data generated per primary geometallurgy sample.

Primary sample Data generated Number of reports Format of report Size


Assays 75 subsamples (up to 30 analytes) Text files (*.csv) <20 kB per file
Metallurgical tests 30 Spreadsheets (*.xls) <500 kB per file
GMxxx Mineralogy type A 5 Database (*.mdb) ~ 25 MB per file
Mineralogy type B Up to 5 ‘Datastores’ >100 MB per datastore
Mineralogy type C 1 Spreadsheets (*.xls) <20 kB per file

238 THE FIRST AUSIMM INTERNATIONAL GEOMETALLURGY CONFERENCE / BRISBANE, QLD, 5 - 7 SEPTEMBER 2011
GEOMETALLURGY DATA MANAGEMENT – A SIGNIFICANT CONSIDERATION

planning team and there was familiarity with various database indicate the number of times the procedure had been run
and data management solutions. A different organisation may on the sample,
require that data is managed with software that already exists,  the subsamples generated by each procedure would need
or may prefer manual rather than automated data entry to be identified,
techniques. Whichever objectives are set, explicit statement  the data elements that were to be collected for each test and
with respect to these requirements proves beneficial in each subsample needed to be identified and rationalised,
determining possible solutions to the data management and
challenge.  the ability to associate a test with a ‘parent’ test needed to
A specific objective for Olympic Dam was to include a be provided.
certain level of the mineralogy data in the database. This Thought was given to the future extensibility of the system
objective arose from the relative inflexibility of a certain which raised some additional requirements:
interface for analysis of mineralogical data supplied by one  tests should include information on the phase of work;
of the service providers. While an excellent platform for  in the event of multiple test runs being conducted, the
examining one sample (or size fractions from one sample) ability to flag a single test as the ‘preferred’ test was
in a lot of detail, it was difficult to compare data between needed; and
multiple primary samples and extremely time-consuming  a method of monitoring the ‘history’ of a sample would
to extract data manually. Thus it was decided that the data be required, including the ability for geometallurgy team
management system should accommodate the storage of basic members to log comments against individual test events.
mineralogical data. The geometallurgical team had already devised a definitive
sample numbering scheme that provided unique identifiers to
DEVELOPMENT AND DESIGN tests and samples, eg GMxxxx-YYY-n-ZZZZ where GMxxxx
represents the primary sample, YYY represents the procedure
Identifying requirements conducted on the sample, n is the test run number (‘zero’
The geometallurgical team identified some fundamental being the initial run) and ZZZZ is the subsample identifier.
concepts during the development of the program scope and Procedures and subsamples were given an identifier as
execution plan. Each primary sample came from a resource shown by the examples in Table 2. Systematic nomenclature,
delineation drill hole; that is, the sample could be related to a regardless of the ultimate data management solution, will
spatial location within the deposit. always be required in order to maintain physical sample
labelling, file naming and to maintain sample ‘provenance’
Multiple primary samples were to be analysed. A series of across service providers.
analytical and metallurgical ‘procedures’ were to be performed
on each primary sample during the initial phase of work. Any Prototype
sample could be tested by multiple procedures. Furthermore,
The system was initially developed in Microsoft Access as
the same procedure could be executed on a sample multiple
a prototype following verbal discussions with the geomet-
times as part of a replicate program. Each procedure would
allurgical team. Although somewhat limited in functionality,
have specific conditions and attributes. Many procedures
the system was serviceable and was used to manage actual
would generate physical subsamples, such as flotation
program data. The prototype helped identify and cement
products, sized products or leach kinetic samples.
ideas about the requirements of the final system, for example,
Data relating to each subsample may need to be stored, and demonstrating that the final system would need to store
a subsample from one procedure could be used as the feed information about the order subsamples were extracted
stock for another procedure. For instance, a flotation tailings to facilitate the correct reporting and (where appropriate)
subsample would be the feed to a tails leach test. The data
calculation of some results.
from both the flotation and leach tests needed to be related to
the primary sample and its spatial location. Using the knowledge gleaned from the prototype,
requirements were captured in a comprehensive document
These initial concepts helped form the framework of the
that clearly outlined each element of functionality the
system and clarified some obvious requirements:
geometallurgy team required, as well as providing detailed
 each sample would need a unique identifier, information as to procedure types, sample types, subsamples
 each procedure would need a unique identifier, and any parent-child relationships between tests. Provision
 each test event (ie when a sample is ‘analysed’ using a of such a document assists greatly with the development of
procedure) would need a unique identifier which would information systems since it ensures all stakeholders have a

TABLE 2
Example of procedure and subsample identifiers.

Procedure ID Procedure description Sub-sample ID Sub-sample description


FKR RC1 Rougher concentrate 1
FKR RC2 Rougher concentrate 2
FKR Flotation test – kinetic rougher RC3 Rougher concentrate 3
FKR RC4 Rougher concentrate 4
FKR RT Rougher tail
LFT LR Leach residue
Leach test on flotation tailings
LFT LL Leach liquor

THE FIRST AUSIMM INTERNATIONAL GEOMETALLURGY CONFERENCE / BRISBANE, QLD, 5 - 7 SEPTEMBER 2011 239
V LIEBEZEIT et al

clear understanding of the problem domain and allows the client program (ie the interface) installed on their computers.
development team to align their development against the The client application is configured to communicate with the
requirements. Prioritisation of requirements also ensures database server and display the data in a manner intuitive
that initial implementations focus on the most important to the users as well as facilitating querying and reporting.
functionality with ‘bells and whistles’ left to the final stages of AcQuire has a standard data model and a flexible interface
implementation. Requirement prioritisation can also reduce whereas GBIS has a more flexible data model and a relatively
the inevitable ‘scope creep’. fixed front-end interface. Both systems have the capacity to
support typical geological data structures such as defining
System design drill holes, samples with downhole intervals, assay support
The provision of the requirements document greatly assisted and capturing geological data.
the development team since it clearly outlined the functionality It was decided to implement the geometallurgical system
that would ultimately be required. Clearly, the volume of data using AcQuire, partly because Olympic Dam already used the
precluded the use of Microsoft Access which has a 2 GB file AcQuire database system to host its geological data (hence
size limit. The preferred database platform was SQL Server the data manager was familiar with the software) and partly
since this environment is readily available at Olympic Dam. because the standard schema met many of the sample/
Thus, the development team had two options; to build a assay/workflow based requirements without customisation.
custom SQL Server database and associated interface, or to Despite this, the underlying database schema was adapted
use or adapt a commercial solution. quite significantly to support the mineralogical analysis and
incorporate the concept of procedures and tests. The design
Custom solution and implementation of the resultant solution was achieved in
collaboration with Snowden Technologies.
Undoubtedly a custom solution would have provided a
system that fully met all the user requirements. However, The extent to which a commercially available solution can be
development of application interfaces (either desktop or web- implemented depends entirely on the requirements. Various
based) is time consuming and such solutions can be difficult alternatives can be evaluated with respect to the requirements,
to support since extensive in-house knowledge is required. cost and schedule. A baseline or default solution should be
BHP Billiton prefers not to develop in-house software unless constructed as part of the evaluation.
no other solution is feasible.
DATABASE DESCRIPTION
Commercial solution Figure 1 is a representation of some of the principal data
Use of commercial solutions is generally the preferred option compounds (ie collections of data tables) in the AcQuire
because an interface, or a tool to adapt or develop an interface, schema as well as the data compounds added to support the
is provided and some of the required functionality is likely to geometallurgical teams requirements. The principal AcQuire
already be present in the system. In addition, in-house support compounds used are the sample/subsample and despatch
is easier to locate when commercial solutions are used, systems.
because there is a wider pool of potential employees familiar
with well known commercial packages. Both Geological Standard AcQuire compounds
Borehole Information System (GBIS) and AcQuire were
considered since both provide a database schema suited to Sampling system
storing geological data. Each of these solutions uses a client/ A significant proportion of the geometallurgical data is
server model. The database resides on a centralised server supported by the default AcQuire sample compound.
(SQL Server or ORACLE) and user workstations have the Individual tests are actually stored as samples. This facilitates

GEOLOGY LEGEND
(unused)
Default AcQuire
PHASE
Compound
WATER
(unused)
Addition to
PROCEDURE AcQuire Schema
DEFINITION
EVENTS
(unused) Supports parent-
child relationship
TEST MINERALOGY

SURVEY
DRILLHOLE
(unused)

SAMPLE

SUBSAMPLE DESPATCH

FIG 1 - Representation of principal data compounds in Geometallurgical Database.

240 THE FIRST AUSIMM INTERNATIONAL GEOMETALLURGY CONFERENCE / BRISBANE, QLD, 5 - 7 SEPTEMBER 2011
GEOMETALLURGY DATA MANAGEMENT – A SIGNIFICANT CONSIDERATION

the link between the data in the Sample compound and the industry cooperation to assist with the development of tools
non-AcQuire test compound. The sampling tables include that would be useful to multiple industry user-bases.
provision of hierarchical keys that support the parent/child There are three fundamental ‘concepts’ that do not logically
relationship that occurs between tests and subsamples as well correlate with the functionality provided by AcQuire; the
as between subsamples and tests conducted on the subsample. notion of a fixed procedure definition; the fact that the same
The example in Figure 2 shows that Test 2 was conducted on procedure may be run during multiple phases of work (but that
the Test 1 subsample material. Additional metadata relating to the definition may change in each phase) and the provision of
both the tests and the subsamples is also stored in the system, a data structure that represented a ‘test event’ on a sample.
but is not shown in Figure 2.
Additionally, the mineralogy data is essentially ‘geological’
Subsequent to the design of the Olympic Dam geometal- information stored about individual samples. Owing to the
lurgical system, AcQuire has introduced a new ‘fractions and extremely large volume of data generated, the data structures
composites’ compound into the schema. The tables in this to support this data must be highly optimised to minimise
compound are designed to maintain a hierarchy of samples
storage space and support fast querying of the data. The
and subsamples to multiple levels; any future geometallurgical
default AcQuire geology compound was not deemed sufficient
solution designed using AcQuire would almost certainly take
to store the mineralogy data; a new compound was required.
advantage of this compound.
Snowden Technologies created new compounds to support
AcQuire includes a mature QA/QC system which has been
those requirements not fulfilled by the default AcQuire
extensively used by the geometallurgical team.
schema. These new compounds are entered into the existing
Despatch system AcQuire metasystem which controls interaction of the data
compounds. This ensures that the customised compounds
The default AcQuire despatch system is used to maintain
are accessible by all elements of the default AcQuire desktop
a record of when samples were sent to laboratories for
interface application.
analysis, the types of analysis requested, when the data was
returned and, where applicable, information relating to how
assay analysis was conducted such as methods and detection Phases and procedure definitions
limits. The despatch system is largely used to assist the The new Phase compound stores the unique Phase identifier
geometallurgical team monitor the progress of their work and the intent, limitations and assumptions of each Phase. The
programs and to track outstanding or overdue test results. same types of analyses can be conducted during each phase,
but the procedure may evolve in each subsequent stage of work.
Schema additions For example, the subsamples generated by procedure ‘ABC’
Customisation of a commercial product such as the AcQuire in Phase 1 may be refined and different to those subsamples
database schema is not to be taken lightly and should only generated by the same procedure in Phase 2. Each procedure
be performed if absolutely necessary. In this instance, the definition and resultant test event is linked to a Phase. A sample
benefits of using AcQuire (the existing interface, the close can be analysed during multiple phases of work.
association with other related AcQuire-hosted data sets, the Procedure definitions (or ‘templates’) are provided by the
in-house knowledge and development capabilities) made the Procedure compound. This ensures consistency and therefore
decision to append tables to the existing database a logical assists with standardisation of data collection, querying and
one. All information relating to the ultimate design was reporting. A procedure definition includes a unique identifier
provided to AcQuire Technologies Pty Ltd in the spirit of for the analytical/metallurgical procedure, the phase of work

TEST COMPOUND SAMPLE COMPOUND


(Non-AcQuire) (Standard AcQuire)

SAMPLE

TEST 1 TEST 1

SUBSAMPLE

SUBSAMPLE

TEST 2 TEST 2

SUBSAMPLE

FIG 2 - The data relationships between tests, samples and subsamples.

THE FIRST AUSIMM INTERNATIONAL GEOMETALLURGY CONFERENCE / BRISBANE, QLD, 5 - 7 SEPTEMBER 2011 241
V LIEBEZEIT et al

PHASE
A
SUBSAMPLE
S001-ABC-0-FLOAT1
TEST
S001-ABC-0-FLOAT2
Generate Test S001-ABC-0 S001-ABC-0-FLOAT3

PROCEDURE SUBSAMPLE S001-ABC-0-RT


SAMPLE
FLOAT1 (1)
S001
ABC FLOAT2 (2)
SUBSAMPLE
FLOAT3 (3)
S001-ABC-1-FLOAT1
RT (4) TEST
S001-ABC-1-FLOAT2

S001-ABC-1 S001-ABC-1-FLOAT3

S001-ABC-1-RT

FIG 3 - The ‘instantiation’ of sample tests from a procedure definition.

the procedure is applicable to, a description of the purpose, within the mineralogy compound. To maximise performance
assumptions and limitations of the procedure and a listing of surrogate integer keys are used on the mineralogy tables. The
the valid subsamples that will be generated by the procedure. use of integers reduces the amount of data stored because
Information standard to the subsample can also be stored, descriptive, alphabetic keys use more storage space. Indexes
for example the order that subsamples are generated by the are maintained to promote the efficiency of queries conducted
execution of the procedure may be constant so each subsample on the tables. A series of data views are available to provide
is assigned a numeric order identifier. users with a more ‘human-readable’ view of the data. SQL
Server triggers (event triggered code) are used to control the
Test events insertion, updating and deletion of mineralogy data through
A test event is the execution or instantiation of a procedure the views into the underlying data tables.
against a sample. This process is displayed in Figure 3. When a
geometallurgist creates a test in the database, the appropriate Security and interface
procedure definition for the applicable phase of work is Access to the database is restricted to members of the
copied to create the sampling framework of the test; at this geometallurgical team. Access is maintained by SQL Server
point a unique identifier is assigned to the test event and to security groups.
each subsample copied from the procedure definition. When The interface to the geometallurgical database is via the
results are received from laboratories, they are appended to
AcQuire client application. An AcQuire workspace has been
the relevant subsamples and the test is marked as complete.
created that allows authenticated users to create phases,
procedure definitions and tests and review imported data sets.
Mineralogy Export and reporting functions are developed on demand for
The Olympic Dam geometallurgical program includes the the geometallurgical team.
capture of mineralogy data for multiple samples from both
Mineral Liberation Analyser (MLA) and QEMSCAN® (QEM)
automated mineralogy analysis systems. In the original
USE OF THE SYSTEM AT OLYMPIC DAM
program of work the MLA analysis was conducted on a split Figure 5 represents the activities and data flow divided into
of the original drill hole composite sample. The QEM analysis decisions and activities and the corresponding data entry into
was conducted on sized subsamples from flotation and leach AcQuire. The explicit definition of the phases of work, the
tests, that is, the QEM test was a ‘child’ of another test. In procedures and subsamples, and the analytes and methods
later programs MLA analysis was also used for the sized has already proved to be an integral part of the definition of
subsamples. Of the data sets collected by these procedures, program scopes at Olympic Dam.
the Mineralogy compound is designed to store grain size
distribution, mineral associations, the mineralogy of individual Importing data into the database
particles and grains, and an assessment of the leachability, At the start of each phase of work, the geometallurgical team
lockability and liberation of various minerals within a sample. liaises with laboratories to create a workable template that
A comprehensive list of minerals is maintained as well as will be used to report the data results for each procedure. For
‘mineral suites’ which represent groupings of similar minerals the metallurgical testing (comminution, flotation, leach) the
that are reported by the mineralogical analysis. For example, data was provided in a formatted Excel spreadsheet which is
‘Carbonates’ includes dolomite, ankerite and siderite. converted to CSV (comma separated) ASCII format prior to
Figure 4 shows the data distribution within the geometall- import.
urgical database. The entire database is approximately The mineralogy testing proved to be rather more complex.
53 GB in size of which 49.6 GB is actual analytical data. For both MLA and QEM the laboratories tended towards
Of the analytical data, 49.1 GB (99 per cent) is contained reporting results in ‘datastores’ or individual databases that

242 THE FIRST AUSIMM INTERNATIONAL GEOMETALLURGY CONFERENCE / BRISBANE, QLD, 5 - 7 SEPTEMBER 2011
GEOMETALLURGY DATA MANAGEMENT – A SIGNIFICANT CONSIDERATION

Mineralogy
99%
Phase
0%
Sample
0.8% Procedure
0%

Test
0.1%
Other
1%

Despatch
0.1%

FIG 4 - Data distribution in Olympic Dam geometallurgical database.

required additional processing using client software to extract requirements have however required changes to some
results. Given the number of samples analysed, the overhead laboratory practices such as the prevention of entering text-
of the manual processing to extract results from individual based comments in numeric columns and the reporting ‘less
datastores was significant. Both laboratories worked with than detection limit’ assays using the ‘<‘ symbol instead of as
BHP Billiton to adjust their client software to either allow a negative number. Further, Olympic Dam has imposed the
the generation of multiple results sets or by publishing a requirement of maintaining test and subsample nomenclature
programmable API (application programming interface) that at a service provider level. Now that the Olympic Dam system
allowed the creation of a customised Visual Basic program is established, a clear set of guidelines can be provided to
that automatically extracted required data from the data sets. potential service providers as part of the initial tender or
MLA data is imported extremely efficiently from individual scope of work package.
Access MDB databases and QEM data is imported from a
series of ASCII files automatically extracted from individual The original data files are maintained as original source
QEM datastores. files. This increases the storage capacity requirements of the
system but is deemed an essential backup to the database as
An existing automated import system designed by Snowden
well as providing a demonstrable audit trail.
Technology for the AcQuire geological database was adopted
to provide similar functionality to the geometallurgical Data analysis
team. A series of importers (either AcQuire import objects,
As described earlier, the geometallurgical team decided that
or SQL server stored procedures) have been developed to
the database would primarily be a data repository and the
support the import of each individual file format provided
focus would be on fundamental test data. Therefore despite
by the laboratories engaged to conduct testing for the
the importance of uranium leach extraction, it is not stored as
geometallurgical programs. On receipt of results, team
an explicit variable in the database and the database cannot
members transfer the digital files to a designated folder for
be queried directly to deliver ‘uranium extraction’. Instead a
the specific phase/procedure on a file server. Every hour, the
standard report was created containing all the information
database automatically examines each of the import folders
needed to calculate the extraction and other parameters
and attempts to import the data contained in the file using
from the leach tests. This is extracted and used as an input
the importer interface designed for the phase/procedure.
to various data analysis software packages depending on the
Successfully loaded files are transferred to a ‘loaded’ folder;
requirements. This ensures that the data is the most current
rejected files are transferred to a ‘reload’ folder. Team
data prior to use.
members periodically review the reload folder and fix the
rejected files. This system reduces the time taken to enter data The QA/QC functionality within AcQuire was used
into the database and also provides an initial data integrity extensively, beyond its original intent as a method of
check; the importer will not load the file if it is formatted comparing duplicate data sets. It proved to be an excellent tool
incorrectly or if the test/sample identifiers listed in the file do for comparison of assay data sets and assisted in identifying
not match those that already exist in the database. sample swaps, quick visual reviews of flotation and leach
This aspect of data validation has proved to be extremely behaviour, and for identifying outliers on a statistical basis for
useful in minimising manual scrutiny of files for errors. The further investigation.
Olympic Dam geometallurgy program has generated more Other reports are developed on request and may include
than 20 000 reports from various analytical laboratories calculations, or generation of a report mimicking the
and the import functionality has allowed many common original worksheet. Commonly used reports which produce
data reporting errors to be intercepted. The more stringent data for multiple samples include the comminution report

THE FIRST AUSIMM INTERNATIONAL GEOMETALLURGY CONFERENCE / BRISBANE, QLD, 5 - 7 SEPTEMBER 2011 243
V LIEBEZEIT et al

LEGEND
Start phase
of work
Activity

Data Entry into


Define Assign unique identifier, AcQuire
phase assumptions, limitations and intent

Define Define mineral


Define Define Define
assay lists and
laboratories procedures subsamples
suites groupings.

Collect
samples

Define Identify each physical sample with a Test is


samples unique identifier finalised.

Devise test
schedule
Re-testing Review
is required results

Create test
events Procedure definition is copied to
create “empty” test/subsample
framework; each element has unique
Create identifier. Import
subsamples results

Send Analysis is
Create
samples to performed and
despatches
laboratory results returned


FIG 5 - Logical data flow.

(summarised comminution data with head assays and originated from. Other data relating to the drill hole such
mineralogy); uranium mineral speciation incorporating as geology and drilling information is not replicated in the
mineralogy, assay and weight by size fraction; QA/QC reports geometallurgical database; instead the geometallurgical and
for comparison of replicate data sets; assay extraction for the geology AcQuire database can be queried as one system
minor element deportment calculations and gangue mineral- using a linked server.
ogy reports. These output data sets are then analysed in
various ways with spreadsheets, statistical packages or Web-based reporting
exploratory data analysis packages with the ultimate aim of The geometallurgical database is hosted by Microsoft SQL
populating the block model. Server 2005. A web-based reporting interface is included with
this Microsoft product which facilitates the delivery of reports
EXTENSIBILITY and data to users that may not have access to the AcQuire
system. The reporting application includes a subscription
service that allows scheduled emailing of specified reports to
Links to other databases user groups. Currently used to notify users of rejected imports,
The samples analysed by the geometallurgical team were web based reporting is likely to become a larger component of
sourced from drill hole core. In the geometallurgical database, the geometallurgical database as the geometallurgical team
each sample is flagged with the identifier of the drill hole it move towards data publication rather than data collection.

244 THE FIRST AUSIMM INTERNATIONAL GEOMETALLURGY CONFERENCE / BRISBANE, QLD, 5 - 7 SEPTEMBER 2011
GEOMETALLURGY DATA MANAGEMENT – A SIGNIFICANT CONSIDERATION

Search library the metallurgical laboratory would not, or could not, supply
The geometallurgy search library was introduced as a web- the original assay files. In contrast, recent programs which
based interface in response to a request to make the results specified the sample nomenclature and file formats have
from the geometallurgical database widely accessible. The enabled loading of more than 1000 worksheets and 1000
search library provides authenticated users with the ability assay files with virtually no changes.
to conduct ‘range’ based searches (for example, find all the
samples that have a copper assay between one and three per
Engagement of service providers
cent, have four per cent chlorite and that have been analysed The willingness of the service provider to maintain test and
by flotation testing). Users of the search library do not need subsample nomenclature is critical to the success of whatever
to understand the structure or underlying assumptions of the system is selected. Every data management system must rely
database, the system purely delivers summary results to them. on correctly named files, and for geometallurgy programs,
Most users of the search library are not authenticated to use sample provenance (maintaining the relationship of the
the geometallurgical database; they consume information subsamples and data to the primary sample and its spatial
from the system that the geometallurgical team have elected location) is crucial. Sample nomenclature and reporting
to make available. requirements are now included as a specific aspect in Olympic
Dam geometallurgy scope of work documents. Every system
Additional data sets has its constraints, and a number of accommodations might
The original design and implementation of the geometallurgical need to be made on both sides, but the willingness of a service
database was based on the nomenclature and structure of the provider to work within the selected system is a considerable
Olympic Dam geometallurgical teams requirements for their advantage.
initial phase of work. Since implementation, the system has
been identified as a potential host for various other data sets. CONCLUSIONS
To date, the system has incorporated information on stope
Time spent defining objectives and developing requirements is
grade control samples, monthly composite samples from the
never wasted in any project, and the data management aspect
plant as well as the drill core composites that it was originally
of a geometallurgy program is no exception. The definition
designed to manage.
of the scope of the geometallurgical test work program
enables the number and form of the reports to be examined
LESSONS LEARNED to determine the scope of the data management challenge.
Objectives and requirements relating to data storage, location
Commitment and resources and accessibility, degree of manual data management,
The explicit identification of a geometallurgy program’s data production of data sets for analysis, program management
management requirements is critical. The data generated and QA/QC should be set. The various options for data
from a program must be managed in some form or another; management can then be evaluated against an explicit set of
it is recognition of the differences between geometallurgy
objectives and requirements.
and conventional metallurgical or geological programs which
may lead to a geometallurgy-specific solution. The cost of a The implementation of a logical nomenclature assists in
geometallurgy-specific data management solution is not both program and data management, and should be developed
necessarily more than conventional methods although this at the start of the program in conjunction with the service
does not become apparent until the costs of an alternative or providers. A range of commercial solutions are available
baseline are calculated. based on geological data management but some degree of
In the case of the Olympic Dam system, there is no doubt that customisation is required to accommodate metallurgical and
the resultant system is a product of the skills and experience mineralogical data.
base of the available resources, and the commitment of It is recognised that a large geometallurgy program requires
the organisation to a relatively sophisticated solution. It is a variety of skills and experience in the areas of geology,
extremely valuable to have personnel who can liaise between resource estimation, metallurgy and mine planning. It is
the technical user base and information system developers, less recognised that skills in data management are equally
leading to a streamlined development process and clearer important to achieve maximum ongoing benefit from the
understanding of the possibilities and limitations. The Olympic program. The Olympic Dam example is only one possible
Dam data management solution definitely requires a mixture of solution to the data management challenge, but it has
skills, as with geometallurgy itself. demonstrated the value of a truly cross-disciplinary approach
to geometallurgy data management.
Implementation at the start of the program
One of the most important lessons learned was the importance ACKNOWLEDGEMENTS
of implementing the system as early as possible in the program
to obtain the maximum benefit. One of Olympic Dam’s test The permission of BHP Billiton to publish this paper is
work programs was run by one of the technical teams, and gratefully acknowledged. In addition the authors wish
it was later identified that the sample source and test work to acknowledge the expertise and assistance of Michael
program was aligned with the geometallurgy program McMahon from Snowden Technologies in development of the
objectives and therefore it made sense to include the data in Olympic Dam geometallurgical database, and the cooperation
the geometallurgical database. However this required a huge of AcQuire Technology Solutions Pty Ltd in allowing
amount of manual work even though the number of samples customisation of the AcQuire database schema. Finally the
was relatively few. The contract metallurgical laboratory had authors would like to thank all of the service providers to the
not maintained the sample nomenclature (every sample re- geometallurgy program (Amdel Bureau Veritas Australia,
named based on the order in which they were unpacked) and David Wiseman Pty Ltd, JKTech, ALS Mineralogy, SGS
the Excel worksheets were not named consistently. Therefore Minerals, CSIRO Land and Water, Pontifex and Associates)
every single spreadsheet report required a manual re-work. for their cooperation as the file format requirements for the
In addition, there was a major workaround required because Olympic Dam geometallurgical database developed.

THE FIRST AUSIMM INTERNATIONAL GEOMETALLURGY CONFERENCE / BRISBANE, QLD, 5 - 7 SEPTEMBER 2011 245
246 THE FIRST AUSIMM INTERNATIONAL GEOMETALLURGY CONFERENCE / BRISBANE, QLD, 5 - 7 SEPTEMBER 2011

You might also like