You are on page 1of 10


A discussion of Deswik Pseudoflow Pit Optimization
in comparison to Whittle LG Pit Optimization

Julian Poniewierski
Senior Mining Consultant

The user being a computationally more efficient algorithm developed is thus in full control of how the costs and revenues are some 35 years after the original Lerchs-Grossman algorithm calculated and assigned. and is required to do their own block that achieve the same result. and now suffers from having become a generic term for the process of pit optimization– similar to the genericization of the Hoover brand with vacuuming in the UK (such that in the UK. PSEUDOFLOW EXPLAINED 1. In the past 30 years. For Deswik’s based on the Lerchs-Grossman (LG) algorithm. pseudoflow implementation. with another 10 years before it became a mainstream approach to open pit optimization. This does however require the user (1965). Whittle and the LG algorithm have become synonymous with the optimization of open pits. software (Whittle Three-D). including Deswik’s implementation. cost structures and revenue parameters (which we believe is a “good thing”). the user is required to calculate the revenues and costs for each block in the In summary. Just as “hoovering” does not need to be a Hoover vacuum cleaner – indeed the Dyson cyclonic vacuum is recognized as far more efficient at vacuuming– the pseudoflow algorithm should now be used to replace the LG algorithm in “whittling” your pit optimization. to be fully familiar with their block model. and there are at least three commercial implementations available. It is now 15 years since the formulation of the pseudoflow algorithm. INTRODUCTION The purpose of this document is to inform It should be noted that the Deswik implementation is the user about Deswik Pseudoflow. within the not constrained (nor is it aided) by the extensive set- context of the mining industry’s most accepted pit up parameter tables and inputs provided in the Whittle optimization process that uses Whittle software software for cost and revenue calculations. It took 20 years from the formulation of the LG algorithm This enables the set-up to be as flexible as required by the for it to be incorporated in the first commercially available user (unconstrained by template set-up dialogs).CAD environment. both are variations of network flow algorithms block model used. 2 . with the pseudoflow algorithm regularization within the Deswik. people hoover their carpets).

Seymour. and it is incapable of examining all ore is given a value which reflects this. In the case of an ore block. FLOATING CONE Pana (1965) introduced an algorithm called Moving (or The LG methods took into account two types of Floating) Cone.GROSSMAN cross-sectional areas. which is a heuristic approach. and other associated costs. This information is stored as “arcs” between the blocks (“nodes”). The method was developed at Kennecott information: Copper Corporation during the early 1960s (McCarthy. the shortcoming of this approach is that it creates which can. This was a very labor intensive approach and could only Two methods to the solution of open pit optimization ever approximate the optimal pit. for ore and waste. Zhao and Kim 1992. The projected ultimate pit limits are developed by using a technique of a moving “cone” (or rather a frustum of an ii. of early solution to Kennecott) were an early implementer of the floating cone algorithm (and may still offer it in their In the manual method.2. and relied heavily on the “gut an optimum pit limit for an undiscounted cash flow – feel” of an experienced designer using trial and error. blocks must be removed to uncover it. Whittle of Whittle Programming Pty Ltd in the mid-1980s. mining engineers used manual realistic results. the LG method needs details of what other block model of the mineral deposit. based on an economic block model of an ore body and its surrounding waste. For each block in the optimization. digging and the block model space to generate a series of interlocking haulage. The final pit shell was then produced by drawing The LG method was based on a mathematical technique increasingly larger pit shells on cross section such that which was unusable in practice until a practical optimization the last increment had a strip ratio equal to the design program called Whittle Three-D was developed by Jeff maximum. The design had to be were detailed by Lerchs and Grossmann. The cone is moved around in negative and will be the cost of blasting. The Lerchs-Grossman section was compared against the break-even stripping (LG) algorithm is well documented in the technical literature ratio for the estimated ore grade and appropriate revenue (Lerchs and Grossman. a simple optimization of economic solution suite). pit depth was usually carried out by hand calculator (or slide rule) for regular shaped orebodies using incremental 2. the algorithm fails to consistently give optimization and pit design. The incremental stripping ratio (the ratio of the It was also in 1965 that Lerchs and Grossmann published a tonnage of waste that must be moved to access the next paper that introduced two modeling approaches to solving mass increment of ore that will be accessed) on each cross. HISTORY OF PIT OPTIMIZATION 2. requiring a three dimensional computerized model. during mining. Both methods gave became extremely complex.3. combinations of adjacent blocks. MANUAL PROCESS Prior to the development of computerized methods of pit For this reason. and determined which blocks should be mined to obtain the maximum dollar value from the pit. less any processing. will be offset by the value of the recovered ore. and a inaccurate because it treated the problem in only two Dynamic Programming algorithm. sales. the open pit optimization problem. In the case of a waste block this will be a minimum mining area). In cases of highly variable grade the problem of an operations research technique. and then a manual Mintec/MineSight (being a US based company and supplier pit design. 1993) and was the first computerized attempt at pit i. Any block However. LERCHS . PSEUDOFLOW EXPLAINED 2. 2. Hustrulid and Kuchta 2006). or film). the “pointy” end has been cut to uncovered. being a Graph done on a large number of cross sections and was still Theory algorithm. which is an application dimensions.1. be separated into waste and overlapping cones. the removal cost frustum shaped removal increments. and an overall pit slope. The required mining slopes. linens. 1965. 3 . interpretation methods with evaluation on manually drawn cross-sections (on paper. The value in dollars of each block once it has been inverted cone – that is. and cost inputs. 1995.

and various heuristics and techniques were that this is for an undiscounted cash flow value). which In 1976. used in place of the LG algorithm. Branches are flagged as ‘strong’ if maximum closure network flow problem but recommended the total of their block values is positive. network flow algorithms perform orders of represent the net profit of the block. than the alternative “lowest label” method implementation of the pseudoflow algorithm. the weights than a million. Except optimizer software offering. Picard provided a mathematical proof that a indicate that some part of a strong branch lies below a “maximum closure” network flow problem (of which the weak branch. 2002. developed to maximize its performance. and years on many feasibility studies and for many producing theoretically much faster for larger problems (Muir. the mining (usually slope) constraints. 2000). At this One of the first efficient maximum flow algorithms used in point the blocks in all the strong branches taken together solving the open pit optimization problem was the “push- constitute and define the optimum pit. Lerchs and Grossmann indicated of related blocks in the form of branches of a tree (called a that the ultimate-pit problem could be expressed as a “graph” in mathematics). the LG algorithm “knows” nothing about the positions of the blocks – nor Development of more efficient network flow algorithms indeed about mining. When such a case is found. As a consequence. Hochbaum and Chen’s study (2000) showed that the push- In effect. mines. The “highest label” method implementation of year that the floating cone method was computerized) and the pseudoflow algorithm in particular is consistently faster the first commercial available adoption of the LG method than the generic LG methods and is also usually faster (Whittle’s Three-D). be strong or weak) or breaking a ‘twig’ off one branch and sophisticated network flow algorithms could therefore be adding it to the other branch. When the number of vertices is large. and the arcs represent magnitude faster and compute precisely the same results. This was the algorithm that MineMax implemented in their first pit It should be noted that it is a mathematical solution. Also note that it took some 20 years between the Pseudoflow methods give new life to the LG pit publication of the LG method (1965. for the information given by the arcs. Berkeley which is purely mathematical. The increase in speed can The basic LG algorithm has now been used for over 30 be from two to 50 times faster than the LG methods. the LG method progressively builds up a list In their 1965 paper. The blocks in the relabel” algorithm (Goldberg and Tarjan. NETWORK FLOW SOLUTIONS structure arcs. in this case all cases. The LG method then searches for structure arcs. and they can calculate identical results in a fraction of the time. 4 . what the LG algorithm has done is to find the relabel algorithm outperformed the LG algorithm in nearly maximum closure of a weighted directed graph. weak branches are those which will be left behind when 1992. As such the LG algorithm provides a mathematically optimum solution to Numerous authors implemented the push-relabel the problem of maximizing the pit value (importantly. PSEUDOFLOW EXPLAINED Given the block values (positives and negatives) and the 2. (Hochbaum. method of solving a special case of a network flow problem. possibly due to computer memory worth mining if they are uncovered. which was also the optimization. Such branches are their direct approach. 1997). 1988. the two branches open cut optimization problem is one) were reducible to are restructured so as to remove the conflict. 2001. The generally accepted most efficient list of vertices and a list of arcs.4. greater the vertices represent the blocks in the model. Goldfarb and Chen. The LG algorithm works only with a have continued. two or three dimensions and how many arcs per algorithms developed by Professor Dorit Hochbaumn block are used is immaterial to the logic of the method. Hochbaum and Chen. Whether these are laid out algorithm currently available are the various pseudoflow in one. hence solvable by involve combining the two branches into one (which may any efficient maximum flow algorithm. The LG algorithm was therefore a negative total values are flagged as ‘weak’.. This may a “minimum cut” network flow problem. Other branches with constraints at the time. King et al. The checking continues until there is no structure arc which goes from a strong branch to a weak branch. note algorithm. 2005). mining is complete. and her colleagues at University of California.

the results for the Deswik implementation were compared 34 1157 243 480 152 107 27 against four publicly available test data sets from Minelib1 36 1387 478 541 157 116 28 (Espinoza et al. The push-relabel algorithm implemented by MineMax was compared to the LG algorithm by SRK (Kentwell. there are a couple of other known examples of published comparisons between the LG Figure 1 – Solution times for four pit optimization algorithms algorithm and flow network solutions to the pit optimization for different bench number pit problems problem. Time (seconds) 1527 126 29 In addition to Muir’s paper. These analyses and slopes). McLaughlin. 30 632 130 327 125 54 17 32 878 176 410 145 83 28 As a check of the correct implementation of that code. with any minor differences that are observed always being due to how the various implementations compute the slope angle constraints.xhtml 5 .cl/minelib/Datasets. LG LLPQ HLPQ Profit value 57 118 058 57 118 058 57 118 058 Table 2 (from Muir. Branches relinked 950 175 329 599 420 244 The relative solution times shown in Table 1 are shown Branches pruned 1 703 036 459 454 638 088 plotted in Figure 1. 2005) pseudoflow priority queue” (HLPQ) implementation took Table 2 – Statistics for level 38 for 220 x 119 x 38 profit matrix just under 2% of the time it took for the standard LG algorithm to solve a 38 bench pit optimization problem. PSEUDOFLOW EXPLAINED 3. 2005) shows the relative run-times for LLPS Subset Lowest Label Pseudoflow (no priority queue) several variants of both the LG and pseudoflow algorithms. results were presented to the mainstream mining industry Table 1 – Optimization times (seconds) to various pit levels for in the 2005 AusIMM Spectrum series publication: Orebody 220 x 119 x 38 profit matrix Modelling and Strategic Mine Planning. Key results of Muir’s Bench LG LGS LLP LLPS LLPQ HLPQ analysis are reproduced herein. 2012).with the of the identical results achieved in comparison to the LG differences appearing to be due to block coarseness and algorithm in solving a pit optimization. The Pseudoflow results were identical to the Legend: published results at Minelib. 26 285 56 186 91 23 9 It should be noted that the code written by Muir (2005) 28 398 94 247 107 35 13 is the underlying calculation engine that has been implemented in Deswik Pseudoflow. ALGORITHM PERFORMANCE COMPARISONS Muir (2005) gave the most comprehensive analysis of the and was found to produce “the same results for the actual pseudoflow algorithm performance and a practical example optimal pit calculations” (to within less than 0. 2005) shows that the number of blocks and the profit value for the HLPQ solution was identical Blocks removed 95 228 95 228 95 228 to the LG solution of the same 38 bench pit optimization Blocks remaining 830 754 830 754 830 754 problem.01% . LLPQ Lowest Label Pseudoflow (priority queue) HLPQ Highest Label Pseudoflow (priority queue) It can be seen from these results that the “highest label (after Muir. LG Normal Lerchs-Grossmann LGS Subset Lerchs-Grossmann LLP Lowest Label Pseudoflow (no priority queue) Table 1 (from Muir. KD and P4HD. The specific data sets the results 38 1527 628 556 160 126 29 were checked against were for Marvin.uai. 2002) 1 http://mansci-web. Jiang (2015) stated that the final pit limits from using a pseudoflow algorithm implementation versus the Whittle LG implementation have always been found to be materially the same.

and stockpiling delays the recovery of that but to also check the optimistic case upside scenario to value. used in the process. be aware that the process of pit optimization provides such a facility. as well as the correct use of the results of the slope with respect to the actual design and effects of modeling. Mines 4. vertical sides. as per the scenario the value generated is the value that occurs at time strategies advocated by Hall (2014) and Whittle (2009). It is possible to output a smoothed shell through the block centroids. Deswik therefore recommend that the implementation of Pseudoflow does not have a tool user concentrate on the overall picture and getting as to consider minimum mining width – but this is in the accurate as possible estimates of the “big ticket” items.1. and more. placement of haul roads on that overall slope. Effect of minimum mining width on the bottom of a approximations in the process of pit optimization that need shell. Larger blocks will generally give are a large number of approximations inbuilt into the less slope accuracy. but note that this will not If an elevated cut-off grade policy is used in the give the same tonnes and grade result as the block scheduling of the pit early in the pit’s life. uncertainty and d. consideration of the minimum mining width at the bottom of each shell – even when the package used In summary. An accuracy tolerance of around 1° average error is usually considered acceptable. the time value of the block of ore stockpiled can be a fraction of the value used in the pit optimization. 6 . SOLUTION APPROXIMATION “ERRORS” with significant amounts of marginal stockpiled ore will suffer a significant oversize effect from the difference a. This is due to the approximation of the overall exercise. value is actually generated. This will always need to be will provide the exact “true” optimization solution. The huge efforts dedicated to the development of sophisticated optimization algorithms is usually not c. The effect of using blocks with vertical sides to in when the algorithm values the block and when the represent a solution (a pit design) that has non. A matched by similar attention being paid to improving the difference of 5% in tonnes is quite common during this correctness and reliability of the data used in the modeling process. as a means based optimization when the surface is cut against the of maximizing the NPV (Lane. e. The accuracy of the assigns the value and when the value is actually realized overall slope created in the modeling process with in the plan increases further. At present Deswik’s parameters. The pit optimization algorithms Deswik also advise to design for risk minimization of – both Whittle LG and Deswik Pseudoflow assume the downside in your assumptions. Some of the numerous sources of error. Many pit optimizations are undertaken without to be recognized are discussed below. remember: “don’t sweat the small stuff”. Stockpiling for 10 or more years will mean that determine infrastructure boundaries. PSEUDOFLOW EXPLAINED 4. smaller blocks that algorithmic solution to the pit optimization problem as well allow greater accuracy will require more modeled arcs as a number of common errors and uncertain assumptions (block precedencies) and will slow the processing down. MODELING ISSUES TO NOTE Having shown that it has been proved that the pseudoflow respect to slope desired to be modeled will depend algorithm will give identical results to the LG algorithm. it is upon the height and number of dependencies (arcs) appropriate to also point out that no algorithmic solution used to define the slope. There checked for suitability. stockpiled will be increased. of mining. And future development plans. This will change the value of is based on coarse and uncertain estimated input the selected shell used for design. then the tonnage resource model blocks. Effect of stockpiling. and the time related differences in value between when the pit optimization b. 1988). Slope accuracy representation. Changes in converting a shell to a pit design.

If blocks with extremely small values (cents per tonne of positive value) are left within the As discussed by Whittle (2009) when the time value of block model used (effectively the use of a marginal money is taken into account. This can be seen where the value set to zero. Once the final shell has been selected is at a lower overall tonnage than the peak of the and the ultimate pit designed. the curve of discounted cash value versus tonnage tends to be flat at the top. a commonly seen in the industry for ore reserves work variable cut-off grade or cut-off value policy should be and project feasibility studies. Additionally. Processing Plant Performance Parameters. Effectively. Aside from Smallest Mining Unit (SMU). the outer shells of the RF cut-off grade value of zero). pursue lower cost. Using Revenue Factor (RF) =1 shells for final pit design. It is depending on the deposit’s structure and the mining suggested that a suitable value would be the minimum constraints (minimum mining width. the selection of RF=1 shell is still It should also be noted that for NPV maximization. in that pit can be reconsidered for inclusion in ore reserves and stockpiling if so desired. The effect of processed for little more than practice – as discussed in discounted cash flow means the discounted costs Poniewierski (2016).65 and 0. d. Errors in block model regularization and the assumed c. 7 . COMMON INPUT/OUTPUT ERRORS AND ISSUES a. it should also be noted that project value can often be increased by sacrificing metal recovery to b. maximum vertical desired percentage margin on the processing and sales advancement per year. so they do not influence the optimal peak of the discounted cash flow of the specified case shell selection. Whilst it is worth maintaining the option to operate during this period and in this part of the deposit in case prices. Variations built into the result. or because there is a fixed tail component the over-selectivity can easily result in pits with an not be taken into account). then a lot of ore will be = 1 pit can be shown to reduce value. or technology improve. Cut-Off Grade. The optimal pit from a Net Present Value (NPV) viewpoint Deswik suggest that in order to avoid this situation can be between revenue factor 0. Such blocks would have their revenue and processing capacity. Not considering estimated. the ore/waste tonnage classifications and grades at the The commonly used constant recovery will almost SMU size need to be maintained and not smoothed always be wrong (either because it is optimistically over- out to the larger regularized block size. Despite the fact that this aspect is well discussed in the technical literature. that a value cut-off greater than zero be applied. the marginal material undiscounted total cash curve. If a block model is used price. expectation of double the value of a pit selected from a block model with an appropriately sized SMU.2. then unrealistic mining selectivity will be block of ore is the processing plant recovery. this part of the resource should not be regarded as a core part of and driver of a project (Whittle 2009). the other big factor with significant uncertainty that features grade estimated blocks at smaller than and used in the calculation of revenue received for a the SMU size. higher throughput – as discussed by The pit limits that maximize the undiscounted cashflow Wooller (1999). outweigh the more heavily discounted revenues.95. mineralogy and hardness can be than SMU size for purposes of processing speed. Additionally. If a model is regularized to a larger in recovery for grade. for a given project will not maximize the NPV of the project. and limit on total movement) costs employed. due to the fact processed in the project for very little value. then expected compared to the recovery used in the model. that the cost of waste stripping precedes the margins a significant percentage of the ore is being mined and derived from ore ultimately obtained. it is common for the last third of the life-of-mine to be quite marginal. adopted (as per Lane 1988). PSEUDOFLOW EXPLAINED 4. costs. For example.

3. plant capacity and requirements. c. In the analysis optimization results ultimately depend on the accuracy of costs and revenues. Dilution and loss are nearly always “guesses” – except for sites with a number of years of operating experience and a good reconciliation system that allows for assessment of the dilution and loss (which is not all that common). exchange rates. data that makes assumptions and estimations on contractor costs and labor costs. While a lot of focus can be spent on ensuring the desired overall angles are modeled accurately. sparse and imperfect knowledge of faulting. Except for operating mines with a good (model over-call) and up to 30% has been seen. block model has been created from sparse imperfect inflation. d. This is also one of the major sources of error in a pit optimization. Should these be included or not included? If included. This is one of the biggest e. There is usually In addition to the grade uncertainty. we can confidently state that the fault interpretation and lithology interpretation. the design will time period costs) need to be applied on the basis of change when these are converted to Indicated or the mine/mill system bottle-neck. Costs. As a general rule this Measured status. 8 . it needs to be noted that fixed costs (or error by 50% or more. Effect of Inferred resources. of under-call do also occur. annual production rate. the geotechnical conditions can change quickly from that currently being used. bedding and hydrology. Economic uncertainty. Even in operating pits. Cases understanding of their detailed cost driver history. INPUT UNCERTAINTIES a. that do not truly reflect the changes in costs as the pits being assessed change in size. For the commodity mineralization limits. we have to make assumptions of the model and the competence of the geologist about the macro-economic environment such as interpreting all the available geological data. price in particular. b. and will predominate in the there is usually a great deal of uncertainty on the literature as no-one likes to discuss the bad outcomes costs being used in the pit optimization. etc. Geological uncertainty. these can easily be in In addition. are just estimates. sustaining capital costs. is often the SAG mill (with power rather than tonnage being the limit). jointing. fuel and power costs. selection. many resource models have contained metal errors of at least 10% or more f. Many publically. In the authors experience 70 to 80% of all parameters used to estimate costs such as equipment resource models suffer from overcall to some degree. If not included. as the pit sources of “error” with pit optimization. interest rates. price used will be 100% wrong for the life-of-the mine (it will never be one static value). In the author’s experience. PSEUDOFLOW EXPLAINED 4. in many cases the slopes provided for use may be little more than a geotechnical engineer’s guesstimate based on very little rock mass quality data. Geotechnical uncertainty. The commodity prices. mineralization grades modeling. there is also an imperfect understanding of fixed and variable costs density uncertainty and in-situ moisture uncertainty.

and the user should be aware of all of these to avoid falling into common error traps. This enables the cost and revenue calculations to be as simple or complex as required by the user (unconstrained by template set-up dialogs). Pseudoflow is however a computationally more efficient algorithm developed some 35 years after the original Lerchs-Grossman algorithm (1965). 2005). SUMMARY Both the Lerchs-Grossmann and pseudoflow algorithms are variations of network flow algorithms that achieve the same result. the user is required to calculate the revenues and costs for each block in the block model used. and is required to do their own block regularization within the Deswik. 9 . it will be a difference in the set-up used. The user is thus in full control of how the costs and revenues are calculated and assigned. but this does require the user to be fully familiar with their block model. PSEUDOFLOW EXPLAINED 5. For Deswik’s Pseudoflow implementation.CAD environment. and has been available for use for some 15 years. cost structures and revenue parameters (which we believe is a “good thing”). There are numerous set-up factors and parameters that can cause differences in pit optimization results. with the first implementation for mining discussed in 2005 (Muir. If differences are seen between a Whittle LG result and a Deswik Pseudoflow result. It should be noted that the Deswik implementation is not constrained (nor is it aided) by the pre-defined template inputs provided in the Whittle software for cost and revenue calculations (these templates can be restrictive for both very simple set-ups or complex set-ups not catered for).

pp. Goycoolea. D. Cut-off Grades and Optimising the Strategic Mine combinatorial problems. K F. FL. 1984. R. 1992. mill throughput. The simulation approach to open-pit design. 2001. I F. M. A N. Planning. available at http:// Res. Francois-Bongarcon. Ann. 247. A new approach to Pana. Hochbaum. pp. optimization-yaohong-d-jiang 10 .amcconsultants. D. D S. 48(6): 894. AusIMM Project algorithm for the maximum-flow problem. P L. com/downloads/Minemax-Planner-vs-FourX. and Metallurgy: Melbourne). 1988. 1986. Plan. AIME. A new-old algorithm for minimum-cut Wooller. In Large Open Pit Mining Conference. 201–207. D S. pp 97-101 (The Australasian Institute of Mining 914. Y C. Vol. A pseudoflow algorithm: a new peril of the break-even cut-off grade. in best implementations of old and new algorithms for the Proceedings Orebody Modelling and Strategic Mine open-pit mining problem. Vol. 57(2): ACM-SIAM Symposium on Discrete Algorithms. 53-56. pp. K. pp 217-230 (Whittle Programming: Melbourne) Jiang. Y. USA. Ultimate pit design methodologies using Computer techniques in mine planning. Networks. Goldberg. W.linkedin. No. January. 1978. Operations Research. 2000. J. 2005. 58. Espinoza. www. V. 1993. Proceedings of the Third Annual the maximum flow problem. Poniewierski. 78(2):159-168. M Grossmann pit optimization to the design of open pit open pit optimization software evaluation and comparison. 921-940. on Strategic Mine Planning. 2009. “Impact of Reblocking on Pit Optimization” https://www. Adelaide.47-54. mines. Misguided objectives that destroy value. A computational King. 56(4): 992-1009. Maximal closure of a graph and applications to Hall. Application of Lerchs– Kentwell. 311 p (The Australasian Institute 11. On strongly polynomial dual in Mining and Exploration. 1801-1805 Australasian Institute of Mining and Metallurgy: Melbourne). T R. 1976. Transactions of Mine Planning. Open pit optimization — strategies for Lane. R E. 1999. No. p. University of algorithms for the maximum flow problem. 2012. Journal of the Association for J. Chandran. 1997. E and Newman. Proc. 157-164. 1988. D M and Guibal. D. Dotson and W. Moreno. PSEUDOFLOW EXPLAINED REFERENCES Alford C G and Whittle J. Spectrum Series No 14. Short Course and Computing Machinery.minemax. Australasian Institute of Mining and Metallurgy: Melbourne). Kim. Mining Engineering. 2014. S and Tarjan. Mathematical Programming B. 1454 1459. 2002. ZZ–1 – ZZ–24. Cut-off grades beyond the mine – optimising and maximum-flow in closure graphs. pp236- Research. Negatively geared ore reserves – a major Hochbaum. Picard. pp 125-128 (The pit mines. A faster deterministic study of the pseudoflow and push-relabel algorithms for maximum flow algorithm. and whitepaper on the AMC website. Press. 274. 8-9 March 2016. 2015. R. Oper. G. Spectrum Series No 14. Spectrum Series 20. D C W. Erickson. pp. M. H and Grossmann. AusIMM–IE MineMax Whitepaper available from https://www. D S and Chen. in Proceedings Third Biennial Conference 193. Academic 358-376. 18. 2016. Performance analysis and Whittle. Hochbaum. J. of Mining and Metallurgy: Melbourne). Pseudoflow. 1986. Vol. Aust Newman Combined Group. 1965. 1268–1272. D S. 91-114. Muir. Mining Engineering. O’Brain D T and Pana. computer models the state of the art. 2008. pp. A. Special Issue of Arizona. B G and Hochbaum. 37(4): 171. Peters. MineMax Planner vs Whittle Four-X . editors. J D. The Economic Definition of Ore: Cut-off Grades improving economics of mining projects through mine in Theory and Practice (Mining Journal Books: London). in Orebody Modelling and Strategic -beginning of a new phase of research. D and Chen. 2009. Symposium on Computers and Computer Applications Goldfarb. Dagdelen.p. D. Arizona. in Orebody Modelling and Strategic Mine Lerchs. Operations Evaluation Conference. McCarthy.pdf Carlson. planning. Optimum design of open Planning. new life for Lerchs-Grossmann Parameterization of optimal design of an open pit pit optimization. A and Tarjan. 1965. Vol. In the maximum flow problem. 1966. Rao. 22. Orlando. 206(1). pp 97-104 (The Society of Mining Engineers. “Pit Optimization” internal paper for AMC MineLib: A Library of Open Pit Mining Problems. 2005. Vol. Operations Research. 30. College of Mines. Management Science. The Canadian Mining and Metallurgical Bulletin. 35.