Are you sure?
This action might not be possible to undo. Are you sure you want to continue?
M S K and DECISION ANALYSIS
in Projects
Second Edition
JohnSchuyler
PROJECT MANfNiEMENT INSTITUTE
Library of Congress CataloginginPublicationData
Schuyler, John R., 1950Risk and decision analysis in projects / by John Schuyler.2nd ed. p. cm. Includes bibliographical references and index. ISBN 1880410281 1 . Decision making. 2. Uncertainty. 3. Risk management. I. Title.
2001019196 CIP
ISBN: 1880410281 Published by: Project Management Institute, Inc. Four Campus Boulevard Newtown Square, Pennsylvania 190733299 USA Phone +6103564600 or visit our website: www.pmi.org
O 2 0 0 1 Project Management Institute, Inc. All rights resewed.
"PMI" and the PMI logo are service and trademarks registered in the United States and other nations; "PMP" and the PMP logo are certification marks registered in the United States and other nations; "PMBOK", "PM Network", and "PMI Today" are trademarks registered in the United States and other nations; and "Project Management Journal" and "Building professionalism in project management." are trademarks of the Project Management Institute, Inc. Printed in the United States of America. No part of this work may be reproduced or transmitted in any form or by any means, electronic, manual, photocopying, recording, or by any information storage and retrieval system, without prior written permission of the publisher. PMI@ books are available at special quantity discounts to use as premiums and sales promotions, or for use in corporate training programs, as well as other educational programs. For more information, please write to the Business Manager, PMI Publishing Division, Forty Colonial Square, Sylva, NC 28779 USA. Or contact your local bookstore. The paper used in this book complies with the Permanent Paper Standard issued by the National Information Standards Organization (239.481984).
. . . . . . . . 60 Treesoftware . . . . . . . . . . . . . . .. . . . . . . . . .. . . 5. . ... ... . .. . . .. . . . . . .. . . . . . . . . . . . . . . .... . ... . . .. . . .. . . . . . . . Decision Policy Summary . .. . . . . . ..1 PopularEquations . . . Crane Size Decision . . . .. . .. .. . . . . . . . . . . . . . . . . . . . Summary: Expected Value. . 41 Conse~ativeRiskAttitude. . . . . . . . . . .. . . .. . . . .. . . . . .. . 32 . . . . . . . . . . . . . . . . . . . . .. . . .7 . . . . ... .. . . . ... . .. . . Who Does All This Work? . . . . . . . .... . . . . .. .. 3.59 .. . .... . . . . . 34 . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . .. . . .. . . . . . . .. . . . . . .. . .. . . . . . .. . . xi PART IINTRODUCTION TO RISK AND DECISION ANALYSIS Chapter 1Risk and Decision Analysis 3 Introduction . . . . . . . Correlation . . . . .. . . . .. . . . . . the Best Estimator . . .. . . .. . . . . . 36 Chapter 4 .. . . . . . . .. . . . . 12 . .. . . .. . .8 Appendix %Moment Methods . . . . .. .. . . . . . . . . . . . . . . . . . . . . . . . . . . .CONTENTS This book uses EV to abbreviate expected value.. . ..28 Chapter 3 . . .. .. . . .. . 65 . . . . .. . . . . . . Utility Function for Risk Policy . ... . . . . . . . . . . ... . . ... . . . . . .. . .. . . . . . . . . . . . . . . .. . . . . . . . . . ... . .. .. . . . . Decision Policy Summary . . . .. . . .. 42 .. . Decision Maker's Preferences . . . . . . . . . . .. .. . . . . .Decision Policy 29 Intuition Is a Poor Method . . ..23 . . . . . . . .. .. . . . . . . . . . . 17 . . . . . . . . . . .. . . . . .. . . . . . . . .. . . . . . . . .3 . Wastewater Plant Example . . . . . . .. . . . . . . ... . . . . .. . . . . . . . . . . .. . . . . . . . . .. . .. . . . . . . . . . .. . . . . Preface . . . .. . . . ... . . . . . ... . . . .. . . . . . . . . . . . . .... .. . . . . . .. . . . . . . . . . . . . . . . . . .. . . Decision Problems .. . . .. . . . . . .. . . . . . . . .. . . . . . . We hope this does not cause undue confusion with the earned value concept in project management.. . 1. . .. . . . . .. . ... . . . . . . . .. .. . . . . . .. . . .. . . . . Three Pillars of Decision Analysis . . . . . . . . . .. ... . . . . ... .. . . . ExpectedValue . . . . . . 53 .. .. . .. 35 . . DecisionTreeSummary . . . . . . . . . . . . . . . . . . . . . . . . .. .. . .. . . . . . . . .... . . . . . . . . . . .. . .. . .. . . . . . . . . . . . . . . . . .1 Attitude toward Different Objectives .. . . . . . . . . . . .. . . . . . . .. . .. . . . . .. . . .. . ... 29 .. . . . . . . . .. . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . .. . . . . . . . . 9 20 . .. .. . .. . . .. . . . . . . . . . . . . . . . . . . . . . . . .. . . . . .. . . . . . . . . . . . ... . . . .. .. . . . . . 29 . ... . .. . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . .. . . . . . . . . . . . .. . . ... . . ... . . . . . . ...49 . .. . . . . . . 3 AttitudetowardTimeValue . . . . . ... . . . . . . . . . .. . . . ... 66 . . ..... . . . . . . . . . . . .. . . . . . . . . . . Chapter 2Decision Analysis Process 23 Ten Steps toward Better Decisions . . . . . . . . . . . . . . . . . . . . . .. . . . . . .. . . . ... .. Riskanduncertain ty . . . . . . . . . . . .. . . .. . .44 MultiCriteria Decisions . .. . . . 56 Chapter &Decision Trees 59 DecisionTrees . . . CredibleAnalysis . . . . . . .. . . . . . . .. . . . . .. . . . . . .. . . .6 . . . . . . . . .. . . AttiiudetowardRisk . .. . . .. . . . .. . .. . . . . ... . . . . .. . . . . .. . . . . .. . . . . . . . the central concept in decision analysis. . . .. . . . . ... . . . . .. . . . . . . . .. . . . . . . . Frequency and Probability Distributions . . ... . . . . . Utility and MultiCriteria Decisions 41 Exceptions to Expected Monetary Value Decision Policy . . . . . .. . . . . . . . . .. . . . .. . . . . . . . . . .. . . . . . . . . . . . . . . . . .. . .. . .. . . . . . . . . . . . .. . . . .. . .. . . . .. . . . . . . . . . . . . . . . . .
. .. . . . . . . . . . . . . . Continuous Distributions . . . . . . . . 100 . . . . SensitivityAnalysis . . . . . . . . . . . . . . 133 .. 2000 Edition . . . . . 105 . . . . . . .. . . . . . . . . . . . . . . . .. . . . . . 100 DuringtheProject .. . . . .. . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . .107 . . . .. . . . . . . . . . .. . . . . . . 9 . . . . . . . .. . . . . Appendix6ABayesian Analysis . . . . . . . . . . . . . . . . Chapter 9 . . . . . .. . . . . . . . . . . . .Chapter +Value of Information 67 . . . . . . . ... . .. . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 . . . . . . . . . . . . . . . . . . . . . Dynamic Simulation Models . . Analysis Risks (Reducing Evaluation Error) . . . . . . . . . . . . . .. . .. . . . .. . . . . . . . . . . . . . . . . . .. . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . . . . CommodityPrices .116 .77 . . . . . .. . . . . .. . . . . 134 . . . . . . . . . . . . . . . . .. . . . . . . . . . . . .115 . . . . . . . . .. . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .75 . . . . . . 115 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . 114 PortfolioRisks . .116 . . . . . Modeling Techniques 121 . . . . ... ModelingTools. Common Simple Situation . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . Revisiting the Wastewater Plant Problem . .. . . . . . . . . . . . . . . . . . Continuous Risk Events . . . . . . . . . .. . . . ... . . . .. . . . . . . . . .142 . . . .. . . . . . . . . . . . . . . . . . . . . . . 9. . . . . . . . .129 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . 8 PMBOfPGuideSections . . . AssetValuePerspective. . . . . . . . . . . . . . . . . Appendix 6EGKilling the Project in Time . . . . . . . .. . .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . .. . . . .. . . . . . . . . 8. . . . . . . . . . . . . . . . . . . . . . ... .Quick. .... . . . . . . . .. . . .. . 144 Which Distribution Is Best? . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . PointForward Analysis . . . . .. . . Appendix 8A. Discrete Distributions . . .. . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. .. . . . .. . . . . . . .. . . . . .. . . . . . . . .. . . . . . . . . . . . . . . . . .. . . . . . . . .. . . . . . . . . . 83 . . . . . . . . .. . . . . . . . . . KeepYourPerspective . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 . . . . . . . . . . . . . . . . . .122 . . . . . ... . . . . . .. . . . Monte Carlo Technique . . . . . . . . . . . . . . . . . . . . . Environmental Hazards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Risk Prioritization Needs Quantification . . . . . . . . . . . . . . . . . . Wastewater Plant Simulation . . . . . 140 Chapter 1GProbability Distribution Types 141 Probability Distributions . .. . . . . . .. . . .. . . . .. . . . . . .. 1 Evaluating Alternatives .. . . . . . . . . .115 Appendix 8LLComparison with the PMBO* Guide . . . . . . .6 Comparing Simulation to Trees .. . . .. . . . . . . .1 . . . . . . Valueoflnformation . . . . .. . . . . . . . . . . . . . . .. . . . . .. . .79 Chapter 7Monte Carlo Simulation 81 ApproximatingExpected Value ... . . . . . . . . . . . . . . .1 1 1 .. . . .. . . . . . . . .. . . . . . . . . . . . . . . . .93 . . . . . . . . Early Warnings . . . . . 107 . . . .1 .. . . .. . .. . . . . . . . ..78 OptionsAddValue . . . . . . . . . . . . . . . . . . . . .an d. . . . . . . . . . . . . . . .. . . .. . . . . .. . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . ... . . . . . . . 9 Chapter &Project Risk ManagementBy the Numbers 97 The Business Perspective . . . . . . . .. . . . . . . . . .. . . . . . .. 8.. . . . .9 . . . . . . . . 97 Modelscope . . . .. .. .. . . . . . . . . . . . .... . . . . . . . . 77 Gateways . .117 PART IIMODELING AND INPUTS . . . . . . . . . . . . . . . .1 Appendix 8ELRisk Management Plan .. .. . . . . . . . . . .1 1 . . .. . . . .1 Sensitivity Analysis . . . . . . . . . . 67 . . . . . . . . . . . . . .. . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . .114 . . . .79 . . . . . . .. . . . . . . . . . .. . . . .. . . . . . . .. . 67 . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . ... . . . . . . . . . . . . . . . . . . . .90 Simulation in Practice . . . . . . . . . . . . . . . . . . . . . . . . . . .Dirty Decisions . . . . . . .. . 41 . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . .. . . .2 1 Forecasts from Models . . . 74 . . . . . . . . .. . . . . . . . . . . . . Feel Good about Your Decision . Deterministic Cashflow Models . . . . 149 . . . . . . . 139 SummaryT oward Credible Evaluations . . . . . . . . .. . . . . . . . . . PreProject Risk Management . . .. . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . .. OperationalRisks . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . .. . . .. . . .. . . . . . . .. . .116 . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . .. . . . . . . . . . . . . . . . . . .. . . . . . . . . . .. . . . . . . .. . .. . . .. . . .. . . . . . . . . . .69 Value of Information Summary . . .. . . 1.. . . . . . . . . . . . .. . .. . . . . . . . . . . . . . . . . . . . . Appendix 8CMitigating and Avoiding Risks . . . . . . . . . . . .. . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126 ModelingProce ss .. . . .. . . . . . . . . . . . .. . . . . . . . . . . . . . . . . 115 . . . . . . . . . . . . . . . 114 . . . . . . . . . . . . . . . . . . . . . . . . . Interest Rate and Exchange Rate . . . . . . . . . . . . . .. . . .. . .. .1 Wastewater Plant Revisited . . . . . . . Deterministic Project Models . . . .. . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . Plant InformationAttemative . . . . . . . .. . . . . . . . . . . . . . .
. . ... . . . . . . . . . .. . . . . . . . . . . . ... . . . . . . . . . . . .. . . . . . . . .. . . . . .... . . . . . .. . . . . . . . . . . . . .. . 1 9 1 Example Project Model .169 . ... . . . . .. . . . . . .163 HumanFactors . . . . . .. . . .. .. . . . . . . . . . . . . . .. . .. . .. . . . . . .. .. . . .. .. . .. . . . . . . . .. .. .. . . . . . . . . . . ... . .. . . . . . . . . CombiningMethods . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . .. . . . .. . . . . . . . .. . . . . .. . .. . . .. . . . . .. . . . SimplifyingProject Decisions ... .. .. . . . . . .. . .. .. . . . .. . .. .. . . . . . . .. . . . ... . . . . . . . . .227 Project Risk Management . . . . . . Chapter l!%OptimizingProject Plan Decisions 191 It's an Optimization Problem . .. . . 1 . . . .. . .. . . . . 61 163 . . .. . .. . . . 233 . . .. .. . . . .. .... . . . .. .. Index. . . .213 Appendix17BFuqmc . . . .. .. . . Summary . . . . . .. . . .. .. . . . . . . . . .. . . . . . . ThinkingLogically . . .1 Symbols .. . . . .. . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . .. . . . . . . . . . ... .0 1 202 .. . . . . . . .. ... . . 151 1.. . .. .. . . . . .. . . . . . . . . . . . . . .. .. . ... . . . .. . . . .. . . . . . . . . . Appendix ASummary of Methods 219 Some Additional Methods .. . .167 . .. . . . .. . . . . . . . .. Key Probability Theorems .. .. . . . .. . . . . . 208 . . . . . . . . . . . .. .. .... . . 196 . ExpertSystems . . . . . . . .. ... . . . . . .. . .. . . . . . . . . . .. . . . . . . . . . . .. .. ... . . .. .. .. . . . . .. . . . .. . . .. . . . . . . . . . . . . . . . .. ... . . .. .. . . . . .. . . . . ... . . . . . . . . . ... .. . .. . . . . . . . . ... . . ... . .. . . . . ... . . .. . . .. .. . . . .. . . . . . . . . ... . . . . . . . . . . . . . . .. . . ... . . . . . . . . . . . . . . . . . . . . . . . .... . . . . . Deterministic or Stochastic? . . .. . . . . . 192 . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . .. . ... .. ... . . ... . . . . . . . . .. . . . .. . .. . .Chapter 11Judgments and Biases 151 ThreeRoles . . . . . . . . . . .. . . . 193 Incentives .. . .. . . . . . . . . . .. .. . ... . . .. . .. . . . .. . .. . . . . . . . . . .. . . 215 . . . . . . . . . .. . . . ... . . . . . .. . . .. . .. . . .. . . . . . . . . . . . . . . .. .. . . .... . . . . . . . . . .. . . . .. . . .. . . . . . . .. . ... . . . Sources of Correlation . . . . .. . . . . . . . .. . . . . . . . .. . .. . . . . .. . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . .. . . .. . . ... . . .156 . .. . . . . . . .. .. .. . . . . . .. . . . .189 . . . . . . . ... . . . . . .. .. . . . ... . . . . . . . . . .. . . . . . . . .. . . .. . . .. . . . . . . . . ..... . . .. . . . . . . .. . . . . . . . .. ... . . . . .. . . . . . . . .. . ... .. . . . . . . . . . .. . . . . . 51 Judgments .. . .. Biases . . . . . . . . . . . . .. 183 . ... . . . . . . Decision Analysis with Monte Carlo Simulation . .. . . . . . . . .. . . . ... . . . PART 111. . .199 Chapter 16Probability Rules 201 Venn Diagrams and Boolean Algebra . .. Terms .. . . . . .. .185 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. Improving Evaluations . . .. .. . . . . . . .. .. . . . .. .. ... . . .. . .. . . . . . .186 . OptimizationExperience . . 158 Summary . .207 . .. .. . .. . . . . . . . . . . . . . . . . . .. . .4 PECIAL TOPICS Chapter 14Exploiting the Best of Critical Chain and Monte Carlo Simulation 183 Criticalchain . . . . . .. . . . . .. . . . . . . . 232 . . .. .178 . . . . . . . .. . . . . . .. . . Summary . . . . . . . . . . ... .. . . . . Bibliography 251 253 . . . . .. . .... .. .. .. . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . ComparingApproaches . . . . . . . . . . ... . . .. . .. .. . . . .. . . . . . .. . . ... . . . Chapter 12Relating Risks 161 Correlation . ... . . . . . ... Chapter 17Expert Systems in Project Management 207 Smartcomputers . . .. . . . . . . . .. . . . .. .. . . . . . . . .. . . .. . . .. .. . . . .175 .. . Appendix 13ANew Venture Analysis .. . . . . . . . .. . . . .. . .. . . . . . . . Optimizing Activity Starts . . . . . . . . . . 159 . . . . .195 . . . . .2 2 1 Appendix &Decision Analysis Software 223 Spreadsheets . . .. . . . . . . . . . . . ... . . . . . ... . . . .2 . . . . . . . . . . . . .3 .. . . . .. . . . . . . . . .. . .. . .. .. . . . . Chapter. .. . . . . . . . . . . .. . .. . . ... .. . . . . Appendix 17ANeural Netwoks . . . .. .. . . . . .. . . . . .. .. .. . . . . . . . . . . . . .. . . . .. .. . . ... . . . . . . . . . . . . ... . . . . . . . ... . ... . . . VarianceAnalysis . 2 . . . ... . . . .. ... . . .. . . . . . .... .... . .. . . . . . .. . .. . . . . .206 . . .. . . . .. . . . . 186 ... . . .. . 172 . . .. .. . . . . . . . .. .. . .. ... . .. . . . .. .. . . . . . . . . . . . . . . . . . .. . . . . . . .229 Glossary 231 Abbreviations . . . . .. .. . . . . . . .... . . . . .. . . . . . . . . . .. ... . . . . .. . . . .. .. 224 Monte Carlo Simulation . . . ... .. 3 1 Stochastic Variance 169 Base Case versus Stochastic Model .. . . . . .225 Decision Tree Analysis . . . . . . . . . . . . . . . . .. . . .. . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . 219 . . . . . .. Ways to Represent Correlation . . .. . . . . . .. . . .. . . . . .. . . . . . ... . . . ..... . .. . . . . . . . . . . . . . .. ... . . . . . . . .. . ... . . . . . .. . . . . . .. . . . .. . . .. .
. .8 Figure 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .20 Decision Analysis Process . . . . Solved Drug R&D Decision . . . . . . . . . . . . . . .46 Decision Model for E U Decision Policy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Project Value Contours and Two Alternatives . . . . . . .3 Figure 6. . . . . . . . . . . . . . . . . . . . . . . . . .1 Triangle Distribution for Material Cost Uncertainty . 5 Risk Taxonomy . . . . . . . . . .88 Cumulative Probability Distribution . . .4 Distributions of Outcome Value . . . . . . . . . . . . . . . l l a and l . . . . . . . . .5 Figure 6A. . . . . . . . . .3 Figure 6.2 Figure 1. . . . . . . . . . . . . . . . . .3 Figure 4. . .1 Figure 6B. . 91 Three Original Alternatives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 Incremental Improvement . . . . . . . . . .9 Figure 7. . . .9 Figure 5. .6 Figure 1. . . . . . . . . . . . . . . . . .10 Evaluation Weak Link . . . . . . . . . . . . . . 52 Decision Criteria Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . .4 Figure 7. . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Figure 6B. . . . . . . . . . . . . . . . . . . . . . . . . .3 0 Crane Size Decision Tree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .8 2 Stochastic (Probabilistic) Model . . . . . 73 Inverted Probability Tree . .6 8 Expanded Decision Model . . . . . . . . . .1 Figure 3. . . . . . . . . . . . . . . .8 5 Simulation Flow Diagram . 55 DecisionNode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5 1 Three Utility Scales . . . . .7 Figure 1. . . . . . . . . . . . . . . . . . . . . 72 Delayed Skid Alternative . . . . . . . . .8 Probability Distribution for Project Cost . . . . . . . . . . . . . . . . . . . . . . .62 Decision Tree to Evaluate Three Initial Options . 70 Used Plant Prior Probabilities . .50 Simple Objectives Hierarchy. . . . . . . . . . . . . . . . . . . . . . . . . . . .3 Figure 4. . . . .9 1 Cumulative Probability Distribution . . . . . 1 Figure 2. . . . . . . . .2 Figure 7. 45 . . . . . . . . . . . . . . . . . .7 Figure 7. . . .47 48 ComparingTwo Alternatives . . . . . . . . . . . . . . . . . .24 Value Meter Metaphor for Decision Policy . . . . . . . . . . . . . . . . . . . . . . . . . Decision Model for EMV Policy . . . . . . . .8 Figure 4. . . . . l l b Figure l A . . . . . . . .2 Figure 5. . 78 Decision Tree Model . . . . . . . . . . . . . . . . . . . .7 Figure 4. . . . . 5 . . . . . . . . . . . . . . . . . . .8 9 Decision Model . 7 1 Information Alternative .1 Figure 3. . . . . . . . . . 8 Frequency Histogram of 500 Estimates .10 Figures l . . . . . .4 Figure 4. . . . . . . . . . .9 Figure 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 7 Average Evaluation Error versus Evaluation Number . . . . . .1 8 Four Distributions with the Same Standard Deviation . . . . . 16 Distribution of the ActuaYEstimate Ratio : . . . . . . . . . . . . .93 Evaluation Alternative . . . . .6 4 Current Decision Model . . .1 Figure 4. . . . . . . . . . . . . .8 Figure 7. . .7 Frequency Histogram of Fifty Cost Estimates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .8 6 Sampling a Discrete Distribution . . . . . . . . . . . . . . . . . . .3 9 Example Utility Function . . . . . . . . . . . . . . . . . . . . .5 Figure 1. . . . . . .1 Figure 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 Probability Distribution for One Input . .12 Distribution of Project Costs .3 Figure 7. . . . .5 Figure 7. . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 Figure 6. . . . .95 ' wii . . . . . . . . . . . . . . . . . . . . . . . . .1 Figure 7. . . . . . . . . . .2 Figure 7. . . . . . . . . . . . . . . . . . . . . . . . .3 Figure 1. . . . .1 6 Cumulative Frequency Distribution of the W E Ratio . .1 Figure 6. . 6 . . . .2 Figure 4. . .4 Rgure 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5 Figure 4. . . . . . . . . . . . . . . . . . . . . .LIST OF FIGURES Figure 1 . . . . . . .4 Figure 6. 1 Figure 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6 Figure 7. . . . . . . . . . .2 Figure 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 ChanceNode . . . . . . . . . . . . . . . . . . . . . . . . . . . . .76 78 Gateways in Drug R&D . . . . . . . . . . . . . . . . .6 Figure 4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Decision Tree Solution with DATA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . 179 Forecast of Activity A E Completion Time . . . . . . . . . .6 The Asset and Project Models . . . . . . . . . . . . . . . . .9 Figure 9. . . .2 Figure 8.1 Figure 12. . . . . . . . .3 Figure 9. . . . . . . . . . . . . . . . . . . . . . . . . . .104 Risk Response Planning Hierarchy . . . . . . . . . . . . . . . . . . . . . . . . . . . 128 Evaluation Scope in Asset Life Cycle. . . . . . . . . . . . . . . . . . . 138 Binomial Distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 Figure 8A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192 Activity Network Diagram.1 Rgure 8. . . . . . . .8 Figure 9. . . . . . . . 1 3 1 Spider Diagram . . . . .1 Figure 13. . . . . . . .1 Figure 14. . . . . . .112 GanttChart . . . 123 CPM ActivityonArc Diagram . . . .1 Figure 15. . 145 Uniform Distribution .165 Uncertainty versus Competency . . . . . . . . . . . . . 155 Scatter Diagram Showing Correlation . . 135 Sensitivity Chart for Chance of Success . : . . . . 109 Project Risk Database. . . . . . . . . . . . . . . . . . . . . . . 126 Typical Activity Node Representation Four Precedence Relationships . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5 Figure 10. . . . . . . . . . . .3 Figure 10. . . 129 Influence Diagram . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189 Project Cost versus Time to Complete AE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .146 Normal Distribution. . . . . . . . . . . . . . . . . . . . . . . . . . . . 127 Cash Equivalents . . . 148 Beta Distribution. . . .3 Figure 13. . . . . . . . . . . . . . .12 Figure 10. . . . . . . . . . . .2 Figure 13. . . . . . . . . . . . . . . . . .5 Figure 15. . . . . . . . . . . . . . . .3 Figure 15. . . . . . . . . . . . . . . . . . . . . . . . . .197  * . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .194 Influence Diagram of Activity Start Decision. . . . . 124 . . . . . . . . . . . . . . . . . . . . .4 Rgure 9. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175 Cumulative Frequency Distribution . . . . . .2 Figure 10. . .1 Figure 8A. . . . . . . . . . . . . .2 Figure 9. . . . . . . . . . . . . .137 Sensitivity times Uncertainty . . . . . . . . . . . . . . . .3 Figure 8B. . .10 Figure 9. . . . . . . . . . . . . . . . . . . . . . . . . . 147 Lognormal Distribution . . . . . . . . . . . . . . . 153 Probability Wheel . . .144 ThreeLevel Approximation. .1 Figure 9. . . . . . . . . . . . . . . . .7 Figure 10. . . . . . . . . . . . . .1 Figure 9. . . . . . . . . . . . . . 1 Figure 1 Figure 11.1 Rgure l3A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162 XYGraph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Figure 8. . . .2 Figure 15. . . . . . .136 Tomado Diagram . . . . . . . . . . . . . . . . 109 Decision Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11 Figure 9. . . .5 Figure 9. . . . . . . . . . . . . . . . . . . . . 195 Distribution of Slack Time after Activity AE. . . . 188 When Activity A E Can Start . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 Figure 11. . . . . . . 104 Simple Contingency . . . . . . . . . . . . . . . . . . . . . . 152 Sources of Probabilities. . . . . . . . . . 177 Frequency Distribution for W. .3 Figure 8. .10 1 . . .8 Figure 10. . . . . .3 Figure 13A. . . . . . . . . . .149 Roles in Project Analysis . . . . . .3 Figure 12.3 Figure 15. . . . . . . . . . . . 179 Cumulative Frequency Distribution for PV . . . . . . . . . . . . .6 Figure 10.9 Figure 10. . . . . . . .6 Figure 9. . . . . . . . . .1 Figure 10. . . . . . .9 8 Risk Profiles Deviating from Base Case . . . . . . . . . . . . . . . 142 Poisson Distribution . . . . . . as Judged .196 Optimization Progress . . . . . . . . . . . 1 7 1 Distribution for NCF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 Figure 14. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .193 Distribution of Project Cost . . . . . . . . .2 Rgure 12. . . . . . . . . . 145 Triangle Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4 Figure 10. . . . . . . . . . . . . . . . . . . . . . . 190 Optimizing Activity AE's Start . . . . . 103 Prioritizing Risks . . . . . . . . . . . . . . . . .2 Figure 14. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138 Sensitivity Chart from Simulation . 143 Continuous Distribution. . . . . . . . . . . . . . . . . .7 Figure 9. . 108 Contingency Model. . . . . 148 Exponential Distribution. . . . . . . . . . . . . . . . . . .4 Figure 15. 167 Completion Times and Earliest Start. . .4 Figure 8A. . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3 8 Completion Time for All Activities except Wastewater Plant . . . . . . . . . . . . . . . . . . . . . . . . . .230 LIST OF TABLES Table 1 . . Project Activities for Using Decision Analysis . . . . . . . . .2 Figure B. . . . . . 108 Joint Probability Table . . . . . . . . . . .1 Table 17A. . . . . . 220 . . . . . . . .187 Project Cost for Five Cases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Stochastic Method Selection Guide . . . . . . . . 87 . . . . . . . . . . . . . . . . . . .1 Figure 17. . . . . .2 Table 13. . . . .3 Table 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 ThreeLevel Input Distribution . . . . . . . . . . . 164 SpreadsheetModel . . .1 Figure 178.5 Delay Days Given Activity 1 5 Are Critical . . . . . . . . . .3 Table 7. 174 Example Variance Analysis . . . . . . .Figure 16. . . 198 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4 Table 6. . Comparing ESs and NNs . . . .1 Table 3. . . . . . . . . . . . . . . . . . . . . . . . . . . .65 Probabilities for Used Plant . . . 202 A Tree Path Is a Joint Probability. .4 Figure B. . . . . . . . . . . . . . . . . . .2 Project Estimate and Actual Costs . . .1 Table 5. . . . . . . . . Division Point between "Tall" and "Not Tall". . . . . .222 . . . . . . . . . . . . . . . . . . . . . . . . . . . .2 Table 7.72 Joint Probability Table . . . . . . . . . . . . . .216 Project Outcome Belief Graph . . .2 Table 5. . . . . . . .209 . . . . . .1 Figure 17B. . . . . .6 VennDiagram . . . . . . . . 92 Input Distributions . . . . . . . . . . . . . . . . . . . . . . . . 37 Payoff Table for Crane Size Selection . . 224 Defining an 'Assumption Cell" in Crystal Ball . .3 Table 14. . . . . . . . .1 Table 13. 216 TopRank Sensitivity Graph . . . . . . . . 9 9 Improved Base Plate Material . . . . . . . . . . . . . . . . . . . . 62 Plant Cost Information . . . . . . . 214 Techniques of Uncertainty . . . .2 Figure 17A. . . . . . . . . . . . . . . . . . . . . . . . . . SimpleNN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Table 12. . . . . . . . . . .2 Figure B . . .227 Decision Tree in PrecisionTree . . . . . . . . . .226 Distribution and Statistics Screen in @Risk . .2 Figure 17. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Table 6A. . . . . . . . . . . . . . .1 Table A . . . . . . . . . . . . . . . . . . . 6 1 Time to Acquire and Install Wastewater Plant . . . .1 Figure 16. . . . . . . . . . . . 209 . . . . . 174 NCF Distribution Statistics . . . . . . . . . . . . . 1 Table 3. 205 Example Frame Structure . . . . . . . . . . . .1 Table 7. . . . 176 Comparing the Two Approaches . . .1 Table 7. .1 Table 8A. . . . . . . . . . . . . . . ExampleRules . . .1 Table 13. . . . . . . . . . . . . . 214 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .76 Wastewater Plant Alternatives . . . . . . . . . . . . . . . . .63 Calculations in BackSolvingthe Tree . . . . . . . . . . . . . . . . . . . . . . . . . . . W C osts . . . . . . .229 Comparing Effect of Actions on Project Uncertainty . . . 1 . . . . . . . l Figure 8. . . . . . . . . . . . . . l Table A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1 Table 15. .5 Figure B. . . . . . . .4 Table 8. . . . . . . .2 Table 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3 Figure B. . . . . .228 Sensitivity to Daily Crane Cost Variable . . . . . . . . . . . .
are receiving great attention these days. Decision analysis is a discipline that helps people choose wisely under conditions of uncertainty. more managerial perspective from this book about the craft of project management and risks. is the most changed. Recent evaluation and project management software provides help in assessing and managing threats and opportunities. . project risk management. PMI has available another risk management book: Project and Program Risk Management: A Guide to Managing Project Risks and Opportunities. we are learning more about value creation and how to work with uncertainty. The Project Management Institute (PMIB)realized "explosive growth of about 350 percent during 19952000. Max Wideman. Furthermore. every day each of us deals withor at least worries aboutrisk. with special attention to the decisions in project management. I believe that the two books nicely complement each another. Fortunately. This book introduces risk and decision analysis. yet few of us have had formal training in decision making. PMI released A Guide to the Project Management Body of Knowledge (PMBOKB Guide) . The Guide's Chapter 11. and a value engineeringlike approach can often deliver additional value. especially.2000 Edition. edited by R. Project owners are becoming more demanding of performance and less tolerant of surprises.PREFACE Is there anything more important to the success of a project than making good decisions? This management skill is certainly near the top of the list. Shorter business cycles and evergreater competition are pressuring organizations for better resource management. Opportunities often accompany risk. We are seeing increasing interest in probabilistic techniques for all types of evaluations. As this edition was being written. Project management and. Project Risk Management. His popular book provides a different.
These topics include critical chain project management. I have been assisted by many people at PMI. In mid1992. PMI's professional monthly magazine. Webster. and Chapter 8 describes project risk management. Jim skillfully oversaw installments nine through twelve. Chapter 16 is a brief tutorial about probability rules.Preface Organization We divided the book into three sections: PART I . we have appendices about decision analysis software and a table of methods for dealing with uncertainty. Mark Parker with the first edition and Toni Knott on this edition. Chapter 2 outlines a general problemsolving process. Francis M. James Pennypacker was PMI Publications' publisher fall 19941998. The first twelve articles appeared in the first month of every quarter. At the end of the book. Fran kept me focused on project management when I would occasionally revert to my roots. Thanks Michelle. which eventually led to the series. who initiated the idea. and Mark. as well as the manuscript for this book. PART IIMODELINGAND INPUTS Here we provide more details about project modeling.~ O D U C T I O N TO RISK AND DECISION ANALYSIS This first half of the book introduces decision analysis. Toni. We hope you will find the Glossary and Index useful. For both editions. and discusses how it applies to project risk management. PMI Publishing's managing editor. optimization. and the last six articles appeared in oddnumbered months during 2000. I was invited to contribute an instaUment to a series on risk analysis edited by Richard "Dick" Westney. editorinchief (19871994) guided the first eight installments. I am especially indebted to the book editors. Acknowledgments This book collects a series of eighteen articles from PM Network. The one article gave way to three installments. encouraged and shepherded the PM Network articles numbers thirteen through eighteen during 2000. . and discuss using probability distributions for uncertain inputs. project evaluation. Michelle Owen redrew the illustrations and handled layout. and expert systems. Sandy Jenkins. who worked hard to polish my manuscript. His passion for the project management discipline and his depth of knowledge made each piece robust and lucid. PART IIISPECIAL TOPICS We finish by discussing some emerging techniques. Dr. in 1996.
which will aid in a future revision of this book.maxvalue. professionalism.htm .com/rdap2rev. I'm pleased to be making a small contribution toward advancing project management.Preface The PM Network series and the two book editions have been a pleasant endeavor. our tools and knowledge of how to apply them continuously evolve. Colorado john@maxvalue . I have greatly enjoyed the association. PMI Publishing and I will be grateful for your comments and suggestions. You may send corrections to booked@pmi. In addition.com Amplifications and corrections to the book are posted online at http://www. Although the probability concepts are well established. and contributions of the PMI staff.org. and please send comments and suggestions to me: John Schuyler Aurora.
.
Decision analysis is perhaps the most powerful management technique since the invention of the organization hierarchy. Chapter lRisk and Decision Analysis Decision Analysis Process Chapter 2Chapter %Decision Policy Chapter &Utility and MultiCriteria Decisions Chapter &Decision Trees Chapter &Value of Information Monte Carlo Simulation Chapter 7Chapter &Project Risk ManagementBy the Numbers . This first part introduces the methods and approach of applying decision analysis in project decisions.
.
and operations research. for decisions continue to be made throughout the project life cycle. This analysis technology. The methodology is proven. finance. Fortunately. and performance. These methods explicitly recognize uncertainties in project forecasts. is becoming mainstream practice. This chapter introduces some initial terminology and a few important concepts. schedule. andI hope you will agreeeasily understood. The quality of these decisions impacts cost. on the leading edge in the 1970s and earlier. a subset of operations research (also called management science). Introduction Decision analysis (DA). and other resources? DA involves concepts borrowed from probability theory. The formal discipline is called decision science. The techniques are applicable to all types of project decisions and valuations. is the discipline for helping decision makers choose wisely under conditions of uncertainty. a few basic concepts in probability and statistics go a long way toward making better decisions. Decision Problems Most decision problems are about resource allocation: Where do we put our money.CHAPTER 1 Probability is the language of uncertainty. time. Operations Research/Management Science . statistics. Committing to fund a project does not end the decision making. accessible. This book explores the approach and principal techniques of DA. sometimes called risk analysis. psychology.
however. DA techniques are universal.Chapter 1Risk and Decision Analysis (OR/MS or ORMS) is the broad discipline of applying quantitative methods to problems of business and other organizations. consistent way to incorporate judgments about risks and uncertainties into an analysis. and competition pressure us to improve decision making. evaluation practices and decision policy remain a weak link in many organizations (see Figure 1. Increasing business complexity. All decisions seem to fall into one of these categories. They apply in any professional discipline and to problems in your personal life. Although this book is oriented toward project management decisions. low. and we evaluate each type in a similar manner. People are maturing in traditional planning and evaluation practices. speed. The purpose of this book is to strengthen that weak link. or lease Size or number of units of equipment to purchase @ Best use or disposition of an asset. Determining the optimal values for decision variables 0 Bid amount to maximize the value of the bid opportunity 0 Optimal capacity or configuration of a facility or equipment. When uncertainties are significant. + + + + . Appraising value Project or venture value. or elements of projects or ventures Estimating transaction or project proceeds. Applying these techniques should lead to faster.This is despite great sophistication in the technical science and engineering disciplines. Sophisticated decision making doesn't mean more complicated. Yet still. We can solve many decision problems on the back of an envelope. and dealing with uncertainty seems a logical improvement. Decision science emerged as a subcategory of management science during the 1960s. DA provides the only logical. All we need is a way to measure value under uncertaintythe topic of Chapter 3.and moderatecost computer software is now available to implement the methods described in this book. Applying the DA approach requires: Capturing judgments about risks and uncertainty as probability distributions Having a single measure of value or the quality of the result Putting these together with expected value calculations.1). make or build. these techniques are the best route toward credible project decisions. more confident decisions. DA is a general problemsolving process and applies to three types of problems: Choosing between alternatives Buy.
How would you evaluate the quality of your estimates? Most forecast users recognize two principal. In competitive situations. Precision tells us about the magnitude of the errors.Chapter IRisk and Decision Analysis Material Science Engineering Design Production Project Evaluation Other Engineering and Scientific Disciplines Figure 1 . Applying the expected value (EV) concept enables more accurate forecasts and estimates. We demonstrate low bias by having an average forecast error of approximatelyzero. both bias and precision are important. and we desire small errors. estimates proving neither too high nor too low on average. . Unbiased. Understanding a few things about probability distributions will be helpful before introducing EV later in this chapter. minimal random "noise" in the estimates. For most internal purposeswhere we're competing against nature. Objectivitylack of biastells us about the error of estimations compared to what actual values result. Precision: Reasonable closeness between estimates and actual valuesthat is. desirable characteristics in a credible analysis: Objectivity: Over a number of projects. for exampleobjectivity is more important. of course. Forecast accuracy is a composite of low bias and high precision. 1 Evaluation Weak Link Credible Analysis Your job may involve estimating project or activity costs.
where a situation analysis focuses in turn on is a good process Strengths. such as a future commodity price. Risk and Uncertainty Risk and uncertainty describe the possibility of different potential outcomes. Decision analyses are normally very transparent: The values and calculation details are readily available for inspection by a decision maker and other interested persons. Some . and human systems." many of us allow "risk" to encompass either undesirable or desirable outcomes.2. It is sometimes helpful to make a distinction. potential impact. We are trying to estimate incremental value to the organization. For example. risk is approximately synonymous with uncertainty. For many people. we often classlfy risk events as either "threats" or "opportunities. Uncertainty refers to variability in some value. the risks and uncertainties reflect unknowns and variability in nature.Chapter IRisk and Decision Analysis Before we leave the topic of credible analysis. Informally. how are we to make tradeoffs between cost.) This classification is useful in project risk management. materials. Some systems feature inherent randomness. shown in Figure 1. especially. * : * The definitions for risk and uncertainty are controversial. and Threats. Those persons preparing forecasts and evaluations should provide adequate disclosure and communication. is consistent with SWOT analysis. The analysis scope should encompass all the important effects of the decision alternatives. risk of failure. usually unfavorable. The model must faithfully represent the assumptions as presented. such as games of chance. We know there is going to be a price. An informal distinction is that uncertainty applies to when the outcome is variable. Because there is no good antonym in Enghsh for a "good risk. A decision policy tells us how outcome value is measured. In project management." This taxonomy. Weaknesses. "risk is used when there is a large. Opportunities. schedule. Typically. and performance? We discuss decision policy in Chapter 3. Few topics will generate as much debate as trying to achieve consensus definitions for these two common words. (Th~s for creating decisionmaking opportunities. we should recognize several additional important facets: The value measure in the model must correspond to the mission and objective(~) of the organization. In business. the uncertainty is how much. where we want to reduce the probability and impact of a threat and increase the probability and impact of an opportunity. Risk is the quality of a system that relates to the possibility of different outcomes. the contingency event either happens or does notfor example. or both.
Suppose that. a probability distribution. as illustrated in Figure 1. Frequency and Probability Distributions Suppose that you and fortynine other professionals estimate Project Cost.0 million. draw a bar graphone bar per intervalwith the bar heights proportional to the number (or frequency) of occurrences within each interval. If we obtain a great many data points and use smaller intervals. The collected values range from $26. . Labeling the yaxis is optional. then project cost and other outputs will be probability distributions also. the frequency distribution converges into a smooth and continuous curve.Chapter IRisk and Decision Analysis Figure 1 . label the yaxis with either numbers of values in the interval or percents of the total. Another definition dimension. we have 500 data points. is whether we should classify risks and uncertainties as "known unknowns" or "unknown unknowns. or.d. instead. I f you desire. More data and smaller value segment partitions provide the additional detail shown in Figure 1. there is little risk of serious miscommunication.7 million to $76. count the number of values occurring within each interval. Here is the procedure for charting a frequency histogram. First divide the range of values into (typically)five to fifty intervals (segments. 2 Risk Taxonomy project management professionals are classifying uncertainty (i.).3. drawing argument. A probability distribution can completely represent one's judgment about a chance event. a surprise or contingency) as the most general term. average value as the number of data points increases. a probability densityfunction (p. I am merely offering definition suggestions according to what appears to be most common usage.e. Next. They then classify events with good outcomes as opportunities and those with bad outcomes as risks or threats.f. simply. bins) of equal width.. If one or more inputs to an analysis are probability distributions. Even if we disagree. We can display the fifty estimates as afiequency histogram. We obtain a more accurate estimate of a central." I'll not argue the definitions here. Now.4. because anyone viewing such a chart expects that all of the data values are represented.
d. Figure 1. 3 Frequency Histogram of Fifty Cost Estimates ost Estimate.Chapter IRisk and Decision Analysis 10  6 C 3  a .5 shows a probability distribution for the next sample for a Project Cost estimate.f. the yaxis is scaled so that the area under the curve equals1. 20 I I 90 100 Cost Estimate. Fitting such a curve to 8 . n LL e! 5  0 .3 and 1. represents the population of aLl possible outcomes. Suppose that we want to judge the uncertainty in a new cost estimate coming from the same population as values obtained for Figures 1.4. $millions Figure 1 . This figure shows the relative likelihood of estimate values along the xaxis. The units of the yaxis are not important. $millions A p.
.5: IThe most likely value is about $40 d o n . this function would represent our best judgment of the distribution of cost estimates.f. This is the most important and most useful statistic: It is the best single measure of value under uncertainty. "density" describes the correspondence between area and probability. the p. As such.f.5. and we expect more data samples to be near this value than any other point. represents a judgment about unaertainty. . We will see that the E V calculation is analogous to a centerofmass calculation with a twodimensional mass of uniform density.5 Probability Distribution for Project Cost lots of representative data. is not representing data.d. The expected value. Popular Central Measures There are two especially important statistics annotated in Figure 1. The mode is more probable than any other value. Statisticians call this the mean. the p. $millions Figure 1. $45 million. In the case of Figure 1. Statisticians call this peak the mode. it represents a judgment about new cost estimates.E m 2 a Q Cost Estimate. Unlike a frequency distribution. is the probabilityweighted average. 0 % Here.d. If we know the probability distribution. It represents the average of many values sampled from the population of project cost estimates represented by this probability distribution. yet has little application in decision making. This statistic is useful in describing a distribution's shape.Chapter IRisk and Decision Analysis Mean or Expected Value P . then we know that the EV is the probabilityweighted average of the distribution.c n c n Q) 2 . and use a Greek letter p symbol.
This centermost value.d. moving from left to right. The confusion arises because. represent an "average" value. sum the area under the curve. shown in Figure 1.f.. distribution in Figure 1. The cumulative format allows us to read confidence levels and intervals directly. and widely used with demographics to indicate "average" values. "We used an average cost for this assumption. This is equivalent to progressively integrating the p. as we'll discuss. is the median (annotated in Figure 1.d. Figure 1.f." I feel the need to ask that person what he means by "average.6) that we have been discussing is often called "the cumulative lessthan distribution. Central measures.d.) curve. Notice that the cumulative distribution ranges from 0 to 1 on the yaxis. This $33 to $57 million range represents an 80 percent confidence interual. is the reverse cumulative distribution.f. Because average is so often used (abused) in different ways. In some situations.5 million.6 shows two versions of m u lative distributions. which is also called the "exceedance" or "greaterthan" form of cumulative probability distribution (dotted curve in Figure 1. Whenever someone says something like.d. the N is an impossible outcome value. Expected value originated in a mathematics concept called expectation theory.f.f. Nonetheless. at 50 percent probability. sloping downward to the right. it has lost most of its meaning. because management is accustomed to viewing "better" on graphs as being upward or rightward. an N is the best singlepoint estimate under uncertainty. it is unlikely that we will realize the precise N . This format is especially popular when larger xaxis values are better.f. such as house prices or salaries [I]. About 80 percent of the Project Cost estimates are between $33 million and $57 million." I Cumulative Probability Density Curve The p.f. This is another central measure. variously. (solid m e in Figure 1. The dotted curve in Figure 1.d. is easily converted into a cumulative (probability) ~ t y f u n c t i m (c. The c. in fact. Here are example interpretations of the solid curve in Figure 1.5. About 74 percent of the Project Cost estimates are below $50 million.6. To translate the p.6). The solid curve sloping upward to the right is the c. Then we have the reverse cumulative distribution. The mode and the mean are two "central measures" that indicate the "center" or position of the distribution curve on the xaxis.d. in most cases.Chapter 1Risk and Decision Analysis 9 The "expected" term misleads many people. rotating it around a horizontal axis." Some peoplemost in factprefer to "flip over" this cumulative curve.6).6: Project Cost estimates are equally likely to be above or below $43.5. Either cumulative curve contains exactly the same information as the p. 10 . The 74 percent (lessthan)confidence limit or confidence level is $50 million.d.
a risk is a binary event. Discrete Events Thus far.50 . We have a discrete (probability) distribution when we have two to many (usually countable) possible specific outcomes.60 . Either something happens.3 and Figure 1. the frequency distribution's shape converges to the p. we have been examining continuous distributions.d.6 Cumulative Probability Distribution Often. the inequality symbols "<" or "I" or ">" or "2" are used when labeling the cumulative functions. and determiningwhat actions are cost effective in mitigating such risks.40 . We will discuss this process in Chapter 8 about project risk management.6. or it does not. Suppose we characterize a population with a p. one works instead with a judgment about a distribution for a cost of a single project.Chapter IRisk and Decision Analysis 1. More commonly. where there is a continuum of possible outcomes. $millions Figure 1.90 .f. If this risk event occurs. though this is optional and irrelevant with continuous distributions.a . then there is an impact on the project. In many situations. Often we have a project contingency. because the probability of realizing the precise boundary value is virtually zero.5 and Figure 1.d.20 .70 (IJ .10 0 0 10 20 30 40 50 60 70 80 90 100 (IJ 3 Cost Estimate.80 a P . Project risk management is mostly about considering discrete risk events and their impacts. If we obtain many data from this population. such as in Figure 1. an event that either happens or not.f. The use of the "=" part of the symbol is common.00 .H . Figure 1. This is the simplest form of chance event. .+ 0 2 .30 .4 illustrate sampled data. This is perhaps the most important concept in statistics.
$2.7.3 million. most likely. The judged distribution comes from subjective judgment.d. and high values. These three parameters are sufficient to completely and uniquely describe the triangle distribution. We then have three points: low. For Material Cost. illustrated in Figure 1. Regardless of how it comes about. One would be confident that: I Performing many similar projects (independent and no learning) would result in an average material cost of approximately $2. I Converts to a Single Value How should you respond if your boss or the client insists on a singlevalue estimate instead of a distribution? The best singlepoint estimate is the expected value (EV).f.2 million to $2. changes linearly between these points.4 million.7 million. with a most likely value of $2. the average estimate error will approach zero. completely expresses her material cost assessment. Of course. comparable historical data. modeling. $millions Expected Value Suppose in estimating Material Cost. She further believes that the p. an expert judges that the outcome could range from $2. a probability distribution fully discloses someone's judgment about the value of a project model input variable. or some combination. because EVis the objective (unbiased) value to represent a probability distribution. I Over the long run. or its cumulative equivalent.Chapter 1Risk and Decision Analysis I Material Cost. This probability distribution. the case shown in Figure 1.7. the model and judgments that go into it must be unbiased.4 million would be the singlepoint estimate. 12 .
Chapter lRisk and Decision Analysis Thus. is another popular calculation tool that uses a statistical sampling method to solve the EV integration. discussed in Chapter 7. to all outcomes. Do not be alarmed. such as solving EVs in payoff tables. it is very difficult to solve even simple mathematical operations with probability distributions.d. one might be able to solve the integral equation: += EV= jxf(x)dr m where x is the value of the variable and f(x) is the p. Note that forecasts that use the most likely (peak) or median (50 percent confidence) values would systematically understate Material Cost. This formula is the fundamental calculation used in decision trees. Monte Carlo simulation. Excel's SUMPRODUCT function is a convenient way to do this calculation. In almost every situation. we can solve for EVs. I have included formulas for those persons who may be interested. and P(xi)are the probabilities of these outcomes. we are calculating with simple equations. mostly just multiplying and adding.f. Do not worry about the calculus. Fortunately. Then our share of EV cost. EV is the only unbiased singlepoint forecast. X. For example. EV volume. For example. if a decision burdens a project with a cost X = $10k. This equation is almost never one that we can solve with symbolic mathematics. H If we multiply all outcomes by a factor Y. Moreover. the E V is the sum of the outcome values times their probabilities: where xi are the outcome values. then the EV changes by that factor. EV calculations have two highly desirable properties when we are forecasting a cost or value: H If we add an amount. If we have the distribution expressed as a mathematical formula. then the EV increases by X. and this affects all possible outcomes for this project. which we will discuss in Chapter 5. Calculating Expected Value There are several ways to calculate the EV of a probability distribution. and EV revenue is reduced by half ("distributive property"). For a discrete distribution with N possible outcomes. then the project's EV cost increases by $10k ("delta property"). We do not know how to integrate a project model. * $ There are perhaps more equations in this chapter than in the entire rest of the book. of x. suppose we split a project fifty/fifty with a partner. .
because its outcome is often substantially different from the EV result.8. This reveals the accuracy of the estimates.1.g. is a combination of low bias and high precision.2 million. Stochastic Variance. . This keeps the evaluation project team focused and directed. At first impression.9 is a graph comparing actual and estimate project data from column 5 in Table 1.1 summarizes these data. Chapter 13. I hope that you will find it common sense and in many ways similar to the problemsolving process that you already use. Project data that verlfy an objective estimating process will have these characteristics: The average estimate error approaches zero over many projects. The samples for actual costs. DA can save time overall. Actually. Even if sufficient data are not available. we intend these features to be in an objective evaluation process. describes this phenomenon in detail. The main analpcal benefit of using DA is improved calculation accuracy in valuing each alternative. There are several reasons for this: W There is a process for the decision evaluation. Using E V inputs often does not provide an EV result.. with most projects in the $500. Let's see how this works with synthetic project data. are from the population of projects. and use these for a base case analysis. Probability distributions already characterize the possibilities. Figure 1. Objective Project Estimates One key to improving project estimates is performance feedback. it may appear that DA requires more work from the project or evaluation professional. The focus is on making progress toward the organization's objective.1.This measures the value of each outcome and alternative. and any tradeoffs are explicitly valued in the decision policy. column 3 in Table 1. Chapter 2 describes a tenstep DA process. The average Project Cost is $11.000 to $30 million range. shown in Figure 1. We usually calculate the EVs of input variables. It is important to compare forecasts to what actually occurs. but merely one possible projection useful for discussion. Note that the base case cost or value should not b e usedfor decision making.Chapter 1Risk and Decision Analysis We can obtain EVs for input distributions by numeric integration or graphical methods. with values for the first ten projects shown in detail. Accuracy. stratified by project size or type). There is less work for additional whatif cases. W The analysis incorporates the value function (decision policy). Consider the case of a company with original cost estimates and actual cost data on its last 100 projects or similar activities. you may recall. Table 1. This is not the base project plan. W This relationship ( b d e t above) holds for any classification subset of the data (e.
1 Project Estimate and Actual Costs For illustration.75 to 1.10 presents the same data as a cumulative frequency distribution. For example: There is about a 60 percent confidence that the actual result will be within the range of 0. sometimes we prefer a cumulative distribution format that directly represents confidence levels. . Figure 1. As discussed earlier.9.20 times the estimate. It has exactly the same information content as Figure 1.Chapter IRisk and Decision Analysis Dollar Amounts are $000~ Table 1 .
The width of this range is a measure of the evaluation precision.Chapter 1Risk and Decision Analysis  b .10. 16 . I I # 1 I 0 10 20 30 40 50 60 Project Cost. Alternatively.E m n 2 a a . we see that the probability of ActuaVEstimate WD being less than 0. 8 Distribution of Project Costs Ratio ActualIEstimate 9 From Figure 1. we could use the standard deviation or another statistic. and the probability of AIE being less than 1.V) c n b .2 is about 80 percent.75 is about 20 percent. $millions Figure 1 .
70 P .00 . Even with objective evaluations. 1 0 Cumulative Frequency Distribution of the N E Ratio There is a 56 percent chance that the actual result will be lower than the estimate. Summary: Expected Value. Since the A/E ratio is nearly 1 in these figures.9 and Figure 1. With objective analysis. by the decreasing Average Error. Figures l.50 .0 1.10 are feedback tools for monitoring and improving project estimates.10 .90 . and is caused by chance deviations.80 .20 .5 2. Figure l . l l b show the average error plotted as a time series.5 1.1. Whether or not we have sufficient data to demonstrate. will usually diverge in either direction. the ratio will approach 1 as the number of evaluations becomes very large. With perfectly objective analyses.lla and l . l l b shows a great volume of synthetic data on a log scale generated by the demonstration simulation.60 . Cumulative Error.0 2.40 3 0 . This is contrary to what many expect. this is a characteristic of objective evaluationsthat is.5 Ratio ActualIEstimate Figure 1 .. the average error is diluted as the number of observations (projects)in the sample increases. the Best Estimator Probability is the language of uncertainty.a m n a . column 6 in Table 1. A probability distribution is a graph or mathematical expression that completely represents the likelihood . however.1. Table 1. those without appreciable bias. shown in the last column in Table 1. these estimates appear to be unbiased.Chapter 1Risk and Decision Analysis 1.v 9  2 0 0 0.1 and Figure 1.
Next. The standard deviation..Chapter IRisk and Decision Analysis 4 . We have discussed the most popular and most useful statistic. However. is symbolized by the Greek letter sigma (0). we want to know the EV of each alternative.000 3. we discuss a nearuniversal approach to problem solving. Perception of value depends upon the preferences of the decision maker. APPENDIX 1A MOMENT METHODS This section describes some other useful statistics.000 Project Number Project Number Figures l . a numeric representation is at least unambiguous. which is a measure of uncertainty. . EV. For logical. conveniently.E tff L  E 0 b 9 C g (2) S a I (4) 20 40 60 80 100 3 1 0 30 1 1 1 0 3 [ # 1 1. l l a and l . we discuss decision policy in Chapter 3. Also. Sometimes. consistent decision making. distributions for the alternatives are difficult to compare. Quantlfymg a judgment doesn't necessarily mean that it is more accurate. l l b Average Evaluation Error versus Evaluation Number of different possible outcomes.OW IC. The second most important statistic is the standard deviation. quantified risks and uncertainties can be employed in useful calculations. with particular emphasis on concepts from DA. The EV calculation collapses a distribution into a single value number for comparison. You may skip this section ifyou are not inclined to read more about statistics. in Chapter 2.
For a typical continuous probability distribution. Outcome (xaxis) values are weighted with the probability of occurrence (yaxis). then the sum will be approximately normal regardless of the shape of the individual distributions (from the central limit theorem). = 02. We can sometimes work with statistics of probability distributions. When we add normal and independent distributions. This sometimes guides us to when a normal distribution is the best distribution shape. Parameter refers to statistics about the location and shape of probability distributions. * : * The N is the first moment about the origin for a p. If there are many additive distributions of similar magnitude.The variance is the second moment about the mean. the variances (02) also add: o. A mechanical analogy is the moment arms in calculating torque [3].f. Notice that the four distributions appear to have comparable widths. l shows four distribution shapes.. means O n s ' or EVs) always add: The sum of normal distributions is a normal distribution. Chapter 10 discusses the most popular distribution types.. The "parameter method and the "method of moments" are two alternative names given to these evaluation techniques.+ 0. . Moment refers to the calculation of the statistics.Chapter IRisk and Decision Analysis Standard deviation is the square root of another statistic called the variance. + . Popular Equations The fundamental equations in moment methods are based upon adding means and variances.. where E() is a common representation for the EV of whatever is inside the parentheses. + 0.d.. Usually. Figure l A . When adding distributions. each with a standard deviation of ten units. only the first two moments are used 121.. Variance ( 0 2 ) is the EV of the squared difference between possible outcomes and the mean. Equations for adding means and variances form the basis for traditional PERT calculations and many simple calculations in quality control. about twothirds of the probability is in a 20 interval within the highest portion of the probability distribution. a calculation similar to the moment of inertia.
n l u a 1 I 30 20 10 0 10 20 30 40 Figure l A .Chapter 1Risk and Decision Analysis b . This concept is important to the discussion that follows. We can then add means and variances to get the total dimension mean and variance for an assembly. for a discrete distribution in two variables (such as represented in a joint probability table). Association is another name for this relationship. xi and yi is: 20' . Somewhat more complex. l Four Distribut~ons with the Same Standard Deviation For example. The correlation coefficient ranges from 1 (perfect negative correlation) to +1 (perfect positive correlation). + 2pABoAoB =4 . + Q.V) c 2 a 2 . This is calculated in Excel with the CORREL function.)] = E(AB) . Independence is an antonym. If pAB= 0.P ~ P B which. engineers often assume component dimensions are normally distributed and independent. normal distributions is calculated by: Q. nonindependent. then A and B are independent. Correlation Correlation between two variables means that one variable somehow affects or is affected by another.P.+B = ~ri + 0. the variances of two added. A positive correlation between X and Y means that an increase in X is generally associated with an increase in Y. . + GAB where pAB(AB correlation coefficient. Greek letter rho) is the linear correlation coefficient between elements A and B. The covariance term in the earlier formula is calculated: DAB = E[(A .)(B P.
R2 explains the portion o f the variance in actual data explained b y the (usually regression) model. If you cut out a p.f. Relating Risks. The GM is the nth root of the product of n values. sometimes capitalized R2. coefficient of detenni nation) in reports about regression models. There is also a harmonic mean that has applications in electronics and fluid flow. They refer to the same statistic. This appendix describes some statistics calculations that appear in the literature. in Chapter 12. Thus. 2. only with sample data. There are other central measures. though without the equations. this figure balances on the xaxis at the EV. the more rigorous calculations described in subsequent chapters are important for confidence in the evaluation calculations (except fuzzy logic in Chapter 17). and kurtosis is the fourth moment about the mean ' divided by 0 4 . Moment calculations are common in engineering.Chapter IRisk and Decision Analysis The correlation coeflcient used popularly in simulation is: O W e often see an "rsquared" statistic (P. the EVis the probability graph's center of gravity projected to the xaxis. However. Moment methods are sometimes useful in simple cases. The geometric mean (GM) is especially useful with inflation and other measures involving ratios. . The mean is the first moment about the origin. such as when determining a moment of inertia. printed on card stock. and they can be useful in first approximations. We will revisit correlation coefficients. 3.d. The variance is the second moment about the mean. The weighted center of a probability distribution is the balance point about which the torque (sum of the first moments) is zero. Skewness is the third moment about the mean divided by 0 3 . Endnotes 1.
.
Ten Steps toward Better Decisions Decision analysis is a process for solving problems. Don Aspromonte. However. Be proactive in creating or identlfylng decisionmaking opportunities. The following subsections describe each step in more detail. there may be better opportunities to add value than what we have planned.1 outlines a tenstep sequence for decision making. he isn't asking. he will apply a tried and proven process. the general process can be the same. He has successfitlly filled many cavities with excellent results. While most decisions are unique. such as filling a tooth. Brainstorming techniques are helpful. A sales consultantfriend of mine. 2. Proactively Identify Decision Opportunities Most of us are so busy that we seldom go looking for problems. Are we addressing a root problem or merely a symptom? . you expect him to know exactly how to do the job. proven processes. Define the Problem Ask exactly what is the problem we want to solve. A SWOT analysis is a good source of ideas: performing a situation analysis. and Threats. As he approaches your tooth. "What method should I try today?" Rather. Figure 2. Follow this with planning your strategies and tactics. identifying Strengths. 1 . Opportunities. Whew you go to a dentist for some work.CHAPTER 2 In most work. uses dentistry as an example. Weaknesses. we expect professionals to use consistent.
Frame and scope the problem to be addressed. Evaluate the several major approaches first. Often started by building an influence diagram.Chapter 2Decision Analysis Process Brainstorming. SWOT analysis. Provide individual feedback to the experts who provided People often talk about framing a problemthat is. PV is the value measure by policy. not symptoms. Identify chance events. followed by building a decision tree or systemflow I Outcomes and probabilities of chance events. Most often. and how firm or real are they? 24 L . Evaluate the decision experience. What is the context of the decision in terms of corporate mission. usually including impact on cash flow. Sensitivity analysis. goals. Model(s) to project outcomes. and strategy? What are the constraints. defining the boundaries. can finetune later. Variance analysis. Tty to deal with root causes. Decomposition. and sequence. Correct problem? Level of detail? New alternatives? Value of additional analysis. Strategizing. relationships.
such as designing a facility. Quantify Judgments about Uncertainty From Step 4. their relationships and sequence. flow diagrams or scripting may be more convenient. Identlfy the decision and chance events. Further. "Identify the relevant criteria" in the decision process. Develop the Decision Model Develop a map. 4. they may have ideas and important information to contribute. Be creative. Perhaps you should determine the best technology and location first. Decision trees and influence diagrams are good techniques. we enlist the most knowledgeable available person to judge a chance event (as a probability distribution). if not Step 1. is a good time to involve the decision maker(s)and stakeholders. diagram. because they provide an illustration of the problem. Ensure that plenty of time and attention are devoted to this step. or outline of the decision problem. then finetune the configuration. I think decision policy should be known and stable and not reexamined with each new decision. Step 2. 5. Sometimes a chance event is difficult to judge directly. We also need the expert's judgment about any relationships (correlations)with other variables in the model. This includes any opinionweighting or tiebreaking rules. involved with a chance variable. Sometimes there may be multiple experts. or at least opinions. In some cases.Chapter 2Decision Analysis Process 9 Many other authors include a step something like. there are often several stages of design decisions. It is the decision maker's responsibility to designate those to provide the judgments going into the model. 3. Does this decision affect other decisions? Are there any synergies? Some times we need to expand the scope of the evaluation model to encompass effects of this decision on other projects and operations. Decomposing the system further and building a submodel may be a good way to develop a distribution for a variable in the toplevel decision model. . we know what chance events are included in the model. Communication with and the support of these people are important to project success. ldentify Alternatives Good decisionmaking opportunities (Step 1) and good alternatives are more important than the actual evaluation methods. Usually. This should include the alternatives and possible outcomes of each. For a complex problem.
There is a calculation method for influence diagrams. your decision policy should describe the multicriteria value function. The valuation model must generate a value for every possible scenario generated by the decision model. For every project scenario. Work to keep the model as simple as possible for the purpose at hand. Commonly in business. Payoff tables and other methods are suitable for some simple situations. closely aligned with decision trees. Utility scales are useful for riskaverse organizations and for those with multiple criteria. we can often do this deterministic modeling step very quickly. Discard (prune) alternatives as soon as you determine that they are inferior. The valuation model links to the decision model. and the decision model tells the casMow model what are the decision and chance variable values of the scenario. ensure that additional analysis work costs less that the incremental benefit. It is often expedient to use multiple expected valuesolving methods in the same evaluationfor example. For optimization or valuation problems. combinations of decision and chance variable outcomes. using trees for what trees do best and simulation for what simulation does best. Develop the Valuation Model We need a means to determine a value for every possible outcome. creating the model might require considerable time and effort. we want a single number to measure the quality of the outcome. If you have an existing template or economic model. set these to single values or combine the effects with other variables. Rethink the Problem Are you solving the right problem? Do the decision and cashflow models have the appropriate level of detail? Does this need to be solved now? Or at all? . the model projects incremental net cash flow and present value. That is. For unimportant chance events. Remember. The core of the modeland the most work if it is a special evaluationis usually a deterministic operations and cashflow model.Chapter 2Decision Analysis Process 6. Calculate Expected Value for Each Alternative Decision tree analysis and Monte Carlo simulation are the workhorses of decision analysis. and vice versa. Decomposition and sensitivity analysis are important modeling techniques. For new problem types. What scale you use comes from your decision policy. I f using multiple criteria. every combination of decision and chance event outcomes is a unique scenario. we only need to do analysis to the point where we know the best alternative. We seldom need to optimize every possibility. 8. 7.
" I suggest not inviting managers outside the project evaluation team. Continue looking for decisionmaking opportunities. mitigating threats and enhancing opportunities? 9. You may need to loop back to an earlier step to include a new alternative or risk. the person responsible for implementing the decision has also been involved in the evaluation process. The primary purpose is to help team members recognize and learn from the experience. A variance analysis may be a useful tool. templates. Ideally. Decision analysis methods are useful also for the many decisions a project manager faces. Nothing causes people to become conservative faster than a fear of punishment. This carries forward into project implementation and detailed planning. Monitor the risks and assumptions throughout the project life cycle. What is the value of additional analysis? Are there costeffective sources of information that would better define some chance events? Are there ways to add control to the strategies. The decision analysis process works best when everyone is motivated to provide his best. Implement the Best Alternative We should consider who will implement the decision in our evaluation.Chapter 2Decision Analysis Process While this tenstep process may look linear. budget to get the project team members together again to review the decision process. Anticipate where there may be decision points to reallocate resources. What did you learn? Did the decision model prove adequate? Were the outcomes of chance events in the middle regions of the distributions? Feedback is most effective when it is personal to the experts who provided inputs to the evaluation. often there is necessary rework because of new information or a new insight about the problem. and such might prove useful to later teams. Analyze the factors and contributions to the difference (variance) between forecast and actual outcome. data sources. unbiased efforts. 10. Placing a copy of the decision analysis document might provide inspiration to others pursuing similar evaluations. PostAnalyze the Decision Perhaps the most important single way to improve the quality of decision Plan and making in your organization is a postimplementation review [I]. The learning organizationis a popular concept in modern management. The postimplementation review is not a "search for the guilty. What are the lessons learned that you want to preserve to share with others in the organization? Models. . We want to reward people for making good decisionsnot for experiencing good outcomes. Project feasibility analysis involves some early project planning.
The postimplementation review is the topic of Lessons Learned Through Process Thinking and Review by George Pitagorsky (2000). There should be little or no need for separate "whatif" analyses. The project evaluation team must often sell the recommendation to not only decision makers. the decision model has already considered the possibilities. scientists. problem. and project.Chapter 2Decision Analysis Process Who Does All This Work? You and your colleagues do! Except perhaps for Step 6. Involve these people early and throughout the process. Usually. I hope you will find this general problemsolving process useful in keeping the project team focused and directed to the evaluation task at hand. . the technical problems are also people problems. and others who understand the technology. Endnotes 1. This tenstep evaluation process should save you time. all of this work is best performed by people close to the problem. They are typically engineers. developing the deterministic valuation model. but to other stakeholders as well.
EV is the probabilityweighted average of all possible outcomes represented in a probability distribution. Most traditional decision criteria do not work well with uncertaintywhich is most of the time. Intuition Is a Poor Method Because human intuition is typically so poor at processing probabilities and dealing with details. Rather. Decision Maker's Preferences Decision analysts customarily refer to a decision maker (DM) when describing the person with this role in the decisionmaking process. the expected value becomes a foundation for logical. Partially responsible are misconceptions in project evaluation in many of the textbooks about capitalinvestment analysis. Again. I hope this chapter will guide you in crafting or reaffirming a logical decision policy. important evaluations under uncertainty will benefit from the decision analysis approach. Chapter 1 introduces probability distributions and the expected value concept. Many organizations continue using decision practices that are decades out of date. We all rely daily upon our intuition. A n example problem illustrates two tools for calculating expected value: a payoff table and a decision tree.CHAPTER 3 W i t h an appropriate choice of a value measure. the quantitative methods will bolster our intuition. As with any discipline. We saw that expected value ( E V ) is the only unbiased predictor and the best singlevalue estimate for forecasting. consistent decisions under uncertainty. In an . and the purpose here is not to replace intuition. it is important to understand the theory and to apply best practices.
Chapter 3Decision Policy VALUE Figure 3. Without a clear set of values.c.1 Value Meter Metaphor for Decision Policy 3' $Millions organizational context. and typically these are aligned with different stakeholder group interests. we want to find the alternative with the greatest value reading. another currency. Most decision analysts find it convenient to segregate preferences into three categories: objective(s). They are the DM'S attitudes toward different characteristics of decision problems based upon beliefs and values. as we will soon see.) Which alternativethe path we choose on a decision diagramhas the highest value? The value meter has a scale calibrated in dollars. of what your values and priorities are. and risk attitude. We can deal with each preference separately. (Decision trees. A . providing a more logical and consistent basis for evaluation analysis. one decision is as good as another. An analogy is a voltmeter. and optimizing decision variables. somewhat like a. can be solved by having a way to assess value: ranking alterin Chapter 1 natives. look much like electronic circuit diagrams. Recognizing uncertainty (continuing the metaphor.time value. We examine these three dimensions of decision policy in the following sections. there may be multiple objectives. Kotkin 2000 The decision analysis approach is straightforward. identified alternative with the highest EV. appraising value. Value is in the context of the organization's objective(s).When appropriate. Preferences are the dimensions of value. a clear understanding. voltage). It's essential to have a grip. the decision analysis approach needs a way to measure outcome value.1. itlustrated in Figure 3. In making decisions. Consider a value meter. we will choose the l l three analysis problem types. or even some arbitrary scale that we design. . the DM actsor at least should actaccording to the organization's mission and objective(s). In addition to calculations with probabilities.
decision policy is often based upon money. Investments in health. safety. Let's continue with the customerservice example. This is the basis for valuing the outcomes of decisions.Chapter 3Decision Policy Attitude toward Different Objectives What is the mission? What is the organization's reason for existing? Creating wealth is the customary purpose of business. A good technique for recognizing nonmonetary considerations in the analysis is to translate them into money equivalents. most peoplecertainly most investorsagree that "maximizing shareholder value" is the logical and legitimate objective. We will discuss multicriteria decisionmaking methods later in this chapter. + + + + + . Poor customer service destroys the brand. Otherwise. providing great customer service is an important theme in many companies. Other considerations that ultimately affect cash flow and enter the evaluation include: Operating legally Doing business in socially acceptable ways 4 providing a quality product or service creating and protecting jobs contributing to the community recognizing the interests of employees. Determining the right customerservice level is an optimization problem. tary wealth is the most important objective. suppliers. multiobjective decisions are more difficult. Also. In my opinion. The usual driving objective is maximizing the monetary value of the enterprise (or total return to stockholders). being a good corporate citizen is consistent with optimizing shareholder value. For example. is this an objective?Since stockholders own the business. Decision policy should specify how we measure "goodness" or progress toward the organization's objective. At least. excessive customer service would be at the expense of shareholders. and customers while not trashing the environment exploiting any group of people Doing business with countries. because it is less clear how to measure value. and people whom we like. I cannot overemphasize the importance of being clear about the organization's mission and purpose. Yet. and the environment are other examples with a similar solution approach. companies. money is a convenient measurement unit. Enduring corporations have a core purpose beyond just making money: Where creating monesomething to inspire the troops and customers [I].
as viewed from an average stockholder's perspective. How would you represent timevalue for outcomes measured in human lives? Or old growth trees? Analysts usually apply the same formula. it is appropriate to use different rates for different years. i. though the discount rate reasoning is difficult. . We call this preference. and t is the time from the asof date to the middle of the projection period. date. Most everyone prefers to receive money sooner rather than later. Almost all companies use a fixed PV discount rate. This is similar to an annual interest rate. Conversely. The discount rate. the discount rate rationale is murky. t = number o f compounding periods. the rate chosen applies to all future periods. the impact of a decision is the effect on the organization's future net cash flow.Chapter 3Decision Policy Attitude toward Time Value When we measure the outcome in monetary units. represents the DM'S attitude toward the time value of money. C F = a net cashflow amount. Beware that Microsoft Excel and functions in most other analysis tools typically assumewrongly for most evaluationsthat the t's should be integers. or asof. That is. We call this midperiod discounting. but costs and benefits measured in nonmonetary units can be timediscounted as well. typically the effective date of the next decision. between the effective. CF is typically projected and discounted period by period. and when the future casMow (CF)amount is realized. i. based on expectations about changing inflation and/or capital market conditions. hence the symbol. This is the time. realized at a future time. positive or negative. If the value measure is something other than money. usually expressed as a rate per year. This equation always works. However. we prefer to postpone payment obligations." Present value (PV) discounting is the generally accepted method to recognize time preference for money. i. and i = the PV discount(ing)rate. Discounting makes most sense in the context of money. "time value of money. usually in years. The customary equation is: where PV = present value at a reference time. I recommend that a company choose i such that PV objectively measures increase to company value by doing the project. in their decision policy.
CFs and PVs are usually netofinvestment expenditures.Chapter 3Decision Policy EV PV is the value measure broadly recommended for widely held corporations seeking to maximize monetary value. Design the CF projection to be cash receipts net of investment and cash costs. This measure is so important that it has its own name: expected monetary value (EMV). Here are several guidelines for proper discounted casMow (DCF) analysis. aftertax cost of capital. Many people involved with project evaluation make seribus errors in PV calculations. Be sure your calculation tool properly recognizes the timing of CFs. A firm's chief financial officer should be able to provide this rate. O Do not increase the discount rate for risk! In decision analysis. that the disrisks explicitly and separately with probabilities.' ~ l s oensure count rate is consistent with whether or not the forecast model incorporates inflation. set to zero value for reference. In most cases. Discount CFs to a single date. Previous investment expenses or other prior CFs should be ignored for analyzing for today's decision. We want to forecast the incremental CF effect on the company from doing the project (or whatever alternative is being considered). such as the effects of contract terms and taxes." Recognize prior investments only when those amounts are part of future net CF calculations. usually the next decision or investment point. Be careful that the discount rate does not include a buildup for project risk! Analysts often assume a risk premium in the discount rate. Set the PV discount rate to the company's marginal. Often. where a period's entire CF amount is assumed realized at the center of the respective forecast period. This is necessary for objectively measuring incremental company value. . In decision analysis. Midperiod discounting is the most widely accepted timing approach. we handle . Be careful about timing. we compare doing a project against a "donothing" or "current plan" alternative. we recognize risks with probabilities. Do a "pointforward analysis. Awareness of this issue is what I consider the most important single insight that I've gained in twentyfive years of evaluation experience [2]. Cashflow Analysis Guidelines PV and net present value (NPV or PV) are generally synonymous. except when calculating a benefit/cost ratio. Be careful that overhead costs are legitimately incremental. including financing is unnecessary because of the typical cost of capital reference for determining the discount rate. Most companies are doublerisking their projects when they use stochastic analysis. and this results in double risking when probabilities are applied. Ignore sunk costs.
widely held companies. This behavior may seem rational if the manager or project has a modest budget. most people. the company's valueis large in com . decisions. that is. leads us to make inconsistent tradeoffs between value and risk. a nearly riskneutral decision policy is perhaps best. Risk attitude. Even for conservative individuals. the scale of the outcomes does not matter. How much risk aversion is appropriate for a corporation?The ideal perspective. unknowingly. An investment like the earlier example will reduce the project's cost by $4 million on average. For our example. Very risky projects would have little effect on individual investors. one approach is to assess a typical stockholder's net worth and multiply this times the number of stockholders. That is. there are equal chances between a $1 million loss and a $9 million PVnet gain. for most people. "Risk $1 million with a 90 percent chance of failure?!" A more proper perspective. most people begin to make future decisions more objectively.5 (+$9million) = $4 million. When the project's valueor.5 ($10 million) +.Chapter 3Decision Policy Attitude toward Risk You may skip this section i f you are comfortable with measuring value as money. is to consider decision outcomes relative to the size of the entire organization's capital. consider the manager of a large project who foregoes a chance to invest $1 million to gain a 50 percent chance at saving $10 million in project PV cost. This means being an EMV decision maker.5 ($0)] = . either as a decision criterion by itself or as a reference to the "objective"value. The EMV decision criterion is widely used in decision analysis. and if you are unemotional about the money amounts. better. or risk preference. though perhaps impractical. Riskneutral means being objective about money. For the example. Upon realizing the implications of conservative selections. the EMV for this decision is: EMV = $1 million + [. a riskneutral decision maker would be indifferent between having $4 million cash in hand or having this project opportunity. however. most persons should be nearly riskneutral for modest daytday decisions. make choices that are inconsistent with their longterm objectives. For example. The uninitiated typically exhibit excessive conservatism. For large. is to compare the possible project outcomes to the collective net worth of the stockholders.5 ($I million) + . Unaided intuition. For the second term. We do better in project evaluations. is a fascinating aspect of decision analysis. This is the purpose of risk policy. and negotiations by understanding the concept of risk preference. This is clearly a good decision in a portfolio of many similarly sized decisions. That is.
The utility function transformation provides a guide in how to make tradeoffs between objective value and risk. If desired. For most purposes. Most companies use DCF analysis for project evaluations. If there is insufficient money. a utility function expresses a conservative risk policy. In calculations. Fortunately. PV discounting reflects attitude about the time value of money. the project manager generally will be better off using a neutral attitude toward risk when evaluating alternatives. We determine a utility function. typically. adjusting for risk aversion is not important for most decisions. This leads us to the EMV decision rulea complete decisionpolicy statement suitable for most projects: Choose the decision alternative having the greatest EMV. yet most organizations fail to do this. The denominator is usually money (expected value PV investment) but can be any acting constraint. Then. The result is suboptimal decision making across the organization.. The usual methods apply for calculating expected (value) utility. If the outcomes are small in comparison to project value. And. Setting the guideline is straightforward. However. unique to the DM or organization. EMV adequately measures value. Dollars or other currency units measure value. people do this poorly. if one does want to have a conservative risk policy as a part of the decision policy. there is a straightforward way to do this. they subjectively make tradeoffs between risk and value. the expected utility decision rule is to choose the alternative with the highest expected utility (EU).e. Decision Policy Summary Consistent decisions in an organization require a clear decision policy. It defines how the DM feels about money. . Some parts of the organization are rejecting opportunities that would be approved elsewhere. Many people may have never seen or heard of the technique. Instead. that relates "value" (i.Chapter 3Decision Policy parison to the outcomes of a particular decision. utility) to wealth. a common ranking criterion is risked discounted return on investment (DROI): EMV DROI = E(PV ~nvestment) DROI prioritizes what investments to make first. then risk aversion should have little effect on decision making. we translate outcome value (usually PV) into utility units. The next chapter details this topic. This function also portrays a DM'S risk policy.
In this example. respectively. Table 3. Table 3. Suppose that the probability that A15 will be critical is . it has no effect on the project completion time.) A conventional project analysis sees no benefit of greater crane capacity on the project completion time. we can make this approximation: If the activity is critical (on the critical path). (Critical path is discussed in Chapter 9 and Chapter 14. A Medium Crane and a Large Crane are available by rental.30 (30 percent. Ifthe organization needs a conservative risk policy.500 and $2. Initial costs to bring in the crane are $10. The mediumsized crane is adequate in performing Activity 15. however. The project manager is planning Activity 15. these rules are equivalent. we use a EU decision rule instead (described briefly in this chapter and in more detail in Chapter 4): Choose the alternative having the greatest expected (value) utility. The rest of this book.000. the criticality index) with the Medium Crane. there is a . We will solve this using two different presentation and calculation methods: a payoff table and a decision tree. a stochastic (probabilistic) project model shows the potential for Activity 15 (A15) to be on the experienced critical path. In a typical project model. However. Daily crew costs are $1.1 shows the cranesue alternatives and possible outcomes.000 for the Large Crane.000 for the Medium Crane and $15.1 represents best judgments about completion days. valued in activity Completion Days. the decision is what Sue Crane to order for a specific task.Chapter 3Decision Policy This ranking criterion is an effective heuristic (rule of thumb) applicable to simple resourceconstraint situations. In fact. would accomplish the activity faster. except Chapter 4.70 chance that A15 will not be critical. which requires use of a crane. If the activity is not critical. Activity 15 is not on the mitical path ("not critical") and does not delay the overall completion time. A large crane. . and illustrates use of the EMV decision rule. it delays the project by the full time to complete the activity. That is. the Large Crane is inferior because of its higher cost. For daytoday decisions. applies the EMV decision rule. Crane Size Decision The problemsolving approach in this example is typical. regardless of whether A15 is critical. The decision pending is: Which Size Crane is the best alternative? If the project goes according to the baseline plan.
2. is independent of whether A15 is on the critical path. for most daytoday decisions. each day's delay reduces the asset value by $17. In the Crane Size decision. Regardless. In evaluating the Crane Size alternatives. This. We will also assume that the Time to Complete A15.000 per day to the value of the asset. too. are a popular way to show the possibilities and perform the EV calculations. be aware that there are often situations when we will change an activity's criticality if we shorten its duration enough so that another path becomes critical. Symmetrically. Early project completion adds $17. the difference in EV cost is the basis for making the decision. is not always a safe assumption. with either crane. and sometimes the full model is necessary for proper calculations. this matters only if Activity 15 is on the critical path. If we wanted to show the benefit of early completion. a simplified analysis is quite appropriate. such as that illustrated in Table 3. depending upon whether Activity A15 is critical. Fortunately.000. we are concerned only with incremental project cost for each alternative: Daily ( ~) [2Initial c r y ) + Franc] cost + (g:! ) = In this calculation.Chapter 3Decision Policy Table 3 . However. we might subtract the baseline Days to Complete from the Days to Complete in computing Delay Cost. Payoff Table Which Crane Size should the project manager choose? Payoff tables. Per Day Daily Cost is either $17. . 1 Delay Days Given Activity 15 Are Critical We will assume this holds for this example analysis. the Delay Cost is always positive.000 or zero.
Chapter 3Decision Policy Amounts are in $thousands Table 3 . Value amounts inside the table are the "payoffs. Decision trees are flexible in expressing the logic of a complex decision. Squares represent decision points. An example decision tree program is DATA (Decision Analysis by TreeAge). The Crane Size decision shows the power of simple decision analysis when intuition and conventional analysis fall short. and circles represent chance events." hence. We can solve small problems easily without special software.3 is a screen showing the solved tree.2 shows a decision tree analysis for the Crane Size problem. Decision trees can accommodate more complex problems than payoff tables. Decision Trees Decision tree analysis is a standard calculation method in decision analysis.2 shows the chance event outcomes. Most people use computer spreadsheets for the layout and calculations. 38 L . though these columns are often omitted. and decision tree software is immensely helpful. 2 Payoff Table for Crane Size Selection One dimension of Table 3. I also include columns to show the components of the EV calculation. Figure 3. The other table dimensioncolumns in this examplerepresent decision alternatives. Payoff tables are sometimes an excellent way to present a simple decision situation. combining values and probabilities is difficult with unaided intuition. Chapter 5 and Chapter 6 feature decision tree analysis. The tree diagram is the decision model and serves as a template for calculating expected values. I use rows in this case for each jointevent combination of [Critical or Not] x [Time to Complete A151. and Figure 3. Trees are often nonsymmetric and have subsequent decision points. the name. Decision trees can quickly become enormous (hundreds or thousands of branches). which are difficult features to represent in a table. For most people.
5 $Wd $27. 2 Crane Size Decision Tree Total = $357.8 12 $28.4 296 333 x (17+1.8 228 266 342 x (17+2) $Wd Large Crane $100.210 Total=%5l.5) $Wd 425.0.kP=~.5 x1 .P = 0.l40 03W Decision Tree Solution with DATA .4 24 28 36 x 2 $Wd 16 $340. P = 0.6 Figure 3 .060 T o t a l = 539.5 Crane 16 24 27 34.0.Chapter 3Decision Policy Costs ~n$thousands ($k) On Crltlcal Path Time to Complete Days $k 12 $269.
A problem's structure is often different for each alternative. Collins and Jerry I. and several discrete outcomes. several alternatives. the chance events in these tables apply symmetrically to every alternative. 2. . I have detailed the rationale for determining a PV discount rate in Rational Is Practical: Better Evaluations through the Logic of Shareholder Value. 1995 Society of Petroleum Engineers' Hydrocarbon Economics and Evaluation Symposium. Proceedings. described in Chapter 7 and elsewhere. Good reading about a company's core purpose is Built to Last: Successful Habits o f Visionary Companies.Chapter 3Decision Policy Methods Summary We have now seen two decision model formats: a payoff table and a decision tree. Endnotes 1. The payoff table works adequately in simple decision models involving a single decision point. Monte Carlo simulation. by James C. Decision trees are better able to represent sequences of chance events and subsequent decision points. Usually. Porras (1997). and decision trees handle this easily. is another important EV calculation method that is better suited to some problem types.
Exceptions to Expected Monetary Value Decision Policy Most businesses base their decisions upon monetary valueat least that is the primary consideration. Even ifmonetary wealth is the objective.) Maximizing EMVis a suitable decision policy for most business situations. and the decision maker is conservative. . When money is not an appropriate measure. described in Appendix 8B). The primary exceptions are: When there is not enough money. The cost/benefit/risk analysis focuses on the calculation of expected monetary value (EMV). Or. a decision maker may want a conservative risk policy. to do all of the worthwhile projects or actions (then most organizations use a ranking criterion. or other resource. When possible outcome values are relatively large. However. and nonmoney considerations cannot be converted easily into monetary equivalents. such as discounted return on investment described in Chapter 3.Recall that EMVis expected value (EV) of the present value (PV) net cash flow. or return on investment.CHAPTER 4 This chapter is for people and organizations that are not content to value decision outcomes in terms of money or moneyequivalence. when it's not the sole measure of value. not all circumstances are appropriate for EMVmaximizing. (The PV formula is in Chapter 3.
1($4million) = $1. and the probabilities are: Best Outcome = $1 million cost. manufacturing cost is a major uncertainty. Exampk of Risk Aversion What does it mean to be "risk neutral" or "conservative?"Suppose your company is having an expensive component manufactured.lo. The PV outcome distribution shows the range of possible outcomes and each outcome's likelihood of occurrence. The probabilityweighted average value is the EV. and both parties are sharing a l l data. We appraise or value the alternatives and implement the best one. which is the expected monetary value EMV. the buyer. is the measure of value. consistent way to deal with these last two situations. in order to choose the best one. PV. Assume that monetary value. we use probabilities to represent judgments about uncertainty.) . all of the analysis inputs should be judged as objectively as possible. The riskneutral decision maker need only know the EV PV. Properly done. you. Under the costplus contract. We then weigh these possible outcomes with their probabilities of occurrence. for each alternative.90 Worst Outcome = $4 million cost. you may rescale the amounts to make the outcomes more closely match the magnitude of important costs typically encountered in your business.9($1 million) + . The EMV decision rule would be the organization's decision policy. Discussions have been candid. Assume that other factors. especially cost details. the analysis results in an unbiased distribution of project value (or cost). In decision analysis. In evaluating a particular alternative. (Throughout our discussion. bear the cost risk. we calculate or judge the value for each possible outcome. Conservative Risk Attitude Logical decision making is based upon appraisal. There is a large fabrication contingency that potentially will quadruple the component's fabrication work and cost. p = . the possible cost outcomes to your company. the buyer. Thus. With a costplus contract. p = . Then: EV cost = . are fixed. such as performance and schedule.3 million. You are negotiating a costplus fabrication contract with the supplying manufacturer. In a credible analysis.Chapter 4Utility and MultiCriteria Decisions This chapter shows how utility theory provides a logical.
$1. risks are shared by many investors. however. if it is important. Still. For these entities. costplus contract is preferred to a fixed $1. a conservative buyer might agree to a higher pricesay. The EV concept still applies for decision making under uncertainty. Utility is a measure of value reflecting the preferences of the decision maker.Chapter 4Utility and MultiCriteriaDecisions You are about to close the deal when the supplier surprises you by suggesting that she is willing to change to a fixedcost contract. except when there is entertainment value in gambling or when striving to reach a goal.3 million or accepting the costplus contract outcomes and risks. based upon beliefs and values.3 million price. often decision makers will make substantial value adjustments for risk attitude. For example.3 million EV cost. Utility theory describes how to m o w an EMV decision analysis for a conservative risk attitude. publicly held corporations and governments. Everyone pays more than the EV payoff when playing casino games or buying lottery tickets. For individuals and small. Should this be an important part of decision policy? Yes. However. it is conservative or riskaverse. When the outcomes become large compared to the reference net worth. If the company is willing to pay somewhat more than the $1. the company is riskseeking in this situation. it would be indffe~ent toward either paying a fixed $1. small daytoday decisions will be almost exactly the same as with the EMV decision rule. For large. decision makers can afford to be riskneutral. "What fixed price would you be willing to pay instead [I]?" The equivalent fixed priceto youdepends on your company's attitude toward risk: If the company is riskneutral. decisions with large potential gains or losses may require conservatism. For small decisions. Willingness to pay a risk premium is common decision behavior. If the risky. She asks. The decision rule is to choose the alternative having the highest expected (value) utility (EU). a conservative risk policy is more suitable. . This is unusual behavior. many of us believe that the EMV decision rule is appropriate.5 million. closely held companies. is measured in utility units instead of PV. outcome value. Conservative Behavior Conservative behavior is widespreadalmost universal.
Later. The curve. for most persons. Thus. In the probabilistic sense. incremental utility value decreases as PVs get larger. where value is proportional to PV. the added $1million would be hardly noticed. For small decisions. then. we are using utility theory only for expressing risk attitude. is usually deduced from the decision maker's answers to a series of hypothetical decision problems. Figure 4.Chapter 4Utility and MultiCriteria Decisions Utility Function for Risk Policy In this section. The yaxis scale is arbitrary.1 represents the utility function of a riskneutral decision maker. we will discuss how utility theory applies also to multicriteria decisions. I prefer to scale the yaxis in somewhat tangible units: riskneutral (RN) dollars. We could graph a function relating to how we perceive value versus the actual monetary value. If one already has $10 million. near zero PV. Rather than arbitrarily defining the scale.1 shows an example utility function. Note that for positive PVs (benefits).into perceived value. the objective measure is EMV. would be PV using an appropriate discount rate. regardless of shape or scale. The result is that a value determined by EMV is nearly identical with the value obtained when recognizing riskaversion. .e. for conservative persons. This graph is a utilityfinction. i. The upwardslopingstraight line in Figure 4. twice the positive PV does not represent twice as much value. The objective value measure. Utility curves that are concaved downward represent conservative risk attitudes. +RN$l million benefit is one million times better than a $1 benefit. winning a $1 million prize would be a tremendously exciting event. This makes the utility measure more meaningful: RN$l million cost is one million times worse than a $1 cost. utility (yaxis). an EMV decision maker. however. As an example. I prefer the smooth exponential shape. costs or losses are amplified for negative PVs. and the best possible outcome has utility = 1. PV (xaxis). Many decision analysts scale the yaxis so that the worst possible outcome has utility = 0. Conversely. This is a riskneutral decision policy. value is not a linear function of PV. Incremental positive amounts add incrementally less value as wealth is accumulated. Economist and decision theorists use the word utility synonymouslywith value.. Risk aversion is exhibited in what economists call the law o f diminishing marginal utility. It translates objective value. including choice of origin. Assume that maximizing monetary value is the objective. the straight line and the curve are nearly coincident. Utility units are often called utils.
1 s S 6 . The objective cost forecast and assessment is the EV of the possible .I 6 5 4 0 . I use and recommend this exponential utility function formula: (in units of "RN$") where x is the outcome value in PV$ r is the risk tolerance coefficient in $ The EV of U(x) is called EU. the calculation precision and convenience is better if we have a formula for the curve shape. Chapter 4Utility and MultiCriteria Decisions 2 3 E 2 1 0 Z L 1 : 2. The inverse function for obtaining the certainty equivalent (Cais: where EU is the E V utility. Figure 4. 5 a .1 to convert money to utility units. However. r = $10 million. 4 2 5 6 7 5 4 3 2 1 0 1 2 3 4 5 Outcome PV $millions Figure 4.1.2 shows an EMVbased decision tree for the contract example. we could use a graph such as that in Figure 4. Certainty Equivalent A simple example will show how we use the utility function to help make decisions.. . In Figure 4.1 Example Utility Function In principle.
0517) + . EU is suited for comparing alternatives that all have uncertainties. $XI uncertain costplus alternative. $X represents what we call the certainty equivalent (CE). The risk tolerance coefficient is typically onefifth or onesixth of the organization's net worth. W e cannot compare dollars and utility because of different units. Here.9182 million.0517 million U($4 million) = RN$4.2 Decision Model for EMV Policy outcomes. A higher r is more suited because many diversified stockholders share the risks. First. Recognizinga conservative risk policy requires a slight adjustment to the method. 46 . From the utility function of Figure 4. rather than to imply that we have great precision in these numbers. O The organization's net worth and the onefifth or onesixth factor produce a risk tolerance coefficient (r) that is too low for widely held. We want to determine the fixedprice contract equivalent to the costplus contract uncertainty. 9 I'm showing the additional decimal places to demonstrate the calculation method.4384 million. however. public corporations. calculate the EU of the costplus contract.3 million $1 million $4 million $X Figure 4. in my opinion. Some theorists suggest that this curve is appropriate to a company with a net worth of about $60 million.1 or its equation: U($1 million) = RN$1. The costplus contract alternative at $1.3 million and the fixedprice contract alternatives have equal value to a risk neutral decision maker. The degree of risk aversion is usually proportional to the size of the company.Chapter 4Utility and MultiCriteria Decisions EMV= $1.I(4.1. is equivalent to the What we need to know is what guaranteed cost.9182) = RN$1.9(1. Assume that the company's risk policy is the utilityfunction we saw in Figure 4. Calculating EU (actually EV utility): EU= . we are comparing an uncertain alternative to a fixeddollar outcome.
) However. Figure 4. if the potential downside is serious.9182 million $X = 41. the CE.Chapter &Utility and MuitiCriteria Decisions EU = RN$1. sometimes. here. when the highest EMV alternative is not the same as the highest EU (or CE) alternative. That is the point of having a risk policy: to show which risk versus value tradeoffs are appropriate. The EMV decision policy is a special case: CE = EMV when the utility curve is a straight line. The risk premium is only 3.4 percent of the EMV.3439 million fixedprice contract is equivalent to the uncertain costplus contract.1.0517 million PV = $4 million Utility = RN$4. Figure 4. to avoid risk.3439 . However. I f there are costs placed along tree branches.3000 million) difference between C E and EMV is called the risk premium. if any alternative is a fixeddollar amount. a $1. EU. Note that C E and EMV are nearly the same in this case. I like to calculate every node's EMV.3439 million PV = . "Take it a not investing is the alternative and has a $0 value. and CE when backsolving decision trees. "(We are assuming. If the project has a positive l l . An interesting application arises in a problem of optimizing participation fraction or "working interest" in a project. these costs may then be subtracted from the CEs as the tree is backsolved.1 (or the inverse transform equation) shows that EU = RN$1.000 (= $1. Interesting situations arise. then conversion to CEs is necessary.3 Decision Model for EU Decision Policy We use the utility function to translate EU into CE.4384 corresponds to a PV of $1. When evaluating alternatives with utility. This premium is the additional amount the company is willing to pay. Applying the Risk Policy In decision analysis.3 shows a decision tree solution solved with the EU decision policy.4384 million CE = $1. it may be advisable to share the risk with one or . The approximately $44. Either the EU or CE criterion results in the same decisions. we value the alternatives with either their EUs or CEs.$ I million Utility = RN$1. meaning that r = $w . or sacrifice. for this company. Thus.3439 (point of indifference) Figure 4. that EMV. even though the PV range is 30 percent of the scale. the EMV decision rule says.3439 million.
However. P m . Keeping100 percent of the project is often desirable. and has greater uncertainty. By iteration. who may visually compare the curves. When running a Monte Carlo simulation (simulation. However. as illustrated in Figure 4. then convert it to utility units in every trial. Which alternative has the best riskversusvalue profile? That depends upon the decision maker. If the curves cross. graphs. These are typically reverse cumulative frequency distributions for PV.4 shows example reverse cumulative distribution curves produced by simulation.40 a g . If the project has a positive EMV.80 $7. Alternative B has a higher EMV. having a risk policy is useful. deduce the shape of the function of EU (or CE) versus participation fraction.4. Figure 4. This is the typical and unfortunate reality: Alternatives with better EMVs are usually associated with greater risk. we calculate the outcome PV. ership is always desirable.00 s  a .4 Comparing T w o Alternatives more partners. Most decision tree software provides similar. EU is the average of this utilitymeasured outcome.60 ? I v a ' i  e 0 . Choose the participation interest that maximizes EU. you may not want it a What is the optimal interest you would want in a project? A utility function representing risk attitude is the logical way to solve this problem. The more consistentand recommendedway is 48 . then some fraction ownl l . a fractional share of the project will maximize utility.h Chapter &Utility and MultiCriteriaDecisions 1. if the project cost is high and there is sufficient risk.Q m II . although sometimes more stairsteplooking. it is customary to provide the decision maker with riskversusvalue profile curves. allowing his intuition to make the decision.5 million EMV . 0 10 20 30 40 PV of Net Cash Flow $millions Figure 4.20 . and are the natural presentation format for simulation results.we'll d&cusshow in Chapter 7). When using simulation.oo . and when the bestEMV alternative is riskier (wider).
5 maps project outcome value according to two dimensions. There are three principal ways to deal with MCDM problems: Recast the problem solely as an economic issue. time. complete risk policy for an organization or individual decision maker. ( W e the uncertainties are unresolved. The utility function is a su nct. It enables decision makers at all levels in the organization to make consistent riskversusvalue tradeoffs. and judgments about uncertainty. This is a powerful concept for a conservative company that wants to delegate decision making downward. time value of money. How are we to make tradeoffs between these three dimensions? Understanding and quantdymg tradeoffs is difficult unless a value model is used. Usually these arise because of multiple objectives or goals. and performance. In this case. then choose the best alternative based upon the EU decision rule. Finish three months late but within budget. we would be measuring project value as EMV. Monetary Value Objective Approach In project management. The main alternatives are: Spend an additional $1 million to complete on time. This decomposition facilitates more logical and more consistent evaluations. Interpolating between the curves: . the contour undulations reflect the seasonality of the business. and maximize EU. We measure project outcome value in PV dollars. Use a multicriteria value function (which can have risk aversion embedded).) The project feasibility model determines the shape and position of the isovalue contours. or CE.Chapter &Utility and MultiCriteria Decisions to convert each curve into its EU (or CE). faces tradeoffs across many dimensions such as schools. or CE. for illustration. MultiCriteria Decisions Multicriteria decision making (MCDM) is an approach for problems where value is multidimensioned. Decision analysis keeps separate: objective. Suppose a project plan is in trouble. Which alternative is better? Figure 4. then maximize EMV. cost and schedule. EU. It can apply MCDM to solve such problems in a reasonably consistent way. and police protection. and maximize EMV. for example. EU. A city government. Convert nonmoney considerations into monetary equivalents. goals commonly relate to cost. risk atcci titude. roads. however.
then Alternative1 is the clear and logical choice. Both alternatives have unattractive outcomes compared to the original feasibility study. costs or monetary value often represent 80 percent or more of the decision basis. If maximizing EMV is the corporate decision policy. Sometimes.Chapter &Utility and MultiCriteria Decisions 7 0 ..5 million. developing a comprehensive value model is impractical or inappropriate. Most of the decision was driven by the refinery's value in the nation's 50 . rural area where jobs were greatly needed. Consider the situation of a national oil company that located a new refinery.4 million. . However. EMV = $1. One site was near an urban area that already had most of the needed infrastructure and good availability of labor. even government and notforprofit organizations are making decisions based primarily upon monetary value. 4 2 3 0 180 360 540 720 EV Completion Time Days Figure 4. The contours (or the equivalent value formula) represent decision policy. Increasingly. cn s 6 ~ tff E + 5 0 & ' E a + 0 a . especially in business. An alternative was to locate the refinery in an impoverished.5 Project Value Contours and Two Alternatives Alternative1: Cost overrun Alternative 2: Late EMV = $2. Minor additional considerations can be included in the analysis by factoring in monetaryequivalent values for the side factors. I MonetaryEquivalents Approach Dollars are a convenient and standard measure for comparing decision alternatives.
but sometimes objectives hierarchies have several more levels. A key challenge is devising a way to measure value. and timing of job creation. multiple objectives are involved.000 to create a new job in the local economy. MCDM uses a value function that combines various attributes (measured by decision criteria) of the problem. In project management. have different preferences. it may cost $5. Economists can assess the value. . most decisions are made considering impacts on cost. AU that we need for the substitution approach are ways to translate nonmonetary criteria into money equivalents. calculated in the usual way. This technique uses subjective. because the company was governmentowned. schedule. of course. and performance. Then we can make the decision based upon EMV or CE. Although possible. MultiCriteria Value Functions For true MCDM problems. as shown in Figure 4.6 Simple Objectives Hierarchy economy. cost. These form a hierarchy.6. it is usually cumbersome to attribute incremental cash flow to an employee because casMow benefits are too hard to quantify. EMVmaximizing businesses also use MCDM on occasion. usually the result of having multiple stakeholder groups. and we can then add cost equivalents into the cashflow projections. For example.Chapter 4Utility and MultiCriteriaDecisions Business Objective: Making Money I Cost Completron Time Pelforrnance Figure 4. A mathematical technique translates a matrix of comparisons into composite criteria weights and alternative scores. such as improving employment opportunities. such as hiring decisions and employee evaluations. it was important to consider and balance other secondary objectives. This example structure has only two levels. the principles of decision analysis still apply. A popular way of constructing the hierarchy is the analytic hierarchy process (AHP) [2]. However. Regardless of whether a hierarchy structure is used. Different stakeholders. Often. It is sometimes impractical to develop a model to characterize different alternatives' impacts on corporate value. pairwise comparisons to determine the numeric weights of decision criteria and criteria scores for candidate alternatives.
b. x Utility. most people use a linear additive formula like this for the value equation.7 Three Utility Scales Consider a software project manager who is evaluating staffing options in terms of cost.7. she can welght each attribute.0 Completion Time Months System Performance Transactions/Hours Development Cost $millions Figure 4. The utility yaxes have arbitrary scales.) so that the value behaves according to her intuition about the problem. Ideally. x Utility. and performance. The yaxis of each scale represents goodness or utility value in the context of the respective attribute. 3 +/ 9 1 8 400 [  = = 0 1000 0 1. I *:* Arbitrarily. Note that this valuation approach is highly subjective. Because of simplicity's appeal. The combination is a single objectivefinction. Once the manager has the relevant quantitative attribute value measures. one might have component utilities use zerotoone scales and have the weighting fractions total one. however. often in the simple form: Utility = wtl x Utility. 52 . Sometimes organizations craft more complex formulas with multiplicative or exponential terms. schedule. Chapter 4Utility and MultiCriteria Decisions 1 rl b . she develops a utility scale for each dimension. Regardless of the shape. The strength of MCDM. + wt. She scales the xaxes according to the range of possibilities. The utility scales are arbitrary. The utility scales are arbitrary. and c are just scaling factors weiglung each component. + wt. Because she wants to be systematic in her thinking. this ensures that the composite utility value function lies within a zerotoone range. as shown in Figure 4. The decision maker chooses utility curve shapes and weights (wt. is in providing a structure that addresses all important factors and helps keep the process somewhat logical and complete.. utility functions can also incorporate the decision maker's attitude toward risk. and coefficients a. each utility function is unique to a subobjective.
The decision rule is to pick the alternative having the greatest EU. If the decision maker is comfortable using monetary equivalents. We then. for each trial. Decision analysis has three key characteristics: 1.7. System Performance. Figure 4.723. and Figure 4. The difference between strategies is the number of programmers/analysts assigned to the project. we weigh outcome values with probabilities in the usual EV calculations. coding productivity. EU = 0. Each trial produces a Completion Time. Consider our manager deciding whether to staff her software project for normal or fasttrack development. weighted according to the value function. Usually we ask the most knowledgeable people available to express their expert judgments about risks and uncertainties as probability distributions.693.8 shows the three criteria distributions. A project model captures the project team's understanding of the development process and the team dynamics. she is better off converting the metrics directly into monetary equivalents. Judgments as Probabilities.9 shows the composite value distribution with EUs indicated. We convert each dimension into value components with the respective utility functions. The simulation model produced the following analysis results: Normal Staffing: FastTrack Staffing: EU = 0. and Development Cost. We run the simulation project model for each staffing alternative. combine the component utility values.Chapter 4Utilty and MultiCriteriaDecisions Having a way to measure value enables a project manager to make consistent decisions throughout the project. The EUs are sufficient to make a decision. * ' Note that it doesn't make much sense to attempt calculating a CE in MCDM. Three Pillars of Decision Analysis Let me restate and amplify something important in Chapter 1. as in Figure 4. design performance. and amount of rework. Monte Carlo Solution We will see how we can solve a MCDM problem with simulation. Based on ELI. We would have input variables in areas of application complexity. The graphs provide supplemental information for the manager to check and enhance her intuition about the project. To recognize uncertainty. changing requirements. the project manager logically chooses Normal Staffing. .
028 .000 4.E m n .010 .041 .8 Decision Criteria Distributions 54 I .00 1..000 Days  Forecast: Performance .0 m n 2 a ..000.000.035 P .031 P .Chapter 4Utility and MultiCriteriaDecisions Forecast: Completion Time .014 .ooo r 0.007 .012 .200 Transactions/Hours Forecast: Development Costs .047 .000 2.024 a P .000.020 .OO $ Figure 4.000.ooo 200 450 700 950 1.021 n P a m n P .000 3.
For example. However. many forprofit companies use PV of net cash flow. The EV calculation is the basis for calculating value under uncertainty. 9 2. because of not having a "manyrepeatingexperiments"condition. Others may be unconvinced regarding the applicability of the EV concept (#3). and performance so that we know how to make tradeoffs in one dimension against the others. Any of these requirements can be barriers to wholly accepting decision analysis. we need a quantitative way of relating cost. Almost always. In my experience. PV. The company or decision maker has a way of measuring outcome value as a single parameter. Single Value Measure. reduce emissions. We condense the distribution of possible outcome values into a single measureEV. one solution to this dilemma is to consider EV lives in terms of other people's lives. Some people' are uncomfortable about quantifying risk (#I). safety. Expected Value. for this purpose. This value function may be composed of several criteria that represent measures for each of several objectives. how do we trade off human lives versus money? For some people. we can improve safety. schedule. Where in our organization can we apply money to save lives? . That is why we have devoted this and the previous chapter to decision policy. the value function (#2) is the cause of most situations where decision analysis is not wholly satisfactory. In project management. and improve service and product quality by spending more money. and the environment. the probabilityweighted averagefor valuing alternatives. Even more troublesome are decisions involving health.Chapter 4Utiliv and MultiCriteria Decisions 0.5 Utility Value 0 . for example. This is the "value function" or "objective function" that we seek to optimize. 3. for example.
Using probabilities in this way is much better than recognizing "risk as simply one more criterion. For individuals: Maximize personal happiness over a lifetime. Multiple criteria are sometimes used because it is inconvenient.Making logical. and we should craft a value function to measure what is important. For governments: Maximize the quality of life for citizens. because of multiple objectives. evaluations (of reliability. most often. Decision policy conflicts arise. You can always perform stochastic calculations by putting judgments Decision policy tells us how about risks and uncertainties into models (#I). And. . consistent choices under uncertainty requires all three characteristics in your evaluation and decisionmaking process. w e arrive at a single value for each alternative. and so on) under uncertainty are greatly eased with the EV calculations (#3). Then we have a MCDM problem. Management should be clear about whose interests are being represented and what is being optimized. For professional associations: Maximize the organization's value to members. Decision analysis explicitly recognizes risks with probabilities. which are used to weight possible outcomes. to fully develop the project model to assess value.Chapter 4Utility and MultiCriteria Decisions And at what cost? If we look to the most efficient place in our company to put additional money into safety (assuming we already operate a generally safe workplace). After applying probabilities. The starting point is always: Be clear about the objectives. Decision Policy Summary Project decisions are complicated when there are several objectives or subobjectives. this best opportunity provides a benchmark (as EV cost per EV life) for the marginal cost of saving lives. A l l approaches examined in this chapter employ a means to arrive at a single value measure for possible outcomes. Consider these example single objectives: For corporations: Maximize shareholder value. to value the potential outcomes (#2). Measuring value toward the objective may have several components. D e cisions are easier if there is focus on a single objective. benefit. or impossible. I have described three approaches for structuring decision policy in such cases.
2. both in Interfaces. Interfaces is available in most university libraries or from INFORMSthe Institute for Operations Research and the Management Sciences. The popular AIIP technique is discussed by its inventor in Saaty 1994. The articles are about applications of quantitative methods. RI 02903. Interfaces is the leading journal reporting operations research (management science) applications. and somewhat less enthusiastically in Zehedi 1989. 290 Westminster Street. . Providence. and exclude most theoretical details. Game theory would have us posture the optimal strategic response to the vendor's question.Chapter 4Utility and MultiCriteria Decisions Endnotes 1. including decision analysis. we are concerned only with the CE. For the fixedcost equivalent in the contract discussion.
.
Decision Trees A decision tree is a graphical representation of expected value (EV) calculations. This acts as a blackboard to develop and document our understanding of the problem. Chance event nodes. have advantages and disadvantages for certain problems. Decision tree analysis language includes colorful biological analogies. Conceptually. Decision tree analysis is especially suited for everyday problems when one wants to pick the best alternative quickly and proceed. The starting (usually) decision node . By convention. Other alternatives. especially Monte Carlo sirnulation. no matter how complex. represented in a decision tree diagram by unconnected branches. represented by circles.A decision model represents our understanding of an evaluation situation. Node Types There are three types of nodes in a decision tree: Decision nodes. This chapter describes what many consider the most powerful modeling and calculation technique in decision analysis. and terminal nodes connected by branches. are variables or actions that the decision maker controls. can be analyzed with a decision tree analysis. the tree is drawn chronologically from left to right and branches out. and facilitates team collaboration and communication. The tree consists of decision. represented by squares. chance. are variables or events that cannot be controlled by the decision maker. Terminal or end nodes. any decision. are endpoints where outcome values are attached. like a tree lying on its side.
we may choose to place benefits and costs along branches as they are realized. That is.Chapter >Decision Trees is called the root. representing present values (PVs) of the resulting cashflow stream. This becomes the value of the node and the branch leading to it. Treat the rest of the cost as a "toll" to be paid when traversing the tree. calculated during the process of solving the tree. Development is nearing completion when your team discovers that water discharge contaminant levels are above those seen in the original survey samples. If a cost value lies along a branch. I like to show expected monetary value (EMV). Node expected values. More commonly. and an overly complex tree bears the label. and the radial lines are called branches (sometimes twigs). terminal node values represent the entire outcome. When solving with a utility function as risk policy.expected (value) utility (EU). We are applying the E M V decision rule. These EVs are EMVs or EV costs when we measure value in dollars or other currency. Starting from the terminal nodes at the right. discounted to the date of the root decision. (Sometimes a tree is designed to solve for EV costs. Tree Calculations Solving a decision tree is a simple backcalculation process. subtract a cost when solving a tree to maximize EMV. Here are the three simple rules for solving a tree: At a chance node. recognize that cost in passing from right to left. This is sometimes called backsolving or folding back the tree. Sometimes. a cost along a branch is added. and certainty equivalent (CE) for each node.) Wastewater Plant Example In this case. moving left. A regulatory agency now mandates an improved treatment facility . calculate its EMV (or EV cost) from the probabilities and values of its branches. in which case. Outcome values. we assume that your company is developing a new gold mine that will be operated by another company. a bushy mess. Replace a decision node with the value of its best alternative. we solve for the value of each node and label it with its EV. The terminal nodes are sometimes called leaves. Tree Annotations We label the decision tree diagram with these numbers: Probabilities for each outcome branch emanating from chance nodes.
40 . In addition to the investment and operating costs of the Plant. completion times are independent of the wastewater treatment system. The best general solution is to construct or bring in a wastewater treatment module (referred to as the Plant in the following tables and figures). though this is not customary. The square in the figure represents the decision point.1 Completion Time for All Activities except Wastewater Plant Wastewater Plant Strategies Used Figure 5.1 Decision Node to recover heavy metals from the mine discharge. 6 Fixed: Buy a new.Chapter %Decision Trees Probability . as shown in Table 5. the nodes have reference numbers inside. the engineers judge the completion time in three scenarios. Figure 5. This extended pollutantrecovery capability was not designed into the nearly completed mine facility. For illustration in this chapter.00 Table 5. Assume that: extending construction activities to meet a later startup date will not affect costs. skidmounted wastewater plant. and branches coming out of the right side are alternatives.1. Three plant alternatives are being considered: 6 Used: Buy a used.30 . For all other activities except the wastewater system. adding a branch for another alternative to be evaluated later.1 shows how the root decision node will appear. adding this capability has the potential to delay mine startup. skidmounted plant. Further.30 Other Activities Complete in three months in four months in five months 1. Data Assumptions There are uncertainties affecting the mine project completion time. 6 Skid: Buy a new. . fixed plant.
3. 8 months) Table 5. Other information about the options is shown in Table 5.. Engineers for the plant vendors and the project team jointly make these assessments.2. as threelevel discrete distributions..2.2.35. 12 months) {. Early completions have the opposite effect. In this case.4.5. An important consideration is the cost of delaying the mine startup. The mine is projected to operate for six years. A Fixed plant costs less to buy and operate than a skidmounted plant. 6. from the project feasibility model.Chapter S D e c ~ s ~ Trees on Used Skid Fixed {.event. Clearly. The other mine activities are independent of the wastewater plant.2 Chance Node I I Although costing more initially.5.000 per month (pretax) cost. .3.6. The engineers use three weighted scenarios in each case as the form to judge times to implement each plantacquisition option. a decimal. plus it is likely to have greater salvage value at mine abandonment than a Used plant.2. These values are known well enough that singlepoint estimates can be used. Figure 5. the opportunity loss is the foregone asset value. Assume that. the project team determines that delays to mine startup are valued at $150. The branches coming off at the right at the node represent the possible outcomes. 62 2 . in each pair). due to increased development costs and delaying mine production.. the Skid plant costs less to operate and is faster to install.3. tradeoffs will be necessary in making the decision. 5.2 illustrates what a Time Plant Operational chance node looks like for the Skid option. These pairs of numbers represent outcome values (second in each pair) with their associated probabilities (first value. Note that the branch probabilities at any chance node always sum to 1.. 6 months) {.. as shown in Table 5. 4.35. 3.2 Time to Acquire and Install Wastewater Plant Time Wastewater Plant Operational Skid Option ( F 4 months Figure 5. This lottery notation is a way of expressing a discrete chance. but requires more time to install.
these terminal values are from a casMow projection model. Evaluating Options The appraisal approach of decision analysis is a logical way to evaluate the three plantacquisitionalternatives. Most of us need an analytic tool to have confidence in our selection. The model underlying the outcome values shown in Figure 5.3 Fixed $1. The plant choice is not obvious because each alternative has advantages: Used: Lowest:initial cost. Fixed: Lowest operating costs. Figure 5.3. medium operating costs. In labeling the tree. such items as inflation. highest salvage value.0 Plant Cost Information Over half the work of a typical custom evaluation involves developing a project casMow model. Since all monetary values are costs. in addition to the assumptions in Tables 5.15. minimizing EV cost is exactly equivalent to maximizing EMV. the outcome value is PV costs.3 includes. Can you judge the best alternative by intuition? Try estimating EV cost for each option.3 Used $650 $0 $8. depreciation. Try Your Intuition Now you have seen a l l of the input data. one may work with EV costs instead of EMVs. The root node is the Type Plant Decision. even this small problem is much too complex to internalize in our heads.0 Skid $1.150 $200 $5. cost of capital. we annotate these chanceevent outcomes . Again. medium investment. Skid: Fastest and lowestrisk acquisition time. What is the difference in EV cost between the first choice and the next best alternative? For most of us.Chapter %Decision Trees All costs are pretax $thousands Expenditure Acquisition + Installation Salvage Value Operating Costs Per Month Table 5.480 $600 $6. and taxes.3 shows the complete decision tree model: terminal nodes (right column) show outcomes if the respective tree path is realized. The model provides the outcome value for each scenario: combination of decisions and chanceevent outcomes. No single alternative dominates across all attributes. The latest event controls when the mine development project is complete. There are two chance events affecting outcome value: Time Plant Operational and Time Other Activities Complete.
3 mo.3 Decision Tree to Evaluate Three Initial Options 64 .Chapter 5Decision Trees Tlme Other Activities Complete Time Plant Operational Cost $k 3 mo. 1007 Figure 5.
the progressive thinning of branches provides considerable calculation efficiency (as compared to a payofftable approach). One can even use Excel's Draw functions to illustrate the tree structure.4(1098) . say.3(1098) . If we will be solving a tree more than once.and mediumsized trees.3 EV(13) = .Chapter 5Decision Trees EV(5) = EV(6) = EV(7) = EV(8) = EV(9) = EV(10) = EV(11) = EV(12) = EV(13) = EV(2) = EV(3) = EV(4) = . although its superiority is slight ($16.3(875) .4(1281) . Table 5. . EV(3).3(1281) = 1281k . 1135) = $1120k Values are to the nearest $thousand.3(1007) = 1007k .4(875) .2(1007) .000) appears the best of the three alternatives.3(965) = 901k . By "manual. Acquiring a Skid plant ($1. Note that there are fewer calculations at each column of nodes in progressing from right to left through the tree. we label each chance and decision node with its EV cost." I include the use of a hand calculator.2 EV(8) .3(1115) = 1050k .4 Calculations in BackSolving the Tree with labels (months to complete) and the probability of the respective outcome. Table 5. For large trees.5(1098) .2 EV(11) .3(1281) = $1135k + + + + + + + + + + + + + + + + + + + + + + + + And applying the expected value decision rule: EV(1) = Minimum{EV(2). although calculations were done with greater precision. a computer will help.3(1207) + .The "cut" marks on branches indicate inferior alternatives that have been "pruned.3(1115) = 1115k .3(875) .35 EV(7) = . the nodes do not have numbers. I often build a Microsoft@ Excel template for small.6(1115) .6 EV(9) .4 shows the sequence and components of node EV cost calculations. We backsolve the decision tree to evaluate the three alternatives.2(1207) = $1120k .3(1115) + .35(1594) = $1136k .4(1207) .120.3(1007) .4(1115) . When solving the tree. 1120.2(1050) . though I numbered them here to illustrate the calculations.4(1007) .3(785) + . EV(4)) = Minimum(ll36.3(1594) = 1594k .3(1023) + ." Tree Software We can draw small decision trees on the back of an envelope and solve them with a hand calculator.3(1594) .3(965) = $875k .3(1281) + .35(901) .4(1023) + . including revisions to the numbers. Normally.3(1207) = 1207k .000).3(1098) = 1098k .3 EV(5) .2 EV(10) = . 100 branches.4(1594) .35 EV(6) . A mostly manual solution is practical for trees of up to.5 EV(12) .4(875) .
Decision Tree Summary Decision tree analysis is a convenient way to analyze project decisions having one or several important uncertainties. The Wastewater Plant example combines cost. More information is in Appendix B. Further comfort comes from credible quantitativeanalysis results. and PrecisionTreeTM by Palisade Corporation. and performance criteria into a single monetary value measure. We use probability distributions to encode judgments about risk and uncertainty. I recommend obtaining and learning decision tree software. Inc. decision tree models greatly bolster our intuition. Decision Analysis Software.Chapter %Decision Trees For people who frequently develop decision trees and for someone needing to develop a large tree. For a l l but the simplest of problems. gauging benefits or progress toward the organization's objective. Many practitioners believe that insights gained into decision problems are more important than the numerical results. A single value measure. EV cost: Schedule was translated into cost equivalents by recognizing the asset value lost due to project delay. Performance was recognized by different plant operating costs during the mine life. schedule. Money is a convenient measurement scale for most purposes.. is a central idea in decision analysis. . The two programs that I use and teach with are DATATM by TreeAge Software.
Schedule and performance criteria are incorporated explicitly into the analysis as cost equivalents. This chapter shows how to solve value of information problems. we evaluate the plan alternatives with decision tree analysis.136 million.120 to $1. The decision policy I recommend is to choose the alternative having the lowest EV cost. Often. there are opportunities to do additional analysis or to otherwise get more information before making a resource commitment. Whichever chance event is latest controls when the mine commences production. is generally useful in revising prior assessments. only we are minimizing costs. Revisiting the Wastewater Plant Problem In the Wastewater Plant example in Chapter 5. a copy of every following node is attached to the end of each branch of the preceding node. even imperfect. where there is an option to obtain more information. There are two chance events affecting the cost outcome: Time Plant Operational and Time Other Mine Activities Complete.1 shows the decision model in a "shorthand" format: Implicitly. The expected value (EV) costs are remarkably similar. Here are example sources of additional information: .CHAPTER 6 Creating good decision alternatives [ I ] is one of the most important project management functions. Value of Information More information. Figure 6. based upon their impact on mine value. ranging from $1. This is the expected monetary value decision rule.
9. In this calculation.Chapter 6Value of Information Time Plant Operational Time Other Mine Activities Complete 5. . However.10 47 11. The equation to determine the value of additional information is: (value of additional information) = (value of new decision tree) . We compare the value of information to the cost of obtaining the information. If the cost is less than the value. Time permitting. the convention is to temporarily ignore the cost of acquiring the additional information. and similar directed research Obtaining and analyzing more data Modeling in greater detail Spending additional effort in eliciting expert judgments about model input variables. What is the value of additional information?We determine this by adding an information alternative to the decision tree.1 Current Decision Model U Prototyping. then it is better to acquire the information. we should weigh the cost. and other possible impacts against the value added. any delay. market studies.(value of old decision tree). prior proba bility assessments. 8.6. more often.13 Figure 6. A new chance node in the tree represents the possibilities of informationgatheringresults. We will use the information to revise project assessments or. it might always appear that additional data and analysis are always desirable.7 4 : 4 mo.12.
is most often imperfect.Chapter &Value of lnformation Project delays. The decision tree shorthand illustrates the structure in compact form. Examples include: Acquiring services or products through turnkey contracts that fix costs Limiting losses with insurance Protecting commodity prices by forwardselling production LI! Specifying highergrade materials to reduce chance of failure Using conservative operating practices Investing in backups. both incorporating cost and delay of obtaining more information. Note that the analysis does not change the risks.000 (pretax) and is not refundable. can also be temporarily ignored by assuming that delay is a component of the cost of information. and flexibility. For example. In a decision analysis. the used plant would need inspection anyway.2 shows the expanded decision model. We can spend approximately one month testing and evaluating the used plant to determine when it could be operational. The nodes are numbered to match the example calculations to the figures. This includes a deposit plus the evaluation cost. redundancy. Control. then the New Wastewater Plant acquisition is delayed one month. The value of the information is in providing a basis to reassess the Used Plant leadtime probabilities. Plant lnformation Alternative Now consider a new information alternative for the Wastewater Plant: evaluating the Used Plant first. The evaluation will not change the actual leadtime if we select Used Plant. Leadtimes for other alternatives would be the same from the date of commitment. However. Be certain that the evaluation properly compares the new versus the old tree. Revising Probabilities The used plant evaluation and testing provide information useful for revising the prior probability assessments in the Time Plant Operational . like information. and would apply to the installation expense if we chose this alternative. Figure 6. information and control concepts often lead to adopting new alternatives with moredesirable risk characteristics. if we spend one month evaluating the Used Plant and then make a decision to purchase a new one. Valuing additional control is similar to valuing additional information: compare the new control alternative with the previous best alternative. "control" means controlling or influencing the outcome of chance events. often adding to costs. A similar concept is value o f control. The initial cost of evaluating the Used Plant alternative is $30.
However. 4 mo. The probabilities at the second column of nodes in Figure 6. further information gathering and analysis are pointless. a . I classified the evaluation results into just two outcomes: "Favorable" and "Unfavorable. eliciting the judgments in this manner separates them.10 5 mo.Chapter &Value of lnformat~on Time Plant Operational 3 mo. The first node is the prior probability assessments about the event that matters. we need to reverse the sequence of the nodes in Figure 6.22 Nodes 23 through 40 Figure 6.3. as shown in Figure 6." Project engineers judge the quality of this imperfect information. l . Figure 6. My custom is to place double 70 / .18. 8.4).9.2 and Figure 6. i 3 Decis~on :4 4 % 4 z 17.3 has nodes in inverse sequence from the decision model. . Note that the evaluation node in this section of the tree is followed by the investment decision.19. For the decision model (see Figure 6. Time Other Mining Actlvlties Complete 5.6.2 Expanded Decision Model I nodes. These nodes are not numbered. with today's information.3 are judgments about the quality of the information. This is necessary to represent how and whether the new information adds value. If we already know the plant alternative. 20. For simplicity.21.7 <3 mo. so that the expert(~) need consider only one event at a time. as this tree is a side calculation intermediate step.3. Additional information adds value to the project only if it has the poten tial to change what you are going to do.
Chapter &Value
of Information
Time Plant Operational
5/6
Evaluation Outcome
"Favorable"
Joint Probabilities
p= .25
Y6 a 5
"Unfavorable" "Favorable"
p=.05
 p=.14
.35
4 mo.
35
3/7
6/7
"Unfavorable" "Favorable"
p=.21
 p=.05
"Unfavorable" 
p=.30
Figure 6.3
Used Plant Prior Probabilities
quotes about the information event outcome labels to help distinguish the information event (Evaluation Outcome) from the event of interest (Time Plant Operational). The expert assessed probabilities for the evaluation outcomes conditioned upon the Time Plant Operational event outcomes. R e p resenting conditional probabilities is a convenient and powerful feature of decision trees. The order of nodes in Figure 6.3 is the logical way to express the assessments and relationships between the two events. The causeandeffect relationship is from left to right. However, the sequence is in reverse order from what we need to solve our realworld problem. The actual sequence would be according to the project: Get the evaluation result. Make the plant decision. Experience how long it takes to complete the plant. We need to reverse or invert the nodes from the sequence in which the probabilities are assessed. We do this with a straightforward technique called Bayesian revision (using Bayes' theorem; see Appendix 6A, Bayesian Analysis). We revise the prior probabilities about Time Plant Operational based upon Evaluation Outcome. Table 6.1 and Figure 6.4 show the original and revised probabilities for each Time Plant Operational case, based upon the evaluation outcome.
Evaluating the New Alternative Figure 6.4 shows the decision model extended for the Evaluate Used Plant alternative. The terminal or end values on the right of the diagram are


Chapter
&Value of Information
Time Other Mine Activities Time Plant Deferred Plant Complete
PV Cost
$thousands
780 870 960 870 870 960 1590 1590 1590
Order New
780 870 960 870 870 960 1590 1590 1590
Figure 6.4
Information Alternative
Test and Evaluation Outcome Probabilities and Times to Install for Three Cases Current lnformation 3;.350, 4;.350, 12 months) Original Probabilities: {.300, "Favorable" Evaluation, Probability = .44 Revised Probabilities: {.568, 3; .318, 4;.114, 12 months) "Unfavorable" Evaluation, Probability = .56 Revised Probabilities: 1.089, 3;.375, 4;.536, 12 months)
Table 6 . 1
Probabilities for Used Plant
I
present value (PV) costs for the respective paths through the tree, and include any penalty for a delayed mine opening. Figure 6.5 details the calculations for the EV cost of the delayed Skid alternative, given the Evaluate Used Plant choice. This value changed from $1.120 million to $1.134 million because of the added evaluation cost (about $18,000 after taxes) less the present valuetiming adjustment (about $4,000) of delayed investment.
72
Chapter &Value
of Information
Time Other Activities Complete
PV Cost $thousands
rime Plant Operational
1064
3 mo.
1036 1036 1128
3 mo.
1134
6
1128 1128 1128
5 mo.
1128
3 mo.
4 mo.
1221
1221
5 rno.
1221
Figure 6.5
Delayed Skid Alternative
Again, new information adds value only if it has the potential to change a decision. In this example, the project manager should choose to install the Used Plant instead of the Skid Plant, if the evaluation result is "Favorable." If the evaluation result is "Unfavorable," then install the Skid Plant as planned. The Evaluate Used Plant option is the best alternative. Spending pretax $30,000 on inspection adds a net $62,000 (= $1.120  $1.058 million) of aftertaxvalue to the project. The decision tree was backsolved to compute the EV cost for each alternative. Starting at the right, each chance node is replaced (and annotated) with its EV cost. We replace every decision node with the value of its best alternative. Here are several example calculations, starting at the upper right of Figure 6.4 and working backwards: EV(23) = $870,000 = .3(780)+ .4(870)+ .3(960) EV(17) = $961,000 = .568(870)+ .318(897)+ .114(1590) EV(15) = $961,000 = minimum of (961,1134,1149) EV(14) = $1,058,000 = .44(961)+ .56(1134).
Chapter &Value
of Infomation
Value of Information Summary
Many projects have a sequence of decision points. Analyzing these options is extremely important in an evaluation. Decision trees are especially useful for analyzing situations with decision points after additional information will become known. We continually have decision opportunities. We cannot put many into a decision tree, though it is important to represent the key decisions. One option, that adds value to a project, is the ability to terminate a project that is performing poorly. Appendix 6B discusses project "kill" decisions. We extended the Wastewater Plant decision tree analysis to value a new alternative to get more information. Many projects have opportunities to improve value through such additional data gathering and analysis. In most cases, it is convenient to structure the decision model using the new information to revise prior probabilities. Alternatively, we can revise assessments for eventoutcome values. This usually involves modeling the system to understand and quantlfy the relationship between the information event (symptom) and the chance event of interest. Occasionally, we may want to model changes in both chanceevent probabilities and chanceevent outcome values. Fairly complex problems can be handled in an orderly way with decision tree analysis. For some problems, the tree can become exceedingly large. The preeminent calculation alternative for such cases is Monte Carlo simulation, discussed next in Chapter 7.
Cha~ter &Value
of Information
APPENDIX 6A
Bayes' theorem is credited to Rev. Thomas Bayes, an 18th century British clergyman. Bayes' formula is the most common way to revise probabilities based upon new information. Bayes' theorem is most often stated as:
where ei= outcome i of the chance event of interest N = number of possible outcomes for ei A = attribute of new (imperfect)information, the outcome of an information event (symptom). While the formula is more ominous than difficult, a more convenient solution technique is to construct and inspect a joint probability table. Table 6A.1 contains the full information content of Figure 6.3. The values inside Table 6A.1 are the joint probabilities, i.e., probabilities for the compound events comprised of combinations of Evaluation Outcome and Months to Complete Skid Plant. The values in the table margins have the extraordinary name, marginal probabilities. Note that the rightcolumn marginal probabilities are the same prior probabilities for Time Plant Operational as in Figure 6.3. We can read (with some practice) probabilities for the inverted tree, as shown in Figure 6A.1, directly from Table 6A.1, by inspection. For example, let's assume that we have a "Favorable" Evaluation Outcome. There is a .44 chance of this occurring (the marginal probability, total of left column). The "Favorable"outcome restricts us to the left side of the table, which has three possibilities for Months to Complete Skid Plant: {3,4, or 12). The probabilities of the Months to Complete Skid Plant outcomes are in proportion to {.25, .14, or .05). However, these values (joint probabilities) do not sum to 1. They must be normalized in order to total 1. We accomplish this by dividing each number in the left column inside Table 6A.1 by the column total. The normalized numbers then represent the revised probabilities conditional on the "Favorable" Evaluation Outcome, as shown in Figure 6A.1. Still unclear? If you are confused about Bayes' rule, you are not alone. Although straightforward once understood, Bayesian revision usually requires a more careful explanation plus a couple of hours' practice before a person is comfortable with the process.
00 Table 6A.35 :::: 1.1 Joint Probability Table Evaluation Outcome I Time Used Plant Operational I I Figure 6A.1 Inverted Probability Tree 76 / .44 0.Chapter &Value of InforrnaUon Evaluation Outcome "Favorable" Months to Complete Skid Plant 4 0.56 "Unfavorable" 0.
Chapter &Value
of Information
APPENDIX 6B
KILLING THE PROJECT IN TIME
Things happen [2]. Some projects will never reach the customer. When necessary, let's hope that the project "kill" decision is deliberate and made in time. New information during the project helps us to determine when dramatic intervention may be necessary.
Early Warnings
Experienced project managers recognize trouble. Some clouds on the horizon include: Poor productivity: habitually late targets; many activities exceeding estimates, and none reported earlier (suggesting activity teams might be wasting slack); unfavorable variances; rework; missed performance tests Tension, stress, and burnout among team members; requests for transfers out Waning enthusiasm (perhaps because the need for the project deliverable is diminishing, change in the company strategy, problem is proving more difficult) W Worsening communications: difficulty in scheduling meetings; late when returning phone calls; responding slowly to requests for information Improving communications: requests for meetings and information; heightened formalism about change requests and approvals; adding someone to review the project. A persistence of sick project symptoms indicates a need for intervention. When things start to go wrong, often many things go wrong together. The appropriate and extreme change might be terminating the project.
Gateways
Research projects often have a formal series of "gateways"or stopping points. Perhaps best known is the process for a new drug, from conception to market. Each successful drug goes through a series of stages of development, testing, and approval. (A fasttrackingstrategy may shorten or bypass certain steps.) Each milestone is a decision point, primarily to continue or drop. In the decision tree model, Figure 6B.1, we represent decision points as squares when the decision is not automatic, with the alternatives coming off the right side. A decision tree is a graphical representation of the problem, and provides a template to calculate the EV for each alternative. The normal decision policy is to choose the alternative with the highest EV.
Chapter &Value
of Infomatton
I
b 
Clinical Tr~als
4 Phase Ill
FDA Approval
$rnilllons
mII
Phase l
I
17
.I3
.44
bctlnlcal
blrm .87 Failure
.56
New
w
4 0
'
.83
4
17
5
0 1
Figure 6B.1
Gateways in Drug R&D
5500 0; P = 0.009
(55O ) , P = 0.383
Figure 6B.2
Solved Drug R&D Decision
Figure 6B.2 shows the solution tree drawn and evaluated using the DATA decision tree program [3]. The node EVs are in boxes to the right of each node. At the explicit decision node (square), two "cut" marks indicate the "pruned" inferior Drop branch. In tree diagrams, circles represent chance variablesthat is, risks and uncertainties. In the pharmaceutical model, I can represent gateway decisions as chance variables because success at any stage obviously means that we go forward (on the upward branch at a chance node). The cumulative outcome value and joint probability are shown at each terminal node (triangles at end nodes).
PointForward Analysis
In evaluating a decision, our attention should be forwardlooking. I am assuming here that the decision maker wants to do what is right for shareholders and is not posturing to optimize his career. Despite whatever has happened in the past, the decision best serving the organization is the alternative having the best EV, calculated from possible future outcomes.
78
Chapter &Value
of Information
Humans sometimes have difficulty in only looking forward. Animals are better at quickly cutting their losses. We humans often behave in ways that exhibit the fallacy of sunk costs. This is when our decision is influenced by sunk investmentmoney we've already spent. Sometimes a project has so much investment inertia that we continue pouring in money. We may be "putting good money after bad." Rather than continuing to investhoping that we'll get luckyit may be time to stop the hemorrhaging. We've been taught expressions, such as, "That's spilled milk"and "water under the bridge." The maxims are good ones: Look forward. Remember the fallacy of sunk costs the next time you feel committed to finishing that expensive meal when you are full, or when you have an hour invested in a movie that you are not enjoying.
Options Add Value
A lengthy project typically has milestones. Figure 6B.2 shows planned decision points. However, there are many others, a continuum really, where we can reexamine the project's viability at any point. The best presentation of a decision analysis is the distribution of possible outcomes. A manager may say, "These extreme outcomes aren't possible. I would never let the situation get this bad before doing something." We know that we have many opportunities to reallocate resources during a project, even perhaps killing the project. The evaluation model should include the most important of these decisions. Incorporating the decisions into the model, and the resulting calculations, more faithfully represents the true project value. Decision pointsoptim in the language of financeadd value to a project. In finance, risk is the friend of an option holder, because it is the uncertainty that creates the value of the option.
Feel Good about Your Decision
A good decision is one that is consistent with the decision policy of the organization and all of the information available at the time. Here are some key points about decision making: H Good decisions are correlated to good outcomes. That is the justification for doing a decision analysis. Good decisions do not ensure success, though they often improve the likelihood of success. rn A success is not always the result of a good decision. However, a pattern i l l increase the likelihood of success. of good decisions over the long run w
Chapter &Value
of Information
We want to reward people for making good decisions. This is an uncommon perspective. In most participative organizations, the organization feels better having participants share in the project outcome, good or bad. However, in the context of a portfolio of decision opportunities, making individual choices to maximize individual project success impairs longrun value. Killing projects, downsizing, divesting businesses, and such are typically unpopular decisions. Think big picture and long run. This occasionally means a timely decision to kill a project, when that choice is the EVmaximizing alternative. We should appreciate decision makers who make good decisions under emotionally difficult circumstances. By definition, we cannot control chance events. However, we should sleep well at night knowing that we have done the best with what we know.
Endnotes
1. An excellent source of inspiration in creating and evaluating alternatives is Keeney 1992. Keeney's book is about multicriteria decision making, which we discuss at the end of Chapter 4. 2. Adapted from Center for Business Practices 2001. 3. Figures 6B.1 and 6B.2 were generated using the DATA decision tree program by TreeAge Software, Inc., and are used with permission.
CHAPTER 7
Monte Carlo simulation is perhaps the most popular of the various management science techniques. The simple, elegant method provides a means to solve equations with probability distributions.
Approximating Expected Value
Decision trees solve for the expected values (EVs) of decision alternatives. Conceptually, we can solve any decision problem with a decision tree. For some problems, however, a combinatorial explosion of branches makes calculations cumbersome or impractical. Monte Carlo simulation (simulation)uses a random sampling process to approximate EVs. This technique easily deals with many possible outcomes, and has other important advantages over decision tree analysis.
Wastewater Plant Revisited
We will continue the discussion about the Wastewater Plant example from Chapter 5 and Chapter 6. Figure 7.1 and Table 7.1 summarize the decision tree model. We have considered four alternatives for acquiring a governmentmandated wastewater treatment facility that was not part of the original project. The chief uncertainties in this problem are the time to implement the l l other minedevelopment plant choice and the longest completion time of a activities. Fixed assumptions include mine life, plant operating costs, salvage values, and delay cost (per month impact on mine value of delaying or accelerating startup).
1 Wastewater Plant Alternatives Difficulties in Extending the Tree Analysis Decision tree analysis provides a logical. then decide which plant to acquire. Take one month ($30k expense) to evaluate the condition and time needed to ready the used. There are twentyseven possible outcomes: . Some problems are quite cumbersome to solve as decision trees.1 Decision Tree Model Short Name Used Skid Alternative Acquire a used. 7 5 mo. "Favorable" various times ! i 1 I i 1month I b: I Figure 7. skidmounted plant ($1480k initial outlay). I Table 7. Skld 7 3 mo. credible basis for the wastewater plant decision. Evaluation Evaluate U d Plant I Deferred Decision 8 mo. Consider the three plant type alternatives. skidmounted wastewater plant ($650k initial outlay). and simulation provides another approach better suited to some problems. fixed plant ($1150k initial outlay). 7 3 mo. 4 mo. Acquire a new.Chapter 7Monte Carlo Simulation Time Plant Operational Time Other Mine ktivlties Complete 1 12 mo. I Fixed Evaluate I Acquire a new. skidmounted plant. 7 3 mo. 6 mo.
Monte Carlo Technique Simulation does not suffer the difficulties mentioned earlier. but exacerbate the combinatorial explosion. Among the applications is solving for EV. the probabilityweighted average of a probability distribution. While not a panacea. more detailed representation. Simulation depends upon two essential elements: A model that projects project outcome and outcome value. and we must convert any truly continuous events into discrete approximations. As we add decision and chance nodes. we characterized the outcomes for Time Other Mine Activities Complete as being exactly three. or five months. decision trees often grow exponentially. The details follow. either decision tree analysis or simulation can solve any evaluation problem. For example.Chapter 7Monte Carlo simulation 3 decision alternatives 3 Time Plant Operational outcomes x 3 Time Other Mine Development Activities Complete outcomes 27 x The paths in the Evaluate Used Plant alternative (lower) part of the tree add another fiftyfour possible outcomes. The choice depends upon the problem. It allows a richer. We considered only three completion time scenarios. There are advantages and disadvantages to each technique. days. However. He recognized that a relatively simple sampling technique could solve certain mathematical problems that are otherwise impossible. We credit legendary mathematician Johnvon Neumann (originally Johann. while participating in the design of the atomic bomb. driven by randomly sampling input probability distributions. the possibilities are infinite.for example. Additional outcome branches on the chance nodes would provide finer resolution. In competitive bidding. A decision tree can easily become a bushy mess. A continuous distribution represents a better view of this variable. Conceptually. A technique that repeatedly generates scenarios. . which can sometimes be important. tools at hand. 19031957) with popularizing the Monte Carlo technique. even small differences can affect who wins the contest. A valuable sidebenefit is that we easily obtain approximate outcome probability distribution shapes. Decision trees also require that we represent uncertainties as discrete probability distributions. In reality. Adding a few more uncertainties can lead to thousands of decision tree paths to evaluate. decision trees accept only discretechance events. and personal preference. it has advantages in certain situations. or even shorter periods. and we could measure time in weeks. four.
how we extend a conventional. *:* The word simulation is a very general term. The foundation for simulation is a random sampling process. where the system performs many trials. We can inspect any trial result to determine what combination of input values led to this outcome scenario projection. If one or more inputs to the model are probability distributions. Most of the action is at the left. and generates a possible case for the behavior of the project. singlevalued deterministic model (socalled because every input is singly determined). Figure 7. The distribution completely represents the expert's opinion about the outcome range and the relative likelihood of values within that range. simulation yields a distribution for value. Trials.3 is a flow diagram of a simulation. We can generate projections for timespread variables. Each trial is a pass through the steps at the left. The program . Simulation Process An iterative loop surrounds the deterministic project model and controls the processgenerating many plausible "trial" solutions. Every trial pass through the model generates a plausible scenario. we examine the distributions of trial outcome values. The simulation process is appealing because it is easily understood and not a blackbox solution.2 shows. we are always talking about stochastic (probabilistic) models. such as a present value (PV). The deterministic model sometimes needs little modification to prepare for simulation. Instead of a single outcome value. Extreme cases can be examined to see what conditions gave rise to these outlier results. Examining outliers is a powerful method of validating the model. In decision analysis. then the outputs will be probability distributions also. A simulation model is a straightforward extension to the customary. Figure 7.Then. This is why simulation persists as perhaps the most popular technique in operations research/management science. conceptually. used whether or not the model involves probabilities. We generate many possible project scenarios (trials). such as net cash flow. and display them as EV and confidence curves. An expert's forecast for time to complete an activity is better as a distribution than as a singlevalue estimate. We only need changes necessary to ensure that the model's calculations are valid over all possible ranges and combinations of input values. preserve the characteristics of the original probability distributions and approximate the solution distributions.Chapter 7Monte Carlo Simulation Inputs as Distributions Probability distributions express expert opinions about uncertainties. deterministic model for a simulation analysis. so that simulation in this book is always referring to a model solved with the Monte Carlo method. in sufficient number.
the PV distribution shown in the lower left of Figure 7. Here is a typical sequence of steps in the simulation process: 1. Averaging trial values approximates EVs. in a data file. we analyze the generated synthetic data. Typically. Substitute the trial values of the random variables into the deterministic model. 3. 5. is satisfied. variables.3. Return to Step 1 and repeat until the number of trials is sufficient to provide the required level of precision. Resolve the model.. or stochastic. Analyze the stored results.2 Stochastic (Probabilistic) Model generates many cases until a predetermined number of trials. Frequency distributions and timespread variable confidence bands are easy to obtainfor example.g. When the trials are complete.Chapter 7Monte Carlo Simulation Uncertain Variables Known Values I n Outputs LL Project Activities and Cashflow Model Decisions and Declslon Rules Value Distribution Project~on Schedules PV El H NCF Confidence Bands I Timespread Variables Figure 7. . several hundred trials are necessary to obtain enough data for reasonable precision in the EV calculation. e. time and cost to complete. obtaining project results and outcome values. 2. Averaging the PV outcomes approximates the expected monetary value (EMV). or until a stopping rule condition. The precision of the EV and probability distribution shape approximations improves as we increase the number of trials. 4. Sample probability distributions representing the several random. Store preselected outcome values.
Sorting PVs by magnitude and displaying PV as a function of rank yields the cumulativefrequency distribution. it is good practice to include the PV distribution in the analysis presentation. for Th~s Tr~al 1 1 Sort Data bk ~L!L o cumulative Frequency A Figure 7. Synonyms include chance variable and stochastic variable. I r 4 End Simulation Flow Diagram The EMV may be sufficient information for decision making. Different distribution formats are available: W Aggregating PVs into groups by size and displaying the values as afrequency histogram provides the approximate shape of the PV probability (density) function. However. etc. . or trial.3. in the spirit of full and clear communication.000 Tnals Store Value. These frequency distributions approximate the shape of the solution probability distributions. we often call these random variables. In simulation. values for chance variables by a simple process. because a random number generator drives the sampling process. How Random Sampling Works We obtain sample. We will look at conventional Monte Carlo sampling for continuous and discrete events.Chapter 7Monte Carlo Simulation Sta* Sample Random Variables Place sample values 4 Slmulatlon Tflal ~nto model Recalculate Determinishc Mcdet Cashflow Results After N mls projection Calculate Value and Mher Parameters of Interest calculate Average W ws EMV Frequency Distribution Aggregate lnto H~stogram Iterate N=1002.
"Risks. . for illustration. then we could apply a time. then Weather Delay = True. then Time to Complete Other Activities = 3 months else if RN < . or other impact in the form of a continuous distribution. we can avoid this discrete abstraction: we can represent the full range of possible outcomes of any uncertainty.3. When using Monte Carlo simulation. [There are many combinations of branch outcome values and probabilities that will provide the same target mean Q and standard deviation (4. let's look at how we could sample the discrete Time to Complete Other Activities distribution in a simulation. The example logic works like this: RN = random number If R N < . Most random number generators.4. we needed to abstract Time to Complete Other Activities into a discrete distribution.7.1 With simulation. as illustrated in Figure 7. This is easy to simulate with logic such as: Probability of Weather Delay = . we seldom would use a discrete a p proximation for a continuous event." however. However. as explained next. Each partition's width corresponds to the probability of the corresponding outcome.Chapter 7Monte Carlo Simulation Time to Complete Other Activities Probability 3 months 4 months 5 months Table 7. such as the RAND () function in Excel. Table 7. zero to one. are often binary: either the risk event happens or not. cost. A random number function provides a random sampling parameter between zero and one. then Time to Complete Other Activities = 4 months else Time to Complete Other Activities = 5 months. divide the zerotoone interval into segments whose widths correspond to the probabilities of different outcomes. If true. provide a uniform distribution with equallylikely values in the range.24 RN = random number If RN < Probability of Weather Delay. To set up the discrete distribution for sampling.2 ThreeLevel Input Distribution Discrete Distribution In the decision tree analysis.2 shows my threelevel distribution.
0 0 . Simulation software often allows directly entering the probability density distribution as an input assumption. 6 0. the sampling process uses the cumulative form of the distribution. With simulation.5 Probability Distribution for One Input Continuous Distribution After an expert takes time to judge Time to Complete Other Activities. This distribution is this expert's forecast for this variable.4 Sampling a Discrete Distribution 1 . Do not worry about the actual calculation method.775 month. Consider the probability density function in Figure 7. Conceptually. we can represent the full range of possible outcomes. In simulation.B a Q C 0 . The probability curve of Figure 7. with p = 4 months and o = . We want to convert the density curve of Figure 7.Chapter 7Monte Carlo Simulation Months Months Months 0 .0 Figure 7. we do not need to convert the form of the original distribution if it was a continuous distribution. 3 . This is a normal (Gaussian) distribution shape. Time to Complete Other Activities was abstracted into the threevalue discrete distribution.6 [I]. In the decision tree analysis.U) $ P .5 into the cumulative The conversion is straightforward.4 9 n 0.2. she would normally express her opinion as a probability distribution.5. 8 P . shown in Table 7.7 1.2 0 1 2 3 4 5 6 7 Time to Complete Other Activities Figure 7. .
This xaxis value is the trial value for this variable. This is the sampling parameter and. As indicated in Figure 7. this corresponds to 4. Time to Complete Other Activities. We obtain the cumulative distribution by adding the area (i. Suppose the random number generator provides a value of . 2. We then solve the model for a trial projection. t. Q. On a single pass through the simulation model. integrating) under the probability density function from left to right. This is but one particular scenario in the simulation run. Move down to the corresponding value on the xaxis.B m 11 g . Enter the yaxis at the sampling parameter (random number).0 a b . 0. maps to the cumulative probability axis.6 Cumulative Probability Distribution density distribution has a normalization requirement that the area under the curve equals 1. Move rightward to the cumulative curve. To obtain a trial value for this variable: 1. Most random number generators produce a number corresponding to a zerotoone uniform distribution. For any Time to Complete.4 weeks on the xaxis.Chapter 7Monte Carlo Simulation 1.5 and Figure 7. .4 0..e.6 m  3 z 0.2 0 1 2 3 4 5 6 7 Time to Complete Other Activities Figure 7. 3.8 0. We next obtain trial values for the other random variables in similar fashion.685. The cumulative form is more convenient for our present purpose.2 C. Refer to Figure 7. on the xaxis.6 for this explanation of how (traditional)Monte Carlo sampling works for a continuous distribution [2]. the curve intercept is the probability that the outcome will be less than t. we want to determine a trial value for this activity completion time. Both Figure 7.6 fully represent someone's judgment about the uncertainty.6. using a different random number for each variable. conveniently. We substitute this trial value into the deterministic project model.
In the simulation. The mathematically inclined may recall the integral equation that random sampling solves: t E V = Ixf(x)dx where x is the outcome value. We now use continuous probability distributions when we can. Simulation allows a richer representation. to represent the possibilities more completely. the average approximates EV. it is distributed through the fabric of our model. the evaluation outcome was either "Favorable" or "Unfavorable. The simulation process performs the integration for usapproximately. We do not have a simple equation form for the solution f(x). The uncertainty of this imperfect information is represented by a separate Estimate Error distribution. calculus problem for us! Wastewater Plant Simulation Figure 7. Monte Carlo simulation is solving a very difficult. While we can discretely model Bayeslike dependencies easily. Being able to use continuous distributions is an important advantage that simulation has over decision trees. We can incorporate virtually any number of chance events into a simulation model. which can quickly become too large to solve. we obtain an estimate for Time to Complete Used Plant.Cha~ter 7Monte Carlo Simulation Note that if we sample an input distribution many times and graph the values in a frequency histogram.7 shows a revised decision model for the Wastewater Plant decision of the gold mine development project. we can use a more realtolife description. Here is the way I model the estimation result: (Estimated Time to Install) = (True Time to Install New Plant) x (Estimate Error Function). if not impossible. The Evaluate Used alternative in Figure 7. the integration defies direct mathematical solution. Table 7.7 involves inspecting and testing the used unit. On the basis of the evaluation outcome. The key to simulation is that this random sampling process preserves the character of the original distributions. and f(x) is the probability density function. the probabilities for the different times to complete outcomes were revised by applying Bayes' rule. We need simulation because for most evaluation problems of interest. then the shape will approximate the original probability distribution." These outcomes are correlated with the time to complete the used unit through a joint probability table. . The match improves with more trials and finer histogram divisions. This is unlike decision trees.3 documents the probability distributions in the Wastewater Plant model. Thus. If we sample a distribution many times and average the result. In the decision tree analysis.
this had been a fixedimpact. ranging from a symmetric. a Delay Cost variable. a scaling coefficient.4 shows the results of a 500trial simulation run. are somewhat different due to the increased detail of the continuous distributions and the added Delay Cost variable.7 Dec~s~on Model To further embellish the Wastewater Plant example. (In previous decision tree models.7 shows the various distributions used in the simulation model. The numbers. The Used Later cost is lower than the Used Now .) Figure 7. normal distributionlooking distribution to highly skewed (asymmetric) distributions. Here. Results Table 7. The normal distribution has two parameters ( pand 4. beta distributions provide asymmetric bellshaped distributions with positive skewness (longest tail on the positive side). and an offset).Chapter 7Monte Carlo Simulation Figure 7. I added another chance event. Chapter 10 describes these and other popular distribution shapes. Note that the discrete distributions used earlier are now represented by either normal or beta distributions. Beta distributions are simple functions that can resemble many shapes (by adjusting two shape parameters. while close to those obtained from the decision tree analysis. This is the permonth financial impact of delayed (or accelerated) mine startup. pretax $150k/per month.
5..6. After taking one month to do this evaluation.775) Norma1(5. the testing and inspecting cost ($30.8) + 1 0 x Beta(2. slightly increasing their EV costs. based upon the evaluation information.5) Table 7.3. It is also an option to inspect the used plant before committing to a choice.6..9 compares the Evaluate Used alternative with the Skid alter .12)  1+ 27.Chaoter 7Monte Carlo Simulation Input Variable Time to Complete Other Activities Time to Complete Skid Unit Time to Complete Fixed Unit Time to Complete Used Unit Used Unit Estimation Error Delay Cost Discrete Distributions Continuous Distributions (Decision Trees) (Simulation) {.5.5.4.5) Norma1(4.4.3. the project manager will again compare the plant alternatives.3 Input Distributions Table 7.067 Units Months Months Months Months Estimate Factor $Thousand/Month .4. The stair steps are caused by using monthly time periods in the cashflow projection model. The decision policy would be to choose the alternative having the lowest EV cost. Now the Fixed and Skid alternatives would be delayed one month..3.4) {.2. Figure 7. Without the Evaluate Used option..3.8) Normal(l. the decision would be made today by comparing the distributions for the three base plant alternatives. This cost..8 presents the cumulative distribution curves 131for these options.5 x Beta(2.000. The skid plant also is the least risky (risk evidenced by the width of the distribution).22. reduces the investment required if this Used alternative is chosen after inspection. Figure 7..3..35.3) Norma1(150. {..6) {.35. in this model.2..4 EV Costs cost because. The Skid alternative has a slight advantage in having the lowest EV cost.3. however.. The Used alternative is also reassessed.4.. including deposit) is expensed rather than capitalized for taxes.632) 3.2.
Although the histogram appears lumpy. The simulation model in this analysis uses discrete. this has little impact on the EV calculation. Cumulative curves are most popular.Chapter 7Monte Carlo Simulation 1. we could mod* the model to use weeks.000 of value to the project. Without too much effort. days. Random sampling and the accumulation buckets typically result in an irregular distribution shape. Superimposed on the figure is the cumulative distribution. Simulation in Practice Simulation is excellent for calculating distribution curves for various outcome parameters of interest. in general. monthly time periods. because it avoids the time bucket effect and diutes any random sampling errors. hours.mmmm. . adds $96. The cumulative curve appears smoother. Figure 7. Evaluate Used.8 Three Original Alternatives native. and so on.10 shows a histogram of simulation results for the improvement offered by the Evaluate Used alternative. This contributed to the discontinuities in the resulting distributions. Pursuing the information alternative. to meet the precision requirements of the study. Fixed Now 0 1 2 3 Wastewater Plant Cost $millions Figure 7. for comparing the riskversusvalue profiles of the different alternatives.00 80 .
A closely allied technique called Latin Hypercube Sampling (LHS) has become very popular in recent years. Payoff tables are straightforward.Chapter 7Monte Carlo Slmulat~on 1 . Excel already has functions for sampling normal and binomial distributions. and 1 wiU not be surprised if a future release includes simulation capabilities.60 2 a 2 9 . 94 . LHS reduces the number of simulation trials by as much as 90 percent to obtain a desired precision. though not as convenient as these next addin tools. Personally. This apare using Microsoft@ plies to project models as well. there are sampling techniques generally described as "design of experiments" (mentioned briefly in Appendix A. Summary of Methods). Excel is a great analysis tool for small. Most people Excel in developing custom business models.aQ E b . unless using project scheduling software such as Microsoft Project. though other analysts are comfortable creating enormous spreadsheet models. I prefer a procedural programming language for the big jobs.40 20 . This is a hybrid between uniform sampling and traditional Monte Carlo sampling.OO 5 1 1. and I use Excel for these..to mediumsized models.9 Evaluation Alternative To minimize computer running time while increasing detail.+ 'U 5 E' . The dataanalysis functions are already available. Software I think we can declare that Microsoft won the spreadsheet war. 0 0 .5 Wastewater Plant Cost $millions Figure 7.
Depending upon particular needs. these are both excellent tools. is particularly difficult with uncertainty. however. These tools have different heritages and somewhat different design approaches. two addin products have dominated the market for Monte Carlo simulation in spreadsheets: @RISK@ by Palisade Corporation and Crystal Ball@ by Decisioneering. 0 0 200 100 0 100 200 300 400 500 Incremental Value of "Evaluate" Alternative $thousands(k) Figure 7.80 b . both companies offer versions that distribute the simulation across a PC network.10 Incremental Improvement In recent years. if not impossible. Appendix 8. . each program has most of the other's capabilities.60 0. We need simulation because probability distributions are difficult. Decision Analysis Software.Q m Q a C 6 2. Inc.40 P E 8 b .. OptQuest@.V) 3 0. and the software users are the beneficiaries. Nonetheless. some people may find one program better than the other. Optimization. There are at least two competing simulation addins for Project. 13 a . optimizes decision variables. For the really big jobs.000 times more computer time than solving a deterministic project model.20 3' E  0 . including Web sites with demonstration downloads. @RISKversions are available also for Lotuk 123@ and for Microsoft@ Project. has vendor information.Chapter 7Monte Carlo Simulation 1. a testament to both the popularity of Microsoft Project and increasing recognition of Monte Carlo simulation in the project management community. RISKOptimizer@ is Palisade's solution. In my opinion. Both companies have aggressively matched most features of the other's products. that Decisioneering has a companion program to Crystal Ball. This requires typically from 100 to 1. to combine mathematically. Chapter 15's topic. E 0.00 0.
Chapter 7Monte Carlo Simulation Comparing Simulation to Trees Monte Carlo simulation is a complementary calculation alternative to decision tree analysis. This is conventional Monte Carlo sampling for independent variables. providing additional insights and for comparing riskversusvalue profiles Involving complex decision policy (i. strategic decisions involving a portfolio of projects) Where outcome probability distributions are desired. 2.e. the outcomes of a simulation are frequency distributions. Technically. Appendix A. Summary of Methods. one not maximizing EV). decision analysts start with a decision tree model and then move to simulation when the model becomes more complex). A frequency distribution converges to the true probability density distribution if collected in many fine bins and after sufficient trials. In contrast. decision frees are usually preferred for situations: Involving a sequence of decisions (e. . Inc. Endnotes 1.g. usually. Simulation is usually prefewd in situations: Having many significant uncertainties and contingencies Involving a portfolio (e.g.. The nature of the problem at hand. Frequency refers to sample datain this case. Each technique has its advantages and disadvantages.. Simulation and decision trees are the principal computation techniques in decision analysis. Used with permission. though they do so in very different ways. This figure shows a Cell Assumption window from Crystal Ball@by Decisioneering. these involve Bayes' theorem calculations to revise prior probabilities Where there are lowprobability events Where a simple decision model will suffice (often.. value of information problems). See Appendix B. notably Latin hypercube sampling. and personal preference determine the choice of method. There are other Monte Carlo simulation sampling methods available. Both methods solve for EVs. synthetic data from simulation trials. 3. available tools. Decision Analysis Software. for vendor information. includes a table that matches evaluation problem features to the better EV calculation tool.
Project decision policy is most effective when we combine the usual triple objectives of time. Decision analysis is the discipline for making decisions under uncertainty and provides a logical. analyzing. EMV is the expected value (EV) of the present value (PV) of net cash flow. our understanding of a system is structured and quantified." In decision analysis. The Business Perspective A Guide to the Project Management Body of Knowledge (PMBOKB Guide) 2000 Edition defines project risk management (PRM) as "the systematic process of identifying. I will emphasize quantification and valuebased decision making beyond what the PMBOKB Guide suggests. In this discussion. consistent process for project management decisions. as described in Chapter 4. The project cash flow can be either actual cash or cashflow equivalents: Organizations with multiple objectives and others with multiple decision criteria can also apply decision analysis methods. Probability is the language of uncertainty.CHAPTER 8 BY THE NUMBERS Project risk management is about optimizing value. In Chapter 3. and responding to project risk. I stressed that most organizations whose objective is to maximize shareholder value are well suited to an expected monetary value (EMV) decision policy. and performance into a single value measure. . cost. and experts express their judgments about risks and uncertainties as probabilities and probability distributions.
may be part of a higherlevel corporate or resourceplanning model. The asset model should always be a stochastic modelthat is.1 illustrates relationships between the asset model and the project model. EU). Figure 8. itself. the logical decision policy consists of measuring value (utility) with a multicriteria value function and maximizing EV utility (expected utility. time. . including criticality indexes. 1 The Asset and Project Models Then. subprojects. and performance Value of early completion Impact of parameters. The asset lifecycle model represents how business value depends upon the completion date and changes in the asset's functionality or performance. Both may incorporate the results of detailed submodels developed to understand particular processes. The asset model. such as scope changes Project projections. The asset model relates such elements as: Cost.Chapter &Project Risk ManagementBy the Numbers eration Decommissioning or Conversion Su bmodels Figure 8 . incorporating probability distributionsbecause every project has uncertainties. or system components. Model Scope Most projects develop or build some form of asset.
depends upon the situation and the capabilities of the analysis team. that benefit from decision analysis. In modeling the project. and (b) a deterministic project model for detailed planning (not evaluation) and control. and the effects of uncertainty. as I was writing this section.1 5.2 12. Table 8. Expert system assistants (Chapter 17) will enable us to develop.2 11 12. Thus. Most changed is the Guide's Chapter 11.1 identifies additional areas of project management. developed initially before the detailed project model. do not neglect end of life.1 10.2. Some assets will have significant decommissioning or abandonment costs. say. identified with PMBOK@Guide sections. The asset may have a successor use or be saleable.1. PRM benefits from focusing on the key drivers. without distributions.Chapter &Project Risk ManagementBy the Numbers 2. I'm advocating (a) a stochastic asset model with a lesserdetailed project model inside.4 Project Phases and the Project Life Cycle Project Plan Development Scope Planning Time Management (project modeling) Cost Management BenefiVCost Analysis Tools and Techniques for Performance Reporting (variance analysis.3. 1 Project Activities for Using Decision Analysis The detailed project model is typically deterministic. that is. in either an asset or a project model.1 4. in terms of resolution. Decision analysis is useful throughout the project life cycle. If less than. and apply larger models. PMBOIP Guide Sections The Project Management Institute (PM)released the PMBOKB Guide .2 6 7 8. maintain. In contrast. earned value) Project Risk Management Tools and Techniques for Procurement Planning (contract type selection) Source Selection Table 8 . we might incorporate distributions into the detailed project model. the asset lifecycle model typically contains a simplified project submodel. We can do most riskresponse evaluations on the side as simple decision analyses. Models with hundreds or thousands of probability distributions would overwhelm most of us.2000 Edition in December 2000. risk mitigation actions.1 illustrates. As Figure 8. Project Risk Management. We might decide to buildon into a next version or renovate to extend the planned life. project execution and control benefit from planning in great detail. The Guide's chapter on risk management is the most obvious area for application. The amount of detail appropriate. . fifty activities.1. The full project model contains typically ten to 100 times more detail as the asset model.
Risk Management Plan.2 0 0 0 Edition. and wellunderstood part of the way the organization operates.Chapter &Project Risk ManagementBy the Numbers 9 1 am in general agreement with the PMBOK@Guide . and the purpose of Risk Management Planning is to fix the process for this project. In a streamlined riskmanagement process. This defines roles and responsibilities. It is hoped that. Comparison with the PMBOKB Guide . . Include major discrete contingencies. My Appendix 8D. What alternatives exist to improve the project's riskversusvalue profile? During the Project The numbering on the following risk management functions matches the PMBOK@Guide. Check the project model and asset model after revisions to ensure consistency. discusses plan elements in more detail. express judgments about uncertain input variables. Probability distributions. Appendix 8B. Important risks provide opportunities for improving the project. We should maintain this asset model throughout the life cycle. there are two project models in different detail. described next. the procedures and systems are not reinvented with a new project.2 0 0 0 Edition. Rather. The structure of the summary project model developed at this stage has a preliminary work breakdown structure (WBS) or other activity network representation.1. A stochastic (probabilistic) model of the asset life cycle provides the basis for jushfymg the project.1Risk Management Planning In the early stages of project initiation. of course. describes several exceptions between PMl's official pronouncement and the process that I describe. Update this model as new information becomes available. As illustrated in Figure 8. as with decision policy. and control reporting. we design and implement the formal riskmanagement methodology. with two noted exceptions. This includes a preliminary and coarse project model for calculating distributions for time and cost to do the project. databases. PreProject Risk Management Risk management begins in the feasibility study. consistent. the process is a proven. 11. Special circumstances may require some modification. I use the same PMBOK@Guide function numbers and names.
the assumptions may change. The list includes. remedial action cost incurred because of a delayed shipment) Singlevalue or probability distributions for contingency plan implementation costs. Classifying risks helps identlfy common and redundant entries.3Risk Assessment In my scheme. risks from Risk Identification. Typically. Qualitative Risk Analysis (PMBOP Guide Section 11.3. are the means to evaluate schedule. Also. activity. single values also suffice for lessimportant variables (prioritized subjectively at this stage) Probability distributions for important uncertain values in the project model Probabilities for discrete risk events r Probability distributions for impact variables that become relevant when the associated risk events occur (e. The project model details elements in schedule and cost estimation.. cost. if desired. to preliminarily classify risks into three categories: "Important. as applicable." Beyond this.Chapter € + P r o j e c t Risk ManagementBy the Numbers 11. and performance. key material. the emphasis in PRM will be on quantifying judgments about risks and uncertainties. WBS. especially. Risk Identification is about reviewing every input variable. this replaces PMBOKB Guide Section 11.." "Not Important.3) has its place. consider potential changes in the environment as risksi. . Qualitative Risk Analysis. Their assessments include: Single values for variables that are well known or can be calculated with precision. The project base plan is a single scenario where the inputs are EVs for continuous chance variables.g. 11. and resource. For a discrete variable. 9 As you might have guessed from the book's subject and this chapter's title. the most capable available persons judge these values. We need judgments for the values of every input variable in the model. I strongly urge quantification. I use the outcome nearest the EV. What are the threats and opportunities? Checklists builtup from others' experiences are valuable in getting started and in ensuring completeness. Stochastic calculations at an appropriate level. the project team creates.2Risk Identification In detailed project planning.e. the bill of materials. and an activity network diagram. perhaps the asset model. Influence diagrams are able to represent relationships not evident in the activity network diagram." and "Possibly Important.
such as the solid curve in Figure 8. Risk Analysis is about understanding overall project risk. Here we view project cost distributions (which may incorporate costequivalent considerations for schedule and performance) both before and after implementing riskmitigation actions. A common display of project risk is a cumulative distribution diagram. a probability distribution completely describes a judgment about uncertainty for a single chance event. The purpose of sensitivity analysis is to help prioritize input variables and model construction details.4.4Risk Analysis In my scheme. (b) The cost or time impact on the project as a continuous distribution given that risk event (a) happens. 11. We further need the experts to describe how judgments. As we've discussed. depending upon the model. Quantitative Risk Analysis. The popular graph type of Figure 8. These risk events can be either desirable (opportunities)or not (threats). Usually. The stochastic asset model is the basis for credible forecasts. I also suggest assessing risk for "unknown unknowns" (things we haven't yet thought about) and an aggregation of risks individually too small to warrant specific inclusion. Cumulative distributions show the possibilities. In probability language. the impact is conditioned on the occurrence of the risk even happening. though the xaxis would instead be the asset value outcome (usually PV). Some practitioners carry stochastic calculations through the detailed project model as well. (Many practitioners use a quali . there may be interactions between variables (see Chapter 12).3 (either side) shows discrete risks versus their conditional EV impacts. used here as the zero reference. are associated (correlated) with one another.Chapter &Project Risk ManagementBy the Numbers 0 % These are the "known unknowns" explicitly appearing in the model. I often use the word contingency for such events. the most common risk representation is in two parts: (a) A discrete risk event for which someone judges a probability of occurring.2. this replaces PMBOKm Guide Section 11. The Base Case is only one scenario. However. the target outcome is either asset value or project cost. Prioritizing risk events is important in PRM and done here in Risk Analysis. Sensitivity analysis (described in Chapter 9) quantifies how uncertainty and changes to input values affect the outcome. In PRM. though this may be overkill. Influence diagrams are useful in mapping interrelationship and designing detailed submodels. An asset's value distribution would be similar (assuming the model includes all lifecycle effects). described earlier.
4 shows the hierarchy of this process. Mitigating and Avoiding Risks. Risk Management Plan.5Risk Response Planning What we do about the risks? This is the topic of Risk Response Planning. we will assume risks are only of the "threat" variety. the illustration omits "opportunities"(favorableeffects). . or sometimes both. From among the potential actions.40 . Importance of a risk depends upon both its probability and impact. Figure 8.80 r ' i U) 0 0 2 . Use brainstorming and checklists again to identlfy candidate actions for mitigating each risk (or enhancing opportunities). the project model part of the asset model. For this discussion.2 Risk Profiles Deviating from Base Case tative matrix instead.00 . Figure 8. should the risk event occur. may provide you with some ideas.3 characterizes the importance of identified (discrete) risks [I].E m TI a a.Chapter &Project Risk ManagementBy the Numbers 1. which are cost effective to implement? The optimal project plan incorporates the combination of actions that adds most value to the organization.Risk mitigation actions affect the probability of the risk event or its impact. 0 . "northeast.) Project cost and asset value uncertainty are among the outputs of a feasibility study. as appropriate.60 .After Actions I  Before Actions =.20 2 a a 8 . 11. Risk management attempts to move threats "southwest" and opportunities.oo 800 400 0 400 800 1200 Project Cost $thousands . Actions typically reduce or eliminate risk or affect the impact." IsoEV cost contours are one way to segregate risks into importance categories.a . .Appendix 8C. = = Base Case Figure 8. We should maintain and embellish. For clarity. describes more about evaluatingcandidate actions. Appendix 8B. .
.
At major milestones or other suitable points. and we alternate between looking at the forest versus looking at the trees. Instead of implementing everythmg with a positive EMVnotwithstanding the portfolio issues just mentionedwe would prioritize to do the best with what we have. discounted return on investment (DROI). Recording original estimates and actual outcomes is key to judgment performance feedback. There is much subjectivity in analysis work. we could do a risk/cost/benefit analysis on every candidate action. A database repository is the foundation. .Chapter 8Project Risk ManagementBy the Numbers In a simple world. Data fields include various classifications." Several actions may be candidates for mitigating one risk. in Chapter 3: DROI = EMV E (PV Investment) Choose to implement those actions with the highest DROI first. 11. the process is iterative and features rework cycles. The asset and project models should be coordinated. Risk Management Plan. responsibilities. "Diminishing returns. I mention the most popular ranking criterion. but together they are not. Synergies: "Killing two birds with one stone.6Risk Monitoring and Control An inventory of risks and actions is core to risk management. This is perhaps the single most important way in which to improve an organization's planning and evaluation processes and skills. postimplementation reviews are essential for participants' learning and for capturing knowledge. However. watch points. instead of a portfolio of investments. Either action may appear feasible. and part of the analysis discipline is focusing on elements that matter most. What if there isn't enough money (or other constraining resource) to perform all worthwhile riskabatement actions? Like elsewhere in life. provides more details about this database. we prioritize. Most practitioners use a return on investmentlike criterion. The decisions would be simple: Implement those actions that add more EV benefit than their EV cost. we have a suite of actions and are selecting portfolio of response actions. Keep Your Perspective While the earlier outline appears linear. correlations cloud the picture: A risk mitigation action may affect more than one risk. Here. These are common problems in portfolio analysis." Implementing actions may affect the feasibility of mitigating other risks and the effects of other actions (both in probability and impact). and time windows. Appendix 8B.
Conditional branching is key to making the model realistically dynamic. . There are four appendices to this chapter: 8AQuickandDirty Decisions 8BRisk Management Plan 8CMitigating and Avoiding Risks 8DComparison with the PMBOKB Guide .2000 Edition. It is always interesting to experimentlearning about what is important and how the project behaves under different circumstances.Chapter %Project Risk ManagementBy the Numbers Building asset and project models is the most fun in analysis work.
Chapter %Project
Risk ManagementBy the Numbers
APPENDIX 8A
There is a common misconception that decision analysis is time consuming. Many evaluations require only a few minutes with pencil, paper, and a hand calculator. Once you are clear about the best course of actionstop the analysis! No further value will be added. Decisions that warrant more extensive analyses have one or more of these characteristics: R Outcomes are difficult to value (requiring detailed casMow modeling). Outcome values are significant for the entity, and warrant serious attention to the decision. The best alternatives appear to have similar values or cost. R There are too many variables, compounded by correlations, to process in one's head.
Common Simple Situation
Decision tree analysis is your solution method of choice if you are ever marooned on a desert island. A solar calculator will help! Regardless, you can sketch a decision model on the back of an envelope or in the sand, and do some quick calculations. Suppose you have created a base project plan. Upon further reflection, you now recognize a contingency that could affect project cost (and/or schedule). Figure 8A.1 shows a simple treelike model of this contingency. Brainstorming about the possibilities is an important project management function. Corporate planners call it SWOT analysis: identlfylng Strengths, Weaknesses, Opportunities, and Threats. Note that contingencies can be either unfavorable (threats or risks) or favorable (opportunities).A large project may have hundreds of identified risks and opportunities. Once we identlfy a contingency, the manager should seek actions to exploit or mitigate the situation. This is when project management can be proactive, rather than waiting and possibly having to react. The actions we implement may change the probability that the contingency event will occur, affect the project cost distribution (i.e., impact) should the contingency event occur, or possibly both. We can often meet a contingency with several candidate abatement (or exploitive) actions. An action can preempt the contingency event. More often, actions result in only partial control. Flexibility and contingency plans are among the ways to reduce the impact of negative contingencies that do occur.
Chapter W r o j e c t Risk ManagementBy the Numbers
Impact on Base Plan
Low
Contingency Happens
Medium
High
Does Not Happen
Figure 8A.1
Simple Contingency
Upgrade to Thicker Stock Premium Grade Stock (but more difficult to machine) Machined Casting
probability1of Being Adequate
Cost2 to Rebuild
.40 .90 1.00
$5,000 $10,000 $30,000
'conditional, given that the original stock and fabrication prove inadequate. Includes monetalyequivalent penalties for project delay.
Table 8 A . i
Improved Base Plate Material
Here is a typical situation where a quickanddirty decision model is adequate. Suppose your project involves building several electromechanical instruments. Alignment is critical for certain components mounted on a base plate. The intended aluminum stock for fabricating the base plate is easy to work, but may prove too flexible and nondurable in the application. Your team assesses a .12 chance that the aluminumbased plate chosen for fabrication will be inadequate. There are three upgrade alternatives, shown in Table 8A.1. You could upgrade to one of these other choices now, or see how the originally planned aluminum stock works out. Assume that if an initial base plate is not satisfactory, you will have sufficient information then to know what solution is required. We can use the Figure 8A.1 template to express judgments about the risk and impact of the contingency. Figure 8A.2 shows how we can appraise the contingency. The EV cost of the contingency is $1,200. Whatever alternatives we consider must reduce the EV cost by more than the cost of implementing action. We decision analysts call this a value of imperfect control problem.
108
Chapter &Project
Risk ManagementBy the Numbers
Resolve with Thicker Stock
Base Plate
P m s
$5,000
~ o Weak a
Yes
10,000
.50
Premium Stock
$10,000 $30,000
Casting
No
$0
Figure 8A.2
Contingency Model
Base Plate
Pmves
Too Weak
Resolve with
Thicker Stock
$5,000 $10,000
Use Planned
$30,000 $0 $10,000
$30,000
$0
Figure 8A.3
Decision Model
An alternative. Your company can decide now to use thicker aluminum stock in the initial fabrication. This initially will cost $500 more. Is this prudent? Since the $500 is less than the $1,200 EV cost of plate inadequacy, the action alternative passes the first feasibility test. Figure 8A.3 shows the decision model. You want to evaluate whether this conservative approach is a justifiable precaution.
Chapter %Project Risk ManagementBy the Numbers
The thicker base plate reduces the contingency probability from .12 to .072. If it is inadequate, then the probability that we need the Premium Stock increases from .50 to .833. Similarly, the probability of Need Casting increases from .10 to .167. In this case, the candidate action (Thicker Stock Initially) affects both the probability and the impact of the contingency. The EV cost of refabrication drops from $1,200 to $960 if initially we use the thicker plate. However, the case with the thicker plate costs $500. Thus, the initial comparable EV cost with the thicker plate is $1,460. Therefore, choosing Planned Aluminum Stock is most appropriate (for a riskneutral decision maker or an organization that has an EMV decision policy). The Thicker Stock Initially is a worse option, having a $260 higher EV cost. Thus we obtain a logical, defensible basis for the decision. This type of quickanddirty decision analysis fits many situations. A basic understanding of decision analysis should be in every project manager and professional's toolkit.
Chapter &Project
Risk ManagementBy the Numbers
APPENDIX 8B
RISK MANAGEMENT PLAN
I resist suggesting outlines and formats for economic evaluations and project plans. The formalism you need depends upon the situation. However, three topics in this appendix may serve well as the elements of a PRM plan: Sensitivity analysis Evaluating alternatives Inventorying risks and actions.
Sensitivity Analysis
Modeling is the heart of evaluating a system and the alternatives that we have to affect the behavior of that system. Sensitivity analysis is the quantitative process of determining which variables are most important in the analysis, through determining their influence on the outcome value. Sensitivity analysis is a full section in Chapter 9, Modeling Techniques. Judgment plays an important role. Some risks are clearly not important. We should merely document those risks identified and considered "insignificant." Briefly describing our rationale is good practice (in case something proved significant after all).
Evaluating Alternatives
We only need to model in sufficient detail to make an informed decision. For most decision situations, we do not need to model the enterprise or even the entire project. Typical decisions are sufficiently localized so that great simplifying assumptions are reasonably valid. Adding value is the goal in decision making. Quantitative methods are much more likely to produce consistent, meaningful results. Lots of books and articles describe matrices of risk versus impact. Figure 8.3 shows an improved, quantified version. For choosing risk abatement actions, many practitioners advocate a returnoninvestment decision criterion: Profit Improvement ROI = Cost of Action Most people deduct the Cost of Action in the numerator. If not, this is usually and more properly called a benefit/cost ratio (BCR). Either way produces the same decisions. The decision rule is to implement the action if the ROI > 0 or if BCR >1.
the heart of PRM is the database of risks and actions. since we're talking about risk. Figure 8B.1 illustrates the structure of this database. though I point out these alternatives because they appear so often. then the numerator and denominator should also be EVs. 112 7 . Here are some example data fields in addition to the elements in Figure 88. An inventory of risks and actions is core to risk management.1: I Area or department owning the risk Person responsible for the risk Relevant time horizon (a Gantt chart would be a useful display of risks and actions) Activities affected Response resources needed if risk event occurs Watch points and early indicators. Of course. Then this improved criterion is the DROI.Chapter 8Project R~sk ManagementBy the Numbers Figure 8B. A database repository is the foundation. Inventorying Risks and Actions As mentioned earlier in the 11.1 Project Risk Database We usually measure costs and benefit values as PV cash flow. It is simpler and less confusing just to talk about EMVs.6Risk Monitoring and Control subsection.
at present. Putting all this together is a complex applicationdesign problem. to seamlessly integrate these project management elements. of any computer solution that ties everything together. such as an expert system described in Chapter 17. the (usually deterministic)detailed project model. Perhaps we await an intelligent assistant. and the risks and actions inventory.Chapter &Project Risk ManagementBy the Numbers I am unaware. These elements are usually managed separately: the stochastic asset model. .
Chapter &Project Risk ManagementBy the Numbers APPENDIX 8C MITIGATING AND AVOIDING RISKS It is an unhappy fact of life that there are usually many more things that can go wrong with a project than can unexpectedly go right. classifies actions into four areas. Increase the company's capitalization. and mitigating risks. The following outline describes some ways that you might find useful when avoiding. .3 Mitigation. . Move the risk to a third party. I Participate in many ventures (divers+). shortterm sales (price and volume) contracts. "do nothing. This is the default. Although this section focuses on the downside." when we are unable to find a cost/effective action to deal with the risk. Changing the project plan to eliminate the risk. I Group complementary risks into portfolios. Commodity Prices I Hedge or fix prices in the futures markets.4 Acceptance. I Tailor contracts for appropriate risk sharing. . and fixed prices are examples. Portfolio Risks I Share risks by I Spread having partners (dilution or diffusion). PMBOKR Guide Section 11. I Seek lowerrisk ventures. the project manager should be constantly vigilant for potential opportunities.2. guarantees. Specialize and concentrate in a single. we might find a way to reduce the probability of a threat and/or impact.1 Avoidance. usually by contract. I Use long. wellknown area.5. options.or .2 Transference. transferring. Tools and Techniques for Risk Response Planning. where the company enjoys advantages. should the risk event occur. If we are unable to avoid or transfer the risk. Insurance. . risks over time.
Provide better training and tools. Tailor risksharing contract clauses..Chapter &Project Risk ManagementBy the Numbers Interest Rate and Exchange Rate W H W Use swaps. Have backup and redundant equipment. . Restructure the balance sheet. models. Seek additional information. Increase training. floors. Environmental Hazards W Buy insurance. Involve multiple disciplines. decision analysis).e. Increase safety margins. Operate with redirect and bailout options at project milestones. Denominate or index certain transactions in a foreign currency. Perform parallel analyses with alternative approaches. Increase safety margins. Develop and test an incidentresponse program. collars. and communicate crossdiscipline. and other hedging instruments. Acquire additional inventory and spares. Include evaluation practices with project postreviews. pilot programs. Validate models. Analysis Risks (Reducing Evaluation Error) H H H H H Use better techniques (i. Operational Risks H H H H H H Hire contractors under turnkey contracts. and people. ceilings. Conduct tests. overbuild and overspeclfy designs. Monitor key and indicator variables. and trials.
These standards are mostly a volunteer effort. there are several areas in the PMI standard that are not quite right. we need to know how project delivery affects the asset value after handoff. I suspect these terms would be easier to in . So. asset PV. The taxonomy is useful. I'll expand on these topics. and I'm happy to suggest a definition in this book. Continuous Risk Events I am pleased that the Guide defines risk in a way so that the effects on the project can be either favorable or unfavorable. The Guide's Section 11. Key points I offered include: (a) requirement of a clear decision policy. and I would like to see this in the next edition.1 contains the only mention of classifying risks into threats and opportunities. time value. In mid2000. l l projects. in my humble opinion. and I wish that I could have invested more time in the PMBOKB Guide revision. and I encourage project managers to begin risk management during the feasibility study.3. and. hence.I I I Chapter &Project Risk ManagementBy the Numbers APPENDIX 8D COMPARISON WITH THE PMBOKB Guide 2000 EDITION I support professional standards efforts. Below. I would like to see the next PMBOKB Guide edition be more emphatic about encoding decision policy: objectives. The project committee chose to reject or only partially implement some suggestions. (b) risk management can apply to continuous events. I submitted twentyfour comment forms to the PMBOKB Guide Exposure Draft. Uncertainty didn't make the Guide's Glossary. and risk attitude.4. we discuss how schedule and performance outcomes affect net cash flow. In Chapter 4. and (c) emphasis on risk quantification. Despite important improvements.  I Asset Value Perspective My background is corporate planning and economic evaluation. and I appreciate that many people devoted what time they could spare to the 2000 Edition. The corporateasset perspective is especially important in decision policy: How do we measure value? In order to make decisions during the project. That's why I omitted Corporate decision policy should apply to a an "identify decision criteria" step in Chapter 2's tenstep decision process. We will assume your organization is a forprofit corporation for this discussion. The feasibility study needs an asset lifecycle model so as to value different alternatives for the organization.
such as inflation (further complicated by the time dimension). respectively). however. So in this example.. It might be a continuouschance event. A continuous distribution. there may be actions we can take to mitigate or exploit an uncertainty (risk) that is other than a binary event.. During a long project. we have multiple discrete risk events. from conception to abandonment. the project model. I described what I believe is best practice in applying decision analysis to PRM. We don't want to get hung up on arguable definitions. is this. inflation might be important.g. However. in some circumstances. increase the flexibility of a crew availability) if an earlywarning condition occurs (e. though in less detail. For example: A risk management plan might implement a contingency action (e.. I'm hoping that the next Guide revision will mostly eliminate Qualitative Risk Analysis. despite my uncertainty definition in this book suggesting that this is the customary case. the activity cannot start when planned). Modeling is key to planning and decision making. several predecessor activities are forecasting to finish late). Another point is that most people oversimplify the risk and actions. In the context of PRM. the potential for undesirable effect. That is largely the reason for this book! . but this is splitting hairs. Let's talk threat. Using numbersprobabilitiesand distributionsprovides unambiguous communication and a legitimate ranking and decisioncriterion calculation (risked DROI or EV. represents both the risk and the impact. I would normally call a continuous distribution an uncertainty. Maybe these together should be called Risk Analysis. for example. the impact might be higher costs with overtime. I offered suggestions throughout the PMBOK@Guide about recognizing the importance of an asset model that parallels. Assigning grades such as high/medium/low to a risk is asking for confusion. If the risk event occurs (e.g. and that its functions wiU be attributed to a quantitative Risk Analysis process. The traditional risk model has us experience the impact only if the risk event occurs. Project risks are not always discrete events. and at least two occurrences that may affect project costs and schedule. We implement a risk abatement to reduce the probability and/or impact of the risk.g. My point.Chapter 8Project Risk ManagementBy the Numbers corporate into the Guide if people could (would) agree on terminology usage. or a discrete chance event with more than just true/false outcomes. In this chapter. Risk Prioritization Needs Quantification My biggest outstanding issue is about qualitatively characterizing and prioritizing risks. the world is not always so simple. contingent decision points.
In this situation. . 2.Chapter 8Project Risk ManagementBy the Numbers Endnotes 1.3 is adapted from a chart produced by PROActmproject risk management software. used with permission (see Appendix B. the revised probabilities are easy to calculate. Decision Analysis Software). we would use Bayes' theorem. More often. by Engineering Management Services. Figure 8.
MODELING AND INPUTS The model is the heart of most analyses. especially for dealing with uncertainty. The project model captures and expresses our understanding of the project as a system. Chapter 9Modeling Techniques Probability Distribution Types Chapter 10Judgments and Biases Chapter 11Chapter 12Relating Risks Chapter l3Stochastic Variance . Chapters in this part further describe evaluation methods.
.
Most often. for example. a projection is a solution of a deterministic modelthat is. A base project plan. Decision analysis and the associated techniques help decision makers choose wisely under conditions of uncertainty.CHAPTER 9 For most systems. A model predicts what will happen if the assumption values are realized. a model without probabilities. such as projects. a model is constructed and used to project different possible futures. The assumptions may be singlepoint forecasts of the input variables or merely whatif values. developing a model is the key to understanding the possibilities. At least one alternative has multiple possible outcomes. Forecasts from Models Forecastingis the most important analyhc problem in business. The range of possible outcomes is significant enough to warrant attention to the decision. A decision analysis approach is applicable when: The situation has two or more alternatives. Each projection. or scenario. is summarized into an outcome value. The model represents the dynamics of the project and its behavior under dyferent decisions and circumstances. . Most often. is one projection of what might happen. This chapter focuses upon quantifjrlng outcome values. This single value measures goodness or progress toward the organization's objective. Nearly every decision presupposes a forecast: What might happen with each alternative? A projection is a scenario that reflects a particular set of assumptions.
and PDM. This structure is the fabric of the project model. for example. unique. This implies that conditions tomorrow will be like conditions yesterday. The prediction is believable only if the source person possesses recognized superior experience and a record of reasonably accurate judgments. or probabilistic. and then using the model to generate a forecast. Because intuition is hard to define. The term forecast implies an analytic process of estimation and calculation. Deterministic Project Models You m a y skip this section if you are already familiar with CPM. Traditional forecasts use singlevalue forecasts of the model inputs. or complex. Here we are talking about a stochastic. These inputs should be probability distributions when the values are unknown. Example inputs include activity costs and completion times.Chapter 9Modeling Techniques Aforecast is a projection based upon forecasts of the individual input assumptions. The model is an abstraction of the real world. Intuition provides forecasts of questionable value. Modeling involves designing and building a representationof the system. the forecast model includes assumptions about the structure of the system. it is difficult to achieve consensus and train successors. Singlevalue input assumptions calculate through the model to result in a singlevalue outcome. such as by linear regression 1 Modeling the system. These models are in . model. project scheduling models were perhaps the most interesting. the work breakdown structure (WBS). Modeling is particularly valuable in situations that are new. PERT. In addition to input values. Forecasting Approaches There are three general approaches to forecasting: 1 Using intuition or guessing 1 Extrapolating the past. Extrapolation requires suitable historical data and assumes that circumstances and behavior in the future will be similar to what was experienced in the past. the bestforecast is the expected value (EV) outcome. Seldom are the assumptions or reasoning adequately stated. Among the quantitative techniques that I learned in business school. or activity nodes network. Recall that such models are termed deterministicbecause every value is singly determined. In the context of this discussion of decision analysis. based upon one's best understanding.
though it is just a graph and has no calculation methodology. This section is an introduction to the methods. .Chapter 9ModelingTechniques Activity A Activity B Activity C Activity D Activity E Activity F 1 5 10 15 20 23 Project Week Figure 9. for those who may have missed this. He developed this in the context of World War I military projects. and when they occur.1. illustrated in Figure 9. displays the activities in a project. Critical Path Method Perhaps the most popular project planning tool came from the chemical and construction industries. A CPM diagram is a directed network representing the sequence of project activities. Originally the representation was: Activities as arcs and these connect Nodes (also called "connection points" or "events"). The critical path method (CPM) grew out of a joint effort between DuPont Company and Remington Rand Univac during 195659. A Gantt chart. still popular. It shows time to complete and sequence for every activity. This charting tool. as itlustrated in Figure 9. It might also help determine schedule dependencies and resource balancing.2. helps break down the project into a reasonable number of activities. Gantt Charts American industrial engineer Henry Gantt (18611919) originated a form of bar chart that bears his name.1 Gantt Chart tuitive and an elegant way to think about projects. The "activityonarc" method clearly shows each activity and its sequence in the project.
The earliest completion at a node is the latest time from all of its activity connections coming from the left. Nonetheless.7 Chapter %Modeling Techniques ActivityonArc I S sh Figure 9. chronologically. 124 r . we can examine timeversuscost tradeoff alternatives for each activity along the critical path. Calculate the earliest times by working left to right. and the critical path is a key determination in project planning. 2. The difference between numbers inside the nodes represent slack. 4. This sequence is the mmtical path (CP). it has become standard to represent activities on nodes. The connection points contain two numbers: The upper number is the earliest time that this point can be reached.Cost of the crash program. through the network. 3. Find the activities connecting nodes with zero slack. In recent years." an activity is approximately: Value added by a crash program = Days shorted by the crash program x Value/Day shortened . and the lower number shows the latest time to reach this point. The latest completion time at a node is the earliest time.Any delay in these activities correspondinglylengthens the time to complete the project. the CPM label persists. indicating that connecting activities can be somewhat late and not delay the project. The value of accelerating. backcalculating times from the activities at the right. Solving the CPM model involves these steps: 1. Calculate the latest times by working right to left. or "crashing.2 CPM ActivityonArc Diagram I The assessed completion time for each activity is the number on its arc. If there is value in shortening the overall completion time.
In PERT.The mean time to complete a task is: The standard deviation of the completion time is approximately: o=H. with activities on the nodes.Chapter 9ModelingTechniques This calculation is an approximation because the critical path is seldom certain. Here are the traditional formulas for the statistics (using "moment methods" of Chapter 1). It is similar to CPM. M e n and Hamilton developed the Program Evaluation and Review Technique (PERT) during the 1950s for the Polaris submarine project. PERT is a tremendous invention and is the underlying concept in some commercial projectplanning programs. we would find that other paths might become critical. we choose the CP deterministically. these are in the form of three points: Optimistic completion time (L) Pessimistic completion time (El) Most likely (mode) completion time (M). Traditionally. and other activities may become (more) critical as we accelerate one activity. but with the addition of probability distributions. These deficiencies are sometimes serious and warrant morerigorous calculation methodsnotably. Most practitioners call these either CPM or PERT diagrams. Despite these issues. In a stochastic model. we obtain estimates of completion time for each activity.L 6 These formulas approximate the statistics for a beta distribution. Program Evaluation and Review Technique The United States Navy and Booz. The foundation of a PERT analysis is an activity network diagram. the project has a mean completion time for activities along the CP: The standard deviation is: O~=F ActMies r along the CP These calculations have two serious problems: In PERT. . The standard deviation formula assumes activity independence. Monte Carlo simulation (simulation). Then.
The project team supplies the Activity Code. we can represent the last three relationships. The arrows in Figure 9. although the techniques apply to all entity types.3 Typical Activity Node Representation Precedence Diagramming Method "Industrialstrength" projectplanning software has evolved to use a more flexible activityonnode notation. such as for paintdrymg time.4 show the four relationship types. The other boxes will contain values calculated by the program. Although cumbersome. This allows a simple way to show delays. PDM eliminates the need for dummy nodes for overlapping activities. Estimated Duration would be a single value for a deterministic model. using dummy nodes in PERT or CPM. A further PDM embellishment is the ability to include a time delay on a link.Chapter 9Modeling Techniques Earliest Start Estimated Duration Earliest Finish Activity Code Activity Description Latest Start Float Latest Finish Figure 9. Deterministic Cashflow Models This discussion describes decision makingfrom the perspective of a business enterprise. The four relationships possible between predecessor and successor activities include: FinishtoStart (the only relationship recognized in traditional CPM and PERT methods) FinishtoFinish StarttoStart StarttoFinish.3 shows a typical node representation. It could be a probability distribution in a stochastic model. Critical path analysis has evolved to a more general form. . Activity Description. called precedence diagramming method (PDM). Figure 9. The chief advantage is more flexibility to represent activity starting and finishing conditions. and Estimated Duration.
and cost of capital assumptions. schedule. Decision making is easiest if a single value measure expresses the quality of the outcome. This is the simplest way to deal with multiple objectives or multiple decision criteria. such as inflation rate. project performance and schedule translate into cashflow impact. this is often called net present value (NPV). Figure 9. it is impossible to make consistent decisions without a way to determine a composite value. One needs a logical way to trade off one dimension in terms of another.Chapter 9ModelingTechniques StarttoStart Four Precedence Relationships  Chapter 3 and Chapter 4 describe valuing outcomes through decision policy. is straightforward and reasonably consistent. . and cost. Presumably. Everything important to the decision is reflected in either revenues or cash expenditures. Thus. Information generated in the development model and feasibility model can be used to generate NCF.The present value (PV) calculation transforms an incremental NCF forecast into incremental corporate value. Discounted cashflow (DCF) analysis is the basis for most modern financial analysis. While these dimensions are important. I recommend converting nonmonetary dimensions into money equivalents. however. The general process. this cash flow is available to distribute to investors or to reinvest in the business. in Chapter 4). There are many arguable details. For most purposes. and then the PV or EMV. Shaded blocks in the figure indicate common project outcome criteria that inherently lead to multicriteria decision making (MCDM.5 illustrates this evaluation approach. PV of the net cash flow is the singlevalue measure. Cash Counts In business. Project managers are traditionally concerned with performance. tax rates. Alternatively. value derives from net cash flow (NCF).
However. further analysis effort is warranted. Sometimes managers concern themselves only with a development or construction phase. and apply to construction or nonconstruction projects equally. We want to avoid modeling the entire corporation for every decision.Chapter 9Modeling Techniques Decision Model Input Assumptions cost EIl W or EMV Figure 9. perhaps incorporating further detail in the model. this is usually inadequate. The subject system of the analysis may be all or part of an industry. and even other projects. a project. Decision analysis techniques are fully general. Projects often influence other areas of corporate operations. a business. or a transaction. Sometimes a quick inspection of the model outputs will indicate an obvious decision. as illustrated in Figure 9. The model's scope is one of the most important analysisdesign decisions. When this is the case. at least to the point when the best alternative is clear. we need sufficient detail to adequately forecast the impact on corporate NCF. Accuracy is usually betterand more significant for our 128 I I . The scope usually needs to consider the remaining life cycle of the project and sometimes the life cycle of the product of the project (asset). In other situations. Sirnpllfylng assumptions are usually necessary.6. the relative differences in outcome values may be small. The appropriate level of detail depends upon the decision at hand.5 Cash Equivalents 1 1 Models Outcome Forecasts and Decision Criteria Problem and Model Scope A good project model contains sufficient operating and financial detail to reasonably represent the impacts of the relevant decision alternatives. All important details and aspects of the problem should be incorporated into the model. The incremental effect on corporate NCF and its PV are generally the most useful model outcomes. Completion time and asset performance also impact value by affecting cash flows.
The situation may be where a modest effort or investment potentially frees or safeguards much greater resources. It is unfortunate that most people are so busy and reactive that they do not have time to create decisionmakingopportunities. Net Cash Flow d PV or EMV 1 Figure 9. or materials. The scope of the problemsolving process is important. The need to estimate incremental corporate value drives the scope of the decision model. Modeling Process Typically. It usually involves a choice about allocating resources such as time. Increasingly.Chapter 9Modeling Techniques Construction Operation End of Life I * Feasibility Model 4 Construction Model Operating Model Decommissioning and Reclamation Corp. money. and what can we do to protect from or exploit the contingency?" .6 Evaluation Scope in Asset Life Cycle purposeregarding the difference between alternatives than for the actual alternative values. Something happens. O Recall that Proactively Identify Decision Opportunities is the first step of the decision analysis process. or a new idea or information surfaces. an evaluation process is initiated when a problem surfaces. businesses are recognizing the value of proactively creating new alternatives. "What can we do to improve the value of this project?" And. "What might happen. described in Chapter 2. We professionals should continually ask. different from plan.
induction furnace.IncomeTax ." A modeling conceptual error. It is wise to buildin checks to ensure that all resources units are accounted for. Rather. Factors in the model may have inherent flow relationships. Before beginning to actually construct a model. Modeling Flows Understanding the project and its elements is essential to developing a valid model. a crossdisciplined project team is involved or assigned to the problem. etc. . it is the relative values between alternatives. If one is selecting the melt process for a casting plant. The idea of conservation of mass. The project team and decision makers should be confident that the model behaves in a way that is consistent with their understanding of the situation. The model should encompass all important features of the system. is widely applicable. money. Here are two example statements: NetCashFlow = CashMargin .CapitalExpenditures Date Prototype Testing Begins = Maximum (PrototypeConstructionFinish. We must choose a logical framework for developing the model. The general planning rule applies: "An investment in planning has a tenfold savings in execution. and arc furnace. Determining the appropriate model detail is a balancing act. three alternatives are available: cupola. it is helpful to conceptualize the model's organization or structure. It usually works well to adopt one or more of the following as a theme for the modeling process: Sequence of activities Flow of information Flow of physical units Flow of labor hours and material quantities Flow of cash Flows of income and expenses (accounting book basis). If the best two alternatives have similar values. found at an early stage. further analysis is usually appropriate to refine these choices and discriminate the better value. The team should first define the problem (Step 2 in the tenstep process) and include a situation description.Chapter %Modeling Techniques Often. is much easier and less risky to correct. The model represents our understanding of how the project (or asset) behaves under different conditions. This may include optimizing operating design and parameters for each major strategy choice. System Diagrams The first step in any modeling effort is to identify the objective of the analysis. We build these models with mathematical formulas and variables.. It is not the accuracy of the estimates that is important in decision making.TestingFacilityConstructionFinish).
or PDM diagram.. The graphical equivalent is the CPM. A similar diagramming approach. In project management. formula calculations) Roundedcorner rectangle for payoff variable(s). This provides a list of the project's activities and the precedence relationships.Chapter 9Modeling Techniques A diagram of the problem is a good early investment. is an influence diagram.7 shows an example.e. possibly. PERT. the starting point may be the project plan as a WBS. We can do this with little time and effort. the basis for decisions (such as EMV). Costs and. performance details can be built upon the base projectschedule model. Figure 9. The arrows on the arcs in Figure 9. focusing more on the variables. before laboring with the details of formulas and values. . A systemflow diagram is useful in depicting essential features.7 show the direction of influences or causality. Common symbol shapes in influence diagramming include: Rectangles or squares for decisions Ovals or circles for risk or uncertainty variables W Doubleovals for deterministic calculation variables (i.
additional analysis may be warranted to further define the probability distribution of the input variable. Such diagram drawing is the central practice in system dynamics. Decomposition is the process of breaking down something complex into understandable and workable components. the subject decision is whether to accelerate an activity.feedback connections. Other Modeling Concepts Modeling proficiency comes with practice. unfamiliar problem. although there may be a delay. Construct Prototype. Having decomposed the problem into smaller parts. Sometimes teams resolve problems through the insights gained just by developing the graphic model. Whether an activity is on the CP determines whether delaying the activity directly delays the project's overall completion time. It permits the modeler to examine the degree to which an outcome variable is affected by changes in input variables. "Loops and links" are often positive or negative . We can break out submodels for organization convenience. Decomposition enables better intuition and allows dealing with more detail than would otherwise be possible. Following are several useful concepts in modeling. Members can discuss the variables and their relationships while drawing the diagram on a whiteboard. Simplicity is a hallmark of a good model. Typically. Detailed modeling should not proceed until everyone is satisfied that the diagram adequately and faithfully expresses the team's understanding of the system and decision alternatives. we can develop a submodel to better understand a portion of a system. illustrating (qualitatively) how the system behaves. there is elegance in a minimal representation that adequately matches the project manager's view of reality. and link them into a main model. we are interested in a probability distribution that is difficult to judge directly. This activity is part of a much larger program or project. Often.Chapter %Modeling Techniques In Figure 9. In modeling. Prototype Testing follows. I f a relationship is found to be important. we can combine the components into the overall model. This technique pervades most analyses. Synthesis is the process of combining components into a larger whole. the discipline of modeling dynamic systems with rateofchange (differential) equations.7. . The Prototype Testing Start Date affects the activity costs and the project's overall time to complete. Sensitivity analysis is very important in deciding which elements significantly impact the outcome. We can then copy the submodel's distribution into the higherlevel model. or to more precisely define the formula relationships in the model. without resorting to quantitative modeling. synthesis would naturally follow decomposition. Influence diagrams are a great way for a project team to begin working on a new.
We then use probabilities to weigh the outcome values into the EV forecast of a decision variable such as EMV. Reusing standard modules will lead to more efficient model development in the future. Available tools include: Computer spreadsheets Formulabased modeling tools. regardless of the probabilistic technique involved. such as flow charting and diagramming programs. and control. project cost is but one component of corporate NCF. Other tools are being developed (or are available. There is a wellestablished industry in projectscheduling software. resources. Its main function is to generate projections and. Typically. However. . values for each possible outcome scenario. Modeling Tools Computer assistance is preferred for all but the simplest decision problems. 60 percent of the effort of a decision analysis is in constructing the deterministic casMow model. such as High Performance Systems' ithink@ Modeling languages. Computers allow us to include a much greater scope of considerations in the models. The model aids in making forecasts. Much of the commercial software is limited to certain aspects of project management. we evaluate every path through a decision tree. and every trial scenario in a simulation with a deterministic model. e. such as Microsoft Project A variety of other tools to aid in defining and refining logical relationships between variables. highlevel programming languages. or completion time.g. Often we need a custom model. Models to calculate construction costs are often the most detailed. such as Javelin@ or Lotus@ Improv" Procedural. new situations most often require partially or wholly custom models. including controlling cost. These rely on PNDs that permit other logical relationships between activities that are not typically available in commercial programs for project planning. but not used extensively) that permit more sophisticated modeling of project network diagrams (PNDs). Research has shown how to construct model logic modules for phenomena in typical projects. such as Microsoft@ Visual Basic" Graphical modeling tools. The deterministic project model is the core of a planning or an evaluation analysis.. some that now permit visual display of the results of dynamic interactions. tailored to the situation at hand.Chapter 9Modeling Techniques Unfortunately. scheduling. hence. Usually. CACI'S SIMSCRIPT II@ Taskspecific modeling programs.
Chapter 9ModelingTechniques
Spreadsheet programs are the most popular businessmodelingtool. For larger problems, formulabased tools provide a more manageable and productive environment. For the largest models, or for those with complex formulas or conditional branching, procedural programming languages have an advantage: detailed modularity. Microsoft Visual Basic"is powerful, and welldone models are mostly readable by nonprogramrners. The graphical modeling tools will become increasingly functional and popular. There are many tools available for project planning. An increasing number of these solve the project activities network with simulation. These tools generate forecasts of cost and completion time as distributions. At this writing, there are at least three addin simulation programs for Microsoft Project. Other vendors of project management software have added simulation capabilities. Beyond the current tools, we can potentially reduce much of the modeling effort through rulebased expert systems (a subset of artificial intelligence, discussed in Chapter 17). Actually, the amount of work is unchanged; we delegate more of it to the computer.
Sensitivity Analysis
Project models are comprised of variables and formula relationships. We combine input variables in formulas to determine how the project system behaves and the corresponding outcome value. Obviously, some variables are more important than others. Sensitivity analysis is the process of analyzing the relative importance of elements in the model. Usually we focus attention on the input variable assumptions and not on the model structure. A nonprobabilistic, or deterministic, cashflow model provides the basis for valuing each possible outcome. Monte Carlo simulation and decision tree analysis are ways to recognize uncertainty in input variables. These r e p resent judgments about risks and uncertainties. Applying decision tree analysis, Monte Carlo simulation, or other probabilistic techniques turns the deterministic model into a stochastic model. The real purposes of sensitivity analysis are to: Identify key variables warranting special attention Provide information to back up a presentation and recommendation. The analyst may want to allocate time for obtaining expert judgments in rough proportion to the importance of the variables. Typically, two to five input variables cause over 90 percent of the uncertainty in the outcome value.
Sensitivity to Individual Inputs In the simplest application, we perform sensitivity analysis by making alternative "whatif" trials of a deterministic model with step changes in one or more of the variables. These can be either chance or decision variables.
Chapter %Modeling Techniques
700 680 660
V)
0
s
C,
c
o
3 0
a
c
640 620 600
+ MaterialsUnits + MaterialsUnitCost
j , WageRate
V)
580
'7
8
PerformanceReward
a
2
560 540 520 500
0.8
0 . 9
1
1 . 1
1.2
Deviation from Base Case
Figure 9.8
Spider Diagram
Usually, only one variable is changed at a time, holding the other variables at their base case value. This form of sensitivity analysis is easy to perform, and does not involve probabilities. A spider diagram, such as in Figure 9.8, is a popular way to present simple sensitivity analysis. The xaxis is a factor deviation from the base case. Although the yaxis can be any dependent parameter (PV, internal rate of return [IRR], and so on), the outcome value measure is most meaningful.
'* IRR is the discount rate that makes PV = 0.This popular decision criterion represents the
yield of the unamortized investment, and is often used for ranking investments under a capital constraint.
Figure 9.8 shows the model's outcome sensitivity to changes in individual input variables; the more sensitive the model to a variable, the steeper the slope. A drawback to the spider diagram is that percent deviations do not indicate the range of uncertainty for each variable. A variant of this diagram adjusts the lengths of the "legs" to confidence bounds (for example, 80 percent confidence limits) for the respective variables. Some practitioners further embellish the graph with "webs" connecting the confidence points. Figure 9.9 shows a popular sensitivity chart in project evaluations, where the project can either succeed or fail. This is a singlevariable spider diagram, with chance of success as the input variable and EMV as the outcome value.
Chapter %Modeling Techn~ques
200 150
cn
C
2 .tff
100 50 0 50 0.0
E
3
0.2
0.4
0.6
0.8
1.0
Project Chance of Success
Figure 9.9
Sensitivity Chart for Chance of Success
A recently morepopular sensitivity graph is the tornado diagram, shown in Figure 9.10. This has the benefit of expressing the combined effect of the variable range and model sensitivity to that variable. We run alternative whatif cases, changing one variable at a time to a low confidence bound (e.g., 10 percent), then to a high confidence bound (e.g., 90 percent). The graph in the figure prioritizes variables in sequence of importance, according to the range of the resulting outcome values. When the graph is oriented in the manner shown, the outline of the bars resembles a tornado. Note that the top four or five variables account for most of the outcome uncertainty.
Leverage Model
Both deterministic and stochastic models are suited to sensitivity analysis. Stochastic models show the additional effect of uncertainty in the input variables. Variables having a high degree of uncertainty have a correspondingly greater influence upon project outcome uncertainty. In the tornado diagram featured in Figure 9.10, two effects determine the importance of a variable to the analysis outcome value: H The sensitivity of the model to changes in the variable's value (i.e., the slope of the line in a spider diagram) The uncertainty, or range, of the variable. These two aspects combine in a "leverage" effect, as illustrated in Figure 9.11. Symbolically, we have this formula relationship:
I
(variable's importance) z (uncertainty in the variable) x (sensitivity of the model to this variable).
136
Material Units Material Unit Cost Performance Reward Wage Rate Direct Labor Hours Shipping Unit Cost Overhead
I
A

Chapter %Modeling Techniques
1
1
m
!
u
I I I
400
500
600
700
Project Cost $thousands
Figure 9.10
Tornado Diagram
Variable Interactions Sometimes in performing sensitivity analysis, we simultaneously change two or more variables to determine the joint, or combined, effect. The joint variance (change) is usually different than the sum of the component variances. Analysis of joint effects is cumbersome with conventional sensitivity analysis utilizing the deterministic model. There is a relatively new way to do sensitivity analysis that recognizes the possible interactions of variables in the calculations. Run a Monte Carlo simulation, saving the trial values of the input variable values in addition to outcome values. For each input variable, calculate a correlation coefficient between the input and outcome variables. The correlation coefficient formula is in Appendix lA, Moment Methods. This statistic represents the degree of association (correlation) between the variable's random sample values and the project's trial outcome values. The magnitude of the correlation coefficient is a measure of sensitivity to that variable. if the variables have a perThe correlation coefficient @) ranges from +1 fectly positive correlation to 1 when they have a perfectly negative correlation (both rare). A +1would mean that the sequenced (ranked) input variable values and outcome values are perfectly paired. If the correlation coefficient is close to +1 or 1, then we should simply use a formula to express the dependent variable as a function of the independent variable. A zero correlation means that there is no apparent pattern (correlation)between an input and the outcome variable. Figure 9.12 is a sensitivity chart illustrating the relative importance of each input variable, according to variable rank correlation. (llus graph was generated with Crystal Ball.@RISKproduces a similar chart. See Appendix B.)
Chapter %Modeling Techniques
Importance of a Variable's Uncertainty Variable Range from Distribution
sen sit*^ in se term in is tic Model
4
i
Figure 9.11
Sensitivity times Uncertainty
Target Forecast: Project Cost
*
Material Units Material Unit Cost Wage Rate
.63
.45
.38
.34
I
* * Direct Labor Hours
Performance Reward Shipping Unit Cost
.28
.21
.09
I
*
Overhead
*  Correlated assumption
1
0.5
0
0.5
1 .
Measured by Rank Correlation
Figure 9.12
Sensitivity Chart from Simulation
* : * Rank correlation uses the rank sequence of data values (ordinal scale data) rather than the values themselves (ratio scale data) in computing the correlation coefficient.
As with a tornado chart, we prioritize input variables by the width of the bar. We can obtain a similar sensitivity chart, with the same ranking of variables, by approximating each input variable's contribution to the variance (02) of the outcome variable. If I could use only one form of sensitivity graph, this would be it. There is one little problem, however. Sometimes we have a strong correlation be
Decisions are based upon the thencurrent forecasts. Example intermediate decision points include: Evaluating whether to continue a project after test results Changing specifications to adapt to a changed outlook for market demand Expediting activities to make up for project delays. Such models reflect the adaptive nature of realworld project management. which often ripple through the system (when things start to go wrong. in effect. Dynamic Simulation Models In most Monte Carlo project simulations. sometimes we want to temporarily disable correlation in doing sensitivity analysis to get a better prioritization of input variables. We can make projects selfhealing. often a chain of unpleasant surprises is set into motion). The typical analysis has many assumptions. A dynamic model makes adjustments as the project simulation progresses. but is highly correlated to Material Units (p= . At subsequent decision points. much as the real system where a manager can intervene. The system then solves the model to determine the forecast outcome and its value for that trial. Several decisions and chance events. In the model in Figure 9. Solving such a decision tree defines choices that will be based upon event outcomes realized up to that point. Situations where dynamic simulation is applicable include: H Shared resource loading Activity duration and cost dependent upon actual start time versus original plan Effects of contingencies. Shipping Unit Cost is of minor importance. Key decision points are placed where management interventions can occur. and rules of thumb. A decision tree is perhaps the simplest form o f a dynamic project model. forecasts will use thencurrent conditions and apparent trends. Disabling correlation allows such minor variables to drop out. Random samples of input probability distributions (trial values) are substituted in the deterministic model. Decision points during a project can greatly complicate the model.5 in the model). Thus. however. These dynamic simulation models are some of the most interesting to develop. A decision tree analysis can explicitly represent only a few subsequent decision points. We can accommodate more detail and flexibility in a simulation model. for example. uncertainties are resolved at the start of a simulation trial.12. can overwhelm a decision tree approach. policies. .Chapter +Modeling Techniques tween a minor variable and an important variable.
more codident decisions. Modeling in this detail enables more realistic project planning and evaluation. While the intuition of the experienced manager can add significantly to the decisionmaking process. can drive the model's behavior. Decision rules in the project model. representing the company's decision policy. Modeling is a key method of gaining insights into the behavior of a complex system or any reasonably sized project. Of course. the wise decision maker supplements intuition with logical and rational decision methods. and it can be. and should consider the value of further analysis. Model building looks timeconsuming.Chapter %Modeling Techniques Simulation is the only technique that can represent such diverse and important details in a decision model. The key to dynamic modeling is representing conditional branching. however. should always be appropriate to the decision being made. . SummaryToward Credible Evaluations The aim of decision analysis is faster. It is difficult to overstate the importance of a good model. IF statements accomplish this in spreadsheets. The alternative to decision analysis is intuition. the extent of an analysis is often subject to time and resource constraints. Analysis only adds value when the analysis outcome may affect the decision. The extent of the analysis.
so long as we are clear about what "delivery delay" means. Some distributions also use the standard deviation (0) statistic as an argument. I'll use mean Q instead of expected value (EV). . In this chapter. It will be either true or false. A contingency event either happens or not. These are typically the way we speclfy the respective distributions to a Monte Carlo simulation (simulation) computer tool. Spreadsheet IF statements are a prime example of using binary chance events. Here we examine the most popular distributions and situations where and why they apply.4)chance of delivery delay" seldom presents confusion. the most qualified people available are asked to provide their opinions about values that go into the model. Most people are comfortable with this everyday idea. a probability distribution can completely express his judgment about uncertainty. Often we assign values of 0 (false) and 1 (true) as outcome values of a binary distribution. Probability Distributions Probability refers to a number. Here we examine the most popular distributions and situations where and why they apply. When the expert does not know a significant parameter with confidence. In the example figures.CHAPTER 10 Probability is the language of uncertainty. A common anxiety among people learning decision analysis is choosing the type of probability distribution to represent an uncertainty. Usually. A phrase such as "a 40 percent (. Representing judgments about uncertainty is key to using stochastic (probabilistic) project models. We will use p to represent probability of success. representing chance of occurrence. though they are synonymous. 01. Often a risk is a binary event. the numbers in parentheses are the "arguments" of the distribution function.
225 a . number of people available.or perhaps several. For example. Simple systems often produce discrete distributions. distinct possible outcomes? If so.Chapter I&Probab~llty Dlstnbutlon Types Binomial(l0. even some hand calculators. after the binary risk event. Software tools.ooo Figure 10. the potential outcomes are integers. we have instead a continuous distribution. and we judge each as having a p = 142 I . most are simple enough. such as "number of work interruptions. we are talking about a discrete probability distribution. If the outcome value is along a continuum. Figure 10.2 show the most popular discrete distributions.300 rn I .1 and Figure 10. would exhibit a binomial distribution. because of space. We will look at the distribution shapes that are most useful in project evaluation and in project management. Probability distributions represent the range of possible values and the probabilities of values within the range.075  II . and these probabilities must sum to 1. I Binomial Distribution A collection of independent events. suppose that we have ten independent research projects. number of units needed.I50 m . Discrete Distributions Does the chance event have two. Frequently.1 T 0 1 1 2 3 4 5 6  Binomial Distribution For morecomplex situations. we have other types of discrete probability distributions. Each possible outcome is assigned a probability. Though we don't have the distribution formulas printed here." Examples include number of equipment breakdowns." The parameter name often begins with "number of. and so forth. are widely available with functions for the distributions described in this chapter.. each with the same probability of success.2) .
The number of such events in a time or other interval is described by a Poisson distribution. such as representing the number of work requests during a time period.2 chance of success. Typically.2 shows a Poisson distribution with p= 2 occurrences.271  T .203  0 1 2 3 4 5 6 7 Figure 10. we can use only discrete distributions. Though elements in actual systems may not be fully independent or have the same probability of success. the binomial distribution is often a reasonable approximation for a simple portfolio model. Poisson Distribution Sometimes events happen randomly.2 Poisson Distribution 0. This distribution is often a starting point for representing simple portfolios. if you want more resolution. and low case. Figure 10. ThreeLevel Estimates In decision tree analysis. Figure 10. such as research projects. three branches are adequate: high. Suppose that your expert judges the uncertainty in volume of a natural gas reservoir as a lognormal distribution with p = 120 and o = 50 billion . This distribution is widely used in simulating queuing (waitingline) systems." We should choose these three values and their probability weights. so that the original continuous distribution's statistics are preserved.1 shows a binomial distribution with n = 10 trials with p = . such as defects and work interruptions. Thus.2 probability of success on any trial. More than three outcomes can be used. These are called "threelevel estimates.Chapter 1GProbability Distribution Types Poisson(2) . we must convert continuous chance events into discrete approximations. medium. The binomial distribution characterizes this situation. The only parameter needed to speclfy this distribution is the average (p)number of events occurring during an interval.
5 through 10. You should remember that the yaxis is scaled such that the total area under the curverepresenting probabilityequals 1. For a decision tree analysis.8 b . Figures 10. as is. and quality metrics are prime examples. The probability of an xaxis value is proportional to the height of the curve above that point. Figure 10. Cost.6 2 a 8 .4. as Judged cubic feet (Bcf). we would use the continuous distribution directly. Bcf Figure 1 0 . This is often called a rectangle or boxcar distribution. Normally a scale is omitted from the yaxis. I f (a) one knows the 1 4 4 .4 5 0 . most uncertainties will be continuous. With simulation. 3 Continuous Distribution.m 0 4J 0. time.3 illustrates this judgment. because of its shape. 2 0 0 100 200 300 400 Recoverable Gas. Chapter 10Probability Distribution Types eo 0.10 show the most popular continuous distribution types. It is possible to show a scale.H 0. Uniform Distribution The simplest continuous distribution is the uniform. Continuous Distributions In project management and other types of planning and analysis. though the units are unnecessary. I discretized the distribution into the form in Figure 10.
However. Just three values fully describe the triangle distribution: minimum (L).8 Bcf 108. Let me offer two cautions when using the triangle distribution: .0 65.Chapter 10Probability Distribution Types 182. Simplicity is the appeal. most likely (M. Many people incorrectly assign this distribution shape to uncertainties. it is a rare system that generates values that are uniformly distributed. and a small section of a broad distribution. we are describing a uniform distribution. Figure 10.5 illustrates a uniform distribution bounded at 0 and 2 units. values in a central part of the range are more probable. More often.6 shows an example. and maximum (El). Even people without training in probability can relate to these points.2 Figure 10. though I've never seen a system in business or in nature generate values with a triangular shape. Figure10.2) I I 1 0 1 2 Figure 10. mode). and (b) if every value within that range is equally probable.5 Uniform Distribution upper and lower bounds of the range of possible values.4 ThreeLevel Approximation Uniform(0. The shape of a particular distribution is usually less important than its position Q and width (which we might characterize with o). Examples of uniform distributions include: position of a rotating shaft that stops randomly. position at which a cable breaks. Triangle Distribution The triangle distribution is very popular.
For example. Ev). From a table or the function. and not just pessimistic and optimisticcases. Then if the activities have similar duration. 146 .6 Triangle Distribution I People tend to judge distributions too narrowly. The vertical dashed lines in Figure 10. the normal distribution is boundless.7 shows a normal distributionwith a y = 2 and a o=1 unit. Amazingly.3 percent) of being outside the range of y CL 30. When we sum many independent. In theory (and equation). for planning and evaluation work. and this happens to have a simple formula: y = (L + M + q / 3 . Normal Distribution The most commonly observed distribution is the nomzal. It is not. the total tends to be normally distributed.7 show o intervals. The normal distribution occurs often in nature and in business. this is true regardless of the shape of the component distributions. Drawing the distribution will help eliminate the confusion.4) 1 1 I I 1 2 3 4 Figure 10. Galileo observed the familiar bellshaped distribution when measuring the positions of stars and planets. o = 1). Most every statistics book includes a table of the standard normal distribution @ = 0. the distribution for time to complete a chain of activities is approximately normal.2.Chapter 10Probability Distribution Types Triangle(l. one can determine confidence intervals. continuous chance events together. In statistics. the central l i m i t theoremone of the fundamental principles in statisticsdescribes this key behavior. As a practical matter. You can obtain the same values from the NORMDIST function in Excel. Figure 10. The low and high parameters are the actual extremes. Simple project models often assume that activities are independent. unless the triangle happens to be symmetric. there is virtually no chance (0. Many people will confuse "most likely" (the mode) with the best singlepoint estimate. The best singlepoint estimate is the mean Cu. an interval y + orepresents a 68 percent confidence interval for a normal distribution.
When two or more distributions are multiplied. the product becomes more lognormal in shape. too. Exponential Distribution The exponential distribution is a counterpart to the discrete Poisson distribution. Some older readers may remember adding logarithms on slide rules to multiply. having a longer tail to the right.9 shows an exponential distribution with p = 2 units: . then the data are normally distributed. Both the normal and lognormal distributions are completely described by p and o. such as the 10 percent and 90 percent confidence points. Figure 10. This is a common distribution shape for uncertainties that are positively skewed.0 2. Special "probability graph paper" is available for plotting the cumulative frequency distribution. The exponential distribution is most commonly used to represent time between arrivals of random events. then the data are lognormally distributed.Chapter IC&Probability Distribution Types Norrnal(2.7 Normal Distribution Lognormal Distribution We often observe data where the frequency distribution is positively skewedthat is. This has one axis linear in standard deviations of a normal distribution.0 5. As we multiply more independent components together. is a result of the central limit theorem. If the cumulative frequency of the data plots as a line on a linear value scale. with p = 2 and o = 1. Some computer tools accept alternate statistics to specify the distribution.0 Figure 10. such as time x hourly cost. *% The log of values obtained from a lognormal population plot as a normal distribution.1) 1. the product is typically positively skewed.8 shows an example lognormal distribution. If the data plot as a line on log value scale. W s .) Figure 10.
we use the exponential distribution in discreteevent simulations. we can make many different symmetric and asymmetric shapes.8 Lognormal Distribution Exponential(2) 0 2 4 6 8 Figure 10. 148 . and 3 is a scaling factor.l)  0 2 4 6 Figure 10. With the shape factors. Beta Distribution The beta distribution is a mathematical function that assumes a variety of profiles. In the distribution illustrated in Figure 10.9 Exponential Distribution Like the Poisson. Common uses include representing the time until the next arrival of a service request or unscheduled work interruption. the shape parameters are 2 and 5.10. such as of queuing systems. depending upon two shape parameters.Chapter 10Probab~l~ty D~str~but~on Types Lognormal(2. Project management professionals have used beta distributions since the 1950s to r e p resent activity timestocomplete in PERT project models. We can reposition and rescale the distribution as needed. The basic distribution is bounded by 0 and 1.
The amount of effort in detailed modeling should be consistent with the importance of the variable to the decision at hand. the popular software tools allow users to speclfy custom distribution shapes.3) 0 1 2 3 Figure 10.10 Beta Distribution Which Distribution Is Best? If you are the expert. the distributions listed earlier usually suffice. If the expert can explain how the process works. Importantly. . Fortunately. and your system need not produce a distribution with a recognizable mathematical function. the best distribution is the one that completely expresses your belief about the uncertainty. Usually our best available experts provide judgments that go into the project models. Be aware that mathematicians have described hundreds of distributions. Delegating responsibilityfor these inputs is a responsibility of the decision maker. A probability distribution completely describes a judgment about a single variable. we can model it and use the model to generate a distribution shape. Often we have data that suggest that a system is producing a common distribution shape. There is an additional dimension of expert opinion: Are there influencing relationships among the variables? That is the topic of Chapter 12. Judgments and Biases. perhaps the best way to obtain a distribution is to model the subsystem that causes the uncertainty.Chapter 16Probability Distribution Types Beta(2. First. we will discuss the source of these probability distributions in Chapter 11. Without data.5.
.
inflation rate over time). Three Roles There are three roles in the decision process... Decision analysis is a mostly quantitative discipline.g. W e would like to avoid biases and cognitive illusions. Overlapping duties are common. They make the selection (usually accepting the team's recommendation) and initiate implementation.1: Experts or assessors provide the judgments that go into an evaluation.g. There are two basic approaches to judgments. as people "wear different hats. Usually. Evaluation analysts are primarily responsible for developing the quantitative project models. This chapter focuses on the expert judgments that are inputs to these models. as shown in Figure 11. Previous chapters describe forecast and evaluation models. or assessments. models that generate scenario outcomes and forecasts for each alternative. + + .CHAPTER 1 1 JUDGMENTS AND Psychology is often a part of decision science: many investigators are interested in better understanding how people do make decisions. a particular cost) Singleline forecasts (e. Our discussion here is about how people should make decisions. Decision makers review the forecasts and judge whether the analysis is credible. the most knowledgeable people available provide these inputs. as inputs to an evaluation: Deterministic Assessments (nonprobabilistic) Singlepoint values (e." Judgments A judgment is a value assessment performed by a person.
a probability distribution can completely represent an expert's judgment about the possible outcomes of a chance event. Objective and Subjective Probability Because a human performs a judgment. and Other Experts Evaluation Decision Figure 1 1 . Considering a single variable alone. Decision analysis is more encompassing.g. This is supported by a comprehensive understanding of the system or by conclusive evidence: either knowing the value and likelihood of every possible element in the p o p 152 . and is about evaluating alternatives in the context of decision policy. whether a task will require rework)..Chapter 11Judgments and B~ases AssessorsEngineers. The extreme opposite is objective probability.. this is adequate to represent minor variables or those known with high confidence.. the resulting probability distribution is called a suvective probability. Even in decision analysis. whether or not an event happens (e. 1 Roles in Project Analysis A conventional deterministic analysis uses only input variable forecasts. hours to complete an activity) It Discrete probability distributions (e. When a value is uncertain. Risk analysis is the process of assessing the probability distribution for one or perhaps several variables.g. Scientists. Economists. There may be several risk analyses as part of a decision analysis. number of subcontractors bidding) 0 ) Event probabilitiesthat is. Probabilistic Assessments (stochastic) @ Continuous probability distributions (e.g. the representation takes the form of a probability distribution. This binary event type is the simplest of discrete distributions.
as inferred by their name. which is rare. or completely understanding the stochastic process providing variable outcome values. where we have essentially complete knowledge of the system. always involve a degree of subjectivity on the part of the expert. Being "objective" as to source (see Figure 11. the expert must assess to what degree the data still apply. far from either end. The probability is not an outcome prediction. "The probability is 30 to 40 percent. not objective to quality) are more likely .e. With sufficiently complete data or system knowledge.. so long as it faithfully captures the expert's understanding. For a binary risk event. Judgments. Subjective probability assessments reflect someone's belief.2 shows the continuum between the two extremes.Chapter IlJudgments and Biases 100% Objective Probability Perfect Understanding of the process giving rise to the probability distribution or Complete Data from the parent population v 100% Subjective Probability Intuition or Guessing without any evidence or support Figure 11. Biases (i. A range is nonsense (unless predicting where the final judgment will lie). the probability reflects the expert's degree of belief that the event will occur. Even with considerable data from similar situations. the outcome will be 0 or 1(false or true). It is important that judgments be as objective as possible. Here are two common and illogical statements: "I judge that the probability is 30 percent. we could approach a 100 percent objective probability assessment. Fully objective assessments are rare. Most judgments lie somewhere in the middle. Rather." ** There is no such thing as an incorrect probability. Possibly confusing is also using "objective" when referring to judgment quality as meaning free from bias. and other elements of chance. nature or whatever is being judged. will often arrive at somewhat different assessments.2) reduces the effect of emotion. given the same information. despite often incomplete information. or it would always be wrong. The expert should settle on a single fraction. The opinion is personal. but I may be wrong. Figure 11. the assessment should be a single number." O When someone judges the likelihood of an event.2 Sources of Probabilities ulation. Two people. dice. Examples would be probabilities associated with cards.
we often hear a question like. Then work inward. acts as an interested but objective party. eliciting a judgment is usually best accomplished through an interview process. The interviewer. but does not guarantee a good outcome in a particular case. you will find several key variable assessments that are highly subjective. For this reason. The expert adjusts the size of the pie sections until she feels that the areas are proportional to the likelihood of the respective outcomes. Unless the expert has been well trained in probability concepts. In formulating an opinion. . In such cases. A scale visible only to the interviewer shows the corresponding probabilities. We use decision analysis in believing that there is a correlation between good decisions and good outcomes. Consistently making good decisions increases the likelihood of favorable results over time. it is important to first determine range of the distribution. Is this the right variable? Have we clearly defined what it means? Is this the right level of detail? What is assumed in this assessment? What is the worst (or best) possible outcome? What has to go right for this to occur? What might go wrong? In the interview. In most evaluations. you will likely fall into the trap of the anchoring bias (discussed later in this chapter). The decision model provides a logical way to use everything important that we know about the problem. we ask the expert to consider the possibilities. A similar approach would be to use a device where linear distances represent probability weights. If you do. "How objective is that assessment?" In decision analysis. One means is to use a probability wheel.3. appropriate use of even soft judgments about important variables leads to improved decisions. Eliciting Judgments Eliciting means to bring out or draw out the judgment of an expert. A good decision is consistent with the data and with the organization's decision policy. often a decision analyst. analogs may help diffuse the probability anxiety. such as that illustrated in Figure 11. Nevertheless. subjective probabilities are often all that are available.Chapter 11Judgments and Biases when the source of the judgment is subjective. Bracketing the possibilities and working inward avoids anchoring on an initial central value. Do not fall into the trap of starting with the best value and attempting to build a distribution around it. Some experts are uncomfortable with quantlfylng their judgments directly as probability distributions. Different perspectives of the variable are discussed and quantified until the expert is confident that the resulting probability distribution accurately reflects her best judgment. which uses pieshaped sections to represent the possible outcomes.
Consider persons asked to provide. only about half of survey participants have the true outcomes contained in their judged 80 percent confidence intervals. such that there is only a 1 0 percent chance of the outcome being lower than the lower bound. a project manager might be considering which component to use in a critical application. then choose Component Y. * : * An 80 percent confidence interval is defined by two bounds. say. That means the typical person is responding with a 50 . then the losses will bankrupt the company. an 80 percent confidence interval for a parameter." and black marbles represent "failure. If the company implements Component Y and it fails. Assume white marbles represent Component Y's "success. % R e f u s e d Fttiive Figure 11. Component X is reliable but expensive. Overconfidence about Our Knowledge Actual forecasting behavior and clinical experiments repeatedly show that people are overconfident about their knowledge. the wider her confidence interval should be." The interviewer adjusts the mixture until the manager is indifferent between: Choosing the safe but expensive Component X Randomly drawing a marble from the urn. For example. An interviewer chooses colors and number of balls to r e p resent different possible outcomes of a chance event.3 Counteroffer Probability Wheel A classic analogy to uncertainty is an urn or other container of colored balls or marbles. and realizing the project outcome corresponding to the marble color. It makes sense that the less someone knows about a variable. Typically.Chapter 11Judgments and Biases Yh&Contract . and a 1 0 percent chance of being higher than the upper bound. If the fraction of black marbles is less than the actual risk of Component Y failure in the application. Component Y has a "high probability of working in the product application and costs onethird as much.
objectivity is synonymous with lack of bias. If artificial situations are necessary for practice." survey participants continue to make their confidence intervals too small. requires attention in three areas of the evaluation: Biasfree assessments of input variables A value function. which faithfully measures progress or goodness toward the organization's objective Integrity in the decision model structure. If we elicit the judgment by using an interviewer. Then. first ask for a 90 percent. The statistic of most interest is the EV. . Biases One hallmark of a credible evaluation is objectivity (discussed in Chapter 1). Other interesting results from such surveys (see Capen 1976) include: People without knowledge of a topic are often unable to differentiate between 30 percent and 98 percent confidence intervals. The solution to this overconfidence bias is practice and feedback. such as by putting some real money at stake. then a 50 percent. they can discuss reasons why more extreme values might occur. For example.Chapter 11Judgments and Biases percent confidence interval. but not the whole range. the larger the confidence interval that they assign (they more fully appreciate the possibilities). Even when told beforehand that "most persons are overconfident about their knowledge. This is not objectivity used in the sense of what was the source of the probability assessment (data or intuition). the less chance that their interval includes the true value. embodying the decision policy.The more a typical person knows about the topic. when the request was for an 80 percent confidence interval. Achieving objectivity. The average error (actual minus estimate) should tend toward zero over many forecasts. Asking the person to judge two confidence intervals will help. In the discussion that follows. This represents a confidence interval. The more people know about the general subject (not the specific question). Asking for a 95 percent interval typically gets about a 65 percent confidence range. it helps to make the exercise as meaningful as possible. A bias is a repeated or systematic distortion of a statistic. so that calculations themselves do not introduce any biases or other errors. he will often first ask the expert for a plausible range. confidence interval. Expert assessors should practice and be given feedback on the quality (outcomes) of their assessments. or low bias. being too high or too low on average.
others are from relying on illfounded heuristics or rules of thumb. Biases toward new information." Even when delivered. faulted. any of which can introduce bias: Objective. Believing what one wants to believe. "Bad news doesn't sell. Insensitivity to sample size. Most companies use a discount rate that is too high for probabilityadjusted forecasts. People often seek new information only when a "favorable" report is expected. Many managers believe that they have a responsibility to act conservatively. similar to optical illusions. A study is more likely to be commissioned if the decision maker expects it to support her present opinion. We remember cases that are more famous or where we were more closely involved. or ignored. A contractor may have an excessive desire to win a bid in order to appear busy and successful. If members of a project evaluation team will be able to work on an enjoyable project.Chapter 11Judgments and Biases Common Biases in Evaluation or Decision Policy Decision policy has three elements. they may inadvertently deliver optimistic assessments. Projects (or bids) with unreasonably optimistic projections are often selected over projects (or bids) submitted with more reasonable projections. If the value or quality measure does not directly correspond to the organization's purpose. An organization's management should look carefully at what objective is to be maximized in decision policy. An objective value measure will be a statistic that corresponds directly and logically to the objective of the organization. Listed below are the more common biases that might be inadvertently introduced into an analysis. Present value discounting adjusts cash flows to reflect time value of money. Feelings about the effects of certain outcomes. bad news is often discounted. People are prone to make hasty generalizations. Time value. Personal feelings. therefore biases. and should be measuring monetary value. Different stakeholders have different objectives. Availability. This practice may be intentional and appropriate. Risk policy. Some scenarios are easier to imagine than . They will willingly sacrifice a risk premium (see Chapter 4) in terms of reduced expected monetary value. Most businesses profess to maximize shareholder value. There are also perception biases. to lower risk. The natural tendency is to shade judgments to reinforce what we want to do. Often there is insufficient data to draw a conclusion. Additional biases arise because of poor understanding of statistics or probability theory. there wiU be a bias. Belief and Perception Errors Several obvious biases arise from emotions and motivational effects. because our mental model may not be consistent with reality. creating a bias against longterm projects.
Very different judgments can result from the way we ask (frame) questions. even though rare. If one has an initial impression of the best forecast. are given undue weight. but will lose $2 million dollars if it fails. Exceptions are remembered and weighted as if commonplace. Anchoring. they are more likely to recognize information that reinforces the original judgment or decision. Often they do not. If any person were to answer both questions. drives a motorcycle without a helmet)." The expert inadvertently anchors to the original number. People are willing to do more to save a species endangered by hunting than endangered by natural change. We are sensitive to the wording and context of outcomes.. Framing. They "anchor" to the previous position. Discrepant and unusual events. objective evaluations require good analysis practice. the responses should add to 1. Surveys about losing (saving) jobs or lives often show that the answers depend upon the form of the questions. This is called the fiarning of an event description. Psychologists have demonstrated that people assessing a probability distribution shape should not try to build around an initial "best guess.Chapter 11Judgments and Biases others.g. the higher the believed likelihood of occurrence. The most recent data or experience is weighted most heavily. asking: "What minimum probability of success would you require to be willing to approve this project?" "What maximum probability of failure would you accept and still be willing to approve this project?" Such questions often evoke a response different from the complement question. We also mentioned anchoring when judging an uncertainty. then he often keeps this constantly in mind when buildingout the distribution. Judgment bias occurs when the current situation is rare. Sensitivity to the cause of the problem. Some of these assessments may be highly sub . such as those in our industry or area of expertise. It is therefore difficult to objectively consider the possibilities. neither common nor unique. Lifethreatening risks are more acceptable if the person has a choice about being exposed to the risk (e. The easier it is to imagine or remember. Later. People do not like to change their minds. especially if the earlier position was publicly stated. Here are some recommendations: Make a conscientious effort to best capture the experts' judgments in assessing input variables. Improving Evaluations Consistent. This effect is minimized if the assessment starts at the extremes of the distribution and works inward. Insensitivity to prior probabilities. Consider a major project that will earn a $20 million profit for the company if it succeeds.
Everyone upstream from the decision maker should be as objective as possible. the analysis is questionable and of little value. 1 Strive to avoid judgment biases in assessing input variables. Quantlfy subjective assessments. Whether additional investigation is warranted can be evaluated as a value o f information problem (see Chapter 6). Summary As professionals. Some situations demand conservative decision making. and separate judgments from preferences. and use lessons learned to improve the evaluation process. Employers and clients are better served by ensuring that evaluations are objectivea hallmark of professionalism.g. These are followup analyses of decisions after the general results become known. Otherwise. the resulting analysis can be overwhelmingly conservative. and model interrelationships between variables. in the final stage of analysis. also called postevaluations or postaudits. A suitable stochastic method (e. This work is either evaluation itself or preparation for evaluation. 159 . We can detect some errors by validation. It is important to preserve the original reasoning by documenting the analysis. most of us spend a great portion of our workdays processing or preparing information for others to act upon. the decision maker simply compares riskversusvalue (or cost) profiles of the different alternatives and chooses the best one (see Chapter 3 and Chapter 4 for a more consistent. This is done. Note that if everyone is slightly conservative. it helps to decompose the project system into submodels. A conservative attitude is a biasbut is often appropriate. Most often. data can be used to validate models. Apply a consistent risk policy. r Perhaps the most powerful way to improve decisions is postimplementation reviews. and use the discount rate only to represent preference for time value of money. Use an appropriate value measure. quantitative approach). is to increase the present value discount rate to compensate for risk. in effect. 1 In developing a valid model.Chapter 11Judgments and Biases jective. Sometimes we can develop models to validate outcome data. There is a psychological bias toward rationalization. Conversely.. Provide performance feedback to the individual assessors. and many people have difficulty reconstructing their thought processes after learning the outcome. It is far better to use probabilities to deal with uncertainty. Monte Carlo simulation) correctly propagates the probability distributions throughout the calculations. A common error. We can avoid or minimize some of these by carefully structuring the problem. E Recognize that biases do exist. for example.
saying. given what we know. even when some of those inputs are very subjective.Chapter 11Judgments and Biases A colleague approached me once. Chapter 12 continues our discussion about judgments. . "I could use the decision analysis methods i f only I had good data." When do we ever have "good inputs to planning evaluation models? Rarely! My response is always to point out that we are going to make the decision regardless of whether we have good data. A decision analysis approach lets us do the best we can. next focusing on relationships among variables.
Usually. often many things begin to go wrong together. For example. a chance event (risk or uncertainty) is related to one or more other chance events. These are examples where we have correlations. we will discuss some of the causes of correlation and ways to represent this in project models.CHAPTER 12 In modeling projects. we can use a statistic. Decision variables can influence chance events also. If data are available. A delay in one activity delays a chain of subsequent activities. understanding the system is sufficient to identify whether one variable is dependent upon another. In this chapter. Alternatively. to measure the degree of linear association (correlation).If we see a pattern in the points. Often. a risk or an uncertainty. Positive feedback can quickly drive a system to spiral out of control. the easiest way to determine whether two paired data variables are related is to plot and inspect a scatter diagram (also called an xy plot or crossplot).1 demonstrates a relationship between PersonHours to Complete Project and 161 . When something starts to go wrong in a project. Correlation Association and correlation are synonyms. then there is correlation between the x and y variables. called the correlation coefficient. Figure 12. otherwise. we should express that relationship in the model. a common beginner's mistake is assuming that chance events are independent. Activity Cost is typically associated with Activity Time to Complete. At least one of the variables is a chance variablethat is. Characterizingthe association between events is a second dimension of assessing uncertainty. or associations. In stochastic modeling. Encountering unexpected complexity or bad weather increases activity completion times. if any two variables are related. the analysis results will be wrong. Either term refers to a causal relationship between two or more variables. between variables in the project model.
e. and the correlation coefficientis about +0. in this case.= I  =I 9  I 9 I =GIm# =:. Standard PERT analysis (in Chapter 9). This is positive correlation.. Such simple models do not consider nearcritical paths that have a probabilistic influence on project completion time.. Further.*. (A negative correlation would be welcome. We see that an increase in Project Team Size is generally associated with an increase in PersonHours to Complete Project. I 1 I 0 0 0 0 3 s e c I I I = 9 == = =: .Im** I= :== I  . If variations in time and materials increase together.7 in this case. in value of information problems (Chapter 6). Most often.9 9 . the effect on the project is greater than the sum of the variances in time and materials alone.w a. This is not necessarily due to optimism by the project professionals. 1 Scatter Diagram Showing Correlation Project Team Size. Decision trees are especially well suited for representing conditional probabilities. correlation adds to uncertainty in a project. . whom we asked to judge activity completion times. For example. one or more chance variables are partially dependent on another event outcome. . n .Chapter 12Relating Risks I a.s 1 I I    2 e I I Project Team Size Figure 1 2 . for example. A wellknown deficiency in deterministic project models is optimistic completion times. v I I  . total project uncertainty is worse than if time and materials were independent. considers only activities on the deterministic critical path. the informationevent outcomes (symptoms) are correlated with an event of interest (affecting value). A convenient way to represent the relationship between the chance events is by using a joint probability table. nonindependent) probabilities.) 162 . for example.9  m I = :* . simple approaches usually assume independence between activity duration times.w 0 I a E I I I I = I a .L. = I # = m 9. Typically. because it would decrease overall uncertainty. Most analyses involve at least some conditional (i.* = I I m = .m = = . That is. I I* .
Two variables may share a common influence. Ways to Represent Correlation We should balance the effort in representing correlation with its importance in the model and decision situation. in part. salary and coordination costs are correlated. In this method. Often one variable is partially dependent upon one or more other variables. Direct causeandeffect relationship. calculation. we connect decision. For example. weight of the vehicle. For example. Vehicle Cost is determined. Having more people also means more communication channels. higher demand for shared resources and common materials usually results in longer leadtimes. In value of information problems (options). An influence diagram (discussed in Chapter 9) is an excellent way to map relationships between elements in a system. Thus. and average load. exclusive use.we have one or more information chance event(s) (symptoms). In the subsections that follow. by Distance Driven. For example. Similarly. which may be unknown at the project start Estimation biases Changes in project scope. Other correlation examples include: Low unemployment levelthe driveraffecting recruiting leadtimes and salary levels Inflation. I will describe several correlation modeling methods. The number of people on a project team obviously affects salary expenses. the underlying demand for our product (perhaps we want to forecast next year) affects the results we obtain with a market survey this . and wage ratesaffecting many activities across the board System or problem complexity. and payoff variables with "influence" arcs showing the direction of causeandeffect relationships and calculationprecedence relationships. Common drivers.Chapter 12Relating Risks Sources of Correlation The following paragraphs describe situations where correlation arises. competition for a resource results in the customary inverse relationship between price and demand. Bottlenecks. which leads to higher costs in coordination efforts. chance. Probability Tables In Chapter 6. materials. and other resource constraints can cause correlations. One or more chance events of interest affects the symptoms we observe. we use joint probability tables to represent probability relationships between two events. System constraints.
we have additional techniques for representing correlation. in turn. we can use it for sampling by either: Randomly selecting a data set. Direct Labor Hours explains most of the Total Task Cost. instead.30 Table 12. Product demand.29 .38 .15 .20 . and using the dataset values as trial values in the simulation. If we have a collection of representative data sets.50 . Fortunately for our calculations. most often we need deal with only one pair of chance events at a time.33 . such as is shown in Table 12. With simulation. Figure 12. Otherwise.2 shows another data collection where a clear pattern exists. a scatter diagram shows a pattern when there is correlation. Workforce composition. This is the approach for the remaining three subsections. Asset and project models are more likely to use Monte Carlo simulation. and the relationship appears approximately linear. for different drugs introduced to the marketplace.1. as the stochastic calculation process. and finding a way to represent the relationships mathematically.Chapter 12Relating R~sks Market Study Result "Fa~orable~~ "inconclusive" "Unfavorable" 7 Market Demand Medium High Low .1. and other variables also affect total cost. drives our project's profits. Of course. For example. which we learn about later.02 . Fitting to Data As seen in Figure 12. This has an advantage of not needing analysis in order to understand the relationships.1 Joint Probability Table year. the survey information would be worthless. . we might have data of unit volumes at monthly intervals after introduction. we might miss an opportunity to gain important insights. Sampling Actual Data A data set consists of values sampled together. Analyzing the data. materials. with or without replacement. There is correlation between the Market Study Result and Market Demand.
a polynomial equation) are extensions of the same idea. The simplest methodand probably well suited to the data included in Figure 12. .2is a "linear regression" to the formula: Total Task Cost = a + b x Direct Labor Hours where a computer routine chooses coefficients a and b to best match a line through the data. Regression is a statistical technique that finds the best fit of equation coefficients to one or more independent data variables. For truly difficult problems.2 XY Graph We could "fit" a dependent variable to one or more "driver" variables using regression. Most popular is the "least squares" method. Correlation Coefficient If an increase in one variable is generallythough not alwaysassociated with an increase in another. The opposite effect is negative correlation. Noise in real and synthetic data represents other variables and effects not recognized in the model.Chapter 12Relating Risks 350 300 c m 250 tff 5 0 + cn cn 200 150 100 50 0 0 6 Y m v 500 1. Many handheld calculators now provide linear regression functions.000 2.500 2. where the best set of equation parameters provides the lowest sum of the errors squared. To make the synthetic data more realistic.000 1. Fitting a function to multiple variables and curvefitting to a more complex equation (e. they are said to have a positive correlation. neural networks (discussed in Chapter 17) may be a suitable "massive regression" approach.. we might add a noise function (such as a normal distribution about the bestfit line).500 Direct Labor Hours  Figure 12.g.
Unrelated variables have p G 0. one variable is simply a deterministicfunction of the other variable. such as a project. @RISKand Crystal Ball use rank correlation. There is a trick to the way popular simulation programs use correlation coefficients. I understand that the rank correlation mathematics works better than value correlation when the distributions have different shapes. Correlation is a characteristic of many systems when there are causeandeffect relationships and when there are common "driver" variables. p. then we can express that knowledge in a model. sort the data and then replace the values with the rank numbers 1 to 100. as best understood. which is another deficiency of the method. Instead of using correlation calculated with actual data values. In these extreme cases. If there are many relationship pairs. We represent the functioning of the system. Then calculate the correlation coefficient using the ranks of the variables. The next technique is the best if circumstances warrant the extra modeling effort. My motto is. A correlation coefficient for any pair of variables is perhaps the simplest way of expressing correlation.Chapter 12Relating Risks A statistic. The popular @RISKand Crystal Ball tools for Monte Carlo simulation with Excel spreadsheets both provide convenient ways to relate chance events with correlation coefficients.2 have p = 0. measures the degree of linear association between paired data values. . "If you can describe it. rather than to the actual values. Detailed Modeling The superior modeling approach. Correlation coefficients are a quick and easy way to represent correlation in a model. we can model it. I f you want to calculate the rank correlation coefficient. if doing the work is justified. the correlation coefficient refers to the rank order of the variables." How are these variables related? What is the calculation hierarchy? Can we further decompose these elements?I f you can explain why or describe how the project and its environment work or behave.or negativecorrelation has p = 1. if you have 100 data values for a variable. For example. is to model the system in detail. This is a matrix of correlation coefficients of all variables involved. and perfect anti. Perfect positive correlation has p = +I. The formula for the (Pearson momentproduct) correlation coefficient is in Appendix 1A. then a correlation matrix is used. The two strongly related variables shown in Figure 12. Again. Do this for the other variable as well.85. That is. This way presumes that there is a linear relationship between the variables. called a correlation coefficient. an influence diagram is a good way to map the relationships. Microsoft Excel's CORREL function is a convenient way to obtain the correlation coefficient statistic. you will first need to convert the values to their ranks.
People often spoil the plan. Ineffective. in my opinion. Despite being a novel. Critical Chain (1997). Who does the work often is as important as how the work gets done. Unfortunately.3. Competence Often. individual productivities can range by a factor of ten or more.Chapter 12Relating Risks Incompetent. The following two subsections describe examples where human behavior affects chance events. human behavior affects the outcome of a process in progress. With programmers. In project planning. or 0 PV Project Cost $ Figure 12. such as with most games of chance. as illustrated in Figure 12. Fully objective probabilities are possible when we completely understand the system. Further. a project manager can have a dramatic influence on the project team's productivity. for example.3 Uncertainty versus Competency Human Factors Analysis is always easier with good data. we may detail the work to be done and then apply metrics and various methods in estimating time and cost. the book. I was struck by his clear explanation of human behaviors in project performance. biases creep into our judgments about uncertainty. Response to Progress As I read Eliyahu Goldratt's book. You will probably recognize the following behaviors. Similarly. . we estimate cost or time to complete without sufficient consideration as to who will be doing the work. vividly explains why judgments about uncertainty so often are biased. This is straightforward planning and cost estimation. Factor the people into the project plan and forecasts.
one at a time. This is why "stretch budgets are so popular. Early finishes are not reported. Decomposition is a key method that project planners use for making better forecasts. For example. Correlation affects the forecast quality in a stochastic model. The procrastination and embellishment behaviors. Embellishment. Beyond a point. Garbage in produces garbage out. while late finishes push back successor activities. If you are ahead of schedule and cost budget. and that reporting provides little impact on project success (perhaps because some organizations do not share the progress reports with the people actually doing the work). Catchup. Goldplating. Our original intent may be to work systematically on tasks in a logical sequence. Some companies have found that this is an illusion. what Goldratt calls the "student syndrome" in Critical Chain. Project progress reporting bolsters a feeling of control. If behind schedule and budget. Just knowing a time to complete target provides an achievement incentive. The next chapter describes what I think is the most important reason for people to use decision analysisyou get a better number! . Breaking down tasks into details reduces biases in estimation and the undesired behavior effects. however. most conscientious people are inclined to work harder. Plus. we can segregate contingencies from the normal portion of an activity's work (and address them with project risk management. working to our detriment. This is largely procrastination. mentioned earlier.) Wasting slack times. People like to work toward objectives. planning the details leads to a more realistic work assessment. (This is positive feedback in the system. some people will give up. suggest why. why not add more features or make it a better work product? This is especially true in software development. The resulting inefficiency worsens matters. creating need for further multitasking. Urgencies and waiting delays often necessitate overlapping different tasks and different projects. Lessdriven workers may be inclined to coast a bit. Chapter 8).Chapter 12Relating Risks Multitasking.
Customarily. When planning your project. This chapter explains stochastic variance. deterministic analysis. stochastic variance. suppose you had used a stochastic (probabilistic) model to forecast expected value (EV) Completion Time. this might be a base plan for a projectwhat happens if everythmg goes according to the input parameters. that portion of the deviation. a base case projection (the base project plan) is a projection made with a deterministic model. actual minus estimate. Using expected value (EV) inputs in a project model does not always provide the EVresult (however measured). no. What if. and those values are EVs if . This chapter discusses the causes of stochastic variance and how to reconcile a decision analysis to a conventional. attributable to the stochastic calculations. through extraordinary coincidence. is dramatic. and the project executes exactly as modeled. For example. Stochastic. suppose you realized your basecase project model? Would this ensure that the project is completed in exactly the forecast time? Generally. the best singlevalue assessments are the inputs to the basecase model.CHAPTER 13 One of the benefits of decision analysis is improved accuracy. your team completes each activity in the same time as the EV estimate for each activity? That is. Sometimes the difference. Base Case Usually. Base Case versus Stochastic Model Deterministic refers to traditional project models with singlepoint inputs. or probabilistic. refers to project models where we represent uncertain inputs as probability distributions.
This calculation difference can sometimes be dramatic. In one analysis (see Appendix 13A). Probability distributions (and correlation details) express complete judgments about uncertainty. Cost accountants do variance analysis.8 million investment. however.6 million! Thus. the process of reconciling a forecast with actual. Deterministic and stochastic calculations provide different best estimates. The EV of the completion time distribution is often different than the base case. . one that is especially useful as a reference for discussion. This base case is but one scenario. such as costs across time. O The stochastic variance effect is certainly not new. and that difference is the stochastic variance. a project team should incorporate its best information in forecasting. the real value of the project was almost four times the value of the conventional analysis! In summary. For discrete variables. the outcome variables will be distributions.2 miUion. but I have seldom seen any reference to this calculation phenomenon in the literature. the stochastic model's expected monetary value (EMV) was $4. so I thought stochastic variance would be a good name. Two key points: The best singlepoint input values are their EVs. Anything less rigorous may introduce a bias or calculation error into the forecast.Chapter l%Stochastic Variance the experts assess distributions for the input variables. (Stochastic Variance) = (Base Case Outcome) . the venture required a $2. We would like the effects of probability distribution inputs to propagate through the model calculations. you should not use the basecase scenario for making decisions. The stochastic variance phenomenon arises with other variables as well. we need to calculate with probabilities. Thus. normally we use the outcome nearest the EV. Stochastic Forecast Of course. The reason is that the base case is often an incorrectf m e m t . The base case present value (PV) was $1. in order to get an EV projection of a timespread variable.(EV of Outcome Distribution) I chose this name in reference to variance analysis. This surprises many people who think that decision analysis is an embellishment mainly to characterize risk. Although of interest. The deterministic project (or asset) model solution is the base case.
p) completion times of 10. Days Figure 1 3 .C).11.1." Consider the following. projects with uncertainty in activity completion times usually take longer than the base case schedule than if there were no uncertainties. and have EV (same as mean. B. determines when D can begin. Here are the calculation details for the example. This corresponds to the MAX function in Excel. The EV start date for D is often later than the latest E V completion date for A.C) is read "maximum of A. and C must be completed before a successor activity.B. This merge bias goes unnoticed without using a stochastic project model. Each has a probability distribution representing judgment for its activity duration. with distributions. or C (completion dates). respectively. and 9 days.48) t+ 0 5 10 15 20 25 30 35 Time. The p. one source of stochastic variance is merge bias. There are uncertainties in these estimates. The width or uncertainty of each distribution is characterized by its standard deviation. .1: Activities A. D. and C all begin together. lognormal probability distributions. the latest completion time. and C start together at time 0. Regarding stochastic behavior.B. or EV. illustrated in Figure 13. Activities A. Thus.Chapter I S t o c h a s t i c Variance Q Q I I / c (9. we see activity networks where often two or more predecessor activities must finish before a successor activity can begin. B. as shown in Figure 13. and C.3. In a conventional. "Nothing good every happens at a merge point. is the first value in parentheses.05. In Chapter 9. B. B. deterministicproject analysis. Max(A. The judged activitycompletion times are skewed. Assume that A. The notation Max(A. 1 Completion Times and Earliest Start Merge Bias In project management.the second value in the parentheses. o. B. I once heard a project modeler say. Neglectingt h i s "joining" or "merging"effect is one reason why many projects complete later than forecast. 5) 1 (13. can start. as shown in the figure.
B. Task D can 1 days (see Figure 13. using 1 days to schedule the D start is unwise. or at least within narrow ranges.05 days (excess calculation precision is retained to illustrate the concepts). An even greater benefit is more accurate estimates for decision making. With the assumed lognormal distributions. and C. This is one way that stochastic variance can arise from the model structure. and C were known precisely. The 2. This is not the case shown for A and C had no chance of exceeding1 in Figure 13. merge bias or stochastic variance. The optimal project value will likely be a different planned start for D. the difference between forecast and actual: . However. we should explicitly incorporate distributions in representing risks and uncertainties. as shown in Figure 13. When reconciling a forecast with the project's actual outcome. the EV start date for Task D turns out to be 13. Whenever the forecast or estimation is important. the variance analysis should separately recognize the component attributable to stochastic variance.05 (or rounded) days for planning Activity D's start.05/11 = 19%. The deterministic solution had the successor activity starting in 11. the difference between forecast and actual can be substantial through no fault in project execution.1. and this optimization process is the subject model in Chapter 15. Any of the .1 days.05day increase beyond the largest EV completion date for any prior activity is the merge bias. the start in 1 1 latest of the best singlevalue estimates for A. Variance Analysis Variance analysis (not to be confused with the variance statistic) identifies the component causes for an actual outcome deviating from its forecast or budget. If a project's schedule lengthens 2. Cost accountants are often the ones who prepare variance analyses. When we use decision analysis techniques to make forecasts. ~ a i D's k EV start date would be 1 1 days only if the completion times of A.what is the effect on costs? A stochastic model of the project system helps with the answer.Chapter I S S t o c h a s t i c Variance With a conventional activity network model (CPM or PDM). This would also be true if the distributions 1 days. is the effect of uncertainty in the network project model. Sometimes.preceding activities could delay D's start beyond 1 1 days. This 2. 9 The project manager probably should not use 13. Many people first become interested in risk and decision analysis as a way to characterize the uncertainty in estimates. A simple Monte Carlo simulation demonstrates the effect.1).1. while the EV start under uncertainty was 13. This is the EV completion time for B.1 days difference.0 days. B.
shows one format for explaining what happened in reference to the calculated forecast. the EV projection is an objective forecast. e. a bonus when the team meets a performance target The merge bias. We started this chapter with the question: What if for every input variable: Actual outcome value = EV of the assessed probability distribution? Using these EV input values provides the base case deterministic analysis.Cost. Example Simulation Analysis and Variance Report Let's examine a simple model calculated with Monte Carlo simulation to show the projection and a variance analysis: Net Cash Flow = NCF = Volume x Price . caused by constraints on activity starts and finishes in the activity network structure Contingency plans and other subsequent decision points represented in the project model Correlation. This means that it is unbiased. Stochastic modeling better represents reality. When done properly. Bias can arise from an improper model structure: its formula relationships and constraints. The example variance analysis. The problem is that the base case answer is often different from the EV of the outcome variable's distribution. The difference is the stochastic variance.Forecast Outcome. Decision analysis provides a more accurate forecast for many problems. between variables. The forecast is unbiased if the input values and model calculations are unbiased. It also helps the decision maker better understand the business or proposed transaction. Forecasts Are Usually Wrong A stochastic model produces a more accurate forecast because it correctly recognizes uncertain variables and how they interact through the calculations. it is important that a postanalysis recognize that portion of the total variance attributed to the stochastic calculations. The stochastic variance can arise from common situations: Asymmetry and nonlinearity (perhaps the most common cause). or association.. We can split this into two components: Total Variance = Deterministic Variance + Stochastic Variance.Chapter l>Stochastic Variance Total Variance = Actual Outcome . discussed later in this chapter.g. Because this situation arises. .
we capture experts' complete judgments about uncertainties in the form of probability distributions. Although the distribution in the figure appears approximately centered at about $100.000 units and o=2.000.910.400.000 units.1 shows that the base case forecast is: NCFdekrmti.Chapter 1SStochastic Variance Variable Volume Price Cost Statistics Distribution TY Pe Lognormal Normal Triangle Correlation Coefficients Independent 0. singlepoint calculation approach.090.5 Assumptions 10.090. Appendix B). the bias is significant.0 $k p = 10 p = 30 I = 140 o=2 o=5 m = 180 h = 280 0.000.000. I developed this example with one of the popular spreadsheet addin products (see Decision Analysis Software. This would be the forecast with the conventional. Using the EVs for each parameter.000 = $100. or about a 5 percent difference. Correlated with Volume with p = . Cost: Triangle distribution. Independent variable.2 displays the simulation NCF distribution. and high = $280.5 Table 13.000 $k NCF 100. Figure 13.$200.95. The EV NCF is $95. The result of a Monte Carlo simulation model forecasts (EV) net cash flow (NCF) at $95.090 = $4. with p = $30/unit and o=$5/unit. with p = 10. Often the stochastic variance is substantially greater in proportion to the mean forecast (as with the example in Appendix 13A).000 $/unit 200. as shown in Table 13. In a decision analysis. Correlated with Volume with a correlation coefficient p = .000.2 NCF Distribution Statistics Assume that a company's experts provided the following assessments for the three random variables: W Volume: Lognormal distribution. The deterministic solution is simple.5.2.000 units x $30/unit . W Price: Normal distribution.5. with low = $140. 174 .000.000 k unitslmo.000 and o =$29. = 10. 30. Table 13. This has p = $200. The stochastic variance is $100.1 Spreadsheet Model Table 13. most likely = $180.000 .
2 Distribution for NCF Table 13. with EV annotated? Summary EVs provide the best singlepoint assessments for each continuous parameter in an analysis. This table illustrates one possible format. Cost accountants sometimes define certain item variances to include joint components. EV of the outcome distribution provides the best. the outcome value is often different from the EV . Notice that the graph in Figure 13. Figure 13.3 is an example variance report for the NCF projection. An example interpretation: There is about a 57 percent chance that NCF will be less than $100. singlepoint project forecast. and the outcome would be the actual result. the project realizes the EV of each input parameter? This is the base case. The joint variance in Table 13.Chapter l>Stochastic Variance 100 0 100 200 300 NCF $thousands Figure 13. We should separately recognize the stochastic and deterministic variances in a postanalysis. that is.3 shows the cumulative distribution for the NCF projection. As a decision maker. However. wouldn't you prefer the graph. they define component variances to include all parts of the joint variance(s). through extraordinary coincidence.I recommend separating the individual and joint contributions. What if.3 conveys much more information to the decision maker than a singlevalue estimate would provide.000.3 results from simultaneous changes across multiple variables in the calculation. Also.
756)k $1. Here.4k 100.8 55.835k Variance (1.4k +4. Postanalyses are perhaps the single . Table 13. x and s are the sample mean and standard deviation of the simulation results.4 Stochastic Variance Total Variance Actual NCF Realized Notes: NCF = Volume x Price .9 TT = $95. for distributions representing judgments about input variables. these are approximations for the true solution p and 0 . the standard is huge compared to the mean.0/unit $29.0k (3 Std. o.Chapter 13Stochastic Variance ProjectIBudget Variance Report for December 2000 P Original Projection: Volume Price Cost NCF from E V Parameters Less Stochastic Variance NCF Forecast (from simulation) PostAnalysis: Volume Price Cost Joint Variance Actual 8.0k $30.0k units $5. p and o are symbols for the mean and standard deviation. The postanalysis separately recognizes this correction effect that arises with the more accurate stochastic calculations.0 $200. respectively. is only $700.9) 11.08 $144.4k Variance amounts in parentheses are unfavomble. and mean 01) are synonymous.7)k 10.0 $11. This difference is the stochastic variance.08 $55.3 Example Variance Analysis forecast.lk Deterministic Variance Check: NCF (actual parameter outcomes) NCF (EV parameters) Deterministic Variance +11. A more precise and unbiased EV forecast is perhaps the most important benefit of a decision analysis approach. For illustration here and elsewhere.4k $100.165k Mean 10.4k $111.244k $31.Cost +16. The resulting standard error of the mean for 10.9 s = $61.3 $111.000 trials deviation.0k 4. Deviation 2.4 Contribution to $ Variance (52. N. Symbols Ek).2 (1. more digits are being shown than are significant.
0 a) > .Chapter 13Stochastic Variance 1. not for being lucky with chance outcomes.3 Cumulative Frequency Distribution best way to improve our decisions. We should judge the evaluation team members based on their creativity in generating ideas. it is important to recognize the individual sources of deviation from forecast.2 0 + c s 3 0 a) 5i 5 LL 3 0 100 0 100 200 300 NCF $thousands Figure 13. The variance analysis format described in this chapter supports a distinction between good decisions and good outcomes. . With a solid decision analysis process. we should reward people for good decisions. and on the completeness and integrity of their analyses.
the calculation propagates the probability distributions through the model. This was a realtolife example as part of a joint presentation to the Institute of Management Consultants. a fellow certified management consultant. I suggest focusing on perhaps an 80 percent confidence interval (about $2 to +$I9 million) and not get too excited about the distribution tails. The simulation outcomes ranged from $2.20 million net PV discounting at 10 percent/year 14.2 show two forms of the resulting PV distribution.7 percent internal rate of return (IRR) 9. operating expenses. This traditional analysis shows that the project has a PV of $1. instead of a single PV. I also put in some correlation between key variables. "I built the distributions around the original.2 million PV. singlepoint values so that these original values were the distribution EVs.8 million investment in plant and working capital Yielding $1. Many of the model assumptions are uncertain. This was the base case analysis. This range is useful information in characterizing the venture's uncertainty. . Mr. He developed a Lotus 123 spreadsheet model for a typical new product venture. the new venture appears economic. we see that there is almost a 50 percent chance that the project will have a positive value. Figure 13A. In effect. From the cumulative distribution. Devereux Josephs.1 and Figure 13A. as would be used in a decision analysis. and I were analyzing a new venture. such as rate of unit sales growth. PV as a value measure is appropriate and widely accepted.5 million to over +$42 million.5 years to payout (time to make cumulative NCF positive) $8.20 million. net of capital recovery.Chapter 13Stochastic Variance APPENDIX 1 3 A Several years ago.8 million total NCF = net profit While not a thrilling value. I modified Dev's deterministic model to make it a stochastic model. His model showed: Base Case Deterministic Model $2. I replaced the ten most important input variables in the model with probability distributionsmakingthese parameters "fuzzy. Now. Dev was showing traditional DCF analysis. $1. we obtain a PV distribution. income taxes. and cost of capital. I solved the model using Monte Carlo simulation.
isn't it? The stochastic results reflect a $3. The best assessment of value. For the $2. This is the value added to the corporation beyond recovering the investment and interest. "The venture is estimated to add $4. through extraordinary . 3 LL  D Mean = EMV = 4.6 million to the company's net worth. adequately funded.584 5 5 15 25 35 NPV $millions Figure 13A. To a large.2 Cumulative Frequency Distribution for PV The EMVis what counts most.Cha~ter 13Stochastic Variance ! 6 K Q. If. Note that a portfolio of projects quickly dilutes the effect of individual projects having high risk.6 million. this risk profile would also warrant a value adjustment to reflect their attitudes about risk." This is quite a difference from the base case. To small. the EMV may be all that is necessary for decision making.1 Frequency Distribution for PV 10 0 10 20 30 NPV $millions Figure 13A. riskadverse companies. This adjustment is the stochastic variance.8 million investment. In recommending this new venture. the project team might say.4 million correction to the base case value. or forecast. our EMV is $4. and (essentially)riskneutral company. is the mean or EV.
isn't it? Yet.coincidence. the principal cause of the discrepancy is something not seen in the deterministic model.20 million. If one party can better assess value and risks.2 million. internal and external circumstances were such that we realized the values of the individual input forecasts. it possesses much better information for negotiating. then the actual outcome in the lookback analysis would be P V = $1. . you are saying our value added is only $1.4 million? In this example. it fails to cover its cash operating costs. The stochastic model incorporates these possibilities to which the deterministic model is blind. Coincidentally. management will terminate the venture and sell the assets. What happened to the missing $3. yet keep all of the upside potential. The unit manager might reasonably ask: You forecast this venture to have an EMV of $4. If we launch the new venturq and. after a while. the EVs of the input variables were realized (their best individual forecasts). This is an option to truncate losses. That's good.6 million. Dealmakers should be alert to hidden value.
Additionally. a chapter on probability concepts provides the fundamentals for working with probabilities. the language of uncertainty. Chapter 14Exploiting the Best of Critical Chain and Monte Carlo Simulation Chapter 15Optimizing Project Plan Decisions Chapter 16Probability Rules Expert Systems in Project Management Chapter 17 .This part discusses topics of interest at thefringe of mainstream project management.
.
Critical chain project management (CCPM) extends critical path analysis to the constrainingresource. written as a novel. deterministic calculations. However. In this book. of course. Many managers have seized upon the radically different critical chain approach as an improved way of planning and managing projects. as the most significant project management publication in recent years. Critical Chain Many people regard Eliahu Goldratt's book. primarily. is the critical path. . in my opinion. he applies theory of constraints (TOC) to project management. CCPM addresses several key problems in project scheduling. This chapter describes an even better approach by addingin techniques of decision analysis. we will discuss ways to blend the best features of the two styles. CCPM uses. I will compare and contrast my understanding of CCPM with a traditional decision analysis approach. In this chapter. Critical Chain (1997). The obvious constraint in projects. the method ignores much of what we have learned in decision analysis about understanding and modeling risk.CHAPTER 14 EXPLOITING THE Critical chain project management is a hot new method that can improve a traditional decision analysis approach to project planning and risk management. Then.
is about debottlenecking manufacturing operations. whether talking about manufacturing inventory or project time. The key features of the CCPM approach include: The schedule model is deterministic. is essentially the same: Identify the constraint. with buffers to deal with uncertainty. Schedule highrisk activities early so that problems can be detected and addressed early. Eliminate buffers along the critical chain. Critical Chain. Subordinate other activities to the constraint. The obvious project "bottleneck"is the sequence of activities along the critical path. Goldratt suggests estimating activitycompletiontimes at 50 percent confidence points (medians). a powerful new approach to project management. greater throughput can be achieved by attacking the bottleneck constraint. His first bestselling book. Reading this book is an enjoyable way to challenge one's beliefs about project management and to gain new insights. Focusing on project critical path activities is the manufacturing equivalent of concentrating on bottleneck production processes. people will usually waste the slack. Managers want to ensure that the constraining resource is fully used. Simply put. Also place buffers at the end of "feeding" activity chains that merge into the critical path. both the book and approach. a novel called The Goal: A Processfor Ongoing Improvement (1984). Other references that I've found most helpful include articles by Leach (1999) and by Elton and Roe (1998). Don't start work prematurely. Deterministic means that the inputs to the model are all singly determined values. Otherwise. Critical Chain defines CCPM. Critical Chain Project Management In the Critical Chain story. Remove padding from activity estimates. Goldratt applies his philosophy to project management. Put a main buffer at the end of the project to protect the customer's completion schedule.and a book by Newbold (1998). has been widely reviewed and discussed. The analog of excess workinprogressinventory is prematurely starting activities off the critical path. making sure that the constraining resource is fully used. which is widely heralded in manufacturing. Manufacturing's throughput is much like project completion time. I r Exploit the constraint. The analogy between production management and project management is effective. yet ensure that the constraint resource is never waiting for work. .Chapter 14Exploiting the Best of Critical Chain and Monte Carlo Simulation Theory of Constraints Many consider Goldratt to be the father of the0y o f constraints. Elevate the constraint: Seek ways to raise the productivity or availability of the constraining resource. They avoid starting work prematurely as this builds excess workinprocess inventory. His process.
providing distributions and EVs for outputs. Work to plan. Usually this means maximizing EV present value (PV) net cash flow or minimizing PV cost. Decision Analysis with Monte Carlo Simulation Decision analysis is the discipline for making decisions and evaluations under uncertainty. Adjust the project plan(s) as necessary to exploit the constraining resource(s). * : * Contracting people and other resources to be 100 percent applied to an activity would be difficult in many situations. Encourage early activity completions.Chapter 14Exploitingthe Best of Critical Chain and Monte Carlo Simulation Determine the critical path and resource constraints. Monitoring and communicating buffer statuses are key to managing the ongoing project. are expressed as probability distributions. Instill a culture where an activity team is ready to begin working on its critical activity. It is characterized by analyses having three key features: Judgments about uncertainty. The expected value (EV) calculation compresses the distribution of possible outcome values into a single number. and avoid tampering. there is much more that we can do to manage uncertainty. However. There are other projects competing for the resource that would similarly like 100 percent dedication upon demand. (These decision rules are the same except for a sign change. There are several commercially available programs that provide simulation capabilities for models in Microsoft Excel. "It's always a people problem" is an old saying in management consulting. Part of this is accomplished by using median rather than conservative time estimates. and often analysis results. We determine the quality of an outcome by a numeric value finction that measures goodness or progress toward the organization's objective. It is a straightforward process that allows distributions to represent activity time estimates and other uncertain inputs. and various other project planning tools.) We use a project model to forecast the behavior of the project as a system. Simulation solves the project schedule. and will devote 100 percent of its effort to this activity for the duration. Microsoft Project. such as time to complete and cost. . The driving objective is minimizing project completion time. The optimal decision policy is to choose the alternative having the best EV. CCPM addresses project management's challenge to effectively deal with uncertainty. CCPM reduces the effects of two behavioral problems: biasing estimations and wasting slack times. Avoid wasting slack times. Monte Carlo simulation (simulation) is perhaps the easiest and most useful way to addin details about uncertainties.
most real projects contain many correlations. Combining the methodstaking some of the best features of eachought to be an improvement.Chapter 14Exploiting the Best of Critical Chain and Monte Carlo Simulation The key analytic benefit of simulation is that the result is a more accurate estimate. The intent is to minimize possible delays from time overruns for most noncritical activities. problems affect many activities and thus compound the effects. . Comparing Approaches Table 14. better. This is called the criticality index.1 highlights and contrasts features of the CCPM and decision analysis. even though the effects are quite negative!) With the simulation model. On the other hand. project value to the customer. This is because it is the latest of predecessor completion times that drives the start of the next activity. for people doing the calculations. For example. The distribution for time to complete the project or any sequence of activities. in my opinion. for this effect. the project manager can forecast such elements as: The probability that an activity lies on the critical path (looking back from project completion). Project managers have recognized that uncertainty in merging activity paths produces a longer schedule. Unfortunately. With stochastic (probabilistic) project models. I use a more general term. Here is a strategy to realize benefits from blending critical chain and decision analysis approaches for project management: Develop the work breakdown structure and activity network diagram in the usual way. Deterministic calculations are a deficiency in CCPM. We can perform costbenefit analyses for candidate riskmitigation actions and for evaluating possible activityaccelerating (crashing)efforts. The distribution of project cost or. Combining Methods There are complementary strengths in both CCPM and decision analysis. this is often called merge bias. Contrary to popular belief. summing activity completion times does not produce a normal distribution. Traditionally. when things start to go wrong. Critical Chain explains why we should avoid earliest possible starts by default. project scheduling starts activities as soon as possible. stochastic variance. from all implementations of which I've read. (This is positive correlation. often. the critical chain approach remedies some problems with a traditional stochastic planning model. In Chapter 13.
Deterministic (singlevalues). shorten the project plan to conduct more activities in parallel (being careful to recognize resource conflicts and multitasking inefficiencies). Stochastic (probability distributions). Recognize resource constraints. usually by interviewing the best available experts. Breaking apart important contingencies (probability. Figure 14. premature investment. A culture of trust and objectivity is essential for obtaining quality estimates and for other human facets of effective project management. except for adjusting buffers. and rework due to changes. Strive to keep fairly static. and so on. and we use the narrow distribution mean for the deterministic model input.Chapter 14Exploiting the Best of Critical Chain and Monte Cario Simulation Monte Carlo Simulation Theme Focus on the deterministic critical path or critical chain. lengthening time to complete Activity AE.  Features of Both Reduce or eliminate multitasking and wasting slack. Work with the possibilities. can incorporate dependencies for delayed start and other dynamic conditions. with buffers). Develop a stochastic project model. Pull (start tasks only as needed by finish date. A graph of actual outcomes versus estimates is an excellent feedback tool (as we discussed in Chapter 1). Work to planmake it happen. In the example activity. As available resources permit. This function. Learning is essential to improving future judgments. and the cost of the activity.1 illustrates this decomposition. . Traditionally push (start tasks as early as possible). with buffers).Schedule start times for lowcriticality index activities to optimize project value. p p p P Plan Changes Scheduling Starts Calculations Execution  Table 1 4 . Task leaders provide the forecasts for activity time and materials. and impact distribution if the contingency occurs) is a good way to obtain a better representation and a more realistic estimate. The base case projection assumes that neither of the two sigruficant risk events occurs. assuming no major contingencies occur. We can maintain this perpetually throughout the project life cycle.Postevaluations should be nonpunitive. Simulation models confirm that delaying noncritical activity starts reduces multitasking. Can be pull (start tasks only as needed by finish date. These judgments should be expressed as probability distributions. Manage priorities and risks using criticality indexes and calculated distributions. The activity manager maintains the model inputs (more about this later) for time to complete the activity. either or both of two sigruficant contingencies could occur. 1 Comparing the T w o Approaches Obtain quality judgments about activity completion times and resource requirements. Dynamically update the active project plan. A baseline completion time (or cost) for managing an activity would be the EV time to complete.
0 12.35 for this activity.0 Time to Complete Activity AE.1 through 14. reporting early completion) and in alerting workgroups to be prepared as work on critical (or nearcritical) activities approaches. the project model calculated a criticality index of 0. and affects the time to complete the project.5 10. . '"Asemble Equipment. 9 In Figure 14. AE is the code for an activity. .3. as a probability distribution.recognizing the two major contingencies 5." The distribution represents a forecast for when all of activity AE's predecessors will be complete. A subproject model is the source of the distribution. Here. The forecast. so that activity AE can begin. .3 shows a forecast of when predecessor activities will be finished. r Provide activity leaders with convenient access to the project model.Chapter 14Exploitingthe Best of Critical Chain and Monte Cario Simulation I . 1 Forecast of Activity AE Completion Time The stochastic project model is the core of effective project risk management.5 15.. Weeks Figure 1 4 . Stochastic Model Benefits Modeling major contingencies provides activity managers with information about such elements as: Probability an activity will be on the critical path. we need to expand the scope of the stochastic model to incorporate multiple projects and their supporting resources.97. without major contingencies N =7. The project manager should be especially involved in monitoring ongoing activities (dedicated efforts to the extent possible.0 7. N=6. (Figures in Chapter 15 illustrate additional types of graphic information available to Activity AE's manager. .85. Figure 14.2). the criticality index. Apply optimization techniques for planning starting times for each activity (the topic of Chapter 15).) Changes to the toplevel project model may be necessary in extreme cases. In complex situations of shared resources.2. for when all predecessors to the activity will be complete (see Figure 14. Cost/benefit/risk analyses for activity decisions can produce outputs such as Figures 14.
about optimization. 3 D LL s 6 8 10 12 Predecessor Activities Are Complete Figure 14. In some cases.04 weeks 3 s a .3 shows total project cost as a function of time to complete Activity AE. A graph or other means comparing decision variables with project costs (the next chapter describes this optimization problem). such as is illustrated in Figure 14.35. A graph of project EV cost versus the particular activity time to complete. and includes a distribution for Project Completion Cost and a sensitivity chart for EV cost versus an activity planned start date. and the best project and activity managers will be alert and responsive to changes as they occur. 0. Regularly examine where resources might be better allocated.Chapter 14Exploitingthe Best of Critical Chain and Monte Carlo Simulation N = 10.2 When Activity AE Can Start Time to complete the activity (such as in Figure 14. advancing. Chapter 15. are useful in evaluating alternatives to delaying. * : * Figure 14.1). slowing. This function and the criticality index. A project is a dynamic system. it is useful to build a submodel of the activity that details the major contingencies. or accelerating work on this activity. as either modeled or judged directly by the activity leader or best available expert.3. Summary Project teams should stay flexible. continues this example. .
3 Project Cost versus Time to Complete AE Capable leadership of a motivated. This is an exciting time in project management. The project management body of knowledge has changed dramatically during the past twenty years. . The methodologies described here can be very helpful in bolstering human judgment. The next chapter shows an example stochastic project model and the types of information this produces for decision making. welltrained team is of paramount i r n portance to project success. However.Chapter 14Exploiting the Best of Critical Chain and Monte Carlo Simulation Time to Complete AE. even skilled people perform complex planning processes poorly without the assistance of quantitative tools. This is built in Excel and incorporates probability distributions to represent chance variables. We have the means at hand to dramatically improve productivity in project execution. Weeks Figure 14. Lowcost software and computing power now enable modeling projects as never before.
with perhaps variable levels of commitment for each.CHAPTER 15 Here we look at project planningespecially project risk managementin a decision analysis context. The stochastic project model provides the means to optimize decision variables.. Sequencing the activities. though the possibilities may be many. Figure 15. The decision about the network structure is discrete. including which to execute in parallel.1 shows one example. In project schedule planning. critical chain) controls. Identifiable resources. The example in this chapter focuses on activity planned startdate decisions. W Resources and resource levels. Planned start datesfor each activity. A simple example project. decisions fall into three primary decision categories: Structure o f the activity netzuork. too. There are numerous decisions in project management. are usually variable. Date assignments. planned start date for activity AE. Even within the same project. optimized under uncertainty. A activity planned starts are at their optimal values. . shows the types of information availablefrom a stochastic project model.e. It's an Optimization Problem Managers often face decision problems where the choices are more complex than simply yes or no. and these sometimes need to be planned around resources' commitments elsewhere. This chart shows project cost senU other sitivity to one decision variable. choosing from among the available limited options. a resource critical path (i.
Chapter 1ptimiz~ng Project Plan Decisions a r k + w 2 223. This chapter builds upon the last in demonstrating decision analysis as an alternative and improvement over traditional and critical chain deterministic methods.55 0 '7 a 2 7 7.1 Optimizing Activity AE's Start Chapters 10 through 12 discuss ways to express judgments about resource availability.70 222. The Assemble Equipment activity.95 222. Having two potential rework activities realistically complicates this straightforward project.85 222.90 222.75 222. The stochastic model embodies uncertainties in two forms: activity completion times and possible rework. A better approach is to explicitly r e p resent project uncertainties in a stochastic evaluation model. Weeks after Project Start Figure 15. a rework activity would be modeled as a loopingback cycle until we get it right. "AE. representing this in the model is much simpler if we can reasonably assume that only one rework pass is required to remedy defects. . The main bufferto protect the customeris placed at the end of the project. productivity.5 8 8. we will focus on optimizing the schedule model already structured: choosing best planned activitystart times so as to minimize overall project cost (thus maximizing project value). However. Example Project Model Figure 15." is singled out in this and in the prior chapter to show example information available for this activity's manager.60 222. One of the tenets of critical chain project management (CCPM)is: Avoid having the same people work on parallel activities (multitasking)to the extent possible.5 Activity AE's Scheduled Start.2 shows a simple project model for delivering a custom technology system. This usually means restraint in starting activities. Ideally.5 9 9. The arrows in Figure 15. and other uncertainties. In this chapter.2 show the precedence relationships. However.65 222. this view oversimplifies.00 222.80 222. Small buffers at the end of "feeding chains" merging with the deterministic critical path minimize risk of project delay.
AU precedence relationships are Finishtostart.Chapter 15Optimizing Project Plan Decisions Figure 15. and that expected value (EV or mean) cost is the appropriate project outcome measure. is the effect of late or early finish on the overall project cost. In addition to judgments about the need for the rework activities M o w Software (MS) and Major Rework (MR).3 shows a distribution for project costs produced with the subject project model. let's assume that the company is neutral about risk.2 Activity Network Diagram Branches with decimal fractions. including contingencies. Included in the analysis and the complete project model. for simplicity. representing probabilities. project planning includes judging activity completion times. show two potential rework activities. The EV calculation simplifies decision making by reducing the distribution into an equivalentvalue single number. though not shown here. we are looking at project cost only. Further.000. Optimizing Activity Starts An influence diagram assists in modeling and understanding the elements of a decision. What we really care about is total project value. Here. except the one FinishtoFinish relationship: Design Interface (DI) must finish before Design Software (DS) finishes.4 shows an activity's startdate decision and elements in evaluating when to plan an activity start. and we assume that minimizing the total project cost is the objective. We discussed this method in Chapter 9. The influence diagram in Figure 15. . The project outcome forecast is from Monte Carlo simulation. The best singlepoint forecast of total project cost is $223. The planned activity start dates are the decision variables in this model. Figure 15.
194 . with the amount of rework exacerbated by premature activity completions. Multitasking inefficiencies. One important perspective is freefloat (slack):How much can we delay this activity's start before affecting the early or planned start of its successor activities? Figure 15.F&Optimizing Project Plan Decisions h U C a . and not a focus item for the project manager. This reduces temptation to "waste the slack. 3 D LL $? 0 150 300 450 600 Total Project Cost $thousands Figure 1 5 .Chapter l. despite that some of these details might involve highly subjective judgments. Certainly there should be slack between completion of some activities and the start dates of their successor activities. The project manager can optimize project start times and other planning decisions. affecting time to complete and qualitypremature activity starts worsen this also. Activity managers focus on their respective pieces. Be careful how people use the information about slack time! You don't want the slack to be wasted. Activity and resource buffers are intrinsic to the model. This is the distribution of the implicit buffer between AE and its successor. Correlations between times to complete various activities." A project plan can be updated for progress and as other information becomes known. The mean slack time (free float) is 4. and knowing with high confidence that there is some slack might lead people to focus less on the task at hand. * : * Note that there are no explicit buffer representations. Everything important to the decision should be included in the model.5 shows the forecast of slack time after Activity AE and the beginning of its earliest successor. A more complete model might also include the effects of: Changing requirements. For me.9 weeks. 3 Distribution of Project Cost This calculation model includes only cost penalties for idle crews.
Increased asset value.3 chance (see Figure 15. The stochastic model shows that there is about a . Incentives This model recognizes all values as costs. W Early project finish was valued at +$8. of crew time is wasted for the time after planned start before an activity can begin.000 cost per week. 25 percent of idle crew time.Chapter l w p t i m i z i n g Project Plan Decisions I Figure 15.5) that Activity AE will be critical. A finish before the forecast finish wastes. Penalty functions represent the cost effects: A fraction. typically put at 30 percent. due to early completion. is incorporated as a negative cost. typically.4 Influence Diagram of Activity Start Decision this was a key message in Eliahu Goldratt's Critical Chain (1997). .000 per week. and late finish was valued at $10. due to precedence requirements.
though both have the same calculation goal: OptQuest@ for Crystal Ball@ by Decisioneering Corporation. computation efficiency requires that project models be kept reasonably simple.I50 2 a . for monitoring the program's progress.000 0 5 10 15 20 Buffer between Activity AE and Successor. In the past two years. Depending upon the problem. we need 100 to 1. another automated assistant could guide the activity network 196 . An intelligent automated assistant could guide most or all of the optirnization.300 .075 . 2 . My limited tests thus far are inconclusive as to advantage. Further. Both RISKOptimizer and OptQuest optimization tools provide progress reports. Because Monte Carlo is an approximation technique.5 Distribution of Slack Time after Activity AE Optimization Experience Solving stochastic models is computationally more timeconsuming than conventional analyses. because calculations with probability distributions are so difficult.Chapter 1ptimizing Project Plan Decisions .000 times more computer time to get reasonably precise results. They use very different methods. for more details. This tool applies tabu search and neural network technologies for optimizing decision variables.225 a P . computer power is continually becoming more accessible and inexpensive. This tool uses a genetic algorithm approach to optimizing @RISK@ models. Decision Analysis Software. Monte Carlo simulation solves a difficult calculus problem using a simple and elegant random sampling process.6. RISKOptimizermby Palisade Corporation. Fortunately. two stochastic optimization tools have become available for project models in Microsoft Excel spreadsheets. Weeks Figure 15. similar to the one shown in Figure 15. Nonetheless. one tool may be better than the other. See Appendix B. so I've disguised the exhibits as to the particular analysis tool used.
starting from the lateststarts CPM solution. using earliest starts and latest starts. Comparing columns 2 and 3. in weeks. Merge bias is an example of the stochastic variance topic of Chapter 13. Merge bias is the effect causing the difference between the base case and EV forecast. The mean valueused for making decisionsrepresents the stochastic solution. Use outcome EVs when making project decisions. After three computerhours. The base case in each column is the deterministic solution. The decision variables are planned dates. Table 15. Most stochastic project models show that projects require more time than suggested by the base case. This project model behaves such that an earlystarts solution is nearly optimal because the penalties for late finishes dominate. to start the respective activity.1 summarizes selected deterministic and stochastic results. The variance from the base case value results from carrying probability distributions through the calculations. notice that penalties for delay (in this model) are worse than costs of early starts.6 Optimization Progress construction. Then. with one tool. the mean project .Chapter 15OptimizingProject Plan Decisions 235  Project Cost $thousands 225  Hours 3 through 8 Optimization Progress * 220 0 I I HumanGuided Final Optimiatlon 4 I I 1 I 1 I 100 200 300 400 Trial Solution Figure 15. These are among the topics of Chapter 17. the table shows three progressive stages of optimizing (using a 300 MHz Pentium 11) the stochastic model. The automated optimization started with the "lateststarts" values for activity start dates. Stochastic variance (Chapter 13) is useful as a component in explaining the difference between forecast and actual. This is because the latest of predecessor activities typically determines when the next activity can start (the lines merge on an activity network diagram). Expert Systems in Project Management. We see summary statistics for the distributions resulting from two traditional deterministic critical path method (CPM)solutions.
6 illustrates the progress graph of this second period of automated optimization. . Then I finished the job. With 5. Figure 14.Chapter lptimizing Project Plan Decisions Codes for Activities DS Design Software Dl Design Interface ST Software Testing P C RT AE Procure Components Recruit Technicians Assemble Equipment IT Integrated Testing M S Modify Software (if indicated in ST) M R Major Rework (if indicated in IT) Project Cost for Five Cases cost fell from $242k to $233k.6k [I].000 trials. manually adjusting decision variables in steps that brought EV cost down to $222. Another five hours further reduced mean cost to $226k.).3 Project Cost versus Time to Complete AE (useful for evaluating options to shorten or extend this activity). this is insufficient to decide when to plan Activity AE's work start.6k. Three figures in Chapter 14 show distributions useful for Activity AE's manager: Figure 14. However.2 When Activity AE Can Start (distribution of when a l l predecessor activities will be complete.1 Forecast of Activity AE Completion Time (obtained in this case from a submodel for the activity). the standard error of the mean is approximately $0. Figure 14. Figure 15.
meaning that AE has a 35 percent chance of being on the critical path and directly affecting time to complete the project. Optimizing Activity AE's Start. Integrated Testing. The EV is typically onethird of this formula result when using Latin hypercube error. Endnotes 1. enables the activity manager to optimally plan her activity's start date. the stochastic model reveals buffers as distributions. As such. the standard error of the mean is q = s/& where s is the sample standard deviation and n is the number of trials in the simulation. o.Chapter 15Optimizing Project Plan Decisions Simplifying Project Decisions The analysis in this chapter deals with decisions to plan activity starts. * . sampling (the standard technique). The previous chapter shows a distribution for the time between Activity AE's completion and another distribution for the start of the earliest possible successor activity. The criticality index will migrate toward zero or one as the project proceeds. Rather than planned intervals. the project manager has better information for making decisions under uncertainty. Figure 15. and we need other means to approximate this statistic. The core theme in critical chain project management (CCPM) is to avoid wasting slack. though perhaps unnecessary. for decision making. Even when AE is under way. The criticality index for AE is 0. Focusing on optimizing project value provides a more rigorous way for project planningand all otherdecisions. people checking on the distribution of the implicit buffer might be tempted to waste slack. Activity AE's manager should be receiving more definite information about whether AE will be critical. This is the uncertainty in the EVwith 68 percent confidence. Further. Decision analysis provides a more realistic and more accurate representation of the project and alternative values. The reality is that there isn't a confident amount of slack available to waste. the activity team may be unsure whether its activity is going to be on the critical path. For conventional Monte Carlo sampling. With a good system. These distributions are interesting.35.1. Most decisions are optimization problems where there is a range of choices for one or several decision variables.
.
although the graph is more meaningful if areas are drawn roughly proportional to the probability of the corresponding events. at some point in our lives. For most of us.The shapes are not important. we have five joint events. worst of all. and we call thesejoint events. In Figure 16. With just four algebraic rules. thus.1. zero probability. and the total area must equal one.1. without frequent use.CHAPTER 16 Probability is the language of uncertainty. English statistician J. Venn Diagrams and Boolean Algebra A useful aid for visualizing events and probabilities is a Venn diagram. our teachers presented the topics in this chapter. unfortunately. It is a pity that our education was neglected. Each piece represents two or more events together happening or not happening. we should have begun a lifetime of communicating and working with probabilities. and we partition the area inside to represent the possibilitiesevents. are dimensionless. the skills are lost. we can do a lot with probabilities. "What if we're on an intersection of lines?" One need not be concerned. "What if we're on a line?" Or. However. such as that shown in Figure 16. I finally realized. l l of the possibilities that A Venn diagram represents "event space" and a we relate together. The outer border is typically a rectangle. As soon as we learned fractions. representing probabilities. I sometimes worried. * : * In school. . because a line has no area and. The areas. Here is a short refresher if you need one. Venn (183488)invented the diagram that bears his name.
2.c =Aonly B .~.B.C =neitherAnorBnorC.C=Bonly ~ ~ ~= ~ . 1 Venn Diagram Here is some useful notation in what we call logic or Boolean algebra: A p (A) AB or A . As examples: = TRUE() = IF(AYTimeToComplete <20. 202 .False). And these take arguments. In the scenario in the Venn diagram in Figure 16. true or false. You may recognize these as Microsoft@ Excel spreadsheet variables and functions: TRUE or TRUE() FALSE or FALSE(). A ..A. Key Probability Theorems There are three primary logic operations. B P (A I B) A+B  means not A means the probability of A means A and B (the intersection) means the probability of A.B=Conly .B. there are five possible outcomes:  A . given B means A or B (the union).~ ~ ~ ~ = ~ a n d ~ = i n t e r s e c t i o n o f ~ a n d ~ c =C.True. B = A . Chapter IGProbability Rules A m >" ) \ B  Figure 1 6 .1. A = B . We can assign either value to a cell or the result of an equation.
If we add the areas for A and B together. This means that either A or B is True.B). This is done by dividing each weight by the sum of the weights.B). we can see why it is necessary. For example. if we have a cell named "MachineCenter14Available" that has a value of True.1. So. we have twice counted the intersection area (overlappingportions of A and B). P(A . now representing probabilities. in Excel that work the same. then we can convert the weights into probabilities by normalizing. You can omit the parentheses. Addition Rule We represent the union of events with either a "+" or "u" symbol. and .2 = P(failure).8. Referring to Figure 16. will then always add to one.Chapter 16Probability Rules The parentheses behind TRUE and FALSE. This returns the complement of the argument inside the parentheses. We call this the normalization requirement. surprises many people. or both. Complement Rule Here is an obviously valid concept called the complement rule: For example. TRUE and FALSE. if we know that P(success) = . In Excel. then we also know that P(not success) = . Excel has a NOT() function. this last term in the addition rule is to correct for otherwise doublecounting that intersection piece. if used. The resulting numbers. suppose we have: Cell B10: LathelAvailable containing a formula that evaluates to True. Cell D10: Lathe2Available containing a formula that evaluates to False. indicate that these are functions. P(A+B)= P(A)+ P(B) . The complement rule is closely aligned with the idea that the sum of the probabilities must always add to one. since there are system variables. If we have weights for several possible outcomes.P(A . then: evaluates to False. The last term. This is called the addition rule.
this is the slightly darker area in common with both A and B. An interesting feature of probabilityis that if A depends upon B. In Figure 16. from our understanding of the system (project). The multiplication rule formula is: P(AB) = P(A) P(B I A).1. then B tells us nothing about A. In this special case. These are said to be mutually exclusive. We usually know. We appreciate the causeandeffect relationships. and we symbolize this with the multiplication dot ".B=AB=AnB. left t~ right. Every directed path through a tree. we often use the multiplication rule as an expression and a test for independence: A and B are independent if P(A) = P(A I B). as with implicit multiplication between two symbols. we see that there is no common area between A and C.Chapter 1GProbabili'ty Rules Cell F10: LatheAvailable contains formula: = OR(Lathe1Available. If either ar both efforts succeed. Thus: AandB=A. describes a joint event." or the "n" symbol. Independence The three probability rules described in this chapter all work. The darkened topmost path is AB. This means. Often people omit the dot. as shown in Figure 16. Lathe2Available) that evaluates to True. if true. then we have successfully achieved that goal. The probability of B is conditioned by (dependent on) the outcome of A. the addition rule simplifies: P(A + C) = P(A)+ P(C). In Figure 16. then the converse must also be true: P(B) = P(B I A).1. . whether or not the events are independent. which elements are independent and which are influenced by others. thenat least numericallyB also depends upon A. Modelers often use the addition rule when they want to calculate the probability of success when there are two or more teams working separately toward some goal. This is the logical "and" operation. and P(AB) = P(A) P(B I A). Multiplication Rule This next rule calculates the probability of two events happening together.2. If independent. In fact.
In fact. because if one is true. in effect. behind the three already discussed. With new information. Thus. These are very different concepts. I count this as the fourth probability rule.. . and each task is independent of the others. I added the center part to show that Bayes' rule is. 2 A Tree Path Is a Joint Probability The multiplication rule has a simpler form for independent events: We can string any number of events together in this fashion. then they are not independent.Chapter l&Probability Rules Figure 1 6 . The normalization requirement explains both the multiplication rule and Bayes' rule. if two events are mutually exclusive (e. we cut away part of the Venn diagram and normalize probabilities for the pieces that are left. then the special case of the multiplication rule will calculate the probability of success. Here is a somewhat different presentation of Bayes' rule: where ei N A = outcome i of = number of the chance event of interest possible outcomes for ei = attribute of new (imperfect) information. then the other cannot be. *% Warning: People often confuse independence with mutually exclusive.g. Bayes' Rule Appendix 6A presents Bayes' rule. just a rearrangement of the multiplication rule. if everything has to work for the project to be a success. the outcome of an information event (symptom). so long as each element is independent. separate circles).
. As people become more numerically and logically literate. We will see more of this. you might search using: "power plant" and nuclear and not "North America. I have observed that schoolchildren in Europe and Asia learn these concepts more thoroughly. For example. Some of the Internet search engines support Boolean searches using the same concepts.Chapter 1GProbability Rules Thinking Logically The four probability rules are sufficient to do most operations with probabilities." I once saw a search tool screen with a little Venn diagram indicating whether the and or or operation was in effect. we can expect greater fluency with these probability concepts.
Smart Computers Artificial intelligence (AI) is a broad area of computer science devoted to designing ways to make computers perform tasks previously thought to require human intelligence. .CHAPTER 17 Knowledge management is a hot topic in business todayand people are finding ways to capture knowledge and embed it in software.and framebased. We will also discuss an allied technology. The effective expert system (ES) can assist a nonexpert in performing tasks. provide new ways to represent and share hardwon knowledge. including rulebased expert systems. Artificial intelligence techniques. A computer program then applies these rules to guide a user in performing some task. such as planning and analysis. How does a project manager perform his job? How do you train a person to be a good project manager? The idea behind expert systems is that we can capture the knowledge needed to perform certain types of work in a knowledge base of rules. neural networks. including "fuzzy logic") Neural networks. at nearly expertlevel performance. The main applications of A1 include: Natural language processing (computers that understand our speech and words) Vision systems (such as recognizing objects and patterns) Robotics and control systems ESs (rule.
You have probably used ESs already. smart computer programs promise better tools in project management." because such programs are vastly inferior to the overall capabilities of a human expert. Almost any analysis procedure that we can carefully describe is suited to automated analysis. The ES is a computer program that contains expert knowledge about a narrow problem. Cost estimation. and industry accounting practices are example knowledge areas that we can encode as rules. ESs and ne'ural networks are often called "knowledge systems. The ES assists a lessexperienced person in performing a particular task in a manner similar to and with nearly the performance of a human expert. Analysis. or conventional paper reports. Automated systems can communicate "this needs attention" alerts by email. ES technology offers software developers a way to buildin project management expertise. This technology is often operating behind the scenes in software systems. pager. That possibility is at best (or worst) a few years away. Expert Systems An ES is a computer program that solves a procedural problem in much the same way that a human expert would. It is then able to provide prompt warnings of exceptions and trends. and we're sure to see more commercial offerings in this area.Chapter 17Expert Systems in Project Management Some people bristle at the name. voicemail. An ES can continuously monitor project data and conditions. Meanwhile. matters possibly requiring intervention. such as the "wizards" often found in some popular computer programs. Knowledge about generating projection models. Model building. forecasting formulas. Instead. There are already fielded ESs and neural networks (described later) for this application. These are a l l functions that project managers perform. An ES is much easier to configure than describing the logic by hardcoding with IF statements. As such." Some of us think that we humans are inventing our successors (see Moravec 1999a and 1999b). "expert system. The most popular ES applications include: Diagnosis and prescription Classification Planning and configuring.Our tools may replace us. In project management. Exception reporting. . An ES is an appropriate approach to the manual tuning of the project optimization described in Chapter 15. following are principal areas where knowledge systems have special potential.
and these are a common source of new rules to be added.1 is a simple frame example for a software project. Figure 17. Commonsense reasoning is not a feature of the system. Users can make notes about exceptions (overrides). Welldesigned systems usually feature an "explanation facility. which in turn are further detailed into actions or steps.Chapter 17Expert Systems in Project Management Truck Transport Billing System Type Project: New Development Platform: PCs running MS Windows Language: C+ Interfaces: General Ledger Accounts Receivable + Figure 17. We must clearly define the task and make the rule base as complete as reasonably possible.0 Figure 17." Rules are the action part of the knowledge base. A typical ES contains 50500 such rules. It is a great learning tool. and some of the attributes can be "inherited. the activities are often detailed as tasks." This is a way of backtracking through the rules that were used ("fired") in arriving at the recommendation or conclusion. . Modern databases and objectoriented programming languages have increasingly incorporated this approach in recent years. for example. By examining the system's reasoning. Each frame represents a hierarchical data structure relating objects to each other and containing attributes of each object. Most often.2 shows two example rules. Frames are a way to store information about "objects" (chunks of data).7 IF Change Order and electrical then Need Architect Approval and Need Engineer Approval CF = 1. the program user taps into the wisdom of the experts who contributed to building the knowledge base. and a human decides whether or not to accept the program's advice. There is often a hierarchy of object types. Figure 17.1 Example Frame Structure IF Weather Forecast = Rain and Outdoor Activity then Expect Weather Delay CF = 0. the ES functions only as an advisor. They encode the logic for solving the problem.2 Example Rules Representing Knowledge There are two common ways of representing knowledge in an ES." In a project model. Certain features of the "parent" object can be inherited by its "children.
Some among us believe we humans are inventing our successors. then CFs provide a means to rank the proposed solutions. also called certaintyfactms." where we rate knowledge completeness as confidence factors (CFs. The world isn't black and white. and (c) conclusions "inferred" by the ES as it operates. Zero means "no information or knowledge. in onetwentieth the time. and that our tools will replace us (see Appendix 17A.Chapter 17Expert Systems in Project Management 0 % I once invested most of a year in developing a prototype ES for financial modeling. Knowledge engineers encoded the expert's wisdom into a knowledge base of about 200 rules. developed the Wine Advisor several years ago to demonstrate its ES software. With these knowledge representations. The Wine Advisor assists you in deciding what type of wine to buy." and 1 means "absolutely true. Using the automated assistant. I could build custom evaluations models. This is what gives an ES its power. and suggesting remedies Monitoring for risk events and early warning signs. If the problem has multiple solutions. bill of materials. Suppose your project is to prepare an elegant meal for your family or friends. The ES asks questions about what food you . while reporting problem areas (elegant "exception reporting"). much as we now do with computer spreadsheets. We can attach confidence factors to: (a) assertions provided by the user or knowledge base. and builders of these systems often want to recognize shades of gray. and search techniques. Neural Networks). appended to the rules in Figure 17. The most popular way to represent confidence in knowledge is by "fuzzy logic. Typical analysis tasks where we might want assistance include: Helping design the project to meet a list of target specifications. and cost estimates Helping develop the project plan. (b) rules in the knowledge base. Here's an example "utility" ES. it is difficult to imagine a procedural problem that cannot be solved with ES technology. Teknowledge Corp. though they are not. The IE borrows from rules of logic. while watching for resource problems and looking for ways to level workgroup loading and shorten the schedule Analyzing the project situation." See Appendix 178 for more about this. Operation The "reasoning" part of an ES is its inference engine (IE).2) on a 0to1 scale: Sometimes people confuse CFs with probabilities. Sometimes data are incomplete and uncertain. analysis. developing the work breakdown structure. The IE is a procedure that works through the frames and rules in order to draw conclusions. generating action lists. Not all rules apply to every situation. The company acquired permission to extract wineselection rules from a respected wine authority's book.
largescale ES to configure complex computer orders. and frame structures and rules work well for representingt Part of my experience has been in developing an ES for project casMow modeling.Chapter 17Expert Systems in Project Management are planning to serve: Will you have meat? If so. Even if we do a good initial project plan. Solving the problem currently requires several hours to several days. but generate . We can teach the skill. In over two decades of popular use. ES technology has diffused into some of the project management applications. prioritized according to the selection expertise of the expert. fit together. An ES can manage many of the details. Much of the work in design and estimation is procedural. maintaining the project data' l l changes is often a major effort. We can recognize a good solution.The knowledge base can recognize different types of projects in different industries and in different countries. There are recognized experts who are substantially better at solving the problem than nonexperts. explain the process and reasoning of the experts. developed an early. Monitoring is especially tirnebase for a consuming and tedious. and work. what type? Will there be a sauce? Tomato or cream? And so on. The Wine Advisor then presents a list of recommend wines. Applying the Technology What makes a good candidate for an ES? Here are guidelines for quallfylng a problem for an ES solution: The problem is important. Following are several examples: Digital Equipment Corp. i. The knowledge base and inference engine represent the process of developing feasibility models (functionallyequivalent to what an analyst would develop with a computer spreadsheet). and still most common way. ESs take more computer time to operate. A body of knowledge related to the problem exists. Everything had to be shipped. Experts are scarce and expensive. Engineering design and cost estimation are prime application areas for ES developers. to provide flexibility is by embedding conditional branching logic using IF statements. The old. h i s knowledge. The program asks only pertinent questions based upon rules in the knowledge base and inferences drawn from answers that you provided earlier..e. The interactive process takes only a couple of minutes. Software tool developers have long recognized the need for dynamic programsthose that change their function and presentation to suit the circumstances. When we hardcode the logic this way. Millions of combinations were possible from among thousands of components. the program is "brittle" (easily fails) and difficult to change and maintain.
and we don't have time to do all that we're s u p posed to be doing. With ESs. ready to help us with selected tasks. The future of project management includes tools to help us work faster and improve o w performance in an everaccelerating world. Our "assistant in the box" never gets tired and never has a bad day. more alertly. The two following appendices describe these emerging technologies. We don't have time to learn everything. . Two technologies closely allied to ESs are neural networks andfuzzy logic. and more rigorously. Intelligent assistantsutilitieswill increasingly appear in the marketplace. People are already using these methods in project management applications. Expressing the logic as rules is much simpler for tasks suitable for an ES design approach. Further. we don't have to be a specialist to perform some tasks at an expert level of proficiency. we can accomplish work faster.Chapter 17Expert Systems in Project Management fresh reasoning as needed.
We design NNs to operate in ways approximating. nonlinear regression. such as that displayed in Figure 17A. and perhaps improving upon. that regression determines equation coefficients in which a calculated variable is a function of one or more other variables. sometimes with astonishing accuracy. Given the parameters of the project or activity and historical data. while a typical artijicial NN has perhaps fifty neurons. NNs mathematically process complex data. Once trained. The reasoning is difficult for the user to decipher. One can view this as sometimes massive. calculating a "least squares" fit through data points of y versus x. we relate various inputs to one or several outputs. an NN can predict time and cost to complete. An NN can process data. Relating Risks. used for pattern recognition. filtering data. estimating best competitive bid amount. Weighted values of the input nodes determine what happens at successor nodes.) In an NN structure. Also. where we find the best line to represent a relationship between values plotted on an xy graph. The two methods are very different and complementary. there may be one or more "hidden" layers of nodes to improve performance. *:* Recall from Chapter 12. controlling processes. Estimation is an obvious application in project management. and optimization. we care mainly whether the "black box" generates good predictions. Some of the programs require little user knowledge. called neurons. estimating cost. . and an ES can process the information further. Knowledge is represented solely by the network configuration and the weights assigned to the branches. the network can examine new data and predict the outcomes.Chapter 17Expert Systems in Project Management APPENDIX 17A An artificial neural network (NN) is a simple arrangement of calculation nodes. However.1. The key is "training" the networkthat is. Don't worry about becoming obsolete quite yet: the human brain has approximately 100 billion neurons. Both ESs and NNs capture knowledge and experience. multivariable. Lowcost PC software is available for building NNs. An NN relates {several to many inputs) to {one to several outputs). Applications include forecasting project performance. cycling through data to establish the best configuration and best weights for each connection. All but the simplest handheld calculators will do linear regression. recognizing patterns. analyzing trends. We are most familiar with linear regression. automatically training the network with the available data and testing the performance of different network structures. (Additionaldetail is in McKim 1993. the operation of a living brain.
Need to revise the knowledge base (brittle).1 compares and contrasts ES and NN technologies. Progressive learning (plastic). Judgment and intuition. Rules and frames. One graduate student trained an NN in an evening.C h a p t e r 17Expert S y s t e m s in Project M a n a g e m e n t Input Nodes Hidden Layer Nodes Output Figure 1 7 A . Domain expert plus knowledge engineer. Many suitable. Table 17A. including frames. Frames (objects with inheritance). Thinking in rules. and it performed sunspot forecasting comparable to an ES that took a sunspots expert months to build. Data. We need to be able to describe the rules for solving the problem. Sometimes with an ES approach. not by "recognizing" the solution. Expert Systems (ESs) More concrete. 1 I Decision Making Changing Data Explanations Table 17A. Common feature. Network pattern and connection weights. though domain expertise helpful. it would be too expensive to build a knowledge base of rules. . l Simple NN A S  Knowledge Knowledge Representation Capturing Knowledge Databases Structure Neural Networks (NNs) More fuzzy. Elusive. don't know the rules.1 Comparing ESs and NNs ESs are feasible only with procedural problems.
and inferences. If one is evaluating the chance that a project will exceed its scheduled completion time. if we have an assertion that a person has the attribute." While the membership function is not equivalent to a probability distribution." The words "substantially delayed" are somewhat vague. in the 1960s to handle vague meanings in logical inference. represents the fraction of people who classified the xaxis value as "Tall. such as that in Figure 17B. However. Fuzzy set theory. in my opinion." that says something "fuzzy" about whether the person is a member of the class of people considered "Tall. and the program assigns CFs to resulting inferences. "Tall. The basis is the idea of membership. Fuzzy set theory was invented by Lotfi Zadeh.1). representing the degree of confidence in that piece of information. Alternatively. CF = 0 means only that the information has no value. does not have the same meaning as probability. Many ESs use CFs to represent the reliability of assertions. they appear similar in ways. So. or outcomes. Fuzzy logic provides a quantitative approach to representing "soft" declarations. CF = 1 means 100 percent probability. . only here. We can characterize the truthfulness quality of an assertion with a number. We can assign CFs to assertions and rules. the assessment might be that a project will be "substantially delayed. Some call this approach to uncertainty approximate reasoning. The responses might form a cumulative frequency distribution.Chapter 17Expert Systems in Project Management APPENDIX 17B FUZZY LOGIC Probability theory deals with propositions. professor at the University of California. or fuzzy logic. it does not mean that the opposite assertion is assuredly false." This has the appearance and same dimension as a cumulative probability function. * ' CFs may look and sometimes act similar to probabilities. Degree of Membership in Figure 17B. we can express this judgment unambiguously as a probability. Degree of Membership measures the confidence that the proposition "Tall" is true. The yaxis. rules.1." This function serves as a way to encode the quality or completeness of information. we are representing the degree to which people would consider a height to be "Tall. ranging from 0 to 1 (see Figure 17B. but still describe the outcome. and we should not use CFs for risking. deals with propositions that have vague meanings. This degree of membership idea leads to its use as a confidence factor (Cf3. which has the same bounds of a probability. Suppose we polled many people regarding the height that divides classifications"Tall" and "Not Tall" for United States adult males. However. that are either true or false. Berkeley.1. The 0 to 1 CF range. they are not probabilities.
Fuzzy set calculations are simple. Figure 17B. for fuzzy calculations. The solution is fast and reproducible.I 0.6  a. . though somewhat arbitrary. and so on. The fuzzy results propagate through the model in a l l calculations. singlevalue inputs.1 Division Point between "Tall" and "Not Tall" a .Chapter 17Expert Systems in Project Management 1 . There is algebra. . cost. and propagate quickly through a model. 0  Degree of Membership 0 5' 5'6" I 6' 6'6" 7 ' I 160 180 200 cm Height Figure 17B. schedule.2 Project Outcome Belief Graph In project models.2 shows the outcome result of a simple project model. The result is membership functions for value. 260 280 300 320 340 360 380 400 Days to Complete Figure 17B. some people have used fuzzy logic assertions instead of crisp.
elevators. Fuzzy logic's mathematical operations are well suited to control systems and other situations where precision is less important. Nearly unpredictable results occur with complex formulas and conditional branching (IF statements).Chapter 17Expert Systems in Project Management The main drawback to fuzzy logic is that the calculations do not have the logical rigor of probability theory. For some purposes. we can achieve adequate accuracy with perhaps 1 percent of the computational effort that probability calculations require. . and autofocus cameras are common fuzzy logic applications. Control of air conditioners. One can obtain strange "distribution" shapes from simple and seemingly innocent combinations of variables.
.
but on developing insights about what could affect the project. quadrature. We consider possible events that would impact the organization or project. Approximate integration holds promise. Another term is fast calculus integration. . Some Additional Methods Table A. Several appear only in this appendix: Scenario analysis Approximate integration Design of experiments and Taguchi methods. or cubic spline integration (splines were flexible strips of wood used by draftsmen). Design of experiments and Taguchi methods provide an efficient approach to sensitivity analysis for manyvariable systems.APPENDIX A Decision tree analysis and Monte Carlo simulation are the most popular calculation techniques for evaluations under uncertainty.1 lists decision analysis and related techniques. This provides an effective way to prioritize alternatives for improving value or for reducing variance [I]. along with their key strengths and weaknesses. The method is known by several names: convolution. Scenario analysis is a planning technique focusing on plausible alternative futures and management responses. Few practitioners of scenario analysis apply probabilities to the individual scenarios. and that is the reason I omitted this topic from earlier discussion. There are other quantitative techniques useful in dealing with uncertainty in specific situations. We next identlfy and evaluate actions to take advantage of opportunities and to protect against threats. The emphasis is not on a forecast. The results are reproducible and fast.
Only approximates probabilistic reasoning. Reproducible solutions. Scenario Analysis Moment Methods (parameter method) Simple. Can accommodate complexity under contingencies. Simplistic project network model may be inadequate. Taguchi Methods Simple. Must limit number of decision alternatives and chance event outcomes. Monte Carlo Simulation Solution is approximate and changes with random number seed. l Techniques of Uncertainty 220 .Appendix ASummary of Methods Method Key Strengths Discussed in Chapters I Key Weaknesses Must convert continuous into discrete distributions. Value optimizing or variance reduction with efficient handling of many decision (controllable) variables. CPM. used in stochastic calculations. reproducible solutions. Seldom quantifies risks and uncertainties. value of information). Provide only statistics about the shape of the solution distribution. Risk or uncertainty is merely one of several attributes. e. Potential developments needed to improve accuracy. I Influence Diagram Similar to decision trees. I I Fuzzy Logic Lowmedium complexity. such as correlation.. Better represents relationships between variables. Limited representation of uncertainty. Little recognition in practice and literature. calculations are more difficult. PERT. Time versus accuracy tradeoff. repeatable solution. Poor precision with lowprobability events. emerging technique. fast. Probabilities can be used if the criteria hierarchy represents the value function.g. noise. Calculations often ignore important details. with PERT. 1 Evaluating alternatives with sequential decisions (e. Decision Tree Analysis Graphical layout of expected value calculation. Analytic Hierarchy Process Simple. Seldom quantifies risks and uncertainties. Fast. Other Methods Scenario Analysis Design of Experiments. problems with consistency. Medium complexity. and PDM Simple. Simple if nonprobabilistic. Sensitivity Analysis MultiCriteria Approaches.g. Does not recognize risk versus value tradeoffs. fast. using Low and High for each chance event.. Approximate Integration Table A . Only one critical path is recognized and. Very generally applicable.
July 1992. conventional deterministic models will suffice in some situations. each technique is better at solving certain problems. for example. . and perhaps the projection is only for a budget plan. See Santell. Part 1 1 1 . People often approach risk and decision analysis to determine the uncertainty in an estimate. Warner. However. costs. pp. Warner wrote a contemporaneous threepart tutorial in PM Network: Part I. and J. Stochastic models. such as when: Not very important Low risk: reasonably assured volumes. Deterministic or Stochastic? Is stochastic analysis even needed? I f so. P. in the form of an outcome probability distribution. R. There are a few other calculation methods. 3438. Neither technique is inherently superior.2 lists decision situation attributes versus tools. but trees and simulation remain the workhorses of decision analysis. Table A. 6974. whether decision tree analysis or Monte Carlo simulation would be preferred. Jung Jr. August 1992. J. and prices Preliminary evaluation Doing a base case projection or sensitivity (whatif)analysis Decision is made. C. This helps in contingency planning. The signal processing field has used these calculation techniques for perhaps three decades. M. Endnotes 1.Appendix ASummary of Methods Approximate integration provides suitably accurate results for simple probabilistic models involving addition and multiplication. with all other things equal.. 1992. However. usually for special situations. pp. This book is about stochastic decision models. 3640. Part 11. pp. what method types are appropriate? Deterministic models. already. They are often surprised when the answer changes due to effects causing stochastic variance (Chapter 13). One obtains the solution directly. May 1992.. People often ask: Which tool is better: decision trees or Monte Carlo simulation? The answer depends upon the problem. Check marks show.
Outcome values easily obtained and few chance and decision nodes (hand solution feasible). Sequential decisions in model. Low probability events. d d 4 I Optimizing a continuous decision variable. Complex. Analyzing portfolio strategy..2 Stochastic Method Selection Guide . dynamic price model). d d d d d I 1' I Complex economic and competitive environments.g. More than 58 chance events. Trees I Favoring Simulation I Decision criteria other than value or only EMV. CE. Correlations can be represented by joint probability tables or by correlation coefficients or modeled dependency relationship. dynamic system. Chance events much better represented by continuous outcomes (e. I I I 1 4 1 1 1 Table A. or EU decision criterion. Output distribution($ desired.Appendix M u m m a r y of Methods Selection Factor I I I Favoring Dec.
This section provides brief descriptions and contact information for five programs. RISKOptimizeP. Two are for decision tree analysis. and Visual Basic'" are also trademarks of the respective companies. Traditional analysis is deterministicthat is. TopRank". 4+ First a disclaimer: I try to maintain objectivity about resources that I may be recommending to readers and clients. it is hard to be completely neutral. and I am trylng to be as objective as possible in this section.to moderatecost personal computer (PC) software. PROAct'".APPENDIX B In recent years.Microsoft@. DPL'". As project professionals become competent in spreadsheet or project modeling software. Further. DATA'". DecisionToolsSuite". two are for Monte Carlo simulation. Software is the great enabler. . The software screen shots are used with permissicn. However. RISK+'". decision analysis' growing popularity is partly the result of low.AnalyticaB. such as Microsoft Project. Most project software is deterministic. I do have friends in these companies.OptQuestB. without using probabilities. many people who are not professional analysts have been developing sophisticated project models. @RISK@. A foremost duty as a consultant is to be independent. Lotus@. Though I am not on any company's payroll. and another is for inventorying project risks and actions. "What's next? How can I improve the quality of these models?" Representing uncertainty is a logical and important step toward improvement. Coupled with relentless competition and increasing computer power is an increasing awareness of decision analysis. and I've delivered training and consulting in association with two of the other companies. and Palisade@ are federally registered trademarks. PrecisionTree". 0 % Trademarks: 123@. they might ask. I assist in marketing PROAct on my Web site.
Though unnecessary. 224 . if you are only going to solve the tree once. to draw the tree diagram on the worksheet. Developing a template provides documentation and ease of recalculation. including expected values. in this book). The whatif analysis default is to changing each input cell by & 10 percent. Use the Draw functions (in Excel) to superimpose a network of node symbols and connecting branches. visually. l TopRank Sensitivity Graph Spreadsheets The ubiquitous computer spreadsheet is powerful for many types of calculations. also. Microsoft Excel is the world's most popular modeling platform. Palisade Corporation (vendor information listed later in this Appendix) has an interesting tool for sensitivity analysis for spreadsheet models. The most straightforward are payoff tables.Appendix BDecision Analysis Software Tornado Graph for NPV:/M22 45 36 27 18 9 9 18 27 36 45 Percent (%) Change in N P V : /M22 Figure B . Most people developing casMow projection models are doing so in spreadsheets. However. Figure B.l shows an example tornado diagram from TopRank (graph modified slightly. for clarity. TopRank prioritizes input variables in terms of their importance in affecting a selected outcome variable. to develop templates for small. I recommend that you first sketch the decision model structure to be clear about the problem. it is helpful.to mediumsized decision trees. Most of the popular decision analysis tools are able to link to models in or through Excel. It is easy. Then lay out an organization of cells that will r e p resent nodes in the tree model. of course. don't bother with the spreadsheet.
Persons familiar with Microsoft Visual Basic for Applications can write a procedure to handle the simulation looping (#2). there are the two strong choices. @RISKfor Microsoft Project allows most types of project plan numeric inputs to be a distribution.com Figure B.palisade. Fred Glover. and correlation. to Lotus 123. and therefore his association gave OptQuest instant credibility. The competing simulation tool is: @RISKby Palisade Corporation 30 Decker Rd. Palisade offers @RISKversions for Lotus 123. 1. otherwise: (607) 2778000 Fax: (607) 2778001 http://www.2 shows an example Crystal Ball screen to define an input variable. which uses heuristics. He is world renowned. is the chief architect of OptQuest. Newfield NY 14867 Sales phones: (800) 4327475 in the United States. Excel already has at least two sampling functions (#I) and dataanalysis capabilities (#3). monitoring convergence. Monte Carlo Simulation For spreadsheet models. tabu search. Analyzing the simulation results: presenting statistics and frequency distribution graphs. 2. storing selected output variables for each trial. in one case. my optimization professor at the University of Colorado.decisioneering. 3. Sampling functions for popular and custom distribution shapes. Figure B.Appendix &Decision Analysis Software Simulation Ready The Monte Carlo simulation addins provide three main capabilities to Microsoft Excel and.. .com Crystal Ball Pro includes the OptQuest optmuzation program. Crystal Ball by Decisioneering Corporation 1515 Arapahoe St. and neural networks. Control of the simulation run process.3 shows an example outcome distribution with @RISKfor Excel. Ste. 1311 Denver CO 80202 Sales phone: (800) 2892550 Sales direct: (303) 5341515 Fax: (303) 5344818 http://www.
 . Ste. you might also consider: Analytica by Lumina Decision Systems P. I believe genetic algorithms will find a good solution to most any optimization problem. Sepulveda.Although not an "intelligent" search strategy.com If you are not a fan of spreadsheet modeling and like the influence diagramming technique.. =  Help Definingan 'Assumption Cell" in Crystal Ball 0 % @RISK was initially developed for Lotus 123. Inc. hence the name "@RISK" that provides additional @ functions for sampling distributions. 111 N. Box 320126 . .O. 333 Manhattan Beach CA 90266 Phone: (310) 7986396 Fax: (310) 7984226 http://www.Appendix %Decision Analysis Sofcware . which uses "@" to preface names for its functions. In addition to @RISKfor Project. here is one other product for adding simulation to Microsoft Project: RISK+ by C/S Solutions.cssolutions. RISKOptimizer is Palisade's genetic algorithm optimization package built around @RISK.
so that PrecisionTree is an especially good candidate for people who will be doing both decision tree and simulation analysis. These are designed to work together. If you are modeling in Excel. .4 is an example treea simple onebuilt with PrecisionTree. Decision Tree Analysis Perhaps the easiesttolearn decision tree program is: PrecisionTree by Palisade Corporation.L W I N A (toll free) and (408) 3541841 Fax: (408) 3549562 http://www. and a the Excel formula functions are available.lumina. Figure B.Appendix &Decision Analysis Sofhvare Los Gatos CA 950320102 Phones: (877) ~ ~ . This program is the only one I know that builds the tree inside an Excel l l of worksheet. PrecisionTree and @RISKare the pillars in Palisade's DecisionTools Suite. though this product features an "intelligent arrays" approach for representing data. and possibly linking both together.com Analpca also links to Excel models. this makes linking easy.
where its Marcov chain analysis features are widely used in modeling medical treatments. Yet another decision tree program worth a look is: DPL by Applied Decision Analysis. Inc. Inc. Menlo Park CA 94025 Phone: (650) 8547101 Fax: (650)8546233 http://www.com I perceive that DATA is more powerful than PrecisionTree.treeage. though more complex to learn and use for simple problems.adainc. 1075 Main St. Figure B. 2710 Sand Hill Rd. DATA has a particularly strong following in the medical industry. Williamstown MA 01267 Phone: (413) 4580104 or 888TREEAGE Fax: (413) 4580105 http://www. DATA has been around longer and this oneproduct company focuses on excellence in its one tool.com .5 shows an example sensitivity graph (modified slightly for clarity) for Daily Crane Cost in the Crane Size Decision from Chapter 3.Appendix LDecision Analysis Software Another good program that I use is: DATA by TreeAge Software.
Chapter 8 embellishes the typical approach and adds emphasis on quantification in ways consistent with a decision analysis approach. Project Risk Management Most people with a formal process for project risk management are doing it much the same. PROAct by Engineering Management Services.htm . Jerry evolved into using spreadsheets and eventually wrote his system in Visual Basic. Jerry Gilland. "DPL" stands for Decision Process Language. A colleague and friend. and the program interprets this to generate either a decision tree or an influence diagram. Inc. and started doing this by hand.maxvalue. DPL is the only program where modifying the tree also updates the influence diagram. the conversion from influence diagram to decision tree is one way. worked in the aerospace industry for many years. With DATA and PrecisionTree. a scripting language.5 Sensitivity to Daily Crane Cost Variable A l l of the decision tree programs mentioned in t h i s section are able to first develop the decision model as an influence diagram.com/pa.Appendix BDecision Analysis Software $430 $380 $330  0 Med~urn Crane 0 4 8 12 16 20 Day Cost $thousands/day Figure B. 4650 Shawnee P1. Boulder CO 803033816 Phone: (303) 4948009 Fax: (303) 4943515 PROAct demo: http://www.
What caught my eye was the overlay graph of distributions. before and after implementing selected risk mitigation actions. 230 .6   .ourl Data Above I 1 E ~ l 1~ J b I t RPl  Figure 6.Appendix ELDecision Analysis Software 6 2 ESd Fr4m jnpul:'Edtt Select Maps hkts CostP.nak~rs Ymdow bhms H s t p k w d WXJErmsdAG f m Y ~ h e = U Alter Aclims rts lnchtded h r~.   Comparing Effect of Actions on Project Uncertainty This is primarily a database for capturing and maintaining an inventory of risks and actions. shown in Figure B.6. PROAct uses a simple Monte Carlo simulation calculation for approximatingproject cost (or schedule) uncertainty.t~~>r.
CE CP CPM DA DCF DROl Elx) EMV EU N MCDM NCF NPV OR OWMS analytic hierarchy process activity network diagram critical chain project management cumulative (probability) density function certainty (or certain) equivalent critical path critical path method decision analysis discounted cash flow (see present value) discounted return on investment expectation or expected value of (some parameter) expected monetary value expected utility expected value multicriteria decision making net cash flow net present value (same as PV) operations research operations research/management science .GLOSSARY This glossary is organized into three sections: W Abbreviations W Symbols W Terms (in one section.d.f. alphabetically) * : * Statistical and Probability e3 Project Management * : * Economic Analysis * : * Risk and Decision Analysis. Abbreviations AHP AND CCPM c. .
inverse) sample standard deviation time. especially in PV formula sample mean lessthan greaterthan lessthan or equalto greaterthan or equalto multiplication or " a n d Boolean operator.10 = PV @ 10%/year risk and decision analysis return on investment work breakdown structure Symbols P P 0 O2 0 .. PDM PERT PIR PRM PV R&DA ROI WBS precedence diagramming method program evaluation and review technique postimplementation review project risk management present value: PV. "A. e.d.g. 011 population mean Pearson correlation coefficient population standard deviation population variance standard error of the mean covariance between i and j summation series expression elements summed from n =1 to N annual interest rate or annual discount rate or annual inflation rate probability of (event) A probability of A or B (or both) probability of A and B (jointly) probability of A given B risk tolerance coefficient (or risk aversion coefficient.Glossary Pix) or ~ ( x ) probability of event x probability density function p.B addition or "or" Boolean operator not A .f.
the date to which cash flows are assessor Expert. assumption A condition. valuing. or both. activity network diagram (AND) Any project activity diagram. the desire is to increase the likelihood. this is an effort to affect a risk event or uncertainty. such as a relationship or a circumstance. the desire is to reduce the likelihood of the chance event. product. neural networks. or other central measure. An activity takes time and consumes resources.P(AB). person responsible for providing judgrnentsassessmentsabout one or more uncertainty input(s)for an analysis. or both. The effective date of an evaluation. addition theorem or rule Rule for calculating the "or" union of two events: P(A + B) = P(A + P(B) . its impact. 233 . artificial intelligence (Al) The broad area of computer science attempting to embody machines with humanlike intelligence. analytic hierarchy process (AHP) A system for multicriteria decision making. or other asset. With opportunities.Terms accuracy Jointly having both low bias and high precision characteristics. typically either CPM/PERTtype diagram or precedence diagram method (PDM) type. median. such as property. this term is ambiguous without clarification. that is presumed to hold for the purpose of the evaluation. accuracy ratio Ratio of (actual outcome)/(originalestimate). concept feasibility analysis." most often to the mean. Subsets of A1 include expert systems. refers to the measurement's agreement with the true value. robotics. appraisal approach View that all three types of decision problems (ranking. natural language systems. action In project risk management. because some people use this average when referring to the mode. average Refers to any of the "central measures. activity An element of work as part of a project. With threats. Alternatives are also rated along each attribute by making pairwise comparisons. its impact. asof date discounted. Generally. where the task is to choose the best alternative with respect to value measured by a hierarchy of subobjectives or attributes. used to gauge the objectivity of forecasts. association See correlation. asset model A lifecycle model of a project. and optimizing) can be solved calculating a numerical value measure. and vision systems. Attributes are weighted by making pairwise comparisons. asset Right to or ownership of something of value.
C capital constraint Any limit of funds. OR. in common use. central measure In statistics. certainty equivalent or certain equivalent (CE) The amount. and mode. and chance node. The difference between CE and EMV is the risk premium or risk penalty. this is called the base plan or base line plan. can assume many shapes within two bounds. any of the measures of the "center" of a probability or frequency distribution. Represented in decision models as with (synonymous) random variable. and NOT operations. complementary outcomes (events) Two distinct and different outcomes that together represent all the possible outcomes of a chance event. buffer In project management. bias A repeated or systematic distortion of a statistic or value. Net cash flow equals cash receipts less cash disbursements. Literally. as to decision variable. binary risk event A twooutcome chance event. this would be a limit on capital investment (assets that will have a longterm life). a decision maker would be just willing to exchange for an uncertain gamble. Boolean algebra Name for the common logic of AND. depending upon two shape factors. In project management. imbalanced about its mean.Glossary base case A reference deterministic solution obtained by using the expected values of each continuous input distribution and (usually) the mode of each input discrete distribution. beta distribution A simple distribution that. Used in the classic PERT project schedule calculations. Bayes' theorem The formula used to revise probabilities based on new information. This difference is attributed solely to the decision maker's attitude toward risk. median. such as mean. refers to a planned interval to allow for one or more predecessors to finish before the next activity's planned start. stochastic variable.This requires that no component distribution contributes significantly to the sum. . however." binomial distribution A discrete distribution representing twooutcome chance events of independent trials. this can be any budgetary restraint. usually "success" or "failure. chance event An experiment. After James Boole. These events cannot be controlled. or measurement for which the outcome is not known beforehand. cash flow (noun) Money entering or leaving the company treasury. known with certainty. process. central limit theorem The mean of the sum of a large number of independent distributions is approximately normal regardless of the shape of the component distributions. The concept used in Bayesian analysis. Often referring to the cash generated and available to reinvest in the enterprise or to distribute to creditors. certainty factor or confidence factor (CF) See confidence factor.
critical path (CP) critical path method (CPM) A deterministic analysis of the project activity network diagram (AND) to determine which activities lie along the critical path and determine overall project completion time. When involving two or more dimensions. A reserve amount of project funding (or time) to accommodate risk elements that will likely occur. Sometimes called contingent probability. conservative (risk attitude) contingency See risk aversion. critical chain project management (CCPM) A philosophy of project management that places a buffer at the end of the project (to protect the client) and buffers at the end of activity chains feeding into the (presumed)critical path. criticality index The probability that an activity lies on the critical path. "0" means of no value.e. CFs are used in rulebased expert systems. contributed to lengthening the project completion time). . ranging from 1 to +l. correlation coefficient A measure of the linear correlation between two variables. Synonymous with association.. confidence interval An interval range representing the degree of confidence in the estimate of a statistic. conditional probability confidence factor (CF) The probability that an event will occur given that another event has already occurred. continuous event simulation correlation A model solving for behavior of a system where time is advanced in small. uniform increments. Correlation is caused by one or more dependency relationships. obtained from the fraction of times in a Monte Carlo project simulation where a specific activity was critical (i. usually by applying more time and money. ranging from 0 to 1. credible analysis credible model One that warrants the confidence of the decision maker as being suited to purpose. A number. One that is accepted and used by the decision maker. continuous event A chance event having an infinite number of possible outcomes along a continuum. contingency plan An action or plan readied for execution in the event that a contingency occurs. Relationship between variables such that changes in one (or more) variable(s)is generally associated with changes in another. 2. A possible event that would change the project plan. 1. The sequence of activities that determines the completion time of a project. that indicates the confidence in an assertion or inference.Glossary complement theorem or rule Rule ensuring that the probabilities of complementary outcomes add (normalization requirement): P(A) + P(A) = 1. crash (an activity) Accelerate a project activity. the bounded surface is called a confidence envelope. and "1" means with complete confidence.
finance.f. Complementary term is stochastic (see). cf. deterministic Said of a model where all parameters are fixed or "determinate. consisting of decision.d. Projecting a future cashflow discounted return on investment (DROI) Ratio of PV/PV(investment). cf.g. (2) having a single value measure (usually NPV. the number of "heads" from flipping ten coins. or is influenced by. delta property Feature of expected value such that a constant amount added to each outcome increases the EV by that amount. discounted cashflow analysis (DCF analysis) stream and determining its present value. utility. continuous event simulation. e.a popular criterion for ranking investments under a capital constraint. continuous event.. Truly using decision analysis requires (1)capturing judgments about risks and uncertainty as probability distributions. Opposite of independence. chance. decision analysis (DA) A process for helping decision makers choose wisely under uncertainty. and terminal nodes connected by branches. discounted cash flow (DCF) The sum of individual yearly casMow amounts that have been multiplied by the appropriate PV discount factors. D decision A commitment of resources that is revocable only at some cost. the outcome of another chance event." Singlepoint solution. The formal discipline is called decision science. decision tree A graphical representation of a decision problem and the expected value calculations. or an objectivefunction). The subject involves concepts borrowed from probability theory. Computed as the integral of the probability density curve. discount rate The "interest" rate used for PV discounting. decision rule A procedural rule for decision making. . psychology.) Graph showing the probability that the parameter will exceed particular values (along the xaxis). discrete event simulation Monte Carlo simulation where the emphasis is on state changes of objects within the system.and (3) putting these together with expected value calculations. There are two forms: greaterthan and lessthan types. decision criterion Any parameter useful in making a decision. dependence or dependency When the outcome of one chance event influences. and operations research.Glossary cumulative density function or distribution (c. Dependent relationships are often represented by formula relationships or with correlation coefficients. and both present equivalent information. a subset of operations research (management science). discrete event or distribution A chance event that has a finite number of outcomes. statistics. See good decision. intended to provide the optimal expectation of progress toward the entity's objective. Partial or shared dependency is the cause of correlation. density distribution (or function) See probability densityfunction.
similarly. Suited when (1)enough money is available to do all EMVpositive projects. The knowledge is usually represented by ifthen rules.Alternately. distribution See probability distribution. E economic project One judged to provide a positive value contribution. transformed with a utilityfunction. translating objective value (such as NPV) into utility units. E(PV) or EV PV of net cash flow. such as U(x) = r(1exlr). exponential distribution A distribution typically used for representing the time between arrivals of random events. drawing out a representation of the decision maker's preferences. event tree Tree diagram showing what failures could have caused particular events. Most often. expected utility (EU) Expected value utility. expert system (ES) A computer program (knowledge system) that contains expertise about a specific problem. expected monetary value (EMV) Expected value of a measurement expressed in monetary terms. Usually refers to the expected value of present value.Glossary discretizing Converting a continuous probability distribution into an approximating nlevel (often. event See chance event. the test is whether NPV or EMV is positive. Complement of fault tree. feasibility study. distributive property Feature of expected value such that when every outcome is multiplied by a factor. exponential utility function Any of several equations. project assessment. where utility is typically the riskattitudeadjusted NPV. Usually. the EV changes by that factor. and all other types of analyses related to decisions. Using the expert system allows a nonexpert to perform the task at a proficiency level comparable to a real expert. When the outcomes are measured in monetary units. the corporate tax has been paid. feasible The characteristic of a worthy project where the expectation is valueadded (or costsaved). three outcomes) discrete distribution. The original investment capital is recovered when the stock is sold. utility may be represented by an objectivefunctionin multicriteria decision making. . EMV decision rule Decision policy to choose the alternative with the highest W V . and (2) the company is riskneutral. expected value (EV) Probabilityweighted average of all possible outcomes. elicit (judgment) The process of interviewing an expert and drawing out that person's complete judgment about a chance event. dividend Portion of corporate profits paid to investors. the term is usually called expected monetary value. evaluation General term for any type of analysis used for asset appraisal. engineering evaluation. and dividends are taxed again as ordinary income to the investors.
For example." can be quantified and used in logical calculations. and prophesy tend to be used when the outcome is specific (deterministic) and when special knowledge (e. Gaussian distribution See normal distribution. such as wording a survey question in terms of people dying (or surviving) or in terms of project success (or failure). Prediction. A forecast is the best projection (see). The system "evolves" to a good solution. freefloat. This often involves limiting the scope of the analysis. forecast A judged or predicted view of the event sequence or future state of the world. approximating the probability density function. is try to forecast what might happen. divine communication or extraordinary ability) is involved. including genes. and recognizing various constraints. Framing bias is a psychological effect caused by perspective or context. such as "high cost. Synonym: outcome. foretell. f u q logic Rules of logic based on the idea that events can be classified by degree of membership. they say. Gantt chart A chart with horizontal bars indicating the duration of project activities. a qualitative assessment. The Baltimore Sun Prediction may imply a deterministic outcome.g. Commonly graphed as a frequency histogram. The horizontal dimension is time. Usually calculation or estimation is involved. relating to particular goals and objective(s). Simulation trial results are often presented in the form of a frequency histogram (bar graph). especially negative feedback used for stabilization or control. event Project analysis from the point of inception through The outcome of a chance event.float. not what will happen. cf. Eric Lekus. framing Putting the decision problem into context.. frequency distribution A graph or other characterization of the observed values in a sample data set.identifying a stakeholder or other value perspective.Glossary feedback System concept where a portion of the output is fed back to the input. Futurists are careful to avoid the term "prediction" because it implies certainty. genetic algorithm An optimization programming technique that uses an analogy of natural selection. fullcycle economics asset abandonment. and mutation. float The amount of time that an activity can be delayed from its earliest start and not delay the project. a histogram showing the frequency of occurrence of each value segment. frequency distribution A graph of the probability density function. crossover. It is called a relativefrequency diagram when the frequencies are expressed as fractions. cf. whose shape approximates the true probability density function. No intelligence is involved. . What they do. free float The amount of time that an activity can be delayed from its earliest start without delaying any successor activity starts.
The characteristic where one event does not affect the occurrence of another. although not optimal. at which you are indifferent between the reference and having this lottery. projects having an IRR less than the hurdle rate are automatically discarded. trialanderror procedure. Graph showing frequency of observations counted in segments of the value range. Synonymous with rate of return (ROR) and discounted cashflow rate o f return (DCFROR). A joint event comprised of two or more separate events. or performance.Glossary goal A target or milestone. expressed as an annual percentage rate. There may be multiple roots when the cumulative net cashflow curve changes sign more than once. intersection intuition The believed capacity for guessing accurately. H heuristic Ruleofthumb that provides rough guidance on actions and their consequences. the effect that a risk or opportunity will have on cost. investment An amount paid to acquire an asset. A goal is usually measurable and has a stated time for achievement. . Calculating I R R requires an iterative. asset. Intersection of two sets. Judgments based upon feelings and not logical thinking. 2. Variables and formulas can be quantified and expected values solved by an iterative process as an alternative to decision tree calculations. interest internal rate of return (IRR) A discount rate that yields a PV = 0.. Seefvequency distribution. the event (A and B) is a joint event. marking progress toward the organization's objective. usually presented as a vertical bar chart. good decision One that is logical and consistent with the values of the decision maker (or organization) and all of the data available at the time. technique that usually provides a good. Amount paid for the use of funds. or entity. Similar to certain equivalent concept. influence diagram Network diagram of system variables with arcs indicating the direction of "influence" or timesequenced relationships. Ownership in a project. in one lottery. this is the cost of capital. interest earned by savings in a bank account. 1. such as represented in a Venn diagram. independence indifference probability That probability of success. impact In project risk management. histogram hurdle rate Minimum acceptable level of an investment selection criterion. and vice versa.g. schedule. Most often. e. inflation A rising general level of prices and wages in an economy. solution. joint probability The probability of two or more events occurring together. For example.
This is usually the best estimator for a chance event.f. there are seldom two identical values. The log values are normally distributed.. Half of the other values lie above.e. with frequency histogram data. each layer is sampled. the mode is typically chosen as the midpoint of the bin having the most counts.. limx nj m = F likelihood Usually synonymous with probability. cost of capital. lognormal distribution A frequently encountered positively skewed. e. mitigate To eliminate or lessen the risk or effect of. mitigate the impact of bad weather. and the other half below. A curve with two localized maxima is called bimodal. merge bias The EV start time of a project activity is later than the maximum EV finish time of predecessor activities. by a human. K kurtosis A measure of the peakedness of a probability density function. without replacement. These are the probabilities that are in the "margins" of a probability table. in a project (activity) network diagram. Symbols y for populations (p. profit. This is the highest point on a probability density distribution curve. Latin hypercube sampling (LHS) An improved sampling technique for Monte Carlo simulation that reduces the number of trials required to obtain acceptable convergence. With an even number of equally likely outcomes or observations.judgment A probability distribution assessment performed. Subjective probability assessment. The kurtosis statistic is the fourth moment about the mean divided by 04. The benchmark for comparison is the normal distribution's kurtosis equals 3.d. Incremental difference. marginal on. The probabilityweighted average. Said of cash flows. Synonymous with expected value when referring to the mean of a probability distribution. It arises when the value is the product of other distributions. This effect occurs at the joining or merging of arcs. moving left to right. P(A). e. mode The particular outcome that is most likely.'s) and Ffor data. mean The arithmetic average of equally likely outcomes or a set of observations. continuous distribution. at least in part. The probability function to be sampled is stratified into equiprobable layers. and marginal probability The unconditional or absolute probability. With sample data having high resolution. median The most central value of a population or sample set. Thus. the median is the average of the two centermost values.g.g.. For successive trials. . law of large numbers The sample mean can be made as close to y as we desire by a sufficiently large sample size. i. management science (MS) SO See operations research.
nonlinear regression. Principally two types: decision point (decision node) or chance event (chance node). normal distribution The frequently encountered. and so on. Models for decision purposes usually represent a business enterprise. cubed. the criteria are chosen to measure goodness along the dimension of each objective. ttode normalization requirement The requirement that the sum of the probabilities of all possible outcomes equals one. The circuit consists of a simple arrangement of nodes. Neural networks can be viewed as massive. Typically. Monte Carlo. net of capital expenditures.e. mwal netwoc)c (NN) A computer circuit that attempts to replicate the operation of the human brain. multivariable. nst present V (NPV) Same as present value (PV. usually with reference to a decision tree or influence diagram. project.Successful applications include pattern matching. and their interconnections. The mean is the first moment of x about the origin. see) or present worth. nru#iplication theorem or ruk Rule for calculating the "and" intersection of two events: P(AB) = P(A) . nwokcatlqr e w v e decision alternatives A situation where no alternative contains another alternative. noise filtering. bellshaped distribution. Mcr+h fkw (NCF) overhead. . Ir9uC#criterbdercls)on making (MCDM) Where an objectivefttnctionis comprised of several criteria. m n t First moment: the product of a point value times the distance to a reference axis. P(B I A).. The second and higher moments have the distance squared. End nodes are terminal nodes. Normalizing is a fundamental means of revising probabilities. called neurons.Glossary ntald A simplified representation of a system of interest.m When a resource is concurrently performing multiple tasks. The variance is the second moment of x about the mean. Usually. Considered less efficient than tasking sequentially. the "net" adjective is in reference to the principle that the cash flows should be net of the investment. Monte Carlo simulation See simulation. equiv&ti~t The result of translating different facets of value into a single measure: money. m ' s tree Refers to the requirement that decision trees be constructed to recognize the natural sequence of events. value measure) nodes and others. r r w r l # . @&Wireoutcomes A situation where outcomes are defined such that no one outcome contains values of another outcome. Mutually exclusive projects means only one project can be done. and vision recognition. and taxes. Influence diagrams additionally have objective variable (i. or transaction and consist of variables and mathematical formulas. Cash flow from operations. Also called Gaussian distribution.
Glossary 0 objective (noun) The purpose of an organization. Not affected by personal beliefs. objective analysis objective function I One that is free from bias. statistics. or o p portunities. in project management. A company or individual's holdings of projects. Risks and opportunities. The entire set of values representing the event of interest. opportunity optimal (noun). OR/MS. Start] after Start. objective value measure. Ability to systematically repeat a measurement or obtain an evaluation result without regard to the true value. investments. e. or policy that is unwilling to pay (or require) a premium for uncertainty. less precisely used to mean a goal. postimplementation review (PIR) or post audit Analysis of a project after the general outcome has been determined. This is typically NPV. post evaluation.. Often. 2. and calculation integrity. i. A probability assessment that is determined from complete knowledge of the system or of the parent population. together. precision I 242  .g. Synonymous with management science and often abbreviated together. often used as an antonym of threat. Typically based uponfull cycle economics. Popularly. Adjective meaning the variable is chosen to achieve the value optimum 1. A mathematical function to be optimized in decision making.the objective function combines selected criteria into a composite value measure. option outcome P A particular result of a chance event. Alternative.e. requiring biasfree assessment inputs. and lookback analysis. Also called post analysis. With an objective risk attitude. and mathematics to problems of organizations. comprise risks. objective or neutral risk attitude Describing a person. representing decision alternatives in one dimension and chance event outcomes in the other dimension. organization. payoff table A twodimensional matrix used to calculate expected values. An owned right to sell or buy at some future time. objective probability objective value measure A statistic that corresponds directly and logically to the objective(s) of the decision maker. which is converted to expected utility for the riskaverse decision maker. operations research (OR) The science of applying systems analysis. the potential for something to be better than expected. precedence diagramming method (PDM) Building a project model of activities linked to show precedence constraint relationships (at least four types. the certainty equivalent is exactly the expected monetary value.. Synonymous with universe. In multicriteria decision making (MCDM). population porffolio The set of possible outcomes for a chance event.).
) A mathematical or graphical representation that represents the likelihood of dif ferent outcomes from a chance event. Synonyms: scenario. . and risk attitude. analyzing. qualitative risk analysis. Also called activity network diagram. usually involving production volumes. and (2) to apply activity completion time distributions to determine a distribution for the overall project completion time. project network diagram (PND) Any schematic showing the sequence of project activities. time value of money. It includes the processes of risk management planning.d. normal. A table of random digits serves the same purpose by placing a decimal point in front of the integers. Flaw: neglects merge bias. The sum of discounted cashflow values. probability density function (p. and risk monitoring and control" (Project Management Institute 2000. and responding to project risk. annotated with outcome probabilities. useful in calculating expected values and joint probabilities. project risk management (PRM) projection A calculated future.) probability mass function (p. as is often the case in asset valuation.m. "Risk management is the systematic process of identifymg. It includes maximizing the probability and consequences of positive events and minimizing the probabfity and consequences of events adverse to project objectives. See probability densityfunction. risk identification. A view of the sequence of events or future state of the world under an assumed set of assumptions. Decision analysts classlfy preferences into three categories: objectives. A diagram of chance events. The sum of the probabilities of all possible outcomes equals one. a decision tree without any decision nodes. These preferences are composites of the decision maker's beliefs and values. objective probability. likelihood. The integral of a p. Also called probability distribution and probabilityfunction. Common examples include binomial. present value (PV) probability P(x) The likelihood of an event occurring.f. p. random error A distortion in a statistic. random number A number obtained from sampling (usually) a 01 uniform distribution and used for sampling in Monte Carlo simulation. Usually this is "noise" without bias.Synonyms: chance. cf. and triangle distributions. "Mass" instead of "density" is often used when referring to a discrete distribution. forecast.f. quantitative risk analysis. expressed as a number from 0 to 1 (or equivalent percentages).f. The discount rate represents policy or attitude toward time preference of money. equals one. Also called probability distribution and probabilityfunction. cash flow. case. and subjective probability. 206). risk response planning. probability distribution probability tree See probability densityfunction. A decision maker's attitude about a particular aspect of the decision process. odds. program evaluation and review technique (PERT) An analysis of a project network diagram to (1)determine which activities are on the deterministic critical path. Also called a precedence network diagram because of the emphasis on precedence relationships between activities.d. and so on across time periods.Glossary prediction preference See synonyms at forecast.
g. such as when: high outcome < 2. For example. The term is sometimes used in place of decision analysis. "Uncertainty" is used when there is a range of possible outcomes.. an EMV decision maker). Often used as if synonymous with internal rate o f return. The coefficients are usually chosen so as to minimize the square of the errors (leastsquares fit). Typically. See neural network. the denominator are averaged over the life of the asset. Some people combine into the inclusive R&DA. "risk" is used when there is a large. Also called a sto rank correlation The statistic used in popular Monte Carlo simulation software to express correlation. an amount of money or time provided to mitigate risk. or risk seeking. There are unknowns about conditions of nature and about how systems operate. low outcome Because there is no good antonym for risk. the numerator and. A symbol or measure of a chance event. risk analysis The process of assessing a probability or the shape of a probability distribution. regression Determining coefficients of an equation that best fits to sample data. and polynomial (curve fitting). especially. risk attitude Preference about risk. I'm okay with allowing risk to be the potential for either undesirable or desirable outcomes. reliability Probability that a system or component will perform its intended function over a specified period of time.g. total return in one year divided by starting investment. e.. risk events are often classified as "threats" or "opportunities. The names of regression analyses refer to the form of the dependent variable equation: linear. reserve In project management. risk and decision analysis. possibly. risk The quality of a system that relates to the possibility of different outcomes. Also common are computing the return each year. price uncertainty. For example. success/failure) and use "uncertainty"when the outcomes are merely variable. I prefer to use "risk" when there are dramatically different outcomes possible (e. then averaging. return on investment (ROI) Net benefits (cash flow before deducting investment) divided by the initial investment: net profit ROI = investment Often. Risk is approximately synonymous with uncertainty. .Glossary random variable chastic variable. risk neutral (neutral about risk. rate of return (ROR) Amount earned per unit of investment per time period. One of: risk averse (risk avoiding). It is the Pearson rank correlation coefficient using the value ranks rather than the actual values. usually unfavorable potential outcome (typically a discrete event)." See uncertainty. In project management. risk of failure. Informally. multiple linear.
Usually done by examining the effect of changing in one variable at a time from the base case assumptions. The risk aversion coefficient is the reciprocal of risk tolerance coefficient (both used in utilityfunctions). the sample is removed and not replaced in the population. often expressed as a utilityfunction. See sensitivity chart. The reciprocal of risk tolerance coefficient is called the risk aversion coefficient. sampling (for Monte Carlo simulation) Obtaining a trial value of a probability distribution. S sampling (with experimental data) Obtaining representative examples from a parent population or from experiment.. This difference is attributed solely to the decision maker's attitude toward risk. robust solution A solution obtained from a decision model that is insensitive to moderate changes in the value function and probability assessments. s e n s i t i i analysis An analysis to determine how variations in input values affect the outcome value. consemative risk attitude. See Latin hypercube sampling. sampling may or may not restore the original population set for the next sampling event. sampling with and without replacement When sampling from a finite population. risk policy Approved method for making tradeoffs between value and risk. and each sample alters the event probabilities for future sampling. risk premium The amount of expected monetary value forsaken by using a value derived from risk policy: Risk Premium = EMV . i.e.Glossary risk aversion Dislike of risk. scope The problem definition that specifies what is and what is not included in the project deliverables. Sometimes called risk penalty.Certainty Equivalent. This is sampling without replacement. Risk premium is always nonnegative. A projA planning technique that examines responses to plausible ection. the corresponding xaxis value is extracted and becomes that variable's value in the simulation trial. a random number between 0 and 1 is used to enter the yaxis on a cumulative probability distribution. Important to the evaluation team in understanding which variables are most important. This is typically onefifth or onesixth of the person's or entity's net worth. . risk tolerance coefficient The parameter in an exponential utilityfunction that represents the decision maker's risk attitude. scenario A possible sequence of events and a future state of the world. risk penalty See risk premium. robust model A model whose formulas and structure are valid for all possible combinations of input variables. With conventional Monte Carlo sampling. scenario analysis alternative futures.
mode.Glossary sensitivity chart Chart showing prioritized importance of input variables based upon the correlations calculated between input variables and the outcome value measure. p: the sample mean. statistic A number that describes some attribute of a population or event space. I S . This is observed with cumulative probability curves when the curves do not overlap. chaotic. The standard deviation is more meaningful because it has the same units as the distribution measured. standard error of the mean (SEM or 0. and variance. s 0. simulation Representing artificially. which approximates the true probability distribution for the system's output. A statistic provides information about a distribution. or estimated contribution to total variance. different letters are used: Population Mean Standard Deviation Sample  P ( r x s stochastic (pronounced stowkastic) An adjective meaning probabilistic. then risk attitude is important to the decision. When (1)the curves do cross and (2) the best EV alternative is more risky. standard deviation (SD. The model is used to anticipate the behavior and performance of the system under assumed conditions. is effectivelythe same as integrating what is usually a very difficult or impossible equation. stochastic dominance A situation when one alternative is better than all others at a l l levels of cumulative probability. A symmetric distribution has a skewness of zero. a frequency distribution is built up. In the context of business analysis. or s) Square root of the variance. Positively skewed distributions appear to lean to the left and have a longer tail in the positive direction. spider diagram A graph of the sensitivity of value to several input values. median. standard deviation.= x& SEM is useful in determining whether a sufficient number of trials has been run in a Monte Carlo simulation. representing the essential features of a system in a model.) A measure of the uncertainty of in representing the true population mean. statistical. The most common statistics are mean. or random. (See stochastic dominance and stochastic variable. It overstates the error when Latin hypercube sampling is used. 246 . Variables can be ranked by correlation coefficient. averaged over many trials. Y. simulation.) The term is complementary to deterministic (see). The skewness statistic is the third moment about the mean divided by 03. A sampling technique is used to obtain trial values for key uncertain model input variables. To distinguish between statistics from populations and statistics from samples. B y repeating the process for many trials. skewness A character of distributions when deviations about the mean are asymmetric. rank correlation coefficient. This random sampling process. Monte Carlo A process for modeling the behavior of a stochastic (probabilistic)system.
An event having both good and bad outcomes thus is classified as both a risk and an uncertainty. time value of money The concept that a dollar today is worth more than a dollar tomorrow. (A or B). triangle distribution A continuous distribution uniquely specified by its range (low and high) and its mode. union Composite event consisting of two or more events. to be equivalently replaced by its expected value. A n informal distinction is that uncertainty is used when the outcome is variable. at least in part. The present value concept and formula are generally accepted to represent time value in the decision model's value function. I prefer and recommend the taxonomy described at risk. stochastic variance The difference between (1) the value obtained in a discrete model using expected value inputs (i. which includes the possibility of both A and B. Opportunities.g. uncertainty Often used synonymously with risk (see). hunches. tornado diagram A sensitivity graph showing the output value range produced by the range of each key uncertain input variable. base case). sunk costs Money that has already been spent. uniform distribution A continuous distribution with constant probability density between two bounds. Equivalent to the logical "or" operation.) This is divided into events with good outcomes called opportunities and those with bad outcomes called risks or threats. and (2) the value obtained with a stochastic model that carries probability distributions through the calculations. except as to possible future tax or contract effects. Weaknesses.e. and Threats. feelings. SWOT analysis Abbreviation for a planning and brainstorming process. The relationship is reflected in the time preference of the decision maker.Glossary stochastic variable Same as random variable.e. substitution principle Allows a chance node. based on opinion. (I am not among them. Prior costs (and benefits) should not be considered in making present and future decisions. This is often based upon the standard error of the mean statistic. Some project management professionals are classifying uncertainty (i. A subjective probability is someone's belief about whether an outcome will occur. e. a surprise or contingency)as the most general term. stopping rule A test for halting a Monte Carlo simulation run after achieving sufficient convergence. such as a future product price... This is the most popular distribution in Monte Carlo simulation because of its simplicity and ease of sampling. named after its elements. subjective probability Probability assessment or judgment that is. .. and/or intuition. identlfymg: Strengths. in a decision or probability tree. arranged in decreasing magnitude of effect.
The expected value of the sum of the squared deviations from mean. The xaxis (the utility function's argument) is calibrated in some directly measurable units. Difference between the forecast and actual outcome. usually detailed by components. variable A symbol in a model that has a value or can be evaluated. util (or utile) Arbitrary unit of measurement expressing value on a utility scale. input value. variance ( 0 2 statistic) 1. V value Monetary or material worth. Venn diagram A diagram showing relationships between all possible outcomes of chance events. A popular measure of uncertainty. such as money.eX/') where x is the outcome (usually in $NPV). value function See objectivefunction. value of information Value of being better able to assess the outcomes of chance events. Normally calculated so that an unfavorable variance is negative. An event or variable not identified in the evaluation. value of control events. Also called utility curve and risk preference curve. perhaps because it was never thought of. Value analysis is value engineering applied to something already designed or built. values Value of being able to influence the outcomes of chance Deeplyheld beliefs about what is right and wrong. Utility (see). 3. The origin location is also arbitrary. 2. value engineering Systematically designing a project or product constantly attending to lifecycle value improvement opportunities while still meeting the functional criteria. Represents value versus an objective measure. Synonyms: parameter. A popular (and recommended) utility function is the exponential utilityfunction: U(x) = r (1. The yaxis origin and scale are arbitrary. such as dollars. An event or variable not evaluated. Stochastic variable. 2. This function is sufficient to completely and succinctly express a company's risk policy. variance analysis A postevaluation for the purpose of reconciling (explaining) the difference between the forecast and actual outcome. worth in usefulness or importance to the possessor. Variance in this context is the difference between forecast and actual. 240 . Utility value is measured in arbitrary units called utils or utiles. utility A scale of value reflecting a preference of the decision maker. and r is the risk tolerance coefficient.h Glossary unknown 1. Variance is the standard deviation squared (see). See preference. utility function A graphic or mathematical function relating value of various outcomes to the intrinsic value to a particular decision maker.
yield The amount earned during one year on an investment divided by the investment value at year start. Often used as synonymous with return on investment. For multiyear situations. although there is much confusion and debate about its usefulness.Glossary work breakdown structure (WBS) An outline of a project that details. . the work to be performed. in hierarchical levels. internal rate of return (IRR)is the most common yield measure.
.
Project Management Best Practices Report (January). Project Management Journal 23 (September): 515. McKim. Goldratt. ValueFocused Thinking: A Path to Creative Decisionmaking. 1998. Scientific American (December):12535. Interfaces 24 (NovemberDecember): 1943.. MA: North River Press. John R. HarperBusiness. MA: Harvard UP. 1997. Neural Network Applications for Project Management: Three Case Studies. PA: Project Management Institute. Harvard Business Review (MarchApril):15359. How to Make a Decision: The Analybc Hierarchy Process.) Inc. 1992. Killing the Project in Time. Built to Last: Successful Habits of Visionary Companies. Zehedi. M. Interfaces 16 (JulyAugust):96108. James C. 1998. Newbold. R. C. Rise of the Robots. St. 2000.BIBLIOGRAPHY Capen. George. Project Management Journal (June): 3951. MA: North River Press. Joel. Critical Chain. March 2728. and Jerry I. Jung Jr. Oxford UP. Cambridge. Project Management Journal (December): 2833. 1997. A Guide to the Project Management Body of Knowledge (PMBOKm Guide) . Bringing Discipline to Project Management. 1999a. Robot: Mere Machine to Transcendent Mind. Project Management in the Fast Lane: Applying the Theory of Constraints. Ralph L. Hans. Jeffrey. PM Network (March): 3539. Keeney. Dallas. Larry P. Leach. 1994..1995 Society of Petroleum Engineers' Hydrocarbon Economics and Evaluation Symposium. The Goal: A Process for Ongoing Improvement. 1999. Robert C. and Justin Roe. 1992. Saaty. Collins. Journal of Petroleum Technology (August): 84350. Pitagorsky. 030066. Proceedings. 1989. Lessons Learned Through Process Thinking and Review. Robert A. Great Barrington. and J. C. Warner. Paper No. The Difficulty of Assessing Uncertainty. Elton. Santell. 1976. E. 2000. The Future Is Here! But Is It Shocking? (Interview with Alvin and Heidi Toffler. The Analytic Hierarchy ProcessA Journey of the Method and Its Applications. Great Barrington. Schuyler. 2000.. Center for Business Practices. P. Rational Is Practical: Better Evaluations through the Logic of Shareholder Value. 1995. 2001.2000 Edition. . Kotkin. . Moravec. Lucie Press. Porras.1999b. Optimization in Project Coordination Scheduling Through Application of Taguchi Methods. 1984. J. 1993. Project Management Institute. Thomas L. . Newtown Square. (December):112. Eliyahu. Critical Chain Project Management Improves Project Performance. Fatemeh.
.
245 10. 9156.232 0 standard deviation. 48. chance 12122. 233 actions 1 1 . 197. 19. 113. 148.178.247 discrete variable 101. 139.170. 51.243 perception 157 bid amount 4. 23334 certainty equivalent (CE) 45. 163. 23435 p mean. 232. 233 life cycle 100 model(s) 98103. See cumulative density function (c.f. 84.4749.3132. 234 behavioral problem 185 benefivcost ratio 33. 213. 130. 29. project cashflow central limit theorem 19. 233 asset 4. 159. 1 C CCPM See criiical chain project management (CCPM) c. 137. 234 Boolean algebra 202. 10203. analysis. 117.84. 79.231. 225. 30.f.41. 174. 17879. 10203.166.238. 169. 41. 87. 11112. 21617. 96. 231. method (CPM) calculation(s) 4. 21. 14. 81. 55. 156. 221 capital constraint 135.230. 170.186. analysis analytic hierarchy process (AHP) 51. 194. 23334. Greek letter rho 137.240 See also central measure and expected value (EV) probabilityweighted 9.) CE See certainty equivalent (CE) CF See certainty factor (CF) CP See critical path (CP) CPM See critical path. 5960. 112. 14. decision analysis (DA). 116. 233 appraisingvalue See value. 233. 66. 15354. 153.244 See also discounted cashflow and model(ing). 17375. 99.237. Don 23 assessor(s) 151. 234. 14647. analysis.d.137.235.243 activity start date 193. 197. 199.17374. scenario. 246 A AHP See analytic hierarchy process (AHP) Al See artificial intelligence (Al) AND See activity network diagram (AND) accuracy 5.19697. 157.INDEX @RISK 246 95. 125.237.23334. 143.242. 33. 199. 166. 146. 56. 14. 10. 207. 236 cash flow 26. 235 See also correlation 1 1 . 186.134. 223. 57. 65.219. 213 See also competitive bidding binary distribution(s) 141 1 . 75.209.196. 32.233 value 37. 10506. 88. 197. 237. 17C71.116. assumption 6. 240 . 197 addition rule 20304 See also mutually exclusive. 128.242 merge 17173.233. 233 Aspromonte. 34.245.183. 96. 62.244 lifecycle 9899. 114. 63.239.234. 53. 137. 17. 186. 101. 90. 18. 116.245 certainty factor (CF) 210.224. 60. 180. average 5. 83. 141. p correlation coefficient.236. 7. 105. 234 in PERT 148 bias(es) 5.247 methods 36.164.223. 163. 234. 161. 14243. 128.185. 15659. Greek letter sigma 146.23839. pointforward analysis. 171. 59. lookback analysis. 1 1 1 beta distribution(s) 91. 234 binomial distribution(s) 94. 234 chance event(s) See event(s). 27. 69. 107. 72.234 Bayes' theorem 71. 195 association 20.169. 170 See also projection. 102.117.24041. 117. 13.d. 37.231. base case base plan 101.173. 102. appraising artificial intelligence (Al) 134. and actions inventory activity network diagram (AND) 101.240. 35.243. 156. Greek letter mu 9.186. 173. 234. addition rule analysis See cost/benefiVrisk analysis. 96. 125. credible analysis. 118. sensitivity analysis. 21.233. 166. 170. 234 central measure($ 10.12728.22527 B base case 14. 12. 131. risks(s) analysis. 42. 164. 233. 56.128.159. 51. 231.81. 37. 53. postanalysis. 116. 130. 107.239 See also database of risks and actions and risks. 3738. and variance. 217. 192.139. 204 logic operations 202 buffer($ 18485. 135. S W O T analysis.223. 234. 152 binary event 1 binary risk event 142. 174. decision tree($. 125. 16748. 233 See also expert system(s) artificial neural network See neural network (NN) asof date 32. 17. 232. 171. 55. 97.
251 See also theory of constraints contingency. 38. 223.86.231. 8788.157. 40. 8 1 4 . 78.239. 96. 17678. 233 decision rule 3536. 130 three pillars of 53 with good data 167 with Monte Carlo simulation 185 See also risk. 224.12627. and performance 3.133. 9698. 4144. 84. 228 crash(ing) 124. 78. 14.237. 170.242 singlevalue 12. 99100. 6&69. 38. sunk See sunk costs covariance 20. 49. 5152. 34. 6.216 decision model 2528. 135. and decision analysis (RDA) decision criterion 34. 232 Crane Size decision 38. 3536. 185.199. 80.60. 15 conservative risk attitude See risk($. schedule. 47. 67.127. simple model 173 See also theory of constraints continuous distribution(s) 83. 2932. 111. 149 matrix 166 rank 13738. 236 decision tree($ 13.190. 219.14041. 243 decision point($ 27.172. 96. 7778.225. 107. 22829 DROI See discounted return on investment (DROI) database of risks and actions See risk($.231. 110.154.236. 14.159. 27. 4243. 6667 cost estimates See project cost esimates costplus contract 4243. 225 cumulative density function (c. 67. 51. 154. 229.236 coefficient 2021. 129.241. 79. 169. 66.174. 236 cumulative distribution(s) 10.23437. 14244.111.f. 235.140. 199.197. 29.4748. 50. 122. 156. 45. 178. 175. 15. 16970. 217. 143 competitive bidding 83 See also bid amount complement rule 203 conditional branching 106.151. 137. 215. 18. 71. 140 decision making 3 4 .239. 235 confidence level($ 10. 224. 18384.186. database of debottlenecking 184 decision analysis (DA) 34.38. 146. 18689. 40. 93. 83 . 140. 6. 8789. 111.245.241 decision policy 4. 55. 15556. 196. 90. 152. 235 method (CPM) 123. 151. 107. 6. 129. 110. 36. 66. 211. 57.117. 73. conditional confidence factor 210. 191.236 and CCPM 18687 appraisal approach 63. 2930. 102. 79. 30. 5556. 178 custom distribution(s) 149.83 combinatorial explosion 81.22324. 35. 133. 67. 236 decision science 34. 166. 18. 18.175. 173. 238. 140. and actions. 77. 9091. 117. 3435. 7980. 80.199. 2526. 74. 65. 16162. 110.183.247 decision node 5961. 107. 100.152. 102. 2829. 24748 decisionmaking 6. 121. 231. 236 See also good decision decision maker($ 3. 61. 59. 13738.194. 172.236. 92. 22729. 166. 225 DA See decision analysis (DA) DATA 38.Index chance of success 135.235 critical path (CP) 124. 101. 43.80. 59. 38. 105.246 rank order 166 See also association cost/benefit/riskanalysis 41 See also risk. 168. 14344. 60 base plate 108. 26. conservative constraint(s) 24. 161. 6. 56. 14. 73.243 analysis 26. 4 M 9 . 133.130. 110 blackboard 59 bushy mess 60.128. 84. 166. 178. 107. 74.235 criticality index 36. 46.238. 162. 129. 55. 149.d. 49. 235 Crystal Ball 9596. 29. 233 approach 2930 best presentation of 79 guarantee 154 justification for doing 79 no explicit buffer 194 problem type 30. 27. 235 credible analysis 5. 173. 25. 179.237 decision problems 34. 133. 98. 23536 simulation 23536 control 19.246 detailed modeling 132. 175.97. 139. 173. 6971. 31. 195 critical chain project management (CCPM) 183187. 107.247 continuous event 87.40. 139. 23435 fuzzy logic 210 confidence interval(s) 10. 223 backsolving 47. 18345. 59. 235 Critical Chain 16768. 65. 121. 4243.225. 74.) 10. 168.15657. 242. 140. 163.231. 139. 117. 6667. 186. 134. 44.185. 23. 77. 221. 53. 47. 23. 4144.240.17274. 244.233. 109. 15252. 177 tenstep process 27. 4647 cost.199. 5557. 59.173. 27. 16365. 80. 207. 101 hierarchy 51 costs. 25. 40 process 2324. 217 dynamic modeling 140 conditional probability See probability. 66.116. 48. 241. 5253. 186. 24245. 9. 42. 140. 49. 3334. 127.234. 89. 69.243 See also value of control correlation 25. 134. analysis cost equivalent(s) 51. attitude. 99.220. 221. 234. 154.159. 66.193. 63.232. 130. 19192.
221 discount rate 3233. 102. 132.166 exception reporting 208.17%80. 910. 236 density distribution(s) 8889. 13034. 198 forecast 133. 220 calculate 13. 77 calculating 38. 148. 81. error distribution(s). 197 utility 35.117.242 example calculations 73 See also mean 19.238.15152. 96.23839. 123.17%80.80. 212.243 time preference 32.16667. 23739. 1314. 21. 69. 131. 60. 91 delta property 13. 4549. 210 See also expert system expectation theory 10 expected monetary value (EMV) 3335. 53.20243.238. 62.247 chance 7. 139. feedback fixedcost contract 43 float 238 free 194. 28. 63. 152.16777.132.219.186.247 frames 20910 framing 24.242.104.23436. 6566. 75.90. 89. 125. expected value (EV) 45.130. 125. 127. 29. for independent events joint 201. 204.205. 69. 47. 16975.245.208.236. 13334.8586. 158.234. 7475. 236 midperiod 3233 See also present value (PV) discrete distribution(s) 13.194.48. Gaussian distribution(s). 238 positive 161.20344. 146. 127. 65. 126. 211 Wine Advisor 21011 See also credible analysis and knowledge. 9. 157.179.33.16976. 196. 102. distribution(~).185.9091. lognormal distribution(s).243.binary distribution(s).custom distribution(s). 117.238 Delay Cost 37. 139.220 COS~(S) 13. 235.146.243. 8586. 60.176. 6567. 15. 8 7 4 . 60. 26. 107. 134. 146.19395. 59.155. 187 delay 3637. triangle distribution(s).234.16667. 21517. 237 cost estimation 101. 141. 158. 243 discounted return on investment (DROI) 35. 207. 238 See also judgment(s) frequency distribution 7. 239. 233. system exponential distribution(s) 14748.248 formula(s) 13. fallacy of feedback 17.213 deterministic 26. 44. 237 exceptions 41 expected utility (EU) 3536.246 See also binary event. 53.80. 137. 231. 40. and continuous event fallacy of sunk costs See sunk costs. 110. 99. 32. binomial distribution(s). and "niform d stribution(s) distributive property 13.173.Poisson distribution(s): probabilitv. 232.152.223.231.246 frequency histogram See histogram.96.243 See also sampling distribution(s) See beta distribution(s). 19798. 135. 63. 66. 132. 105.15152. continuous distribution(s). 24044. 14143. 1214.21314. 178. 96. 50.234. 5960. 4243.172. 237. 38. exponential distribution(s).83.239. 94. 32. 40. 117. 49. 113. 232.237. 236.141. 99100. 88. 60. 33. 98. 63. 248 128.4043. 45. 27. 157.246 model(s) 8485. 7273. 9698. 92.18588.22425. 238 See also confidence factor. 6243. 41. 240 dependence(y)(ies) 90. 26.13334. 13637. frequency distribution(s). 161. 56. 247 discounted cashflow 33. 68. 113. 189. 61. 7778.234.10145. 52. 236. 237 dynamic simulation models 139 i EMV See expected monetary value (EMV) EMV decision rule See expected monetary value (EMV). 45. 91. 7071. 147. 217. 168 See also performance. 87. 185.136.16566. 122.236. 24243 calculation(s) 4. 243. 208. 239. 9. 139.93.180.241. 27. 4145.193.166. 12122. 131. normal distribution(s). 11.162.187. 169. 168. .16144. 239 space 201.142.234. 197. 1 1 1 .208.122.233.199. 67. 171. 10810.231. 236. 215.170. 43. 20. fuzzy logic . 41. 94. 84. 12728.8345. density distribution(~).4546. 132. 187. 44. cumulative distribution(s).227.192. 178. 43.163.156. 159.231. decision rule EU See expected utility (EU) EV See expected value (EV) early start 197 error distribution 90 evaluation(s) See postanalyis and scope event($ 6. 67. 126.24445 decision rule 3536. frequency funy logic 21. 20. 84. 87.232. discrete distribution(s).227 CORREL function 20. 13334. 219 detail 7. 14. 20. 187. 117. 117. binary risk event. 36. 98. . 18. 95. 91. 73.236 discounting 35. 13. 23536 design of experiments 94.225.24&48 and decision tree 5960. 190. 238 forecast(s)(ing) 3. 156.19799. 18384. 81. 166. 245. 3738. 124. 153. 4751.65. 2527. Excel 13. 121. 40.13940.154. 11. 112. 133. 157. 96. 5556. 105.248 independent See multiplication rule. 182930. 237 See also singlepoint estimate($ expert system(s) 99.Index decomposition 26. 167. 237 exponential utility function 45. 187. 135.197. 15558.23338. 97.23940. 14243. 237. 23.245.192. 207.
24347 See also multicriteria decision making (MCDM).205. 53.11~17 See also project risk management (PRM). 2425.19294. 111.236. and work breakdownstructure moment 166.81.211. 16263.9596. 225. 6970. A .246 method 19.15556.196. 239 influence diagram 2526. 199. deterministic. 194.233.174.16566.167. 22628. 90. 127. 21314. 122. model(s).178. 83. 97.171. 240. 166. 62. 4748. simulation model(s). 34. 49. 68. 240.159. dynamic modeling. 193. of information interest 32. 147. 7980. 242 marginal 33. 53.163. 26. 42.15660. model. 67. 7475.23940.134. 3233.166. 59. 157. 15152.232. 156. 239. safety. for independent events inflation 21.146. model(s)(ing).247 J joint event($ See event(s).119. 174.123. 125.219.163. 24. 190. 105. 239 frequency 7.141.24142 with simulation 53 multitasking 168. 1 0 M 2 .208.233 illogical statements 153 quantify(ing) 18.241.185.86. project model(ing).24&41. 154 expert('s)(s') 25. 178.241 imperfect information 67.179. 240 See also law of diminishing marginal utility mean 910.246. 13132. 157. 37. 167. 140. 153. 87. 63. 33.238. 187.240 LHS See Latin Hypercube Sampling (LHS) Latin Hypercube Sampling (LHS) 94. 154 bias (es) 15154. model(s).19394.237 See also expert system($ H health. 70.238. and the environment 31. 15659 eliciting 68.2000 Edition xi. 34. 241 multiple objectives See objective(s) I IF statements 14041. 115 mode 910.238 gateways 77 Gaussian distribution 238. 240 lookback analysis 180 See also postimplementationreview (PIR) M MCDM See multicriteria decision making (MCDM) management science 34. simple model.114. 14546. 125. 51. scope. 231.149. 123 midperiod discounting See discounting.247 about risk($ and uncertainty(ies) 4.237. with simulation multicriteria decision making (MCDM) 49.231. 103.162.125. 44. 66.166. 21.23334.236. merge methodology 3. 163. 185 best 9.137. 81. 187. 71. 141. 239 See also decision criteria Guide to the Project Management Body of Knowledge (PMBOKB Guide). 48.116.24@42.226. 9. 63. 74. decision model(s). 205 See also value.236.248 building 140. 93. 27. 56. 241 Monte Carlo sampling 96.13132. 111.24&48 See also expected value (W and standard error of the mean merge bias See bias(es). 208 computer tools for 147 flows in 130 project cashflow 63 synthesis 132 value 4950 See also asset. 4950.20405. 249 intersection 20103.124. 2425. 158. 13536. 92.221. 101. 236. 66.23234. 21. 170. 239. 55 heuristic(s) 36.164. 38. 170. 192.21011. 117. 90. 233 histogram 93. 190.179. 75. 106. 241 See also multiplication rule intuition 29. 187. 195 good decision 25.236. Eliahu 16768. 194 See also framing . 89. 10102.239 See also correlation coefficient and multiplication rule. 154.239.236. 174. 90. 100. 239 hierarchy 1. 232.181. 108. 99102. 5153.18344.87.171. 48.193.166. dynamic simulation models. 154 subjective 12.18384. 17274.174.239. stochastic. in PMBOK@ Guide K knowledge 105. 230. 240 law of diminishingmarginal utility 44 lessons learned 27.141.24647 model(ing) 6. 143. 241 Goldratt. 122. 56. 102. 199 See also spider diagram Monte Carlo simulation 13.176. 127. 151. 134 about uncertainty 9. 51.242 system 208. 137 monetaly equivalent($ 41. 1921. 209.225.217 See also conditional branching IRR See internal rate of retum (IRR) independence 20. midperiod milestones 79.239. 154. contigency.229. 57. 159 lognormal distribution(s) 143. 81.24&41.125.223. 3637.19495.20711. 156. 5253. conditional branching. 68. 132. 90.21415. 53. 125. 134.144. 56. 75. 7779.G Gantt chart 112. valuation. 177. joint judgment(s) 7.127.24142 internal rate of retum (IRR) 135. 40.19799.23536. 1112.
43.126. 8890.181. 208. 67.121.154.62. 56. 1 4 H 7 . 225.55.116. 4749.527. 35. 63. 70. 17. 172.15859. 68. 49.102.d. and performance. 122. 40.81.12728.16. exponential distribution(s).23637. density function (p. 196. 97. 78.23637. 32. 131. 147. 14. 53. perception performance 42. 43.173. 227 parameter method 19 participation fraction See optimizing.111. 237. 213. 48. 95102.164.105.198. 214. 74. A 2000 Edition PRM See project risk management (PRM) W See present value (W) Palisade Corporation 66.114. 26. 49. continuous distribution(s). 143.135. 242 of decision 80 postanalysis 173. 12. 53. 91. 9394.175. 223. See probability. 105.21213.185.f. 75.186.242 precision 5.247 See also discounting 1 1 . and monetaly equivalent($ pointfolward analysis 33 Poisson distribution(s) 143. 24546 outlier results 84 PIR See postimplementation review (PIR) PMBOe Guide . 95. 2. 83. 92. 1819. 224 chance 60. 66. 147 population 89.88. 51. 110.6263. 248 See also present value (W) neural network (NN) 165.f. 96. 242 precedence diagrammingmethod (PDM) 122. 234. 210. 3638. 102.180.107.153.239. 12728.248 present value (PV) 26. 153.16768.231.90. and uniform distribution(s) function 215.225 outcome 7.173.18788. 19. triangle distribution(s). 78. 127 probability 15253. 23&37. 5960. 7374.233. 241 OR See operations research (OR) ORIMS See operations researchhnanagement science (ORIMS) objective(s) 6. 24142.97. 172.128. 231. 156. 223. 36. 17. 132. 238. participation fraction payoff tabie(s) 13.234. 193.23132.238 optimizing 3031. 205. 94.24347 discretizing 237 threelevel estimates 143 See also beta distribution(s). 97. 213. custom distribution(s).239.185. 55. 240 /management science (ORIMS) 34.159. 73. 172. 31.34. 233.56. 241 addition rule 204 NN See neural network (NN) NORMDIST function 146 NOT function 203.110. 45.170. 179.20748.90. 51. 72. lognormal distribution(~).126. 78.16971. 77.124. 243 joint 20. 4.242 in SWOT analysis 6 optimization 26. 38. 90. 8W39.236. 6. 85. discrete distribution(s).246 feedback 14. 1214.239. 10. 242 perception biases See bias(es). 1114. 96. 203.67.19~97. 8. 199. 224.190.243 distribution(s) 45.8386.221.242 PrecisionTree 66. 9 107.217. 240.d.30. 4445.173. 19.17374.24246 conditional 235 density function (p. 130. 46. 41. 232. 241.159 See also cost.242 to truncate losses 180 OptQuest 95.23639 function 52. 232. 240. 242. 56 value(s) 6.243. p. 20748.145.67. 24344 risk 34. 16. normal distribution(s).147. skewness. schedule.244. 187.95. 186.131. 192. 55. 1 1 .Index multiplication rule 204 for independent events 205 mutually exclusive 205. 9798. 41.215. 19699.234 NW See net present value (NPW net present value (NPW 33. 174.22526.84. 22425.6263.546. 139. 233 participation fraction 4748 option($ 52. 233.247 See also decision node normal distributionts) 19.65. 127. 25.241.87. 18889.185. 21. 165. binominal distribution(s).2000 Edition See Guide to the Project Management Body of Knowledge (PMBOlP Guide). 24243 objectivity 5. 8486. 19G97. 30. 79.24041 normalization requirement 89.240.) 713. 231.196.242 opportuntty 4. 101. 243 preference 18. 53.196. 34. 202.15&59. 22729 prediction 122.49.248 multiple 30.251 applications 212. 235. 3435. 710. 62. 3235. 239 judged too narrowly 146 121. 126.167.223.20446.149. 236. 4 M 5 . 3436.130. 103. 24546 portfolio 34.24142.223. 13. 11415. probability 3. 4044. 152.117. 232.96. 14.15557. 105. 66. 241 artificial 213 structure 213 node 47.14142.f. 31.116. 29.17576 postaudits 159 See also postimplementation review (PIR) postimplementation review (RR) 27. 178. Poisson distribution(~).) PDM See precedence diagrammingmethod (PDM) PERT See program evaluation and review technique (PERT) .8385.191.112. 29.d.238. 65. 6263. 29. 217.185.13~37. 233 operations research (OR) 236. 159. 163. 238. 3132. 16263. 116. 50. 240.23233.14142. 14. 178.
19192. 25. 53. 12. 77. 240 prior 6%70 range is nonsense 153 rectangle 131. 152. 133. 187. 133. 184. 234.243 analysis 219. random rank correlation See correlation. 167. 245 event 11. probabilityweighted wheel 154 See also subjective probability program evaluation and review technique (PERT) 122. 41. 154 See also average.244 policy 3436.Index marginal 75.225. S SUMPRODUCT function 13 SWOT analysis 6. risk($ 67. 244 and actions. and performance scope 6. 144. 36.216. 199 spider diagram 13536. 232.18687. 90. 94.238 See also dynamic simulation models.117. 157. and decision analysis (R&DA) RAND 87. 67. 8689.239 See also cost. 243 management xixii plan 117 mitigation 99. 43. 219. 12122. in PERT progress reporting 168 projectplanningsoftware 126 See also critical path method (CPM). 17172.239. 2627. 21415. function premium 33. 101.21011. 157. 16970.123 RlSKOptimizer 95. 248 9&103. 62. 123. 66. See also event. 224. 56.122.10506. 102. 24246 conservative 4344. 107. 121. (PRM) 242 rules 25. 234. 232. 238 11718. 1 1 1 . 245 profile 179 tolerance coefficient (r) 4546.131. 223. 34. 10142. 51. 93.99. inventory of 105.14849.241. 152. 107. 49. 157. 96. 21. 11112. 196. 12&29. 238. 170. 10143. 47. 105 rectangle distribution(s) See regression 122. 164 See also xy plot scenario 26.243 calculations 19 See also beta distributions.13334. 232.146. 100. 187. 202. 105. 92. 244 See also discounted retum on investment (DROI) rework 27. preference.164.229. 42.18548. 140. 185.243 base case 14. 49. 221.19093. 4244.199. 20612.46. 131. 103. 245 Q qualitative assessment 238 R R&DA See risk. 179. 208. precedence diagramming method (PDM). 233 identification 101.243 common risk representation 102 in PMBOlP Guide 97. 23738 185. 87.194. 186 singlepoint estimate 10.22627.91. 247 See also opportunity. 9495. 144.165. 111. 131. database of 112 and actions. 107. 60. 230 neutral 42. 245. 247 triangle 78. rank ranking alternatives 30 ranking criterion 3536. 121. 8991. 52. 240. 84. 235. 13940.16162. and trial value($ model(s) 53. 168. 117. 230 and decision analysis (R&DA) 172. 13437.209. 163. 79. 36. 92.230. 11213.245 number function 87 random sampling 87. 237 analysis 3.23234. 210. 22526. 1213. 84. 114. 30.128. 46. 224. 53. 245 schedule 42. 196. 169. 148. 44. 10102.168. 232. 246 slack 124. 49.162.89. 89.114 projection 3233. 43. 4344. 69. 9394. 105. 245.19697. 173. 97. schedule.235 project network diagram 133. project plan 14. 201 tree 243. 110. 243 project risk management (PRM) xixii. 35. in SWOT analysis sampling 8384. 246 random variable See variable. 8291. 12526. and project risk management 17173. stopping rule. 204. 168. 89. 23. 164.84. 112.4G49.89. costhenefitlrisk analysis. 98. 100 averse 43 neutral 34.244 rsquared (r2) 21 return on investment (ROI) 41.213. 246 stakeholder(s) 25. 5556.216. 63. 103. 11. 157. 93 process 81. and program evaluation and review technique (PERT) project cost estimates 910 project model(ing) xii. 221. 235. 248 See also utility. 53. 147.84.235. 146 See also triangle distribution weight 143. Monte Carlo simulation. 19294 . 105.23233.188. risk. 63.245 sensitivity analysis 26. 159. 114. 132. 102. 13941. 117. 116. 146 Size Crane decision 37 skewness 21.169.187. binary risk. 201. 194 wasting 77.191. 84.139. 28. 151. 244 attitude 30. 63.113. 238.223.243. 6. 188. 133. 60. 18487. 175.24547 See also discrete distribution(s) and random sampling scatter diagram 161. 4 W 9 . 245 aversion 34. 11617.151. 127.245 See also spider diagram sensitivity chart See spider diagram simulation 17.41. 226 ROI See return on investment (R01) random error 243 random number 86. 139.
141. and utility. wasting slack See slack. 80. 87. 81. 1113. T trial value($ 85. time value. 2 4 M 7 stochastic 33. 5556.199. 186.248 random variable 234. 185. 141.44. 157.248 TreeAge Software. 157.Index appraising 4. 87. 74. 240. 21516. 82.63. 2526.244 xy plot 161 and stochastic variance 17072. 910. 36.20506. 147. 49. 248 Zadeh.24M8 analogy 155 weaklink 4 and funy logic 215 and probability(ies) 3. 32. 28 value 7.197.12630. 4243. 162. 122. 79.15255. 122.246 calculations 56. 46. 102 utility 26. 174.164. 244. 141. 247 Venn diagram 20142.236 135. value system dynamics 132 variable($ 6. 55. 93. 221. 197. 138. 70. 53. 18687. 15253.241. 17273. 243. 248 17273. 114. 35.175. 172.23849 . 29. 49. 175 timespread variable($ 8485 See also covariance and stochastic. about uncertainty. 42. 170. 23. value.23. singlepoint estimate. 137.247 variance 1921. 117. 98. 247 of information 6768. 139. 113. variable threat &7. 173. objective measure 44. 49.244. 13139.173. 237. 236 function 35. 96. about risk($ and uncertainty(ies). variance tornado diagram 136.3038. 145. 219.246. 60. 16566. 18.187. 243 model 26. 219. 96. 147.50. 19293.158.180. 248 function 14. 172. 248 See also law of diminishingmarginal utility and utils utils 44. 247 W WBS See work breakdown structure (WBS) wastewater plant 6162. 66. 247 X Y union 20243.242. 36. 210. 233.239.242. 248 variable 86. 243. 8889. and standard deviation yield 135. 228 standard deviation 16.22425. 1819.248 standard error of the mean 19899. 159 subjective probability 15253.178. 24648 of imperfect control 108 variance 14.157. 171.199. 168. 69. 15152. 248 measure See utils value 44. 164. 248 of money 32. 5253. judgment(s). 243. 99. 66.19091. 170. 89.246. 127.124.24748 measure 6. 10143. 176 monetary 31. 197. 136.249 242. 125. 4243. 4647. 247 See also addition rule unknown unknowns 7. 107. 99100. outcome sunk costs 33. 97. 247 joint 137. 8485. 20. 4043. 177.19597. 77.157 model(s)(ing) 98. 16974. 5556. 157.233. 199.232. 249 uniform distribution(s) 87. 4449. 24546 triangle distribution(s) 12.23334. 16263. 199 stopping rule 85. 179.192. 100. 242.159. 134. 27.4142. 13436. 91 uncertainty(ies) 34. 247 theory of constraints 18384 See also stochastic.232.235.191. 247 analysis 27. time value.17980.161.219.219. 156. 79. 56. expected value (EV). 131. 90 U example 6 M 7 .191. 245.159.14445. Lotfi 215 v valuation 52.8081.241. 170. 48. 244.88. 18. 167. 161. 159. 247 riskversus 4849. 89.166. 224. 232. 30 engineering xi. work breakdown structure (WBS) 10001. 90.20243. 145. 116. 10142. 33. 131. 16970. 5253. 12526.33. 10243.18586. 17374. 1213. 11617. 2930. 51.213.171. wasting 53.233. 44. 247 of control 69. 176. 60. 98.140. 27. 17. 26. 186.248 time value 30.178. present value (PV). fallacy of 79 of money. 248 stopping points 77 See also imperfect information project 35.241.149. 247 value(s). 73. 234. 121. 91.21921.236. 83.163. 245.23335. 27.221. 149 and risk 16162. 159. 243. 210. 178 See also scatter diagram See also judgment(s). 154.188. 232. Inc. 247 See also asset. 101.
.
project management expert. "The exercises and case studies featured here.2000 Edition is a vital new research tool for managers and HR professionals looking to retain or recruit employees. and project reports. This new edition incorporates numerous recommendations and changes to the 1996 edition. popular seminar leader. supportive management primer for any "techie" invited to hop on the f ~ srung t of the corporate ladder. reserve time. ISBN: 1880410257 (CDROM) for the project management profession on a global basis. This is a witty. along with the handson advice. and action plans. tools.Upgrade Your Project Management Knowledge with FirstClass Publications from PMI A Guide to the Project Management Body of Knowledge (PMBOKBGuide) 2000 Edition  PMI's PMBOK@Guide has become the essential sourcebook for the project management profession and its de facto global standard. including: progressive elaboration is given more emphasis. Its purpose is to establish normative compensation and benefits data . techniques. and deferred compensation information for specific job titles/positions within the project management profession. ISBN: 1880410230 (paperback)." Library Journal ISBN: 1880410761 (paperback) PMI Project Management Salary Survey . Newly added processes. For example. It's also an insightful guide for those who manage technical professionals. this is suitable for all libraries. hammer home fundamental principles. The PMl Project Management Salary Survey . estimating publications and earned value measurement are added to Chapter 7 (Project Cost Management). the role of the project office is acknowledged." He counsels those who prefer logical relationships to people skills and shows technical professionals how to successfully make the transition into management. ISBN: 1880410265 (paperback) Project Management for the Technical Professional Michael Singer Dobson Dobson. and activity attributes are added to Chapter 6 (Project Tune Management). understands "promotion grief. with over 700. researchers.2000 Edition This 2000 Edition updates information first published in 1996 and expands coverage to over forty industry affiliations in nearly fifty countries in seven major geographic regions around the world. and academics. and techniques are aligned with the five project management processes and nine knowledge areas. a skillful translation of general management theory and practice into tools. project presentations. and systems that technical professionals will understand and accept. It includes selfassessment exercises. the treatment of earned value is expanded in three chapters.000 copies in circulation worldwide. bonuslovertime. current members of the profession or those interested in joining it. The study provides salary. and personality theorist. helpful "how to do it" sidebars. ISBN: 1880410222 (hardcover). the linkage between organizational strategy and project management is strengthened throughout. This is one publication you'll want to have for quick reference both at work and at home. It also contains normative data for a comprehensive list of benefits and an array of other relevant parameters. variance analysis. An intriguing complement to more traditional IT management guides. and project closure are added to Chapter 10 (Project Communications Management). and the chapter on risk management has been rewritten with six processes instead of four.
ISBN: 1880410753 (paperback) Project Management Experience and Knowledge SelfAssessment Manual In 1999. make tough decisions under fire. Monte Carlo simulation. Hornjak shares his "lessons learned" in this best practice primer for operational managers. guarantee. initiating the project or planning the project). Project managers will welcome this fresh translation of jargon into Visit PMl's website at www. ISBN: 188041029X.g. this heavily illustrated second edit.pmibookstore. Koppelman Now a classic treatment of the subject. and stochastic variance. Each task has three components t~ it: what the task is. utility and multicriteria decisions. most easily understood project management book on the market today. Fleming and Joel M. the value of information. In addition to being used to establish the test specifications for the examination. project turnarounds and crisis prevention. examinations. or related activities.9 planning the project). or infer success or failure by individuals in their project management career. why the task is ~erformed. helps project management professionals improve their decision. judgments and biases. probabilistic techniques.t The Project Surgeon: A Troubleshooter's Guide TO Business Crisis Management Boris Hornjak ordinary English. ISBN: 1880410281 (paperback) Project Management Professional (PMP) Role Delineation Study In 1999.on. administrators. of any size. Then his emphasis turns to crisis prevention. ~~~~~d value project M ~ Second Edition may be the bestwritten. The Role Delineation Study is an excellent resource for educators. decision trees. ISBN: 1880410249. includingexpected value. PMI@completed a role delineation study for the Certification ExProject Management Professional (PMP@) amination. and ultimately break the failure/recoverycycle. a consultant in project risk and economic decision analysis. Schuyler Schuyler. the study describes the tasks (competencies) PMPs perform and the ~ r o j e a management each task. ~ ~ practitioners. and identifies the knowledge and skills that are required to complete the task. PMI@completed a role delineation study for the Project Management Professional (PMP@) Certification Examination. (paperback) Risk And Decision Analysis in Projects Second Edition John R. (paperback) Earned Value Project Management Second Edition Quentin W. so you can free your best and brightest to focus on opportunities. ~ trainers. he explainsand demystifies key concepts and techniques. this second edition updates this straightforward presentation of earned value as a useful method to measure actual project performance against planned costs and schedules throughout a project's life cycle. modeling techniques. optimal decision policy. ~ and individ~ ual~ interested in pursuing PMP certification. and in any industry.org .mahng skills and integrate them into daily problemsolving. address problems when they occur. He writes with a dual purposefirst for the practid manager thrust into a crisis situation with a mission to turn things around. Each howledge and skills PMPs use the study's tasks is linked to a performance domain (e. A role delineation study identifies a profession's major performance domains (e. The selfassessment rating should not be used to predict. The role delineation task statements are presented in this manual in a format that enables you to assess how your project management experiences and trainindeducation knowledge levels prepare you to complete each of the task statements.pmi.. and how the task is com~leted. The authors describe the earned value concept in a simple manner so that it can be applied to any project. ISBN: 1880410273 (paperback) A veteran of business recovery. The authors have mastered a unique "earlywarning" signal of impending cost problems in time for the project manager to react. Individuals may use all of these tools to enhance understanding and application of PM knowledge to satisfy ~ersonal and professional career objectives. It describes the tasks that are performed in each domain.g. and prevent them from happening again.org or Shop at Our Online Bookstore at www. instead of on troubleshooting problems.
Dobson Teaming for Quality H. and details the basic knowledge and processes of project management. metaphysics. Called SMARTTM. and reduces lessons learned to a simple format that can be applied immediately to your projects. who has spent thirty years practicing. and matrix organization. The book collects the experiences and wisdom of thousands of people and hundreds of projects. More than two hundred software tools are listed with comprehensive information on systems features. and leadership skills needed by project managers. sponsors of a project. It says that you must first have a strong foundation in time management and priority setting. Its philosophytopractice approach will help people team in ways that promote exceptionally high levels of bonding. He proposes the "Enterprize Organization"as a model that takes advantage of the strengths of the functional organization. Francis M. His new book offers a stimulating synthesis of classical philosophy. such as homemade tomato sauce for pasta." The author. performance analysis. and handle emergencies. Webster Jr. made from the bottom up. Aligned. putting you in charge for possibly the first time in your life! ISBN: 1880410656 (paperback) Recipes for Project Success Al DeLucia and Jackie DeLucia This book is destined to become "the" reference book for beginning project managers. from scope management to work breakdown structure to project network diagrams. and how they handle multiple projects. discusses the technical. particularly those who like to cook! Practical. refers to himself as "the olde curmudgeon. ISBN: 1880410796 (paperback) The Juggler's Guide to Managing Multiple Projects Michael S. Are your projects SMART? Find out by reading this peopleoriented project management book with an attitude! ISBN: 1880410486 (hardcover) tion). resource analysis. and two decades of personal teaming experience to explain how individuals can choose change for themselves. independent. and collective agreement (consensus). the root causes of their resistance to change. Webster Jr. It is based on research and best practices. individual creative expression (innova This comprehensive book introduces and explains taskoriented. armaroundtheshoulder approach. how they perform time analysis. ISBN: 188041015X (paperback) Neal Whitten is a twentythreeyear veteran of IBM and now president of his own consulting firm. David Shuster Don't Park Your Brain Outside: A Practical Guide to Improving Shareholder Value with SMART Management Francis T. and teaching project management. dispenses insider information to novice project managers with a friendly. single dishes. ISBN: 1880410524 (paperback) ISBN: 1880410591 (CDROM) The Enterprize Organization: Organizing Software Projects for Accountability and Success Neal Whitten The Project Sponsor Guide Neil Love and Joan BrantLove This tothepoint and quick reading for today's busy executives and managers is a oneofakind source that describes the unique and challengingsupport that executives and managers must provide to be effective sponsors of project teams. to increasingly . Shuster shows how personal work fulfillment and corporate goals can work in alignment. SMART has saved significant time and money on the hundreds of large and small. and Transitional. project tracking. ISBN: 188041063X (paperback) Project Management Software Survey The PMI@Project Management Software Survey offers an efficient way to compare and contrast the capabilities of a wide variety of project management tools. writing about. cost analysis. behavioral science. particularly crossfunctional projects. and much more. while reducing or eliminating their weaknesses. or are. The survey is a valuable tool to help narrow the field when selecting the best project management tools. Former editorinchief for PMI@. An excellent introduction for those interested in the profession themselves or in training others who are. determine their resource requirements. He provides a history and description of all the components of modern project management. (paperback) tive. Regenera Shuster believes most attempts at corporate cultural change die because people fail to realize how addicted they are to the way things are. consulting on. this new approach is Strategically Managed. Hartman has assembled a cohesive and balanced approach to highly effective project management. Hartman Don't Park Your Brain Outside is the thinking person's guide to extraordinary project performance.PM 101 According to the Olde Curmudgeon Francis M. and the degree to which their willingness to change depends on the moral philosophy of management. management theory and processes. ISBN: 1880410559. and cost reporting. administrative. projectized organization. They are applied to the everyday task of cookingfrom simple. then introduces the concept of Portfolio Management to timeline multiple projects. simple and complex projects on which it has been tested. tempered by hardwon experience. It is also helpful reading for facilitators and project leaders. charting. and interdependent levels of project portfolios. The Project Sponsor Guide is intended for executives and middle managers who will be. logically developed project management concepts are offered in easily understood terms in a lighthearted manner. Here he provides a practical guide to addressing a serious problem that has plagued the software industry since its beginning: how to effectively organize software projects to significantly increase their success rate. It is deceptively simple.
ISBN: 1880410214 (paperback) . ISBN: 1880410583 (paperback) New Resources for PMP@ Candidates The following publications are resources that certification candidates can use to gain information on project management theory. design. information technology. ISBN: 1880410133 (hardcover) PMBOK Q&A Use this handy pocketsized.complex dishes or meals for groups that in turn require an understanding of more complex project management terms and techniques. Mantel Jr. The transition between cooking and project management discussions is smooth. and Deliver Projects on Time and Within Budget by Robert K. Editor Project Management Casebook edited by David I. and disintermediation to changing demography. et al. techniques.. Includes project management concepts and termsold and newthat are not only defined but also are explained in much greater detail than you would find in a typical glossary. principles. Scheduling. and procedures. Kliem and Irwin S. It's a reference you'll want to keep close at hand. and Carl Phillips Human Resource Skills for the Project Manager by Vijay K. As "management by projects" becomes more and more a recommended business practice worldwide. The PMBOK@Guide is an official standards document of the Project Management Institute and will continue to serve as one of the reference documents for the Project Management Professional (PMP") Certification Examination through 2001. the PMBOKB Guide becomes an essential source of information that should be on every manager's bookshelf. and tidbits of information provided with the recipes are interesting and humorous. Patricia Digh. Seventh Edition by Harold Kerzner Tools and Tips for Today's Project Manager Ralph L. Verma The New Project Management by J. Project Management: A Systems Approach to Planning. Koppelman EffectiveProject Management: How to Plan. A Guide to the Project Management Body of Knowledge (PMBOKa Guzde) by the PMI Standards Committee Global Literacies: Lessons on Business Leadership and National Cultures by Robert Rosen (Editor). Serves as a tool for learning about the generally accepted knowledge and practices of the profession. Ludin This guidebook is valuable for understanding project management and performing to quality standards. ISBN: 1880410613(paperback) The Future of Project Management Developed by the 1998 PMI@Research Program Team and the futurist consultant firm of Coates and Jarratt. Meredith and Samuel J. Also included are tips on handling such seemingly simple everyday tasks as how to say "No" and how to avoid telephone tag. and Controlling. P M B O K @ ' Guide. questionandanswer study guide to learn more about the key themes and concepts resented in PMI's international standard. Project &Program Risk Management by R. More than 160 multiplechoice questions with answers (referenced to the PMBOKB Guide1996 Edition) help you with the breadth of knowledge needed to understand key project management concepts. and Thomas Walker Earned Value Project Management. Fourth Edition by Jack R. It 'Overs everything from knowbots. Verma Principles of Project Management by John Adams. Max Wideman. Davidson Frame Organizing Projects for Success by Vijay K. social values. et al. ISBN: 1880410710 (paperback)  A Guide to the Project Management Body of Knowledge (PMBoKB G ~ 1996 ~ ~ ~d i ~~ i ) ~ ~  The basic reference for everyone who works in project management. Fleming and Joel M. et al. Manage. PMP Resource Package Doing Business Internationally: The Guide to CrossCultural Success by Terence Brake. Danielle Walker. Wysocki. after whlch the 2000 Edition will be used. and markets. ISBN: 1880410125 (paperback). Second Edition by Quentin W. Inc. this guide to the future describes one hundred national and global trends and their implications for project management. Project Management Experience and Knowledge SelfAssessmentManual by Project Management Institute Project Management: A Managerial Approach. Cleland. both as a recognized profession and as a general management tool. nanotechnology.
pmibookstore. and "PMI Today" are trademvks registered in the United States and other nations. Jeffrey Trailer.0609 Email: pmiorders@abdintl. "PMBOK".6206 Fax: +412. Pinto ISBN: 1880410435 ( m a c k ) Leadership Skilb for Project Managers. Karen M. All rights reserved.Project Management for Managers Mihaly Gmg. Inc. Sackman ISBN: 1880410036 (paperback) The world's 6reatest Project Russell W.741. Nigel J. *a ThePlJWBookofPtoject~Forms ISBN: 1880410311 (paperback) ISBN: 1880410508 (diskette) h. Pinto. Inc. Max W h a n ISBN: 1seo41O~:O (paperback) Principles of Project Management John Adams et al. Pennsylvania 151431020 USA 02001 Project Management Institute. Jeffrey Mmher ISBN: 1880410478 (paperback) ISBN: 1880410575 (CDROM) u n g the Brojgct T Human m t s of Project Management Series.com Mail: P M I Publications Fulfillment Center P O Box 1020 Sewickley."are trademarks of the Project Management Instimte. Richard Puerzer. .)Sec&Eckted by R.Instructor's Manual Edited by David I. Max wdeman ISBN: 1tW341001X (paperl>aCk) ~ P r w l Project Management Casebook. "PMP" and the PMP logo are certification marks registered in the United States and other nations. Verma ISBN: 1880410400 (paperback) AFrws#rkk~~MtlManageRlent ISBN: 1880410826. Verma ISBN: 1880410419 (paperback) Annotated Bibliography of Project and Team Management DavidI. Participants' Manwl Set (paperback) Order online at www. Bursic. A. Richard Puerzer.ct Management Groups in Large Functional Frank Toney. Karen M. keland ISBN: 1883410117 (paperback) . Volume Three Vijay K. Trailer ISBN: 1880410494 (paperback) D. l . Cleland. Guy Rafe. A. Ray Powers ISBN: 1880410052 ( W r b a c k ) Vlacflmu I . Facilitator's Manual Set (3ring binder) ISBN: 188041080X. Yamslav Wasak ISBN: 1880410451 (paperback) ~ ~ ~ ~ R. c t ~ F ~ ISBN: 1880410621 (paperback) Organizing Projects for Success Human Aspects of Project Management Series. Todd Palmer. Cleland. Damall ISBN: 188041046X (paperback) Psnw&rol#icsinProjectManagement Jeffrey K. Sm~th ISBN: 1880410540 (paperback) Project Leadership: From Theory to Practice Jeffery K.org Book Ordering Information Phone: +412. Bursic. ISBN: 1880410303 (paperback) h P W I p I .. "PM Network". Michele Govekar ISBN: 1880410109 (paperback) ~umm Resource Skltls for the Project Manager Human Aspects of Project Management Series. and "Project Management Journal" and "Building professionalismin project management. Yaroslav Vlasak ISBN: 1880410184 (oa~erback) . Volume One Vijay K.voropalw I SBN: 1880410028 ( p a p e h k ) The Virtual Edge Margery Mayer ISBN: 1880410168 (paperback) The ABCs of DPC Edited by PMl's Desi@ProcurementConstructMnSpecific lntsrest Group ISBN: 1880410079 (paperback) Project Management Casebook Edited by David I. Verrna ISBN: 1880410427 (paperback) Value Management Practice Michel Thiry ISBN: 1880410141 (paperback) How to Turn Computer Problems into Compcttthre Advantage Tom lngram ISBN: 1880410087 (paperback) Achieving the Promise of Information Technology Ralph B. Pinto.Cleland. "PMI" and the PMI logo are service and trademarks registered in the United States and other nations. . Volume Two Vijay K.6t pmctbs of koj. QI#r#v Lervls R. Editors' Choice Series Edited by Jeffrey K . Peg lhoms.741. . Jeffrey W.
This action might not be possible to undo. Are you sure you want to continue?
Use one of your book credits to continue reading from where you left off, or restart the preview.