Professional Documents
Culture Documents
for Managers
Decision Analysis
for Managers
A Guide for Making Better
Personal and Business Decisions
Second Edition
David Charlesworth
Decision Analysis for Managers: A Guide for Making Better Personal and Business Decisions,
Second Edition
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or
transmitted in any form or by any means—electronic, mechanical, photocopy, recording, or any other
except for brief quotations, not to exceed 400 words, without the prior permission of the publisher.
Excel, PowerPoint, Visio, and Word are Microsoft Corporation trademarks. Analytica is a Lumina
Corporation trademark. Crystal Ball is an Adobe Corporation trademark. TreeAge Data is a TreeAge
Software Inc. trademark. @Risk is a Palisades Corporation trademark.
Cover and interior design by Exeter Premedia Services Private Ltd., Chennai, India
10 9 8 7 6 5 4 3 2 1
Success (getting what you want) depends on luck and good decision
making. You can’t control your luck, but you can maximize your odds by
making the best possible decisions.
Broadly speaking, this book organizes and presents otherwise formal
decision-making tools in an intuitively understandable fashion. The
presentation is informal, but the concepts and tools are research-based and
formally accepted. Whether you are a business owner, a manager or team
leader, or a senior professional, these tools will help you make better
decisions in both your personal and your business life.
In the second edition of Decision Analysis for Managers, chapters
focusing on risk analysis and decision quality have been added.
Keywords
business decision making, decision analysis, decision and risk analysis
(D&RA), decision framing, decision making under uncertainty, decision
quality, making decisions, multiattribute decision analysis, personal
decision making, portfolio analysis, project risk, risk, strategic decision
making, trade-offs, uncertainty
Contents
Preface
Acknowledgments
Success (getting what you want) depends on luck and good decision
making. You can’t control your luck, but you can maximize your odds by
making the best possible decisions. This book introduces you to tools that
can help you make better personal decisions and that will help teams and
work groups align quickly and make better recommendations and decisions.
It gives you frameworks for gaining insight when the decisions and/or the
people involved in those decisions are difficult. Whether you are a business
owner, a manager or team leader, or a senior professional, these tools will
help both your personal and your business life.
Acknowledgments
It is difficult to list all the people who help a person formulate a set of
experiences and frameworks that constitute a career. I’ve been fortunate to
have had the opportunities to work on challenging and interesting problems
with some rather amazing people. Here are the high points—errors of
omission are sincerely regretted! We’ll start with decision analysis (DA):
Tom Sciance (now an adjunct professor at the University of Texas) was the
visionary leader at Conoco who started both David Skinner (author of
Introduction to Decision Analysis) and me on the DA adventure. SDG was
our early DA mentor and helped us through those first Conoco projects.
Hollis Call of ADA was also a source of advice and encouragement. Gary
Bush, Pat Leach, Bill Haskett, Nick Martino, Paul Wicker, OJ Sanchez,
Stefan Choquette, James Mitchell, Bob DeWolff, Mike Stallsworth, and the
rest of the early DSI (Decision Strategies, Inc.) gang influenced my
thinking about DA considerably and provided helpful advice, ideas, and
collaboration. Professors Ronald Howard (Stanford), Bob Winkler (Duke),
and Bob Clemens (Duke) have been inspiring and encouraging.
In my association with Chevron for over 13 years (consultant and
employee), I’m grateful that I was able to work in a company that is serious
about using state-of-the-art decision science. Frank Koch, Larry Neal, Elton
Allen, Dale Nichols, Jeff Circle, and Craig McKnight have been very
generous with their time, ideas, encouragement, and collaboration.
My underlying management philosophy was influenced most by John
O’Bryan, Dave French, John Lindsay, and Jarl Swanson of the bygone
Louviers (DuPont) training group (and our “gurus,” Ed Klinge and Ted
Keegan). I’m grateful to the teams that I managed at Parkersburg, Savannah
River (now SRNL), Ponca City, and Gainesville for putting up with and
responding to my constant pushing for synergy and a “skunk works”
culture. People I’ve known who led by example and served as role models
and inspiration include Herb Eleuterio, Bob Cook, and Tom Sciance
(DuPont), Todd Wright, Harry Harmon, and Ed Albenesius (SRNL), John
Bailey (PCR), and Sam Trobee, Bill Hauhe, and Doug Quillen (Chevron).
Manny Gonzalez, (retired Chevron Alliance Manager) is one of the most
successful entrepreneurs I’ve ever met and was an absolute joy to work
with (his bosses, Matt Palmer, Melody Meyer, and Paul Siegele are to be
commended for letting Manny run).
My last Chevron assignment involved working with JPL. I thank Dean
Wiberg, Larry Bergman, and project managers Jeff Hall, Andrew Aubrey,
Harish Manohara, Kymie Tan, and Hari Nayar for their hard work,
competency, and willingness to use the DA toolkit as a way of working
day-to-day with me.
Heath Lynch and Cole Brinkley from Chevron’s Area 52 read the first
edition thoroughly and asked many excellent questions, which gave me
thoughts as to which topics needed expansion in this edition.
I am very appreciative of Doug Quillen and Paul Wicker for reviewing
and commenting on this manuscript. Paul kept me grounded on DA issues,
and Doug added very valuable perspective and insight. Nancy Winchester,
formerly a Chevron colleague, is especially appreciated for her ability to
see what others miss. Larry Kingrey found inconsistencies and items that
needed clarification; I am grateful for his detailed review of the first edition.
Business Expert Press is to be commended for taking on the project.
I especially appreciate the encouragement and the detailed review and
comment by Don Stengel (Professor, California State University at Fresno).
Finally, I’m thankful to my wife Debbie, who has uncomplainingly
trudged off to DAAG meetings, edited the DA books that we’ve published
together, and provided encouragement and good advice for the many years
we’ve had the good fortune to be together.
As I’m writing this, I can’t help but think of Zig Ziglar (who recently
passed away). I had the chance to meet Zig (after reading several of his
books) in the mid-1980s—he was an inspiration to me and was as gracious
and bright in person as his writings would lead you to believe he would be.
When two of my colleagues and I had lunch with Zig, he told us, “You can
have anything in this world that you want as long as you help enough other
people get what they want.” I’ve had some difficult situations in life when I
wondered if his saying was correct—sometimes it seems like the bad guys
are winning. But with the benefit of experience and hindsight, I think he is
correct, and those are good words to live by. Sometimes when we’re doing
financial analysis and we are focused on trying to maximize net present
value (NPV), it is easy to forget why we’re doing what we’re doing. My
sincere hope is that this book will give you some tools to help you get what
you want, and that it will provoke you to think about why you want what
you want. Enjoy!
CHAPTER 1
The premise of this book is simple: there is a set of tools and mental
frameworks referred to as decision analysis (DA) that can help you, your
family, and the teams and groups that you work with improve your decision
making. You don’t need to hire expensive consultants to use the tools—the
concepts are not that difficult and the tools are very useful.
Here is the context I’d like you to envision: you and I have just sat
down in first class seats for a three-hour plane trip. You know that I
have a background in both management and DA consultation and you
have asked me how DA could help you in your job as a manager (or
team leader or senior professional). We have a pad of paper to draw
on, and we will talk about concepts, examples, and stories rather than
the mathematics underpinning DA.
Many of the examples we’ll discuss involve personal decisions—my
friend David Skinner (DA practitioner, author, and entrepreneur) and I
discovered while teaching DA at Conoco that people absorb the
concepts more quickly from personal examples than from business
examples.
So get comfortable, and let’s talk about management and DA tools! Since
you have an inquisitive mind and have several questions, we’ll begin with
FAQs (frequently asked questions) about DA:
Why is DA important?
What is DA?
Where did DA come from?
Does DA work?
What is a good decision?
Why hasn’t DA been more widely adopted?
How do the DA tools fit together?
Then we’ll talk about the tools and how you, as a manager, a team
member, or a family member can use them.
We cannot control our luck, but we can use DA to improve the quality of
our decisions and subsequently increase the chances of getting what we
want (see Figure 1.1).
An important distinction DA makes is that good outcomes and good
decisions are correlated, but there is no guarantee of a good outcome as a
result of making a good decision. We cannot control outcomes; all we can
do is control the decisions we make. Improving the quality of our decision
making through time improves our odds, but we are still at the mercy of
luck. My friend Patrick Leach noted that it is ironic that in professional
poker, knowing the odds is just the start of being able to play, whereas in
business, “the players have the skills but do not understand the odds.”1 DA
helps you understand the odds.
When I introduce this concept while teaching DA, I stop and ask the
students for examples of poor decisions that can have good outcomes.
Invariably a student will bring up winning the lottery, as the expected value
(probability of winning times the amount you win) is considerably less than
what you have to pay for the ticket. Lifestyle choices we make fit into this
part of Figure 1.1: some of us can smoke cigarettes all our lives and have no
adverse health effects; others will get cancer or heart disease from making
the decision to smoke.
When I ask for an example of a disappointing outcome from a good
decision, usually a wildcat oil well comes up as an example. You can
carefully analyze the seismic data and pick the best spot to drill a well and
still come up with a dry hole. In terms of personal decisions, you can do
careful research on a stock or mutual fund and invest carefully, only to have
some unforeseen factor drive down the value of your investment. Health is
also an example of this part of Figure 1.1: we can eat healthy foods,
exercise appropriately, and still get cancer or heart disease. All we can do is
make the best decisions possible, thereby improving our odds of getting
what we want.
The other point I make when discussing Figure 1.1 is that making poor
decisions, especially with regard to lifestyle choices, will eventually catch
up with you. We’ll talk about the implications of poor decision making
relative to ethics and lifestyle choices at the end of the book. For example,
Ralph Keeney concluded in a recent study that about half of the deaths in
the United States are a direct result of poor lifestyle decisions (we’ll talk
more about this in Chapter 14).
One of the biggest problems we have in business is that managers are
usually rewarded (or penalized) for outcomes rather than the quality of their
decisions. The problem with rewarding luck is that sooner or later luck runs
out. If you are in a position where you are evaluating managers’
performances, long-term success of your company depends on making this
distinction (talent versus luck) as fairly as you can.
Companies Use DA
That said, the tools are available for our use to improve the odds of
getting what we want. I’m optimistic—I think that as managers experiment
with the DA tools and find out that the tools are useful, DA will
increasingly become a significant part of the way we run our businesses.
There is one caveat, though. All of the DA tools assume purposeful and
transparent motivation and communication. For thinking through situations
where this assumption may not be correct in dealing with other parties, a
Game Theory approach is more appropriate (see, for example, Game
Theory for Business by Paul Papayoanou). If the culture of your own
organization needs significant repair, that is another topic. The DA tools can
help improve an organization’s culture, but only to the extent that
communication is direct and honest.
Note that the influence diagram is included as both a framing tool and an
analysis tool. This is because the influence diagram serves as a bridge
between framing and analysis.
Also, note that once the course of action becomes clear, it is time to
transition from DA to planning.
CHAPTER 2
About Facilitation
Another facilitation factor is the number of people you include in your
meetings. There are trade-offs between having everybody included versus
having a manageable team.
Problem-solving meetings are the same way—the more people you have,
the less time for each person to speak and the more difficulty you have
facilitating. However, excluding a key person from your meeting is likely to
generate hurt feelings, and set your project back, and may exclude key input
necessary to solve the problem.
Another consideration is the willingness of the people to work together
and try the tools.
When I was consulting, one time I got to the meeting and the room
contained 18 people, most of whom had never met each other before.
The project manager and I had discussed the meeting and had
developed an opportunity statement (see the next section), but I didn’t
realize there were that many people involved. I thought to myself,
“What am I in for?” The group was fabulous, though—they did
everything I asked them to do, and all of them participated with energy
and competence. We ended up with a delightful product that later
withstood a surprise audit by the parent company. One of the things we
did during the two-day workshop was split into smaller groups and
divide up the tasks—this allowed people to work on what they were
interested in.
Problem/Opportunity Statement
The first element of any decision analysis (even a very simple one) is a
problem statement. (Some companies always refer to these starting points
as “opportunity” statements, which can result in team cynicism if the
situation is truly a problem.)
This should be a concise one-, two-, or three-sentence (or bullet point)
statement as to why you are doing something. If I’m consulting, I meet with
the project manager before the meeting and develop the statement together
with him or her. If I’m convening a team to address the problem, I draft the
problem statement ahead of time and send it to the team as a pre-read.
It is very important to give the team time to discuss the problem
statement and edit it together. Once the team edits the statement together,
they align and own it. You do have to make sure that the edits are correct—
don’t abandon the content, but you have to be willing to adjust to include
others’ perspectives. It is also important to make sure that the statement is
technically correct, given the state of knowledge of your team and
associated subject-matter experts.
As an example statement, let’s use a career path decision as the situation
we have to deal with. Perhaps you or someone in your family is graduating
from college or grad school and has to choose among multiple job offers.
Or perhaps you or one of your friends is working for a company whose
future is limited (poor decision-making practices), so you’ve engaged a
recruiter and have ended up with multiple job offers. Here’s an example
statement:
Opportunity statement: We have to choose between multiple possible
career paths that have become available, specifically three competing
job offers. We want to maximize both financial and lifestyle benefits
for the family unit.
In this case, assuming the job offers are good ones, this is best
characterized as an opportunity rather than a problem!
My thinking concerning this very important DA tool has changed
recently as a result of new perspective….
My friend Brian Hagen, an experienced DA consultant, is in the process
of publishing a book that notes the distinctions associated with problems,
risks, and opportunities.3 This can be powerful, as management typically
approaches these three types of situations differently (and with different
biases). Brian notes that there are usually “tipping point” events associated
with problems, risks, and opportunities and we need to understand where
these tipping points are.
A problem is a situation where some kind of tipping point has occurred
or is certain to occur. My friend Tom Sciance notes that, sooner or later,
“the oxcart ends up in the ditch” and you have to figure out how to get it out
of the ditch. (This could be a literal tipping point!) “Do nothing” may be an
alternative in a problem situation (leaving our oxcart there for somebody
else to haul off), but there are always consequences associated with the do-
nothing alternative (losing the money we spent for the oxcart and maybe
paying a fine if the authorities figure out that it is ours). From a financial
perspective, it is very unlikely that the problem was included in the budget;
hence more money is likely to be required to resolve the situation.
A risk is a potential problem. In our case, there is a risk that if the road is
wet, the oxcart could slide into the ditch. There is a risk that an axle could
break and cause the oxcart to slide into the ditch. Here, the tipping point
hasn’t happened, but it is something that we would mitigate if we could do
so. In this case, we’d add the cost of mitigation to our budget if we deemed
it necessary.
An opportunity is a situation where something positive may be
obtainable by expending time and monetary resources. In our case, perhaps
a new transportation company has started up, which will allow us to hire
them to haul our stuff with their oxcart at a reduced cost.
Understanding whether the situation is a problem, a risk, or an
opportunity at the start of the decision analysis can add very valuable
perspective on which tools you might use and how you approach the
situation.
Issue Raising
Once we have edited our statement, it is time for issue raising. This is a
classic brainstorming exercise—we try to capture everybody’s thoughts in
reaction to the problem as fast as we can. We don’t judge the merit of the
opinion—we just write it down. My former DSI colleagues from Calgary,
James Mitchell and Bob DeWolff, call this “rant time.”
The reason we need to do this is that people cannot move on to exploring
new thoughts (and others’ thoughts) very well if they don’t get a chance to
speak their opinions first. They will sit there and get frustrated while
waiting for a chance to interrupt with their thought(s). Plus, if the facilitator
captures the thoughts accurately, it demonstrates that he/she is listening,
which is key to getting good involvement as you go through the tools. You
can ask clarifying questions if you don’t understand the person’s point. And
you don’t have to capture every word—try to headline the key thoughts.
One project I facilitated had a good problem statement that the team
edited and agreed to. However, during the 2-day workshop, by 10:30
AM on the first day we were still in rant time! It turns out that there
was another project that the team was worried about, and they couldn’t
stay on topic due to this other project. So we called a break. The
project manager and I talked about the other project, and when we
came back we shifted and collaboratively drew the decision trees for
the other project. Once the team worked through the decisions and
uncertainties associated with the second project and got some clarity
about how they were going to approach it, they were comfortable
putting it behind them and going back to the “real” problem.
Back to our multiple-career path example, here are some example issues
that might come out as part of the discussion:
Which is more important, financial remuneration or lifestyle?
How do we account for the higher risk (and higher potential reward)
of one of the companies?
Which job includes the most interesting work?
Which is more important: long-term career or short-term
compensation?
How do we incorporate opinions and preferences of key
stakeholders (friends and family) into our decision-making process?
Situation Analysis
So far, we’ve delineated the problem and recorded our team’s reaction to
the situation. At this point, it is appropriate for somebody who has technical
and business knowledge about the problem situation to brief the rest of the
team as to the technical and business situation. If the situation involves a
competitive landscape, the standard tools for examining your position
relative to the competition apply (e.g., Porter’s five forces, SWOT1).
Stakeholder Analysis
Stakeholder Analysis can be completed quickly or it may take quite a bit of
time, depending on how many stakeholders are associated with the problem
and how complex their interests are. The simplest way to start this is to
brainstorm a list of stakeholders, combine them into groups, examine the
interests of each stakeholder, assess how powerfully they can influence the
outcome, and describe how the problem affects them.
As this is being done, you are likely to hear comments noting actions that
need to be taken as part of implementation—comments such as, “We need
to be sure that we file for this permit by a certain date, otherwise we’ll be
delayed,” or “We need to meet with the person quickly to determine
whether they’ll support our work or not, as they could derail it.” Be sure to
note these comments on a “bucket list” or action item list so that the
thoughts are captured.
What Do We Want?
One time when I was consulting, I met with a group of managers talking
about some common issues they were having with projects at their
company. Complaints included:
In this chapter, we’ll discuss one of the most powerful framing tools in
the DA toolkit: the objectives hierarchy. Proper use of the objectives
hierarchy can help teams and employees align, understand trade-offs, and
understand what the goals are for an endeavor. If the objectives hierarchy is
not addressed, the complaints listed above are likely to appear.
An objectives hierarchy is a schematic representation of a project’s
primary, fundamental, and means (or supporting) objectives, where:
With this figure in mind, and listening to the issues raised by the team in
reaction to the problem statement, you can start to get a feel for which tools
are going to be the most useful for your situation.
Note how one’s perspective can affect one’s viewpoint on objectives and
their relative importance. Much value can be obtained if a team takes the
time to articulate and understand each other’s view on the project’s
objectives.
Developing a good objectives hierarchy with the first client and then
using it with the second client would have resulted in either (1) getting buy-
in on what was important from the second client, or (2) abandoning the
project at the time of changing clients. Either outcome would have been
preferable to what happened.
Now that we understand what we want and have explored the trade-offs
between our various goals and objectives, we can begin to look at our
alternatives. The tools we’ll explore include the decision hierarchy, decision
trees, and the strategy table. All three of these tools are very useful and can
be used within the context of an analysis or to add insight to a smaller
problem.
Decision Hierarchy
There are three kinds of decisions we need to consider:
closed decisions,
strategic decisions, and
tactical decisions.
Closed decisions are decisions that have already been made. They are
sometimes referred to as boundary conditions. Closed decisions may be
decisions that have been discussed and decided by the project team or group
already and you do not want to reopen them. Closed decisions may be
decisions that have previously been made by management. Some texts call
these “policy” decisions, but I don’t use this term because teams confuse
policy decisions with corporate policy. The word “closed” is a better
description. It is important, however, to verify that these decisions are
indeed closed with whomever the decision maker is for your problem.
Sometimes executives will challenge what a team presumes to be a closed
decision and instruct the team to explore potential alternatives.
Tactical decisions (in this context) are decisions that are important, but
they can be addressed later. Again, though, it is important to verify with the
decision maker(s) that you can make these decisions later. Decision analysts
have a saying, “Today’s tactical is tomorrow’s strategic,” as it is very likely
that you are going to have to come back and deal with the tactical decisions
sooner or later. This part of the hierarchy is useful for planning, as you can
note on a time line when the decisions will need to be addressed.
Considering our career-path example, our decision hierarchy might
include:
Closed decisions
Strategic decision
Tactical decisions
Decision Trees
You’ve probably figured out that one of my favorite tools is the decision
tree. Trees are simple but powerful and can be used in many ways. In a
decision tree:
Current technology addresses part but not all the needs of the
program, in which case a cost/benefit analysis should be done to
determine whether to proceed or wait. We labeled this outcome the
“sort-of” case; the team deemed this the most likely (and least
desirable) outcome.
This wasn’t how I drew the tree, though. Figure 4.4 is the tree we used
during the meeting.
The insight was that only in the middle “sort of” case would a decision
have to be made.
Also, you don’t have to completely fill out the tree to gain insight. If you
have several uncertainties, the tree can quickly become a thorn bush with
too many branches. Here you can simply draw the decisions and
uncertainties as unconnected events in their relevant sequencing (Figure
4.5). Decision analysts sometimes refer to this kind of a tree as a “skeleton”
tree.
Trees can also be used as part of the analysis toolkit, so we’ll be talking
about decision trees again later.
Alternatives Tables
If you have more than two or three decisions, it becomes difficult to
represent your problem with trees. If, for example, you have five decisions
with three alternatives for each decision, there would be three to the fifth
power, or 243 possible end nodes (combinations of alternatives) in the tree
without including uncertainty! A tree with this many branches is too large a
tree to add insight.
To handle this, we list each decision across the page and note the
alternatives available for each decision in a column underneath it. The
easiest way to build an alternatives table is to use a whiteboard, or, if you
want to use a projector and work online, use Excel. Note that since we only
have one decision for our career path example, we don’t need an
alternatives table. Therefore, we’ll illustrate the concept with a new
automobile strategy example.
For this example, you are on a team that has been tasked with developing
a completely new vehicle. Starting with a blank whiteboard, you have many
alternatives available, such as shown in Figure 4.6.
Sometimes it is important for the alternatives to be mutually exclusive
and collectively exhaustive, but in this case, you can have more than one
alternative selected to constitute a product line. For example, a high
performance car could come standard with a V8 and have an option for a
V12 (like the BMW 830/840/850 series did).
Once again we can see the value of the objectives hierarchy.
Understanding what our management wants and what our team’s objectives
are will give us insight into which alternatives logically fit together. Once
we start to think about coherent sets of alternatives, we can develop a
strategy table.
Figure 4.6 Automobile alternatives table
Strategy Tables
Developing a strategy table can be fun. Looking at the decisions and lists of
alternatives in the alternatives table, we brainstorm a list of strategic themes
(e.g., aggressive, economy). Humorous names for the themes can make this
exercise more fun (e.g., “Genghis Khan” for aggressive, “Cheapskate” for
economy). Select the alternative for each decision that is consistent with the
strategic theme. In a team situation, I ask each person on the team to
develop one or two strategies and note their rationale in developing their
strategies. Then we list each strategic theme and quickly go through them
together. Figure 4.7 shows several possible strategies for the automobile
example.
I’ve listed the strategic themes in the left column and then the
alternatives for each of the five decisions in columns to the right. A
“frugality” theme could include a coupe and a sedan. “Haul the family”
could include a minivan, SUV, and station wagon. The strategy table helps
you to formulate strategies that are internally consistent and allows you to
explore your options.
One way to qualitatively evaluate each strategy is to list its pros, cons,
and hunches. Then list the answer to the question, “If our competitors knew
we were planning this strategy, what would they say about it?” This
evaluation can be used to narrow from many strategies to two or three to
carry into analysis. Also, many times a hybrid strategy ends up being the
strongest.
Keep It Clear!
It is important to make sure that you and your team are clear on objectives,
alternatives, and uncertainties. I help David Skinner teach the EMBA class
at Rice each year. The class is divided into teams, and each team presents a
comprehensive project summary at the end of the semester. Invariably at
least one team will add confusion to their report by adding objectives and
uncertainties to the alternatives contained within their strategy table. Client
teams I’ve worked with have been tempted to do this too.
An alternative represents a true choice that your organization has the
authority to make. An objective is something that you want—it may be
important and something to strive for, but it is not an alternative. An
uncertainty is something that you cannot control. It may be extremely
important, and it may be related to your alternatives and your objectives,
but it is not a choice.
Let’s put this in the context of our career path example. We have one
decision with three alternatives. We choose which job offer we accept.
Our objectives are noted in the previous chapter—we would like a
good financial package, a good social life, and a good professional
environment. The uncertainty is the extent to which, given the choice
we make, we will achieve our objectives.
The reason that I bring this up is that a strategy table won’t work if you
don’t have clarity around the choices that you actually have.
Influence Diagrams
What Do We Know?
We’ve discussed our problem and put it into context. We now understand
our objectives and decisions. It is time to consider uncertainty (although the
decision tree does get us started on the uncertainty discussion). A good way
to discuss uncertainty is to use a tool called the influence diagram.
An influence diagram is a graphical representation of a problem, which
shows:
Most influence diagrams are more complex than this one. Teams starting
out have a tendency to make the influence diagrams too complicated—
including every possible input from an accounting cost sheet. For one of my
first projects, we had a conference room with one entire wall made up with
whiteboard, and we filled the entire wall with the influence diagram!
During the framing phase, it is better to keep the influence diagram
relatively simple. Once we start the analysis and assess subject matter
experts, we’ll find that the influence diagram will change anyway, as
experts have their own way of looking at the risks and uncertainties.
Another tip for using the influence diagram is to note the experts you will
talk with to assess the risks and uncertainties on the influence diagram after
you have drawn it. If you are reviewing the project for your decision maker,
showing the experts that you plan to consult will add credibility to your
project if the decision maker agrees with your selection of subject matter
experts. It is also likely that the decision maker will suggest additional
experts that your team had not considered, which will allow you access to
more expertise and increase the credibility of the analysis.
For our career path example (Figure 5.3), we have one strategic
decision: which job offer to accept. There are three measures-of-value:
financial (we’ll use NPV of pay plus benefits as the financial
measure), social, and professional. We’ll discuss how to handle the
non-quantitative measures-of-value (social and professional) later in
the book (hence they are grayed out in Figure 5.3). The uncertainties
associated with NPV include starting salary, raises and promotions,
bonuses, value of benefits, options, and, in the case of offer C,
probability of success for the start-up company.
Another option (not shown to keep the example simple) is to include
an estimate of cost-of-living associated with each job offer. If cost-of-
living differences are significant, you would need to include this
uncertainty as well. Also, I’ve shown taxes in the influence diagram—
you may elect to use NPV of pre-tax cash flow as your measure-of-
value, or you may decide to use after-tax cash flow.
When you consider Figure 5.3, think how valuable it would be to show
the influence diagram to your significant other as you think through the job
offer decision. The decision tree helps us to frame the decision, but the
influence diagram shows the uncertainties that feed into just the financial
part of the decision! It also gives you insight as to how you might approach
doing the calculations to estimate the NPV of the three alternatives, hence
illustrating the value of the influence diagram as a segue into analysis.
Uncertainty Assessment
Discrete Probability
When hearing or reading the word risk, what comes to your mind? Most
people associate risk with the chance that something bad will happen. Some
decision analysts use the term risk or the term chance when the unknown
outcome has a discrete outcome associated with it: yes or no, success or
failure, zero or one. A coin flip has a discrete outcome with a 50% chance
of success for either side (heads or tails) of the (presumably honest) coin.
Discrete outcomes are always assessed as a percentage (i.e., a number
between zero and one), never a range.
For example, weather forecasters assign a percentage for the chance of
rain (e.g., 30%). What the percentage means is that, when weather
conditions are like they are now, on average for any given location within
the relevant area, on three days there will be measurable precipitation and
on seven days there will be none. Note that the probability of rain does not
address how much rain is likely to fall. It could be 0.01 inch or 14 inches!
Half of the time, the answer will be lower and half of the time the
answer will be higher than the “p50.”
It equals the mean when dealing with uniform or normal
distributions.
With the rain example, the median might be 0.25 inches, again, if we have
measurable precipitation.
I’ve shown these assessments (0.05, 0.25, and 1.5 inches) in tree form
(along with a 30% chance of rain, Figure 6.1).
Figure 6.1 Rain uncertainty
Some people prefer to work with the density function; others prefer the
cumulative probability graph (s-curve). I’ve always worked with the s-
curves; therefore, I will primarily focus on them.
In reality, distributions are seldom normal. Consider the rain quantity
distribution in Figure 6.4—it is clearly not a normal distribution.
If there is measurable precipitation, you can see that the curve flattens out
rather quickly. This makes sense—it is normally quite rare (at least in North
America) to get more than four inches of rain within one day. Of course, if
there is a tropical storm or hurricane, the conditions are different and the
weatherman’s assessment would likely be different than the one shown in
the tree (Figure 6.1).
Referring again to Figure 6.4, the p70 is at about 0.5 inches. This means
that only three times in ten similar situations will rainfall exceed 0.5 inches.
Note that Figure 6.4 assumes that there is measurable precipitation. If we
consider that there’s only a 30% chance of measurable precipitation, the
curve would not start until the 70% point on the y-axis.
Figure 6.4 Rainfall cumulative probability distribution
Many naturally occurring distributions are log normal; however, there are
many different statistical distributions. We’ll talk more about how to deal
with different distributions when we talk about modeling and simulation
later in the book.
The way to counter this is to do the best job you can ahead of time and
develop an s-curve as to what you think the other side would accept. You
can float the first number as long as you understand where on the s-curve
you are—you want a number that is credible but low (e.g., the p10 or p20
on the curve). This is a very powerful concept.
It bears repeating, though—always start with the p10 or p90 and finish
with the p50.
Overconfidence
David Skinner and I use a fun exercise with our classes to illustrate
overconfidence. We call this the “Attila the Hun” exercise. We go
through the explanation of uncertainty, ranges, and distributions. Then
we have five “off the wall” questions (e.g., “What is the year Attila the
Hun died?”). Each person in the class writes down their p10, p50, and
p90 estimate of the answer. Then we tally up the results for the five
questions in four columns:
If true p10s and p90s were captured, about 20% of the answers would be
lower than the p10 estimates or higher than the p90 estimates. This never
happens! Usually about half of the results are in columns A and D.
The way we counter this bias is to ask the question, “Let’s start with the
p10 outcome. What things could happen that could lead to a low outcome
for this uncertainty?” You write down all the items the expert can think of.
Then you reverse the question and ask the expert for things that could
happen that could lead to a high outcome for this uncertainty. Once you
have painted this picture, you ask the expert for their p10 estimate and their
p90 estimate. After you have the p10 and the p90 established, you ask for
the p50.
There’s nothing magic about the p10–p50–p90 points on the distribution.
If the expert thinks differently about the situation, capture the expert’s
thinking. For example, one time I was assessing an expert about equipment
costs, and he said, “I don’t know about a p90, but I can tell you what a p70
number would be.” In his mind, he felt comfortable giving an estimate that
the outcome would be higher about 30% of the time.
Immediacy Bias
Yet another potential bias is personal interest. If the expert has a personal
interest in your outcome, they cannot help but be biased. If you are talking
with one of your project engineers about how long it will take to complete
part of a project, the engineer may be tempted to anchor you high and “sand
bag” the task.
There are just a few common biases. For further reading on bias, I
recommend Tversky and Kahneman.1
For each uncertainty, as noted when we discussed anchoring bias, start with
either the p10 or p90. Write down the expert’s view of the kinds of things
that could lead to a p10 outcome. Then switch and write down the things
that could lead to the p90 outcome. Then ask the expert, “Given that a
number of the factors that could lead to a p10 outcome happen, what do you
think the outcome would be?” Write down his or her answer and repeat for
the p90. Draw a graph showing these points on the whiteboard. When you
and the expert are satisfied that you have captured his or her state-of-
knowledge, ask what the expert thinks the p50 outcome would be. This is
the outcome where the expert thinks there’s an equal chance of the outcome
ending up being higher or lower than the estimate. Plotting this point gives
you an idea of the expert’s thinking about the distribution. For example, if
the p50 is much closer to the p10 than to the p90, the distribution is skewed
toward the low side, but there’s a small chance of a high outcome.
It is important to write down the basis behind the expert’s thinking.
Sometimes other managers will not agree with the numbers, probably
because of their own experience or they have a different expert. Having
written down the factors leading toward the p10 and p90, the people who
disagree can discuss the factors. It is possible that the original expert did not
consider additional factors that might be important, in which case you can
work with the experts to update the assessment. As true experts discuss the
factors and their experience, in my experience they will almost always build
on each other’s thinking and end up with a better result. Again, document
the subsequent discussions—you may have to iterate several times, but the
assessments should improve in quality as you iterate.
In the career example, we would assess the high, medium, and low
outcomes (i.e., p10/50/90 outcomes) of our potential wages, salary,
bonuses, and benefits for each of the three companies. We are likely to
have very similar factors that would lead to high outcomes or low
outcomes for these uncertainties. The expert may want to consider
them collectively or individually. As noted previously, we write down
the factors that could result in a high (p90) outcome, write down
factors that could lead to a low (p10) outcome, and then write down
the assessments.
In the career path example, we have offers in-hand, so we know the
salary and current benefits. What is not known are future raises,
bonuses, and changes to benefits. This is what we have to assess. For
example, for a given company, future raises could be assessed at 0%,
2%, or 5% per year. Bonuses could be assessed at 0%, 5%, and 15%.
Note that these distributions are not normal distributions—we’ll
discuss distributions again in Chapters 8 and 9.
A Special Case: Price
What about the selling price of a product? Is it a known (constant)? Is it a
decision that a company makes? Or is it uncertain? Traditional economics,
of course, notes the price/volume relationship—as you lower a product’s
price, you should see a corresponding increase in sales volume, and vice
versa.
It sounds trite, but my answer to the previous questions is “yes!” Price
can be a constant, it can be a decision, and it is definitely uncertain. For
commodities, of course, price is a true uncertainty, set by traders in
commodity exchanges (e.g., the price of oil, gold, and wheat). However, for
products that are not commodities, I think the correct way to look at price
includes the following:
1. We set a target list price for a product. This is a decision that is likely
based on judgment, experience, market assessment, and (hopefully)
results from a good DA model!
2. Market response to our list price is uncertain. The volume of sales we
will achieve from a specific list price is an uncertainty and should be
represented by a range.
3. The level of discounting off the list price we will have to utilize in
order to stay in business is another uncertainty (with a few exceptions
such as some of Apple’s products, nobody pays list price for anything
today!).
4. The length of time before we have to lower our list price in order to
respond to competitive pressure is another uncertainty. Or, we may
have to eventually raise prices to cover costs if we’re in a period of
rising commodity prices.
5. Ability to sustain a price increase due to increasing commodity prices
is uncertain. Think about the airlines—if fuel prices go up, one airline
will raise their prices and hope the others follow. If the others do not
follow, they’ll have to lower their prices back if they want to keep the
same number of customers.
6. Planned obsolescence is another decision or set of decisions. Consider
electronic and computer products—new models are introduced all too
frequently. This gives the manufacturers an opportunity to reset the list
price and customer expectations with each introduction (if they so
desire). The length of time to develop and produce the next edition of
the project is another uncertainty (your best critical path plan usually
represents somewhere between a p01 and a p05 of the actual start-up
date—we’ll talk more about this later).
We have talked with our experts and updated our influence diagram
accordingly. It is now time to start building a deterministic financial model.
By deterministic, I mean a model that provides one “answer” (or set of
answers, e.g., NPV and internal rate of return of the project) and does not
address uncertainty (yet). Each uncertainty is set to its base value. Once we
have built and validated a deterministic model, we’ll use it to explore which
uncertainties are important (Chapter 8) and then update it for simulation
(Chapter 9). We will also use the deterministic model to populate the end
nodes of the decision tree(s) so that the trees can be rolled back as we
showed in Figure 4.2.
I’m assuming that you use Excel and can find your way around the
program, as most managers develop budgets, projections, and forecasts (or
check work done by somebody on their staff). If you went to business
school, you likely built (or your team built) many financial models to
analyze cases. Therefore, we’ll focus on how to build good models that are
clear, concise, reasonably efficient, and adaptable to probabilistic simulation.
We’ll start with some information about Excel, the terminology we’re
going to use throughout the rest of the book, and some helpful hints on
developing good models.
There is a caveat, though, about this chapter. If you don’t use Excel in
your job, presumably you have support staff who provide this capability. In
this case, I’d suggest skimming this chapter quickly from a viewpoint of
thinking about how you can maintain quality control with respect to the
work that Excel is used to accomplish. You should stop skimming, though,
and start reading with the section titled “Model Validation and Quality
Control” as this is important. I used Excel to prepare many of the tables and
charts, and a reader who is skilled in using Excel can create similar models,
tables, and charts by applying the principles and techniques discussed in this
book. Therefore I am not providing step-by-step instructions for doing this.
About Excel
Excel has an interesting history. Those of us who were early into PC usage
well remember that the dominant spreadsheet program was Lotus 1–2–3.
Lotus 1–2–3 built on an idea first implemented with a program called
“VisiCalc,” which was written for the Apple II and ported to the
Commodore. VisiCalc was a significant breakthrough in computing and was
one of the programs that turned PCs into something useful for business.
Lotus 1–2–3 was an amazing program, both in terms of functionality and
efficient coding—it would run on an IBM PC that contained only 128K of
memory! Most of us used it for economic, financial, and engineering
computations well into the 1990s. However, Lotus was slow to port the
program to the Mac, which left a gap.
In 1985, Microsoft filled the gap with a new program written for the Mac
called Excel. By 1987, the first Windows version was released. A major
upgrade came with Excel 95, developed to correspond with Windows 95.
Visual Basic for Applications (VBA) was added with the Excel 97 version
for Windows and the Excel 98 version for the Mac. By the mid-1990s, Lotus
1–2–3 was rarely used. Subsequent changes have added marginally valuable
features and rearranged the interface into the “ribbon” format.
For a good discussion on Excel’s history (and many other useful items), I
refer you to John Walkenbach’s website http://www.spreadsheet-page.com.
Walkenbach has written over 50 books about Excel and one of his books1
was the most useful VBA tutorial I have found.
Terminology
This may sound dogmatic, but some terminology is appropriate at this point.
Try to avoid the term spreadsheet because the word is ambiguous. Does
“spreadsheet” refer to a page or tab in a file, the file itself, or the
application? I recommend the following terminology (which is consistent
with Microsoft usage):
The $ Operator
The dollar sign operator contained within a cell or range reference (within an
equation) is very important, as it fixes the column or row when you copy and
paste or drag to copy a cell or range within the worksheet. For example:
If you copy (or drag) a reference to the cell “A2” to the right by one
column and down one row, the reference changes in the new cells to
“B2” and “B3,” respectively.
If you copy a reference to “A$2” across one column and down one
row, the reference changes to “B$2” and “B$2” for both of the new
cells.
If you copy a reference to “$A2” across one column and down one
row, the reference changes to “$A2” and “$A3” for the new cells,
respectively.
If you copy a reference to “$A$2” across one column and down one
row, the reference does not change, but remains “$A$2.”
Range Names
We defined a “range” as any cell or group of cells. The reason ranges are
important is because you can name them. To name a range, you can either:
Highlight the cell or group of cells you want to name and then type
the name you want to use into the “Name Box” (upper left part of
your screen) and hit “enter” (“return”), or
From the menu, choose “Insert,” then “Name,” and then “Define,”
which brings up a dialog box where you can specify the name you
want to use and the range to which you want to refer. For the newer
“ribbon” versions of Excel, this is accessed by Formulas, and then
Define Names.
The method described in the second bullet point is how you can edit a
range name once you have created it. Also, using the second method, you
can create a constant that is not in any of the worksheets. You can apply the
range name to just one worksheet; however, normal practice is to use the
same range names throughout the workbook file.
Range names are important because they (1) speed up formula entry, (2)
make debugging and usage by other people much easier, and (3) will help
you remember what you did if you come back later (potentially years later)
to update your file. For example, if you have a series of cells constituting
sales contained in the cells B2 through B13, the formula “=sum(sales)” is
easier to quickly understand than “=sum(B2:B13).”
One caution, though—you don’t want to use a range name for every
possible variable in your workbook. If you have too many range names, they
become confusing and lose their utility.
Styles
The Microsoft Office suite allows you to define Styles for files produced by
each application. Like range names, styles can add clarity within Excel—you
can use them to define consistent fonts, colors, shading, number formats, and
lines for a specific type of cells. For example, I always create a style called
“InputBox” that I use for each input within the workbook (the newer
versions of Excel contain a preformatted style for Inputs). This way, places
to enter numbers are clear on the worksheet page.
However, like range names, you want to use a few styles and keep their
significance clear.
Model Architecture
Whether you have a very simple model designed to complete one calculation
or have a very large model containing all the economics for a mega-project,
the principles of good model architecture are the same. This section contains
some key principles of good model architecture.
Modules
Organize your work into modules, with inputs in one module, calculations in
one module, and outputs (reports) and graphs in a third module. If you have
a large model, you’d normally use one worksheet (tab) for the input
modules. There might be several calculation worksheets and several output
reports. If your workbook is simple, your input module might be small (e.g.,
three columns and five rows), with the calculation module below or to the
right.
Hard-wired Inputs
It is important to never “hard-wire” an input into a formula! Always use
separate input cells for each input, including (and especially) constants. A
workbook file with hard-wired inputs is very likely to propagate errors, as:
another person (or you after time has elapsed) won’t know what the
inputs are or where they came from,
hard-wired inputs may have incorrect units of measure, and
if the input changes, it is unlikely you’ll find all the places in the
workbook file where it was used.
Within the input module, leave space for probabilistic inputs (p10–p50–
p90 ranges), as we’ll be adding the assessments after the deterministic model
has been built.
Units
Make sure that you label each input and put in a separate column for units.
Therefore, an input module needs at least three columns: labels, units, and
values.
Range Names
As noted previously, careful use of range names makes workbook debugging
easier and makes your file more useful later on, both for yourself and for
others.
If you are building a model that you (1) may need to use for a long time, (2)
plan to share with others, or (3) will use infrequently in the future, add a
worksheet (or section within a worksheet) for version control and
documentation. Use this as a log to track the changes that you make and why
you are making them. Excel workbook files can sometimes become corrupt,
in which case you are likely to have to go back to an earlier version and
reconstruct your changes. If you have to come back a year or two later and
update the logic in the model, it will be much easier if you have a history of
how you built it.
The best way to monitor for file corruption is to track the size of the file.
If a 700 kB workbook file suddenly takes up 10 MB, it is corrupt and you
need to delete it and go back to the 700 kB version. The easiest way to cope
with this is save your work each day with a new date or version number
incorporated into the file name (e.g., MyModelRev21.xls or
MyModel28OCT17.xls).
Excel does allow requiring a password to open the file. This can help you
with securing proprietary or confidential information. However, this can be
very problematic if you have to come back to the same model several years
later and cannot remember the password. Also, be sure to back up your work
somewhere other than your computer’s hard drive, which can potentially fail
at any point in time. I use DropBox for this purpose, as it is free (up to a
certain point), secure, and works on most computers, smart phones, and
tablet devices.
We have to decide what interest rate we’re going to use for NPV. Interest
rate is an interesting consideration. For a company, the correct interest rate to
use would be the weighted average cost of capital (WACC). For an
individual, the correct interest rate would be the interest the individual can
achieve with their surplus funds. For many years, decision analysts have
used 10% as a default interest rate. However, since the onset of the great
recession in 2007–2008, 10% is probably too high.
As a consultant, I’ve observed several errors in specifying the interest
rate. Sometimes managers will use different interest rates for different
projects that they are comparing. This destroys clarity. The proper way to
compare the relative risk associated with different projects is to compare
their s-curves. I’ve even seen people change interest rates for different parts
of their model.
The problem with setting an interest rate too high is that doing this can
over penalize projects with later payouts. This is especially important in the
post-2008 recession era of low interest rate monetary policy. And, if you
delete your long-term payout projects, after a few years, you won’t have any
projects!
However, if your interest rate is too low, you may end up implementing
weak projects that should be deleted from your portfolio.
1. Global variable, interest rate: For this analysis I’ve used 5%, as this
represents a compromise between what we could borrow money for
(assuming a good credit rating) and is probably higher than we could
achieve with a CD or money market fund.
2. Global variable, time of analysis: We have to decide what our time
frame of interest is. For this example, I’ve used 6 years. If we look at a
shorter time frame, we may not capture a good picture of the total
uncertainty associated with our decision. If we look at a longer time
frame, the accuracy of our inputs and assessments likely diminishes
significantly. We always have the option of increasing the time frame to
see if it changes our decision.
Figure 7.1 Career path influence diagram
3. Starting salary: In this scenario, we have job offers in hand, so we
know what the starting salaries are for each offer.
4. Raises and bonuses: In this scenario, the raises and bonuses are difficult
to assess. We are likely to be able to gain anecdotal evidence of past
practices as we interview people within each company. We can also
investigate websites to see if we can gain insight. The health of the
basic economy is an obvious factor affecting the raises that companies
give out. Also, the degree to which the individual succeeds in the job
and performs well should affect both.
I’ve modeled the raises and bonuses as two uncertainties for all the
years. This probably overstates the effect of this uncertainty. The most
likely scenario is that the raises received each year depend on the health
of the company and the overall economy and each year’s raise is mostly
independent relative to the previous year. When we look at simulation,
we can further explore this to see if it changes our decision.
5. Benefits: Most companies report total value of their benefit packages,
including 401(k) or similar programs. The 401(k) program is usually
spelled out, although sometimes the company contribution can depend
on whether the company is profitable. In our model, we do have some
uncertainty in the total benefits package, as it might be increased or
reduced depending on the economy and other factors. Note that we take
care not to double count the 401(k) contribution.
6. Taxes: Taxes are uncertain and can be assessed by examining historical
tax rates and considering the political environment and the state taxes
associated with each job offer. We could use NPV of pre-tax cash flow
as our measure-of-value, but this would not allow good accounting for
the effect of the 401(k) and state income tax. Taxes are not a global
variable due to the effect of state income tax.
7. Measure-of-value: After-tax NPV is our measure-of-value. An
alternative measure-of-value could be total compensation (wages,
salaries, and benefits).
Note the distinct input style. This makes it easy to see where the
inputs should be entered.
Note the distinct output style used for NPV.
I usually start in cell B2. This leaves an extra row and column, which
makes it easier to add something above or to the left of the work if
needed.
Note that the rows for the inputs correspond to where they are used.
This makes it easier to copy and paste and it also makes debugging
easier.
Since we’re using after-tax NPV as the measure-of-value, we have to
calculate gross income and pre-tax net income. Therefore, the 401(k)
is computed and subtracted to calculate taxes but is added back in to
compute the total. Interest and appreciation of the 401(k) is not
included in this model, as it is highly uncertain and not that relevant
over the six-year time horizon.
Note that the year 2013 is italicized. This is because it is equal to the
start year; subsequent years just add one to the value in cell E2. The
italics serve as a reminder that column E differs from the other
columns and would not be included in using copy and paste if we
decide to add more years to our model.
Note I have rounded to the nearest whole dollar—adding decimals
make the numbers more difficult to read. I like to use the red font
with parentheses for negative numbers (this is a preference)—it
makes subtractions really stand out.
There is a balance between breaking down the calculations into steps
and combining into what you want. For example, I could have
calculated pre-tax net income directly, leaving out rows 8 and 9.
However, (1) we need the 401(k) results as a component of the after-
tax total anyway, and (2) it is easier to understand and troubleshoot
the calculation with rows 8 and 9 visible.
Tornado Diagrams
For example, with one new product line I worked on, the owner was
very concerned about reusing abandoned equipment from a former
project that had not worked out commercially. When we completed the
tornado diagram, it showed that other uncertainties, such as success
with market and business development, had a much higher impact on
NPV than success or failure with reuse of the old equipment. We
reassigned the engineers that were working on the equipment reuse
issue to business development as a result of the analysis.
Make sure that each uncertainty is set to its base value (p50). Input
each uncertainty’s p10 and p90, one at a time, into our deterministic
model and record the resulting measure-of-value (usually we use
NPV as our measure-of-value for the tornado graph).
In a similar fashion, input each discrete variable as success and
failure, one at a time, into the deterministic model and record the
result.
Compute the absolute value of the difference between the p90 and
p10 resulting measure-of-value for each uncertainty.
Compute the absolute value of the difference between the success
and failure cases for each discrete variable.
Order the differences from maximum to minimum.
Plot the results using a horizontal stacked bar graph.
We repeat these steps for the remaining three uncertainties and record the
results and the differences as shown in Figure 8.2. Using these numbers, we
can construct a tornado graph for Offer A as shown in Figure 8.3.
Developing the graph in Figure 8.3 involves using a few tricks in Excel:
Using the data in Figure 8.2, insert a horizontal stacked bar graph.
Use the low values as a “ghost” data series—the bars will show up
to the left of those in Figure 8.3.
The other way to develop a tornado graph in Excel is to use the clustered
bar graph and then adjust the overlap for each data series.
The tornado shows us which uncertainties are less important. In the Offer
A example, the results match what we would expect—raises are the most
important uncertainty, followed by bonus, tax rate, and finally benefits.
If you are going to compare graphs (we’ll develop the Offer B and C
graphs next), you’ll have to set the maximum and minimum values for the
x-axis manually so that they all have the same scale. Also, note that the
tornado is more powerful than a traditional ±10% sensitivity graph as it
uses actual distributions rather than an arbitrary ± quantity.
In a similar fashion, we can develop the Offer B graph (Figure 8.4).
There are commercially available packages that will develop tornado
graphs, which allow you to develop the graph more quickly than directly
from Excel. Crystal Ball a simulation program available from Oracle (that
we will discuss in the next chapter), has the capability of generating tornado
graphs. Figure 8.5 was developed using an Excel add-in. Note that the
effect of the variables (upside vs. downside) is shown by the different
colored bars in Figure 8.5. This can be done directly from Excel by using
the horizontal stacked bar graph with three series: a “ghost” (unseen) series,
a downside series, and an upside series (take care to get the downsides and
upsides correct).
Figure 8.4 Offer B tornado graph
Figure 8.7, when compared with Figures 8.3 and 8.4, has a completely
different profile. There’s a lot of upside potential, but there is also a
significant financial penalty if Company C fails. Knowing the potential
impact of Company C failure, if we selected this job offer, it would be
prudent to monitor Company C’s progress carefully and to spend time
networking to increase the chances of getting another job if Company C
fails. Understanding the potential upside if Company C succeeds would
likely be a powerful motivator!
This tornado (Figure 8.8) had to be tipped on its side to show all the
probability combinations for each resource.
Cumulative Probability
run each case in our decision tree after we have assessments for
p10/50/90s for each uncertainty and then solve (roll back) the tree,
or
run a Monte Carlo simulation.
If we have more than two or three uncertainties, the tree method can be
very laborious. There are times when developing and solving the tree as I
outline can yield insight, but most of the time decision analysts prefer to use
Monte Carlo simulation because it can be set up and run in less time than
developing the full decision tree.
A note about measures-of-value is appropriate at this point. I recommend
using NPV as the simulation measure-of-value. This is especially valuable
in comparing projects.
The internal rate of return (IRR) is a popular measure-of-value. However,
(1) the IRR function in Excel will return an error message if the cash flow
of your project ever goes negative after the first few years of investment are
made (e.g., if you have to make another investment partway through the
project’s life or sales price drops part way through the project’s life), and
(2) NPV s-curves can give a good picture of the uncertainty associated with
your projects.
The best way to understand how to solve the tree is by example. We’ll
use the tree we developed in Chapter 4 for the job offer example, but
updated to include uncertainties. With Offer A, we have four uncertainties
with three outcomes (a p10, p50, and p90) for each.
A simple decision tree can become a complex decision bush with this
approach—we have 34 or 81 possible outcomes for just Offer A! Offer C is
even more complex. Figure 9.1 shows a skeleton tree for Offer A—the
entire tree is too large to display in the book.
This brings up an important use of the tornado graph—if we’re going
to solve the decision tree to develop the cumulative probability graph,
we should use only the top three or four uncertainties, as the ones
lower in the tornado don’t affect the outcome materially.
Rather than plug and chug through 81 different outcomes for Offer A and
another 81 different outcomes for Offer B (and hope that I record the values
with no typographical errors), I reduced the formula for NPV to one line
and then mapped every possible combination of p10/50/90s. Once the
formula works correctly in one row, it works for all the rows and can be
copied and pasted. I use the deterministic model to verify that my “one-
line” calculations are correct. Figure 9.2 illustrates this concept for the first
part of the full decision tree for Offer A. The data are not yet sorted in
Figure 9.2. Note that I include a sequence number so that I can sort and re-
sort if necessary. Also note that a quality control check on your probabilities
is that the sum of the probabilities of all the branches (“probability”
column) has to add up to 1.0—if it doesn’t, you have a math error
somewhere.
A comment on integration of the input distributions is needed at this
point. We’re using the p10/p50/p90 distributions to approximate an entire
distribution of inputs—therefore we use an approximation for the
probability associated with each branch of the tree. For Figure 9.2, I used
the extended Swanson-McGill weighting, which uses 0.30, 0.40, and 0.30
as the probabilities associated with the p10, p50, and p90 values,
respectively.1 An alternative weighting system uses 0.25, 0.50, and 0.25 for
the p10, p50, and p90 values, respectively.2 This method is referred to as
the McNamee– Celona method. For a more thorough discussion of
encoding uncertainties, see Skinner.3 I tend to use the 0.30/0.40/0.30 values,
as they are more appropriate when there’s skew in the input distribution
(which is usually the case).
As an aside, with Excel, it is sometimes possible to map all possible
combinations. I once completed a shipping fleet sizing analysis where
we had proposals for two different types of ships from several
shipyards. We were buying seven ships, so I mapped each possible
combination of type of ship and shipyard sourcing. The measures of
value were capital cost and cargo hauling capacity (i.e., capacity
versus cost).
Figure 9.2 Table used to develop cumulative probability graph
There were thousands of combinations, but this is not a problem
with Excel. Once the first formula was developed, I generated a table
similar to Figure 9.2. We were then able to pick a few combinations of
interest and examine those combinations more closely. We’ll discuss
this technique more when we discuss multi-attribute decision analysis
and portfolio analysis.
The advantage of using one of the commercial programs is that you can
set up the simulation much more quickly than you can solve the tree (as
discussed in the previous section). The disadvantage is that the software is a
“black box” and it can be very difficult to debug problems if your results
are not correct.
I prefer to build the logic for the distributions into the workbook model
itself; that way I can debug the calculations and logic as I build the model. I
use Excel’s “RAND()” function to generate random numbers, which are
then used to select p10/50/90 values for the uncertainties and
success/failure cases for the discrete variables. I’d advise you to avoid the
LOOKUP functions, as they are really slow. Set the calculation in Excel in
manual and use F9 to watch how the numbers change as you randomly
select each iteration.
After completing and testing the model, I will sometimes use Crystal Ball
as a “player,” that is, I don’t use the input distribution capability but do use
the software to track the measures-of-value. However, it is also easy to set
up a macro that computes a value and writes it for each iteration and then
use the PERCENTILE function on the data to develop the s-curve. To do
this, set up a table with percentages from 1% to 99% (in any increment that
you want) in one column and then use the PERCENTILE function on the
next column (using the first column and a range containing the output that
your macro generated) to compute the values corresponding to each
percentage. Then use Excel to plot cumulative probability versus value.
For our example, the simulation works really well to capture the
uncertainty around Company C going out of business—we have a 60%
chance of success with a 50% chance of mitigation via obtaining a new job.
We also have an uncertainty on which year the company fails (if it fails)—it
could be as early as year 3, most likely year 4, or it could be all the way out
to the end of our time frame, year 6. When we run the Monte Carlo
simulator and plot the results, Figure 9.4 is the result.
As expected from Offer C’s tornado, there’s significant upside and
significant downside relative to the other two offers.
Every fall for the past several years, I give a lecture about DA at UT
Austin for my friend Tom Sciance’s Engineering Economics class.
Every year Dr. Sciance has between 20 and 40 hardy souls who get up
early to make his very useful 8 AM class! I use Figure 9.4 in my
lecture, and after explaining cumulative probability and the scenario of
multiple job offers, I ask for a show of hands as to which job offer they
prefer. Usually two or three students opt for Offer B, and the rest of the
class picks Offer C. We talk about their rationale for their selection.
Figure 9.4 Cumulative probability, offers A, B, and C
For several years, I gave “lunch and learns” with a very similar
lecture at Chevron. When I got to Figure 9.4 and solicited a show of
hands for job preference, one or two of the participants would choose
Offer C, but without fail, the rest of the class would opt for Offer B.
However, in 2008, when I was at UT, the class did not pick Offer C!
Almost everybody in the class wanted Offer B. This illustrates how our
preferences and opinions can be affected by external events. The
students’ perspective on upside potential versus downside risk changed
significantly as a result of the Great Recession. In 2009 and 2010, the
class still preferred Offer B. By 2011, the percentage was beginning to
shift back toward Offer C, but still not at pre-recession levels.
One time almost two decades ago, David Skinner and I completed a
major business strategy analysis for a major business unit within a
large corporation. This was the business unit’s first experience with
decision analysis. The managers had a “momentum” (i.e., existing)
business strategy; however, as a result of the framing and analysis a
more powerful strategy was identified and analyzed. When we showed
the leader of the business unit a figure that was similar to Figure 9.4
except that there was less downside associated with the more powerful
strategy than with Offer C, he looked at the graph for a few minutes
and then looked at his staff and said, “It looks like we are changing
gears.” The rest of the meeting focused on planning for the change in
strategy, which ultimately was successful.
The bottom line is that s-curves are a powerful tool, and you can make
better business and personal decisions if you use them. However, be aware
that short-cutting the assessment process can be very dangerous, as you’ll
think your s-curve is better than it really is!
1 See Skinner, Chapter 9, and Leach, Chapter 4 for detailed discussion on how to set up simulations
using Crystal Ball.
CHAPTER 10
Value of Information
Once we have built our decision tree(s) and our model, we can explore the
value of information and the value of control. The traditional way decision
analysts explain these concepts is to relate:
The key concept behind both of these tools is that there is no value of
information or control unless you are willing to change your decision. In
other words, if you have already made up your mind concerning your
decision, more information has no value!
Consider the alarm lights and gauges in your car. They’re designed to
help you make good decisions and take appropriate action. However, if
you’re not willing to act on the information, the cost of these systems is
wasted. One guitar player I used to play music with drove a classic
Mustang. He was driving home one night when the red “check engine”
light came on. Rather than stop and call for assistance, he decided to
just keep driving. Within a few miles, the engine failed catastrophically.
I asked him why he did that, and he replied, “I decided that I wasn’t
going to let a warning light affect what I wanted to do.”
Probably the most significant commercial use of the value of information
tool is in oil and gas exploration. When you’re searching for recoverable
hydrocarbons and then trying to size processing equipment correctly, it is
important to know when to keep obtaining more data (seismic, seismic
reprocessing, and appraisal wells) and when to stop.
Offer A EV = $381,000
Offer B EV = $411,200
Offer C EV = $384,880, but…
Offer C EV = $436,200 if Company C is successful,
Offer C EV = $307,900 if Company C fails,
Offer C EV = $396,100 if Company C fails and another job is found
(assumes the job is equivalent to Offer A), and
Offer C EV = $219,700 if Company C fails and another job can’t be
found.
This is another way of thinking about the decision, but the s-curves
(Figure 9.4) present more clarity as to what the alternatives imply financially
than do the expected values!
There are two discrete variables in Figure 10.1 for which we would be
interested in knowing the value of information: Company C’s success or
failure and our ability to find another job if Company C fails. We’ll begin
with Company C.
Figure 10.2 shows the decision tree with knowing whether Company C
succeeds or fails before making the job offer decision.
Value of information is calculated by taking the EV of the offers with the
information and subtracting the value of the offers without the information.
From Figure 10.2, the EV with the information is $426,200 (note that
Company C’s success or failure is still uncertain; hence, the EV calculation
accounts for both branches of the first chance node). The strategy of choice
without information is Offer B, which has an EV of $411,200. Therefore, the
value of information is $426,200 minus $411,200, or $15,000.
As this number assumes that the information is inerrant, it represents the
upper limit of what you should pay for information concerning Company C’s
success or failure.
In a similar fashion, we can estimate the value of knowing whether a job
could be obtained in the event of Company C’s failure (or not). Here we put
the uncertainty of whether we can find another job upon Company C’s
failure ahead of the decision to select among the three offers (Figure 10.3).
In this case, the value of information is $415,680 minus $411,200 or
$4,480. This represents the upper limit of what you should be willing to pay
for information concerning the probability of obtaining another job if
Company C fails.
It is important to note that we’ve been estimating the value of perfect
information. There are methods to estimate the value of imperfect
information, but that is beyond the scope of this book. Suffice it to say that
the value of imperfect information is always less than what we’ve been
estimating in this chapter—perfect information.
Value of Control
The value of control is similar to value of information, but here we can
control the uncertainty to get the outcome that we want. Instead of an all-
knowing clairvoyant, we have access to an all-powerful wizard who can
control an outcome for us. My friend David Skinner always tells his classes,
“The wizard is more powerful than the clairvoyant.” (David even has a
wizard hat that he sometimes wears to make this point!)
Figure 10.2 Value of information, company C’s success or failure
Figure 10.3 Value of information, find another job
If I could leave you with only one thing from this book, it would be for
you to understand your own objectives hierarchy. All the other tools we’re
discussing are predicated on understanding what you (and your significant
stakeholders) want and the trade-offs associated with those objectives.
CHAPTER 11
Going through our career example will illustrate how this works.
Multiattribute Example
When we developed our objectives hierarchy for our job offer example
(Figure 3.2 placed here as Figure 11.1),1 we had three fundamental
objectives: financial, professional, and social. We’ve analyzed our three
alternative job offers quite thoroughly from a financial perspective in the
previous chapters. What we’re going to do now is score each job offer
subjectively with respect to the professional and social fundamental
objectives.
You may recall that the tax treatment for Job C was different—this was
because Job A and Job B were both in our hometown state, which has
higher taxes than Job C, for which we have to move a significant distance
away from friends and family. Hence, Job C scores lower on social. Job A
scores highest on social because their company culture is very collaborative
and professional, and we like their locations; Company A is regarded as one
of the best companies in the world to work for. Company B is similar but
not rated as high on locations.
Then we plot the average of the subjective scores versus expected value
of NPV, which gives us a picture of “beauty” versus financial value (Figure
11.2).
When I show this graph to the students in Austin who picked Job C, I ask
if any of them wants to change their selection. Usually a small percentage
of the students would switch to Job B, but most of the students would stay
with Job C for the upside potential. Speaking of which, let’s add the
p10/50/90 values for each job offer to the graph and see if this presents our
alternatives more clearly (Figure 11.3).
Offer C’s flat s-curve shows up in Figure 11.3 as having off-the-scale
upside and downside potential. You can also see that the p50 value from the
simulation is not the expected value (shown by the circular data point being
offset from the cross mark).
When I’m facilitating an attribute scoring session, I develop headline
bullet points for the rationale of how each person would score an objective.
This is not a novel concept—when a judge on Dancing with the Stars gives
a score, he or she gives a succinct rationale for why they liked or did not
like the performance. By documenting the thinking, we add transparency
and clarity for the decision maker. The decision maker may not agree with
the rationale, but if we document it, he or she can understand it. Just
because it is a subjective judgment doesn’t mean that it is irrational.
Purchasing Decisions
Many decisions can usually be made solely on financial measures,
including large capital project and product development decisions (in fact, a
project undertaken with the justification that it is “strategic” is likely poor
use of company money). Merger and acquisition decisions can be made
solely on financial measures. However, company culture is a very important
factor as to whether a merger or acquisition succeeds or fails. Delta Airlines
and Chevron, for example, have been generally successful addressing
corporate culture as they have acquired companies.
The most common multiattribute decisions most of us will face (whether
business or personal) are purchasing decisions. Normally as you spend
more money, you can achieve more of your subjective benefits (“beauty”).
A curve drawn through the highest “beauty” scores associated with each
financial cost defines what is called the efficient frontier. The points along
this curve are the best that you can achieve at any given cost (see Figure
11.4).
Sometimes, though, you get lucky and your best subjective match is the
lowest (or near the lowest) cost. I had two business situations where this
happened:
My friend Doug Quillen and I were working with a joint venture (JV)
team and were chartered to find office space for a team of 20–30
people. We looked at about 35 office properties for lease in downtown
Houston. Our subjective objectives included convenience to mass
transit, how easy or difficult it would be to build out the property to fit
our needs, and how well or poorly the lease terms, especially timing,
fit our requirements.
Figure 11.4 Typical purchasing situations
We were lucky again—the best match for our needs was among the
lower cost providers. And, we had an excellent outcome—the system
was a little late but was on budget and it performed as required. The
system also passed an audit by an independent accounting firm and
two owner audits. By coincidence, the importance of the third
subjective objective was driven home when one of the potential value-
added resellers went bankrupt while we were going through the
selection process.
Decision Quality
appropriate frame,
creative alternatives,
relevant and reliable information,
clear values and trade-offs,
sound reasoning, and
commitment to action.
In a facilitated DA project, I use these six characteristics as a way to help
the team self-assess the degree to which they are on their way to a good
decision (or decisions). Each team member considers the state of their
project subjectively and simply scores their opinion (0 to 100%) as to how
well they have done on each of the six items. We then plot the results on a
“spider” diagram. This quickly points out differences in thinking.
For example, if the project manager scores “appropriate frame” as 90%
but two of the team members score it at 20%, there is something that needs
to be brought out, discussed openly, and resolved.
Also, note that the subjective opinions will vary depending on the
relative maturity of the project and the analysis. If you have an opportunity
that is early in the development process, you would expect lower scores
than a project that is ready for the final investment decision.
Appropriate Frame
This aspect of decision quality goes all the way back to the left-hand
column of Figure 1.2 (the list of framing tools). Have we correctly defined
the problem, risk, or opportunity? Do we have a good understanding of the
business situation? Do we understand our objectives? Most importantly, are
we solving the correct problem?
There’s another aspect of framing, and that has to do with our
relationship with management. Do we have a clear charter from
management concerning the situation? Does management want a thorough,
unbiased analysis, or are we simply developing documentation to support a
decision that has already been made? If the latter is the case, rather than
waste time doing a lengthy analysis, time and resources would better be
spent on planning.
For a personal decision, sometimes just writing down and agreeing on the
problem (or risk or opportunity) statement at the start can help add clarity
for all parties involved as to what the situation includes and what it does not
include. Sometimes the energy of a group or a team or a family will be
around a completely different topic, in which case you will likely have to
address the other topic first and then come back to the situation that you
want to deal with.
Creative Alternatives
One of the biggest temptations a team has is to analyze minor variations
around the “momentum” strategy, that is, the strategy that they perceive is
what management would do if no analysis were being done. Alternatives
need to be doable and compelling, but need to be robust enough that they
are not just small variations. With many decisions, the alternatives should
be mutually exclusive and collectively exhaustive. This can be another
check on the robustness of the alternatives under consideration.
For personal decisions, the decision trees can really help other people
involved see the range of possible alternatives and catalyze a healthy
discussion on creative alternatives.
Sound Reasoning
This aspect of decision quality is subtler. Obviously if you build a financial
model of a situation, you need to get the equations correct. However, trees
showing the interconnectivity of decisions, risks, and uncertainties
(especially on a time line) can help teams and groups of people think
through a complex or confusing situation clearly. This also helps with
contingency planning.
For personal decisions, the tools we’ve been talking about can help you
think through your situation in a logical and clear way, which can reduce
confusion and ambiguity.
Commitment to Action
To be clear, deciding not to act can be a quality decision. This aspect of
quality has to do with commitment to the decision that is made and more
accurately could be stated “commitment to the decision(s)” (however, this
would be a more awkward statement). This aspect scores our ability to
avoid “analysis paralysis” and simple procrastination. And this aspect of
decision quality does not preclude working on contingency planning so that
there is clarity on changes in our situation that would compel changing the
decision.
Other Considerations
One thing implied but not explicitly stated in Spetzler’s six aspects of
decision quality is moving from deterministic to probabilistic thinking and
analysis. As discussed in Chapter 1, Pat Leach3 (Why Can’t You Just Give
Me The Number?) makes a thorough and compelling case for “knowing the
odds” concerning your business, that is, understanding the ranges of
probabilities associated with your uncertainties and your measures of value.
Spetzler includes this in the relevant and reliable information aspect;
however, I would call this out as an explicit measure of decision quality.
My friend Brian Hagen4 (Problem, Risk, and Opportunity Enterprise
Management) adds two other elements to decision quality: timing (with
options) and choice defensibility. His timing element could arguably be
considered part of correct logic and commitment to action; however, in
some situations timing can be incredibly important and thinking through
how well we understand that timing can be very valuable. “Skeleton”
decision trees (Figure 4.5) can add much clarity to sequencing, timing, and
option discussions.
His element on choice defensibility I agree completely with, in that if you
have maximized NPV with your decision but would be embarrassed to
defend it on, say, your local TV news or with the Wall Street Journal, you
need to go back and rethink your objectives and likely spend more effort on
stakeholder analysis.
Risk Analysis
Usually a scale of one to five is used for the exercise. For example,
probability could be very unlikely, very low, low, medium, or high.
Consequences are usually orders of magnitude dollars, for example, $10K
or less, $100K, $1M, $10M, $100M, or greater. Mitigation is usually very
easy, easy, moderately easy, difficult, or very difficult.
Then the risks are compiled based on their relative scoring in a list from
1 to n, where n is the total number of risks identified. The team then
recommends mitigation activities based on the analysis.
The important thing about doing the risk analysis is that it needs to be
updated periodically because situations and assumptions change. I
recommend updating a risk analysis at least twice a year.
Another thing to remember is that immediacy bias will affect people’s
opinions markedly in a subjective exercise like this one. If you assess
people’s fear of flying before and after a major airline crash, you’ll find a
markedly different view of flying risk before versus after. Offshore drilling
risk assessments changed significantly after the BP Deepwater Horizon
incident.
Also, try to be as specific as possible concerning the risk items that you
put in your register. For example, “skilled labor availability” is not very
useful. It would be better to note which trades are likely to be short and
when during the project this would have an effect. (By the way, I have
never seen a project that didn’t have an electrical and instrumentation
“crunch” just before startup.)
From a personal standpoint, when discussing risks with your family, the
three considerations probability, consequences, and mitigation give you a
framework for talking about risks. But again, be aware of the biases that
influence people’s thinking.
Jack Bess told me a story of one project that was very late and over
budget. He had helped the team with the original cost and schedule
estimate, so he was asked to go back and figure out how to get the
project back on track. When he got there, he asked the project manager
where the initial schedule risk analysis was. The manager looked blank
for a minute and then replied, “Oh, I think it is in the file cabinet in the
hallway…,” and they went looking for it.
If the project manager had been using the model and keeping it updated,
it is unlikely the project would have gotten so far behind. For project cost
and schedule risk, this is even more important than it is for the generic risk
assessments.
One good example from my experience was the West Valley
Demonstration Project, which Westinghouse completed for the Department
of Energy (DOE). This was a commercial nuclear fuel reprocessing plant
that was very poorly operated and eventually abandoned, leaving a
contaminated facility and 600,000 gallons of high-level radioactive waste
behind. The State of New York and ultimately the DOE inherited the clean-
up. I knew the clean-up project manager, Steve Marchetti (after
Westinghouse started the work, he was the Westinghouse leader at the site).
Steve had a detailed schedule, which he reviewed with his staff each
week in detail. They discussed each activity and updated the master
schedule based on the new information and adjusted as necessary.
Every day they met in the morning for a few minutes to update each
other on progress toward the weekly plan. Then each month they did a
complete review—past, current, and future—and re-issued the
schedule. At the time I met with Steve, the project was on schedule
and on budget. He eventually was transferred (and promoted), and I
lost track of him and the Demonstration Project.
Steve didn’t use probabilistic methodology per se, but he had a detailed
model of the project, and they used this model to keep ahead of the work
and head off problems before they happened. It was part of their project
management system.
Another question for capital projects would be, “What confidence level
should we use for the capital authorization and startup date?” Even if we
include the s-curves in the appropriation request, we still need specific
numbers to include. If we pick too high a level, say a p90 or p95, the project
will probably find a way to spend the money and we’ll end up with budgets
drifting significantly upward through time. If we set it too low, say p10, the
project team will give up because they have little chance of meeting the
target. The best strategy is to pick a point on the cost and schedule s-curves
that are credible yet realistic, perhaps p40 or p60 values. Also, these targets
should be specific to the project being done and should have a rational line
of reasoning behind them. People don’t respond well to arbitrary and
capricious goals.
The reactors and the other processes worked. The engineers who designed
them didn’t have computers—they used slide rules! The reactors were
operated safely and cost-effectively until the late 1980s, when they were
shut down for political reasons. The other facilities continued to operate for
some time after the reactors shut down. Can you imagine how much it
would cost and how long it would take to build something like this today?
There’s nobody in the world—not even China—who could do this today.
Note that this was the same generation that successfully completed six
manned missions to the moon between 1969 and 1972.
You might ask, what does this have to do with decision or risk analysis?
The answer is that like individuals, teams, groups, and companies, society
as a whole has objectives and trade-offs and has to prioritize. Society as a
whole also has to decide whether to discuss these objectives and trade-offs
transparently and honestly or to allow manipulative politics to control the
discussion.
CHAPTER 14
Other Topics
This chapter discusses a few more tools and items to consider. We’ll start
with discussions about portfolio analysis and then risk analysis. Utility
theory is an interesting decision analysis (DA) concept but is infrequently
necessary. We’ll discuss normative and descriptive viewpoints and
preferences.
The most important topics in this chapter concern DA, ethics, and
personal decision making. We’ll conclude the chapter by disclosing the
“secret” of DA.
Portfolio DA Projects
When hearing the word portfolio, most people think of an investment
portfolio. In DA, portfolio refers to a set of potential alternatives. It could be
a collection of businesses, or a group of oil and gas fields, or a group of
pharmaceutical compounds, or a collection of publicity or advertising
alternatives.
Decision analysts spend considerable time and work helping clients with
portfolio management, which can become quite complex in practice
(especially in the pharmaceutical industry, where the company’s ultimate
survival can depend on properly managing the pipeline of new potential
drugs). In this chapter, we’ll discuss some of the tools and how they can be
relevant for the working manager.
From a DA standpoint, analyzing a portfolio project is very similar to
what we’ve been talking about so far. We’d start the analysis with an
opportunity or problem statement, get reactions to the statement, discuss the
business situation, develop an objectives hierarchy, and talk about closed,
strategic, and tactical decisions. However, the analysis part is a bit different.
The first tool we would use would be to plot probability of success versus
NPV if successful for each element of the portfolio. This is the traditional
way of starting a portfolio analysis. However, sometimes the y-axis is
another measure. For example, if you are looking at a fleet of ships, you
might plot shipping capacity versus present value of the total cost to operate.
Figure 14.1 shows a plot of probability of success versus NPV for a group
of potential projects. If you could only choose four projects, which ones
would you choose?
Also shown in Figure 14.1 are lines of indifference—a $100MM NPV
project with a 50% chance of success should be as desirable as a $200MM
NPV project with a 25% chance of success (assuming similar s-curves for
NPV).
Another way to characterize projects within a portfolio refers to which
quadrant of Figure 14.1 the project is in. Upper right projects are large and
relatively certain—these can be referred to as “pearls.” Large projects with
lower probability of success would be called “oysters” as they turn into
pearls if they are successful. Smaller projects with high probability of
success can be called “bread and butter” as you need these projects to stay in
business. Small projects with low probability of success are termed white
elephants and should not be completed unless they are really inexpensive to
pursue.
So I’m biased, but at the end of the day Warren Buffett’s approach of
buying good companies and keeping them seems to work the best.
Utility Theory
One thing that I have observed reading academic publications and attending
conferences is that much attention is devoted to utility theory. Utility theory
uses equations to adjust a project’s NPV to account for its size relative to the
company as a whole. At one conference, two very experienced decision
analysts were asked how often they had come across a real utility theory
problem in their consulting practice. Both men stated, “One or two over 30
years of consulting.”
The rule of thumb that I’ve heard expressed is that if a given project can
affect more than one-fifth of a company’s total value, it needs to be
considered very carefully. Skinner summarizes utility theory quite well in his
book, Introduction to Decision Analysis.1 My (unsolicited) advice to entities
funding more academic work in utility theory is to find something else to
fund. My advice to you is to be aware that this exists but you’re probably not
going to need it.
DA and Ethics
Professor Marianne Jenkins, Arizona State University, published an
excellent article in Financial Engineering News (which, unfortunately, is no
longer available) in which she discussed the Enron/Arthur Anderson
bankruptcy and collapse. She noted that when Enron first experienced a
quarter with slightly lower-than-expected earnings, they decided to record an
off-balance sheet transaction to inflate the quarterly income. Their lead
auditor from Arthur Anderson told them they could not do this, but they
went over his head to Anderson headquarters and overturned his ruling.
Their logic was that earnings would be better the next quarter, and they
could either ignore the transaction or issue a footnote in the quarterly report
and correct it.
However, when the next quarter came, earnings were again lower than
they wanted. At this point, there were only three alternatives available to
Enron:
stretch the earnings like they had done in the previous quarter,
rectify the “error” from the previous quarter and take the earnings hit,
or
ignore the previous quarter’s stretch but exercise the discipline to
never do this again.
Unfortunately, they chose the first alternative and kept stretching until they
eventually got caught.
I’ve represented this as a decision tree in Figure 14.3. Each quarter is
represented as a separate decision.
The important point here is that neither Enron nor Arthur Anderson had
the objective of cheating until their companies imploded. In a similar
fashion, nobody ever has an objective of becoming an alcoholic by the time
they reach 45, or a drug addict by the time they are 30. Few people have the
objective of being addicted to cigarettes for a pack-a-day habit. It starts with
an ethical stretch and proceeds with small decisions until the person is
addicted. Nobody enters into a marriage with the objective of cheating on
their spouse, but if they cheat and get away with it, it can be much easier to
give in to subsequent situations and eventually be found out.
Note that with each small decision, the difficulty of rectification or
discipline on the next small decision becomes greater. Eventually you
essentially lose these alternatives, as they become too difficult to do.
Figure 14.3 gives you a framework for discussing ethical situations with
your colleagues, bosses, and your family. My sincere advice is that if your
company is in one of these situations, you need to find another job and get
out, as sooner or later they will get caught. Drawing a tree like the one
shown in Figure 14.3 with a person in your family engaged in risky
behaviors might be a useful intervention, depending on how far down the
path to addiction they are. This topic is very important for our families, as
personal decisions are the leading cause of death for people in the United
States.
Personal Decisions
In 2008, Ralph Keeney, a professor at Duke and one of the founding fathers
of DA, published an article in Operations Research titled “Personal
Decisions Are the Leading Cause of Death.”2 He analyzed data from 1900,
1950, and 2000 and found that:
Figure 14.3 A series of ethical decisions (the path to addiction)
He states in his summary: “Over one million people prematurely die each
year in the United States due to causes that can be attributed to personal
decisions … For ages from 15 to 64, about 55% of all deaths are attributable
to personal decisions.”3
Dr. Keeney began his analysis with statistical data on the medical causes
of death, from which he determined the actual cause of death (e.g., medical
cause of death lung cancer, actual cause of death smoking, and personal
decision smoking). He studied 14 medical causes of death (diseases of the
heart, cancer, stroke, unintended injuries, AIDS, suicide, homicide, etc.)
traced to eight actual causes of death (smoking, overweight, alcoholism,
accidents, etc.) that were attributed to eight personal decisions.
Dr. Keeney then summarized his work into eight rather obvious life-
saving decisions we all can make to improve our chances of avoiding
premature death. Avoid:
smoking (don’t start, quit)
overweight (exercise, diet)
alcoholism (quit)
vehicle accidents (seat belt, slow down)
illicit drugs (don’t start, quit)
suicide
criminal behavior, and
unprotected sex (condom, abstinence).
These are all rather obvious, but Figure 14.3 shows how small decisions
we make every day can eventually lead to a bad outcome. The longer we
proceed down the addiction tree, the more difficult it becomes to rectify the
situation and change our ways.
The Secret of DA
My friend David Skinner always says, “Never tell them the answer.” When
David and I were consulting together, he (almost) never broke this rule.
When I was consulting, I respected David’s guidance and found that his
advice was sound. Sometimes clients would ask me (usually during a heated
debate) what I thought and I would decline to answer. I’m going to break
David’s Rule, though, and tell you the secret of DA. It’s really simple but is
also very powerful.
Actually, David revealed the secret of DA in his book,4 but he discussed it
so subtly that I think most people miss it. The secret of DA revolves around
three questions:
All through the book I’ve stressed the importance of the objectives
hierarchy. This is the tool that helps you examine what you really want and
the trade-offs associated with getting what you want.
What you can do includes your decisions, alternatives, strategies, and
portfolio selections.
Your state of knowledge (and lack thereof!) is addressed by assessments
of uncertainty, value of information, and value of control.
The key to using the tools I’ve described in this book is to keep these
three questions—three important distinctions—in mind. You can boil down
these questions even further to the three verb ideas in the questions: want,
do, and know.
More Information
If you have further interest in DA, there is an informal group of practitioners
called the Decision Analysis Affinity Group (DAAG) that is made up
primarily of people from industry and was started in the mid-1990s. Their
website is www.DAAG.net and the group meets every year for a two-day
conference. There are no membership dues—you just register and pay for
the conference, which is usually very reasonably priced. Each year between
80 and 120 people attend the conference.
Another DA group is the Society of Decision Professionals (SDP),
www.decisionprofessionals.com. This group started in the mid-2000s. There
is a membership fee associated with SDP, but they sponsor free (to
members) training, webinars, and events. They usually meet just before or
after the DAAG conference each year and are in the process of absorbing
DAAG. Houston has a local SDP chapter that meets once a month, but you
do not have to be an SDP member to attend the Houston meetings.
There is a large DA community of practice in academia. The primary
group that publishes academic DA work is called INFORMS, which stands
for the Institute for Operations Research and the Management Sciences.
According to the INFORMS website, INFORMS is “the largest professional
society in the world for professionals in the field of operations research
(O.R.), management science, and business analytics.” The INFORMS
conferences are much larger than DAAG or SDP.
You may also choose to explore some of the readings noted in the
References section.
Implications for the Manager
Understanding normative and descriptive viewpoints can help you
understand your own thinking and why others may view the world
differently.
I sincerely hope that linking Dr. Keeney’s careful analysis showing that
personal decisions are the leading cause of death (in the United States) with
Figure 14.3, the decision tree of addiction, gives you:
the willpower to deal with, that is, rectify, any situation that you ever
find yourself in relative to Figure 14.3, and
a framework for discussing personal and ethical decisions with
colleagues, bosses, and, most importantly, family members if it
becomes necessary.
Now that we’re at the end of our journey together, my advice is to use the
tools that we’ve talked about to make better business and personal decisions.
A good decision does not guarantee a good outcome (just as a poor decision
does not guarantee a catastrophe), but we can improve our odds of getting
what we want and help others along the way.
I’d be interested in your successes and failures using DA tools in your
business and in your life. Send me an e-mail: dave@decisions-books.com.
And may all your outcomes be good ones, or to paraphrase Mr. Spock, make
good decisions, live long, and prosper.
Notes
Chapter 1
1. Leach (2006), p. 1.
2. Eisenhardt (1989), pp. 543–576.
3. Skinner (1999), pp. 11–13, 16.
4. Howard (1965).
5. Edwards and von Winterfeldt (1986), pp. 560–569.
6. Nutt (1997), pp. 44–52.
7. Nutt (1990), pp. 159–174.
8. Nutt (2002).
9. Skinner (1999), p. 16.
Chapter 2
1. Doyle and Straus (1976).
2. Skinner (2009), pp. 100–102, 238–241.
3. Hagen (2017).
Chapter 3
1. Keeney, Ralph L., Value Focused Thinking, Harvard College (1992), p. 34.
2. Op sit, Keeney, p. 83.
3. Op sit, Keeney, p. 35.
4. Op sit, Keeney, pp. 68–72 typical.
5. Skinner, p. 31–32.
6. Op sit, Skinner, p. 103.
7. Skinner (2009), pp. 30–32.
8. Op sit, Keeney, p. 5.
Chapter 6
1. Tversky and Kahneman (1974).
Chapter 7
1. Walkenbach (1996).
Chapter 9
1. Keefer, D.
2. McNamee, P. and J. Celona, Decision Analysis with Supertree, 2nd Edition, The Scientific Press,
1990.
3. Skinner, 3rd Edition, pp. 190–192.
Chapter 12
1. Spetzler, Winter, and Meyer (2016).
2. Kleinbaum and Kleinbaum (2013).
3. Op sit, Leach.
4. Op sit, Hagen.
Chapter 13
1. Leach (2006), pp. 108–110.
2. Op sit, Hollmann.
3. Merrow (2012).
4. Op sit, Hollmann, p. 15.
Chapter 14
1. Skinner, 3rd Edition, pp. 210–217.
2. Keeney, Ralph, “Personal Decisions Are the Leading Cause of Death,” Operations Research, Vol.
56, No. 6, November–December 2008, pp. 1335–1347.
3. Op sit, Keeney p. 1345.
4. Skinner, “Introduction to Decision Analysis” 2nd Edition, pp. 75–76.
References
Doyle, M. and Straus, D. (1976). How to Make Meetings Work. New York, NY: The Berkley
Publishing Group.
Edwards, W. and von Winterfeldt, D. (1986). Decision Analysis and Behavioral Research.
Cambridge, UK: Cambridge University Press.
Eisenhardt, K. M. (1989). Making Fast Strategic Decisions in High-Velocity Environments. Academy
of Management Journal 32, pp. 543–576.
Hagen, B. (2017). Problems, Risks and Opportunities: Enterprise Management. To be published this
year.
Hollmann, J. K. (2016). Project Risk Quantification. Gainesville, FL: Probabilistic Publishing.
Howard, R. A. (1965). Decision Analysis: Practice and Promise. Management Science, 34(6 1988),
pp. 679–695.
Keefer, D. and Bodily, S. E. (1983). Three-point Approximations for Continuous Random Variables.
Management Science 29, pp. 595–609.
Keeney, R. L. (1992). Value-Focused Thinking. Cambridge, MA: Harvard University Press.
Keeney, R. L. (2008). Personal Decisions Are the Leading Cause of Death. Operations Research,
Vol. 56, No. 6, November–December 2008, pp. 1335–1347.
Kleinbaum, R. and Kleinbaum, A. (2013). Creating a Culture of Profitability. Gainesville, FL:
Probabilistic Publishing.
Leach, P. E. (2006). Why Can’t You Just Give Me The Number? Gainesville, FL: Probabilistic
Publishing.
McNamee, P. and Celona, J. (1990). Decision Analysis with Supertree (2nd Edition). San Francisco,
CA: The Scientific Press.
Merrow, E. W. (2012). Oil and gas industry megaprojects: Our recent track record. Oil and Gas
Facilities 1(2), pp. 38–42.
Nutt, P. C. (1990). Preventing Decision Debacles. Technological Forecasting and Social Change 38,
pp. 159–174.
Nutt, P. C. (1997). Better Decision-Making: A Field Study. Business Strategy Review 8(4), pp. 44–52.
Nutt, P. C. (2002). Why Decisions Fail. San Francisco, CA: Berrit-Koehler Publishers, Inc.
Skinner, D. C. (1999). Introduction to Decision Analysis (2nd Edition). Gainesville, FL: Probabilistic
Publishing.
Skinner, D. C. (2009). Introduction to Decision Analysis (3rd Edition). Gainesville, FL: Probabilistic
Publishing.
Spetzler, C., Winter, H., and Meyer, J. (2016). Decision Quality: Value Creation from Better Business
Decisions. Hoboken, NJ: John Wiley & Sons.
Tversky, A. and Kahneman, K. (1974). Judgement under Uncertainty: Heuristics and Biases. Vol. 2.
Readings on the Principles and Applications of Decision Analysis. Menlo Park, CA: Strategic
Decisions Group.
Walkenbach, J. (1996). Excel for Windows 95 (2nd Edition). Foster City, CA: IDG Books.
Index
Ambiguity, 24
Anchoring bias, 57
Assessment
definition, 51
experts (see Experts assessment)
uncertainty (see Uncertainty assessment)
Bush Grid, 24
Framing, 7
decision quality, 116
facilitation
body language, 15
capturing headlines, 15
listening, 15, 20
number of people, 13–14
people willingness, 14
PowerPoint, 13
problem-solving meetings, 14
whiteboard, 13
influence diagram
issue raising, 17–18
objectives hierarchy (see Objectives hierarchy)
problem/opportunity statement, 15–17
process documentation, 19–20
situation analysis, 18
stakeholder analysis, 18–19
tools, 11
Front end engineering and design (FEED), 28
Fundamental objective, 21
Immediacy bias, 59
Influence diagram (ID)
career path example, 47, 48
components, 45
drawing, 48–49
facilitation, 48–49
framing and analysis, 47
generic, 46
INFORMS, 138
Internal rate of return (IRR), 87–88
Lotus 1-2-3, 66
McNamee–Celona method, 89
Means objective, 21
Monte Carlo simulation, 91–95
Multiattribute decision analysis
advantage
definition, 105
job offer
financial, professional, and social objectives, 107
objectives hierarchy, 106–107
stock market graph, 108
subjective scores vs. expected value, 109
subjective scores vs. financial potential, 109
principles, 106
purchase decisions
beauty vs. cost chart, 110
efficient frontier, 110
merger and acquisition decisions, 110
multiattribute graph, 112
subjective objectives, 111
subjective assessment, 106
Rain uncertainty, 54
Range names, 68–69
Rant time, 17, 20
Risk analysis. See Generic risk analysis
Uncertainty
vs. ambiguity, 24
deterministic model, 65
strategy tables, 42
Uncertainty assessment
continuous distributions variables, 60–61
continuous probability
continuous distribution, 52
expected values, 54
p10, p50, p50 assessment, 53
rain uncertainty, 54
roll back tree, 54
definition, 51
discrete probability, 51–52, 59–60
experts (see Experts assessment)
influence diagram, 59, 63
price, 62–63
ranges and distributions
definition, 54
density function, 55
probability distributions, 55
rainfall cumulative probability distribution, 55, 56
revenue, 63
Utility theory, 132
Business Decision-Making: Streamlining the Process for More Effective Results by Milan
Frankl
Operations Methods: Managing Waiting Line Applications, Second Edition by Kenneth A.
Shaw
Applied Regression and Modeling: A Computer Integrated Approach by Amar Sahay
Data Visualization: Recent Trends and Applications Using Conventional and Big Data,
Volume I by Amar Sahay
The Art of Computer Modeling for Business Analytics: Paradigms and Case Studies by
Gerald Feigin
a one-time purchase,
that is owned forever,
allows for simultaneous readers,
has no restrictions on printing, and
can be downloaded as PDFs from within the library community.
Our digital library collections are a great solution to beat the rising cost of textbooks. E-books can be
loaded into their course management systems or onto students’ e-book readers.
The Business Expert Press digital libraries are very affordable, with no obligation to buy in future
years. For more information, please visit www.businessexpertpress.com/librarians. To set up a trial
in the United States, please email sales@businessexpertpress.com.