You are on page 1of 159

Decision Analysis

for Managers
Decision Analysis
for Managers
A Guide for Making Better
Personal and Business Decisions

Second Edition

David Charlesworth
Decision Analysis for Managers: A Guide for Making Better Personal and Business Decisions,
Second Edition

Copyright © Business Expert Press, LLC, 2017.

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or
transmitted in any form or by any means—electronic, mechanical, photocopy, recording, or any other
except for brief quotations, not to exceed 400 words, without the prior permission of the publisher.

Excel, PowerPoint, Visio, and Word are Microsoft Corporation trademarks. Analytica is a Lumina
Corporation trademark. Crystal Ball is an Adobe Corporation trademark. TreeAge Data is a TreeAge
Software Inc. trademark. @Risk is a Palisades Corporation trademark.

First published in 2013 by


Business Expert Press, LLC
222 East 46th Street, New York, NY 10017
www.businessexpertpress.com

ISBN-13: 978-1-63157-604-1 (paperback)


ISBN-13: 978-1-63157-605-8 (e-book)

Business Expert Press Quantitative Approaches to Decision Making Collection

Collection ISSN: 2163-9515 (print)


Collection ISSN: 2163-9582 (electronic)

Cover and interior design by Exeter Premedia Services Private Ltd., Chennai, India

Second edition: 2017

10 9 8 7 6 5 4 3 2 1

Printed in the United States of America.


Abstract
Everybody has to make decisions—they are unavoidable. Yet we receive
little or no education or training on how to make decisions. Decision
making can be difficult, both in business (hiring, product development,
capacity, bidding, balancing R&D versus compliance spending, etc.) and
your personal life (dating and marriage, which automobile or house to buy,
whether to change jobs or not, where to vacation, retirement, how to deal
with health problems, etc.).
Decision Analysis (DA) is a time-tested set of tools (mental frameworks)
that will help you and people you work with:

Clarify and reach alignment on goals and objectives,


Understand trade-offs associated with reaching your goals,
Develop and examine alternatives,
Systematically analyze the effects of risk and uncertainty, and
Maximize the chances of achieving your goals and objectives.

Success (getting what you want) depends on luck and good decision
making. You can’t control your luck, but you can maximize your odds by
making the best possible decisions.
Broadly speaking, this book organizes and presents otherwise formal
decision-making tools in an intuitively understandable fashion. The
presentation is informal, but the concepts and tools are research-based and
formally accepted. Whether you are a business owner, a manager or team
leader, or a senior professional, these tools will help you make better
decisions in both your personal and your business life.
In the second edition of Decision Analysis for Managers, chapters
focusing on risk analysis and decision quality have been added.

Keywords
business decision making, decision analysis, decision and risk analysis
(D&RA), decision framing, decision making under uncertainty, decision
quality, making decisions, multiattribute decision analysis, personal
decision making, portfolio analysis, project risk, risk, strategic decision
making, trade-offs, uncertainty
Contents

Preface
Acknowledgments

Chapter 1 What Is Decision Analysis?: And Why Should I Care?


Chapter 2 How to Start Framing a DA Problem: How Can We Work
Together?
Chapter 3 The Objectives Hierarchy: What Do We Want?
Chapter 4 Decisions and Alternatives: What Can We Do?
Chapter 5 Influence Diagrams: What Do We Know?
Chapter 6 Uncertainty Assessment: The Boundary between Known and
Unknown
Chapter 7 Building a Deterministic Model: Time to Run the Numbers
Chapter 8 Tornado Diagrams: Figuring Out What Is Important
Chapter 9 Cumulative Probability: Looking at the Range of Outcomes
Chapter 10 Value of Information: How Much Is It Worth to Know?
Chapter 11 Multiattribute Decision Analysis: There’s More to Life than
Money
Chapter 12 Decision Quality: How Well Have We Done?
Chapter 13 Risk Analysis: I Want It Cheap and I Want It Now
Chapter 14 Other Topics: More Things to Think About
Notes
References
Index
Preface

Everybody has to make decisions—they are unavoidable. Yet we receive


little or no education or training on how to make decisions. Business
decisions can be difficult: which people to hire, which product lines or
facilities to expand and which to sell or shut down, which bid or proposal to
accept, which process to implement, how much R&D to invest in, which
projects should receive the highest priority and which should be deferred,
and so on. Even if you make the correct decision, you still have to get buy-
in and commitment from your team, upper management, other managers,
and key stakeholders to successfully implement the decision.
Personal decisions can be even more difficult: which college to attend,
who to date, who to marry, which automobile to buy, which house to buy,
whether to change jobs or not, where to go on vacation, when and where to
retire, making decisions concerning a serious illness or health problem, and
so on.
Decision analysis (DA) is a time-tested set of tools (mental frameworks)
that will help you, your teams, your peers, and your family:

clarify and reach alignment on goals and objectives,


understand trade-offs associated with reaching those goals,
develop and examine alternatives,
systematically analyze the effect of uncertainty, and
maximize the chances of achieving your goals and objectives.

Success (getting what you want) depends on luck and good decision
making. You can’t control your luck, but you can maximize your odds by
making the best possible decisions. This book introduces you to tools that
can help you make better personal decisions and that will help teams and
work groups align quickly and make better recommendations and decisions.
It gives you frameworks for gaining insight when the decisions and/or the
people involved in those decisions are difficult. Whether you are a business
owner, a manager or team leader, or a senior professional, these tools will
help both your personal and your business life.
Acknowledgments

It is difficult to list all the people who help a person formulate a set of
experiences and frameworks that constitute a career. I’ve been fortunate to
have had the opportunities to work on challenging and interesting problems
with some rather amazing people. Here are the high points—errors of
omission are sincerely regretted! We’ll start with decision analysis (DA):
Tom Sciance (now an adjunct professor at the University of Texas) was the
visionary leader at Conoco who started both David Skinner (author of
Introduction to Decision Analysis) and me on the DA adventure. SDG was
our early DA mentor and helped us through those first Conoco projects.
Hollis Call of ADA was also a source of advice and encouragement. Gary
Bush, Pat Leach, Bill Haskett, Nick Martino, Paul Wicker, OJ Sanchez,
Stefan Choquette, James Mitchell, Bob DeWolff, Mike Stallsworth, and the
rest of the early DSI (Decision Strategies, Inc.) gang influenced my
thinking about DA considerably and provided helpful advice, ideas, and
collaboration. Professors Ronald Howard (Stanford), Bob Winkler (Duke),
and Bob Clemens (Duke) have been inspiring and encouraging.
In my association with Chevron for over 13 years (consultant and
employee), I’m grateful that I was able to work in a company that is serious
about using state-of-the-art decision science. Frank Koch, Larry Neal, Elton
Allen, Dale Nichols, Jeff Circle, and Craig McKnight have been very
generous with their time, ideas, encouragement, and collaboration.
My underlying management philosophy was influenced most by John
O’Bryan, Dave French, John Lindsay, and Jarl Swanson of the bygone
Louviers (DuPont) training group (and our “gurus,” Ed Klinge and Ted
Keegan). I’m grateful to the teams that I managed at Parkersburg, Savannah
River (now SRNL), Ponca City, and Gainesville for putting up with and
responding to my constant pushing for synergy and a “skunk works”
culture. People I’ve known who led by example and served as role models
and inspiration include Herb Eleuterio, Bob Cook, and Tom Sciance
(DuPont), Todd Wright, Harry Harmon, and Ed Albenesius (SRNL), John
Bailey (PCR), and Sam Trobee, Bill Hauhe, and Doug Quillen (Chevron).
Manny Gonzalez, (retired Chevron Alliance Manager) is one of the most
successful entrepreneurs I’ve ever met and was an absolute joy to work
with (his bosses, Matt Palmer, Melody Meyer, and Paul Siegele are to be
commended for letting Manny run).
My last Chevron assignment involved working with JPL. I thank Dean
Wiberg, Larry Bergman, and project managers Jeff Hall, Andrew Aubrey,
Harish Manohara, Kymie Tan, and Hari Nayar for their hard work,
competency, and willingness to use the DA toolkit as a way of working
day-to-day with me.
Heath Lynch and Cole Brinkley from Chevron’s Area 52 read the first
edition thoroughly and asked many excellent questions, which gave me
thoughts as to which topics needed expansion in this edition.
I am very appreciative of Doug Quillen and Paul Wicker for reviewing
and commenting on this manuscript. Paul kept me grounded on DA issues,
and Doug added very valuable perspective and insight. Nancy Winchester,
formerly a Chevron colleague, is especially appreciated for her ability to
see what others miss. Larry Kingrey found inconsistencies and items that
needed clarification; I am grateful for his detailed review of the first edition.
Business Expert Press is to be commended for taking on the project.
I especially appreciate the encouragement and the detailed review and
comment by Don Stengel (Professor, California State University at Fresno).
Finally, I’m thankful to my wife Debbie, who has uncomplainingly
trudged off to DAAG meetings, edited the DA books that we’ve published
together, and provided encouragement and good advice for the many years
we’ve had the good fortune to be together.
As I’m writing this, I can’t help but think of Zig Ziglar (who recently
passed away). I had the chance to meet Zig (after reading several of his
books) in the mid-1980s—he was an inspiration to me and was as gracious
and bright in person as his writings would lead you to believe he would be.
When two of my colleagues and I had lunch with Zig, he told us, “You can
have anything in this world that you want as long as you help enough other
people get what they want.” I’ve had some difficult situations in life when I
wondered if his saying was correct—sometimes it seems like the bad guys
are winning. But with the benefit of experience and hindsight, I think he is
correct, and those are good words to live by. Sometimes when we’re doing
financial analysis and we are focused on trying to maximize net present
value (NPV), it is easy to forget why we’re doing what we’re doing. My
sincere hope is that this book will give you some tools to help you get what
you want, and that it will provoke you to think about why you want what
you want. Enjoy!
CHAPTER 1

What Is Decision Analysis?

And Why Should I Care?

The premise of this book is simple: there is a set of tools and mental
frameworks referred to as decision analysis (DA) that can help you, your
family, and the teams and groups that you work with improve your decision
making. You don’t need to hire expensive consultants to use the tools—the
concepts are not that difficult and the tools are very useful.

Here is the context I’d like you to envision: you and I have just sat
down in first class seats for a three-hour plane trip. You know that I
have a background in both management and DA consultation and you
have asked me how DA could help you in your job as a manager (or
team leader or senior professional). We have a pad of paper to draw
on, and we will talk about concepts, examples, and stories rather than
the mathematics underpinning DA.
Many of the examples we’ll discuss involve personal decisions—my
friend David Skinner (DA practitioner, author, and entrepreneur) and I
discovered while teaching DA at Conoco that people absorb the
concepts more quickly from personal examples than from business
examples.

So get comfortable, and let’s talk about management and DA tools! Since
you have an inquisitive mind and have several questions, we’ll begin with
FAQs (frequently asked questions) about DA:

Why is DA important?
What is DA?
Where did DA come from?
Does DA work?
What is a good decision?
Why hasn’t DA been more widely adopted?
How do the DA tools fit together?

Then we’ll talk about the tools and how you, as a manager, a team
member, or a family member can use them.

Why Is Decision Analysis Important?


Decision analysis is important because your personal and business success
(getting what you want) depends on:

understanding what you want,


luck (i.e., uncertainty), and
good decision making.

Thinking through what is important to you—what you want—sounds


obvious, but many people simply don’t take the time to think through their
objectives, let alone those of others. Sometimes removing ambiguity
concerning your objectives (and those of the people you are working with)
is enough to add clarity and reach a consensus on what to do going forward.

Luck and Uncertainty

We cannot control our luck, but we can use DA to improve the quality of
our decisions and subsequently increase the chances of getting what we
want (see Figure 1.1).
An important distinction DA makes is that good outcomes and good
decisions are correlated, but there is no guarantee of a good outcome as a
result of making a good decision. We cannot control outcomes; all we can
do is control the decisions we make. Improving the quality of our decision
making through time improves our odds, but we are still at the mercy of
luck. My friend Patrick Leach noted that it is ironic that in professional
poker, knowing the odds is just the start of being able to play, whereas in
business, “the players have the skills but do not understand the odds.”1 DA
helps you understand the odds.

Figure 1.1 Luck versus decision quality

When I introduce this concept while teaching DA, I stop and ask the
students for examples of poor decisions that can have good outcomes.
Invariably a student will bring up winning the lottery, as the expected value
(probability of winning times the amount you win) is considerably less than
what you have to pay for the ticket. Lifestyle choices we make fit into this
part of Figure 1.1: some of us can smoke cigarettes all our lives and have no
adverse health effects; others will get cancer or heart disease from making
the decision to smoke.
When I ask for an example of a disappointing outcome from a good
decision, usually a wildcat oil well comes up as an example. You can
carefully analyze the seismic data and pick the best spot to drill a well and
still come up with a dry hole. In terms of personal decisions, you can do
careful research on a stock or mutual fund and invest carefully, only to have
some unforeseen factor drive down the value of your investment. Health is
also an example of this part of Figure 1.1: we can eat healthy foods,
exercise appropriately, and still get cancer or heart disease. All we can do is
make the best decisions possible, thereby improving our odds of getting
what we want.
The other point I make when discussing Figure 1.1 is that making poor
decisions, especially with regard to lifestyle choices, will eventually catch
up with you. We’ll talk about the implications of poor decision making
relative to ethics and lifestyle choices at the end of the book. For example,
Ralph Keeney concluded in a recent study that about half of the deaths in
the United States are a direct result of poor lifestyle decisions (we’ll talk
more about this in Chapter 14).
One of the biggest problems we have in business is that managers are
usually rewarded (or penalized) for outcomes rather than the quality of their
decisions. The problem with rewarding luck is that sooner or later luck runs
out. If you are in a position where you are evaluating managers’
performances, long-term success of your company depends on making this
distinction (talent versus luck) as fairly as you can.

Companies Use DA

Another reason that DA is important is that many companies are using DA


as part of their normal business management. If you understand the tools
and go to work for a company using DA, you’ll be able to contribute
quickly. There is strong evidence (as we’ll discuss later in this chapter) that
companies that use DA are more likely to succeed than those that don’t, so
if your company is unwilling to use DA in an industry where the
competitors do use it, you might consider changing jobs if the opportunity
arises.
In fact, one of the most interesting articles I’ve ever read was by a
professor named Kathleen Eisenhardt.2 She studied the decision-making
processes of eight “dot.com” companies in detail and correlated the
processes with financial and performance results. Decision quality varied
from negative to neutral to very high, and financial performance results
tracked with decision quality. Therefore, if your company has poor
decision-making practices (group think, who shouts the loudest, highly
political, etc.), you need to be aware that your competitors are likely to win
in the long run.
Also, please note that DA is not a fad like so many other ideas that have
been advanced by consulting firms and academia. If you’ve been a manager
for very long, you’ve experienced some of these fads, which at best were a
distraction and at worst damaged your company’s strategy and morale. DA
has been successfully used in oil and gas, chemicals, pharmaceuticals,
transportation, and manufacturing for decades and is part of the normal
business practice at many companies.

Consensus and Alignment


Another reason these tools are important is that they are designed to help
groups and teams of people reach consensus and engage in the decision-
making process. It is sometimes critical for a group to buy in to a course of
action and align together to reach a goal. Caution, though: there are other
times when the technical accuracy of the decision is what ultimately drives
success.
Very recently, a team of which I am a member was divided over the role
of a potential new piece of equipment being developed in the laboratory. A
detailed technical program had been laid out and agreed to earlier, but the
role of the new equipment was ambiguous. I drew a simple tree (we’ll
discuss decision trees in a later chapter) on the white board, noting that
there are only three possible outcomes of the technical program now in
place:

Current technology is adequate to address the needs of the program,


in which case the new equipment is interesting but not necessary to
complete the mission.
Current technology won’t adequately address the needs of the
program, in which case the current program should go on hold until
the new equipment is available.
Current technology addresses part but not all the needs of the
program, in which case a cost/benefit analysis should be done to
determine whether to proceed or wait. We labeled this outcome the
“sort-of” case; the team deemed this the most likely (and least
desirable) outcome.

Once the team understood the uncertainty (adequacy of current


technology) and the decisions that would logically flow from resolving the
uncertainty, the team aligned and agreed on how to proceed.
What Is Decision Analysis?
We’ve been talking about how important DA is, so it is time to define what
it is that we are talking about. David Skinner3 developed the following
definitions:

“A decision is a conscious, irrevocable allocation of resources with


the purpose of achieving a desired outcome.”
“Decision analysis is a methodology and set of probabilistic
frameworks for facilitating high quality, logical discussions,
illuminating difficult decisions, and leading to clear and compelling
action by the decision maker.”

Note that in real life, there’s no “control z” undo on the commitment of


our resources. You may subsequently have to reverse a decision, but only
with the loss of money, time, and resources. David’s point about clear and
compelling action is important—DA should always add clarity. Once a
course of action is clear and agreed upon, your analysis is complete, and it
is time to focus on planning and implementation.
Note the adjective probabilistic—there’s a significant component of
statistics incorporated into the analysis tools of DA. Somewhat tongue-in-
cheek, David and I used to use a chart we called the “Four Ds” that noted
the common elements of statistics in statistical process control (defects),
health risk assessment (deaths), fault tree analysis (destruction), and DA
(decisions).

Here is my definition of decision analysis:


Decision analysis is a set of tools (frameworks) that can help people or
groups of people:

clarify and reach alignment on their goals and objectives,


develop and examine alternatives,
systematically examine the effect of uncertainty, and
maximize the probability of achieving their goals and objectives.
My definition is more of a working definition of DA—you first have to
figure out (and get alignment on) what you want, then figure out what
alternatives you really have, and then use probabilistic analysis to handle
uncertainty. The toolset is a bit broader than any of the definitions imply. I
think that clear and compelling action is important but not as important as
maximizing the probability of achieving your goals (you need to have goals
to maximize your chances of achieving them!).
The term decision analysis was originally developed by Professor Ronald
Howard (not the movie director) of Stanford in the 1960s.4 Since that time,
many books and academic papers have expanded the set of tools referred to
by the term decision analysis.
You may now be curious about the tools. I split the tools into framing and
analysis tools. Framing tools help remove ambiguity concerning the nature
of the problem, the objectives, what is known and unknown, and what
alternatives or sets of alternatives are available. Sometimes just framing a
problem correctly can achieve enough clarity that a team can align on the
decisions that need to be made and proceed with planning and
implementation.
Analysis includes financial modeling, assessment, and examining how
uncertainty affects the potential outcomes for the decision(s) at hand. There
are also several special-purpose tools that we’ll consider near the end of the
book.

Where Did Decision Analysis Come from?


The statistical foundation for DA began in the 1600s when pioneers such as
Thomas Bayes, Abraham de Moivre, Daniel Bernoulli, and Blaise Pascal
developed probability theory and key theorems that are used in the analysis
parts of DA. After World War II, John von Neumann, Oscar Morgenstern,
Robert Schlaifer, Ronald Howard, Howard Raiffa, and Ralphy Keeney
developed theories of rational decision making based on probability and
systems analysis. The practical tools we’ll discuss in this book are a result
of the fundamental work developed by these men.
Ward Edwards and Detlof von Winterfeldt included a thorough, concise,
and interesting history of DA in their book, Decision Analysis and
Behavioral Research.5

Does Decision Analysis Work?


There are many academic studies and papers documenting the effectiveness
of various DA tools and approaches, but the work I like best was done by a
professor named Paul Nutt. Dr. Nutt has made a career of studying business
decisions.
For example, he studied 127 major decisions businesses made in North
America in detail.6 He found that the probability of success almost doubled
(from 40% to 80%) when DA processes were utilized. And his defi-nition
of probability of success represented a very low bar—if the decision had
not been reversed within 2 years, he termed it successful whether the
decision was ultimately profitable or not! He also saw dramatic
improvement in understanding, participant buy-in, and use of creative ideas
and achievement of business results. Dr. Nutt’s papers are fascinating—he
has documented dozens of what he calls decision debacles. If you are a
CEO and make a really dumb decision, Dr. Nutt is going to write you up in
one of his papers!7 He also wrote a book on his findings, Why Decisions
Fail.8 The book is a good read, but it seemed to me that he kind of pulled
his punches in the book versus the refreshingly blunt tone of his papers.
Anecdotally, while I was at DSI, my colleagues and I saw large value
added by helping our clients use the DA tools. Many times the teams had
missed an alternative or a hybrid strategy that, after the analysis was
completed, was clearly superior across all ranges of probability to the
strategy that teams would have pursued, absent the analysis.
Every year I work with David Skinner as he is teaching his DA course at
Rice University. He divides the class into teams and each team has to
complete a project, preferably with an actual decision that one of their
companies has to make (or a significant personal decision facing one of the
team members). The teams present to a set of business people at the end of
the class in a simulated DRB (decision review board) setting. Decision
makers always note that going through the DA tools makes the decisions
easier to make because the tools add clarity on the alternatives going
forward and that it is very valuable to understand the ranges of possible
outcomes and not just focus on “the number.”
When you are faced with a difficult personal decision, using the DA tools
can significantly reduce the stress and anxiety associated with the situation
because they give you mental frameworks to systematically address the
ideas, facts, risks, alternatives, uncertainties, goals, tradeoffs, and values
that are “swirling” around in your mind. When you involve your spouse and
friends and family in the process, the tools give you a way to engage
systematically without just arguing about the situation. You may still argue,
but you can turn that thinking into useful input, examining the upside and
downside associated with each alternative. And by incorporating this input,
you build consensus and a higher quality decision, hence increasing your
odds of getting what you want.

What Is a Good Decision?


We’ve talked a lot about DA and improving our odds by making high-
quality decisions. But what do we mean by good decisions? Once again,
we’ll turn to a definition from David Skinner:

A good decision is one that is logically consistent with our state of


information and incorporates the possible alternatives with their
associated probabilities and potential outcomes in accordance with our
risk attitude.9

As noted previously, making a good decision does not guarantee a good


outcome—it just improves our odds of getting what we want. In Chapter
12, we’ll discuss decision quality in more detail.

Why Isn’t DA More Widely Used?


At this point, you might be asking yourself why the DA toolbox is not more
widely used than it is. I think there are two major reasons.
First, consulting firms have been used by many companies to implement
DA. This has resulted in many managers coming away with the impression
that DA is very complicated and is best left to the experts. There is a place
for using consultants: training in-house personnel who have an interest in
DA, helping with large, complex problems, and coaching executives. But
you don’t need an expensive consultant to understand the concepts and use
the tools. In fact, using the tools yourself will help you make a better
decision as to when it will add value to bring in a consultant to help with a
problem.
Second, because managers are many times rewarded for good outcomes
(i.e., being lucky), some managers begin to believe that their own hunches
are infallible, hence negating the need for framing or analysis.

One very experienced and successful consultant I know left his


consulting practice. I saw him at a conference and asked him why. He
noted that he had reached the conclusion that DA would never be
widely adopted. I looked at him quizzically, and he responded, “Ego.”
Another consultant friend made a sales pitch to Enron during their
peak years; the Enron executives responded, “We know all there is to
know about uncertainty and don’t need DA.”

That said, the tools are available for our use to improve the odds of
getting what we want. I’m optimistic—I think that as managers experiment
with the DA tools and find out that the tools are useful, DA will
increasingly become a significant part of the way we run our businesses.
There is one caveat, though. All of the DA tools assume purposeful and
transparent motivation and communication. For thinking through situations
where this assumption may not be correct in dealing with other parties, a
Game Theory approach is more appropriate (see, for example, Game
Theory for Business by Paul Papayoanou). If the culture of your own
organization needs significant repair, that is another topic. The DA tools can
help improve an organization’s culture, but only to the extent that
communication is direct and honest.

How Do the DA Tools Fit Together?


Figure 1.2 is a basic outline of the DA tool set. I divide the tools into
framing tools, analysis tools, and special topics. The order we’ll discuss
these tools in the book is the normal order that we would use them in a full
DA.
However, you do not have to use every tool for every situation.
Sometimes all you need to do is draw a simple decision tree to frame the
situation and add enough clarity that the issue is resolved. Sometimes
understanding the trade-offs shown in an objectives hierarchy is enough to
help stakeholders see each other’s positions and resolve an impasse.
The first four framing tools are not explicitly DA tools, but these tools
are needed to begin to frame a complete analysis.

Figure 1.2 Decision analysis tools

Note that the influence diagram is included as both a framing tool and an
analysis tool. This is because the influence diagram serves as a bridge
between framing and analysis.
Also, note that once the course of action becomes clear, it is time to
transition from DA to planning.
CHAPTER 2

How to Start Framing a DA Problem

How Can We Work Together?

I’m sequencing introduction of the DA tools as if you were doing a


complete analysis. However, you do not have to be in a complete analysis to
use the tools. It is just a convenient way to organize them. Also, this section
in particular is a good way to kick-off any project team, whether you plan to
complete an analysis or not.

Facilitation: PowerPoint or Whiteboard?


Let us discuss logistics for a minute. I like to use a whiteboard for framing.
There are whiteboards that will print, or you can simply take a digital photo
of what you’ve developed. Markers come in good bright colors. With a
small group, you can all stand around the whiteboard and work together.
Many people prefer to use PowerPoint and project onto a screen, editing
online as the meeting progresses. This can work well, but be aware that (1)
a dark room with a bright light can put people to sleep, and (2)
“wordsmithing” PowerPoint slides can be really painful and boring for the
group, especially if the person typing (the scribe) doesn’t listen well or is a
slow typist. If you’re going to record online, make sure that you or your
scribe can do it well.

About Facilitation
Another facilitation factor is the number of people you include in your
meetings. There are trade-offs between having everybody included versus
having a manageable team.

One of my hobbies is playing music—I play keyboards, bass, and


some guitar and have played in duos and trios, medium-sized bands,
and large bands. I’ve played rock, jazz, R&B, country, western, and
contemporary Christian. My rule of thumb is that the difficulty of
getting a group of musicians to align goes up by the power of the
number of musicians. Here’s one of the few equations I’ll present:
Difficulty = (Constant)n where n is the number of people!

Problem-solving meetings are the same way—the more people you have,
the less time for each person to speak and the more difficulty you have
facilitating. However, excluding a key person from your meeting is likely to
generate hurt feelings, and set your project back, and may exclude key input
necessary to solve the problem.
Another consideration is the willingness of the people to work together
and try the tools.

When I was consulting, one time I got to the meeting and the room
contained 18 people, most of whom had never met each other before.
The project manager and I had discussed the meeting and had
developed an opportunity statement (see the next section), but I didn’t
realize there were that many people involved. I thought to myself,
“What am I in for?” The group was fabulous, though—they did
everything I asked them to do, and all of them participated with energy
and competence. We ended up with a delightful product that later
withstood a surprise audit by the parent company. One of the things we
did during the two-day workshop was split into smaller groups and
divide up the tasks—this allowed people to work on what they were
interested in.

I’m assuming that you have at least a rudimentary knowledge of how to


set up and run a meeting and understand logistics and ground rules for
helping people work together. I will interweave facilitation tips throughout
the book, but these are specific to the tools that we’ll discuss. If you’d like
more background on effective meetings, there is an excellent “yellow” book
called How To Make Meetings Work by Doyle and Straus that is thorough,
concise, and highly recommended.1
Effective facilitation is a skill that takes quite a bit of practice; the key
skills involved are listening and watching body language (and trying to stay
ahead of the group!). Skinner also presents good checklists for meeting
preparation and ground rules that are targeted toward DA situations.2
Again, the secret of facilitation is good listening—capturing headlines of
people’s thoughts and making sure that everybody gets a chance to speak
up.

Problem/Opportunity Statement
The first element of any decision analysis (even a very simple one) is a
problem statement. (Some companies always refer to these starting points
as “opportunity” statements, which can result in team cynicism if the
situation is truly a problem.)
This should be a concise one-, two-, or three-sentence (or bullet point)
statement as to why you are doing something. If I’m consulting, I meet with
the project manager before the meeting and develop the statement together
with him or her. If I’m convening a team to address the problem, I draft the
problem statement ahead of time and send it to the team as a pre-read.
It is very important to give the team time to discuss the problem
statement and edit it together. Once the team edits the statement together,
they align and own it. You do have to make sure that the edits are correct—
don’t abandon the content, but you have to be willing to adjust to include
others’ perspectives. It is also important to make sure that the statement is
technically correct, given the state of knowledge of your team and
associated subject-matter experts.
As an example statement, let’s use a career path decision as the situation
we have to deal with. Perhaps you or someone in your family is graduating
from college or grad school and has to choose among multiple job offers.
Or perhaps you or one of your friends is working for a company whose
future is limited (poor decision-making practices), so you’ve engaged a
recruiter and have ended up with multiple job offers. Here’s an example
statement:
Opportunity statement: We have to choose between multiple possible
career paths that have become available, specifically three competing
job offers. We want to maximize both financial and lifestyle benefits
for the family unit.

In this case, assuming the job offers are good ones, this is best
characterized as an opportunity rather than a problem!
My thinking concerning this very important DA tool has changed
recently as a result of new perspective….
My friend Brian Hagen, an experienced DA consultant, is in the process
of publishing a book that notes the distinctions associated with problems,
risks, and opportunities.3 This can be powerful, as management typically
approaches these three types of situations differently (and with different
biases). Brian notes that there are usually “tipping point” events associated
with problems, risks, and opportunities and we need to understand where
these tipping points are.
A problem is a situation where some kind of tipping point has occurred
or is certain to occur. My friend Tom Sciance notes that, sooner or later,
“the oxcart ends up in the ditch” and you have to figure out how to get it out
of the ditch. (This could be a literal tipping point!) “Do nothing” may be an
alternative in a problem situation (leaving our oxcart there for somebody
else to haul off), but there are always consequences associated with the do-
nothing alternative (losing the money we spent for the oxcart and maybe
paying a fine if the authorities figure out that it is ours). From a financial
perspective, it is very unlikely that the problem was included in the budget;
hence more money is likely to be required to resolve the situation.
A risk is a potential problem. In our case, there is a risk that if the road is
wet, the oxcart could slide into the ditch. There is a risk that an axle could
break and cause the oxcart to slide into the ditch. Here, the tipping point
hasn’t happened, but it is something that we would mitigate if we could do
so. In this case, we’d add the cost of mitigation to our budget if we deemed
it necessary.
An opportunity is a situation where something positive may be
obtainable by expending time and monetary resources. In our case, perhaps
a new transportation company has started up, which will allow us to hire
them to haul our stuff with their oxcart at a reduced cost.
Understanding whether the situation is a problem, a risk, or an
opportunity at the start of the decision analysis can add very valuable
perspective on which tools you might use and how you approach the
situation.

Issue Raising
Once we have edited our statement, it is time for issue raising. This is a
classic brainstorming exercise—we try to capture everybody’s thoughts in
reaction to the problem as fast as we can. We don’t judge the merit of the
opinion—we just write it down. My former DSI colleagues from Calgary,
James Mitchell and Bob DeWolff, call this “rant time.”
The reason we need to do this is that people cannot move on to exploring
new thoughts (and others’ thoughts) very well if they don’t get a chance to
speak their opinions first. They will sit there and get frustrated while
waiting for a chance to interrupt with their thought(s). Plus, if the facilitator
captures the thoughts accurately, it demonstrates that he/she is listening,
which is key to getting good involvement as you go through the tools. You
can ask clarifying questions if you don’t understand the person’s point. And
you don’t have to capture every word—try to headline the key thoughts.

One project I facilitated had a good problem statement that the team
edited and agreed to. However, during the 2-day workshop, by 10:30
AM on the first day we were still in rant time! It turns out that there
was another project that the team was worried about, and they couldn’t
stay on topic due to this other project. So we called a break. The
project manager and I talked about the other project, and when we
came back we shifted and collaboratively drew the decision trees for
the other project. Once the team worked through the decisions and
uncertainties associated with the second project and got some clarity
about how they were going to approach it, they were comfortable
putting it behind them and going back to the “real” problem.

Back to our multiple-career path example, here are some example issues
that might come out as part of the discussion:
Which is more important, financial remuneration or lifestyle?
How do we account for the higher risk (and higher potential reward)
of one of the companies?
Which job includes the most interesting work?
Which is more important: long-term career or short-term
compensation?
How do we incorporate opinions and preferences of key
stakeholders (friends and family) into our decision-making process?

Situation Analysis
So far, we’ve delineated the problem and recorded our team’s reaction to
the situation. At this point, it is appropriate for somebody who has technical
and business knowledge about the problem situation to brief the rest of the
team as to the technical and business situation. If the situation involves a
competitive landscape, the standard tools for examining your position
relative to the competition apply (e.g., Porter’s five forces, SWOT1).

Relative to our example, relevant situation analysis information could


include the cost-of-living and geographic distances associated with the
various alternatives, lifestyle distinctions between the alternatives, and
professional connectivity associated with each of the alternatives.

Stakeholder Analysis
Stakeholder Analysis can be completed quickly or it may take quite a bit of
time, depending on how many stakeholders are associated with the problem
and how complex their interests are. The simplest way to start this is to
brainstorm a list of stakeholders, combine them into groups, examine the
interests of each stakeholder, assess how powerfully they can influence the
outcome, and describe how the problem affects them.
As this is being done, you are likely to hear comments noting actions that
need to be taken as part of implementation—comments such as, “We need
to be sure that we file for this permit by a certain date, otherwise we’ll be
delayed,” or “We need to meet with the person quickly to determine
whether they’ll support our work or not, as they could derail it.” Be sure to
note these comments on a “bucket list” or action item list so that the
thoughts are captured.

For our career path example, stakeholders could include a spouse,


children, friends, family, bosses, work associates, and so on.

Taking a few moments to consider others’ possible views on the problem


can sometimes yield significant insight. If there are significant stakeholders
who could affect your project, coming up with a plan to engage these
people can sometimes make the difference between success and failure (and
can be a separate project unto itself). Social media gives people the
potential to quickly mobilize if there is a group that does not agree with
what you plan to do. The situation with the Dakota Access Pipeline is a
good example of what can happen if you don’t understand who potential
stakeholders are.
These framing steps can sometimes proceed very quickly, but sometimes
it can take quite a bit of time to go through them. None of these steps are
classical DA tools; however, they can be essential to developing a good
frame—a good shared awareness—of your problem or your opportunity. In
the next chapter, we’ll discuss our first true DA tool: the objectives
hierarchy.

Documenting the Process


Documenting the work that you do with the DA tools depends on the
problem, risk, or opportunity that you are dealing with. If a simple decision
tree is all that is needed, you may just send a follow-up e-mail with a sketch
or photo of the tree. If you are helping a colleague develop an objectives
hierarchy (Chapter 3), a digital picture of a whiteboard may be all that is
needed.
If you are working on a large project, you normally have a charter from
your management, which leads to the framing meeting and use of at least
some of the tools in the left column of Figure 1.2. At the end of framing,
you will need to review the frame with management, normally in the form
of a PowerPoint presentation. Your management will always have
comments and adjustments to your frame. Once you have completed the
analysis, again, a PowerPoint presentation for management is normal.
Sometimes a written report, called a Decision Support Package (DSP), is
prepared to document the frame and the analysis. The DSP is provided to
management as a pre-read before the final presentation, or can be developed
after the presentation to support advancing the project.
Another useful way to document a large project is to keep an Excel file
with separate worksheets (tabs) for results from using the tools, outstanding
issues, action items, decisions taken by the team, and so on. This file is
updated after each team meeting and can significantly help, especially if
there are multiple stakeholders involved in the meetings. This technique is
also useful when a complex contract is being negotiated between two or
more parties. Each meeting should be followed with an update of the Excel
file.

Implications for the Manager


Almost every manager will occasionally have to facilitate. The concept of
“rant time” is very important—only by allowing everybody the chance to
speak and be heard (demonstrably) will people be able to move on and
work on the problem that you want to address.
Losing sight of the business situation and stakeholder interests is all too
common and can be very dangerous to your business. It is all too easy to get
locked in on the silo of your own organization and lose sight of the big
picture.
The key to good facilitation is good listening and then capturing
headlines on a whiteboard, chart pad, or computer. Once people know that
they have been heard, they can move on and process new thoughts and
information.
1 SWOT is an abbreviation for strengths, weaknesses, opportunities, and threats. If you are
unfamiliar with Porter’s five forces, Wikipedia contains a good summary of his thinking.
CHAPTER 3

The Objectives Hierarchy

What Do We Want?

One time when I was consulting, I met with a group of managers talking
about some common issues they were having with projects at their
company. Complaints included:

no clear agreement on trade-offs,


misalignment on selection criteria and value measures,
missing scope or misalignment on project scope, and
unresolved frame issues.

In this chapter, we’ll discuss one of the most powerful framing tools in
the DA toolkit: the objectives hierarchy. Proper use of the objectives
hierarchy can help teams and employees align, understand trade-offs, and
understand what the goals are for an endeavor. If the objectives hierarchy is
not addressed, the complaints listed above are likely to appear.
An objectives hierarchy is a schematic representation of a project’s
primary, fundamental, and means (or supporting) objectives, where:

“An objective is … something that one desires to achieve.”1


A primary objective is the high level, general objective that we have
relative to the problem, risk, or opportunity that we are considering.
A fundamental objective is “an objective that is both essential and
controllable.”2 This definition is useful as a test of the fundamental
objectives that the team develops.
A means objective contributes “to the achievement of the
fundamental objectives.”3

Figure 3.1 Objectives hierarchy construction

An objectives hierarchy can also be defined as the combination of a


primary objective, a set of fundamental objectives, and their supporting
means objectives. Interconnections between the various fundamental and
means objectives (boxes) are shown (lines or arrows). An objectives
hierarchy is easily drawn by hand on a whiteboard, on paper, using sticky
notes, or using PowerPoint or Visio. Figure 3.1 shows the basic
construction of the objectives hierarchy.
Note that defining the objective is within our control. Achieving the
objective is at least partially controllable, that is, your actions determine
whether you achieve the objective or not. However, some objectives are
also likely to include some uncertainty that must be managed along the way.

Where Did the Objectives Hierarchy Come from?


In 1992, Ralph Keeney published the definitive reference on this subject,
Value Focused Thinking,4 in which he described in detail the objectives
hierarchy, how to develop it, and why to use it. Over time, the DA
community adopted Keeney’s objectives hierarchy as part of the framing
process. In fact, Gary Bush and David Skinner (founders, Decision
Strategies) note that some decision problems can be solved completely by
removing ambiguity using the objectives hierarchy and the other framing
tools.5 Skinner makes a very important point, “the key is to only use the
tools and processes needed to gain clarity of action.” Once this is achieved,
you’re done with the analysis and should start planning. Once the course of
action becomes clear and compelling, it is time to end the DA phase and to
move into planning and implementation.6 The objectives hierarchy can add
clarity and understanding of project goals and can reduce ambiguity.

Why Use the Objectives Hierarchy?


The objectives hierarchy is designed to add clarity, help gain consensus, and
then communicate what a project hopes to achieve and why. It helps define
purpose by establishing how success will be achieved and measured. The
objectives hierarchy helps decision makers and team members examine:

what is important to the success of the endeavor,


trade-offs between possibly conflicting objectives,
ambiguities and lack of clarity among goals,
management’s views of the project’s objectives,
how stakeholders’ objectives may differ, and
how to measure success (which, by definition, must consider each
fundamental objective).

Without completing the objectives hierarchy, the team bears a significant


risk of solving the wrong problem. Finally, objectives hierarchies assist in
stimulating development of creative and robust alternatives.

Where Does the Objectives Hierarchy Fit into DA?


As already noted, we begin the framing process with a problem statement,
issues raising, and a business/technical situation analysis. The objectives
hierarchy normally follows issues raising, followed by the decision
hierarchy, strategy table and decision trees, and influence diagram. A
stakeholder analysis can be completed either just before or just after the
objectives hierarchy (considering stakeholder interests will improve
objectives hierarchy quality and vice versa). Furthermore, the objectives
hierarchy should be used throughout the process and included in the
management review of the project frame, including:

Management should confirm or refine the measure(s)-of-value noted


on the influence diagram using the fundamental objectives in the
objective hierarchy as input.
Any qualitative strategy evaluation should address the fundamental
objectives in the objectives hierarchy. Usually the means objectives
are addressed in a qualitative evaluation, as they are specific enough
to assess. Fundamental objectives are usually more difficult to
assess, as they are more abstract and are usually subjective.
The objectives hierarchy should be used as a check against the final
recommendation, especially for a multiattribute decision problem.
The question should be asked: “How well does our recommended
alternative achieve the objectives that we originally said that we are
trying to achieve?”

A good way of illustrating the role of the objectives hierarchy is to


consider what we used to call the Bush Grid (named after decision analyst
and co-founder of Decision Strategies, Gary Bush, not President Bush), in
which you consider the level of uncertainty versus the level of ambiguity.
Ambiguity in this context means the lack of clarity about goals or
conflicting goals (see Figure 3.2).7
Figure 3.2 Uncertainty versus ambiguity

With this figure in mind, and listening to the issues raised by the team in
reaction to the problem statement, you can start to get a feel for which tools
are going to be the most useful for your situation.

Facilitating the Objectives Hierarchy


Using the problem statement, business situation analysis, stakeholder
analysis, and issues raised as a backdrop, ask each participant to write down
what they believe are fundamental and means objectives for the project on
sticky notes, one objective to a note.
If you have a small team, next combine and group the notes onto a large
triangle. To arrange the objectives, the questions “why do we want to do
this?” (leads upward on the triangle) and “how can we do this?” (leads
downward on the triangle) are useful (see Figure 3.1).
If you have a large team, split into subteams and have each subteam
complete their own hierarchy and present it to the larger group. I usually
combine and refine the hierarchy offline and then come back the next day
(if possible) for the original team to edit the diagram together. Also, expect
management to make additions and corrections to the objectives hierarchy
based on their perception of the problem—this is a key part of frame
validation and ambiguity elimination.
Very important, though, is to recognize that it is less important to get
every line drawn from each objective to every other related objective than it
is to get the key objectives identified and placed in approximate order.
Inexperienced facilitators sometimes spend too much time forcing a team to
refine an objectives hierarchy, which can burn out the team. Use the team’s
time carefully and don’t hesitate to take refinement offline and come back
to the team for review and comment. It shouldn’t take more than one to two
hours to develop a preliminary objectives hierarchy, even with a large team.
Sometimes the team goes quickly through the preliminary diagram but later
has the energy to spend more time upgrading the refined version—they find
that discussion about trade-offs and value can be very worthwhile.
Occasionally, teams become confused between the two framing
hierarchies: objectives and decisions (see the next chapter). You need to be
able to draw out the distinctions between decisions and objectives; trees
with alternatives for the decisions will help the team understand the
differences.

Example Objectives Hierarchies


This example is one that I use with my DA courses (see Figure 3.3) and is
associated with our multiple career path example. When I lecture on
campus, I put this into the context of graduation and finding a job. When I
lecture at a company, I put this into the context of, “Your assignment is
ending and you are getting ready to go into the job selection process.” Your
primary objective is to obtain the best possible career.
With this approach, there are three fundamental objectives—financial,
professional, and social—that feed into the overall objective of obtaining
the best possible career. These meet the essential and controllable test, in
that if any one of the three fundamental objectives is significantly deficient,
the overall objective is compromised. All the lower two levels are means
objectives that contribute to accomplishing the fundamental objectives.
Keeney sees this particular problem a little bit differently—he cites four
fundamental objectives for a similar situation:

learning valuable skills,


determining whether to go to graduate school (or not),
enhancing chances of acceptance at the best business schools, and
experiencing a different geographical region and lifestyle.8

Note how one’s perspective can affect one’s viewpoint on objectives and
their relative importance. Much value can be obtained if a team takes the
time to articulate and understand each other’s view on the project’s
objectives.

Figure 3.3 Example objectives hierarchy

Figure 3.3 illustrates the potential trade-offs between objectives. Our


ideal job would, of course, include excellent financial rewards, be
challenging, interesting, and exciting from a professional standpoint, and
would be in a location and have a working environment that fit our needs
exactly. However, the likelihood is that any job decision we have to make
will represent balancing the three fundamental objectives. One job might
pay really well but might be in a location that was remote relative to friends
and family. Another job might be in a great location, but the work itself
might not fit our professional needs.
Sometimes means objectives represent trade-offs between themselves
underneath a fundamental objective. For example, a job might be an
excellent challenge but a poor fit with our skills and interests. Or a
company might have an excellent working environment and collaborative
culture, but their locations don’t fit with our family and social situation.
Other times the trade-off might be between a fundamental objective (e.g.,
financial) and a specific means objective (proximity to family). The power
of the objectives hierarchy is that it gives you a way to map and explore
what you want and the trade-offs associated with achieving what you want.
And then you can use it as a discussion catalyst for people around you who
are important to you.

Objectives Hierarchy and Career Decisions


The objectives hierarchy is a very valuable tool for making career
decisions.

One time I was consulting for a pharmaceutical company. The project


manager and I went out for beer after the workshop was completed. As
we were talking, he said, “Dave, for some reason people have always
come to me for career advice. If they have a job offer from a different
company or are thinking of a transfer within our company, they have
always sought me out. I used to just tell them what I thought, but now I
ask them what is important to them. I ask them questions, and as they
are talking, I draw their objectives hierarchy. Then I suggest that they
take that home and discuss it with their spouse.”
A few months later, I was working on a different project in an oil
company, and one of the managers asked if I could help him for a few
minutes. He asked me, “Can you help me draw some trees?”
We talked about what his problem was, and it turns out he was
contemplating a career change. Remembering the previous
conversation, I asked, “What is your most important objective as you
live your life? What is the most important thing to you?” He thought
for a few minutes, replied, and I pulled out a piece of paper and wrote
down what he said. I then asked him more questions, and we drew his
objectives hierarchy together.
About halfway through, he started to get excited, as he saw that this
would be a mechanism for him to talk with his wife, as they had been
arguing about his possible career change without resolution. Sure
enough, he took it home and asked her what was important to her, and
they integrated their objectives. This gave them clarity, and they
quickly aligned on a course of action they both agreed with.
Unfortunately for the oil company, he ended up leaving and buying a
business that fit his objectives better than continuing on with the
company. But for him and his wife, the course of action they took fit
much better with what they both wanted.

Keep Objectives in Focus


Some of the DA tools help you or you and your team gain insight as to a
particular problem, and once the course of action becomes clear, you’re
essentially done with the tool (except using it to explain the decision to
other stakeholders). The objectives hierarchy, though, is more evergreen in
nature. It helps people keep focused on what is important and what trade-
offs they’re dealing with as they go through time. We’ll discuss a negative
example and a positive example to illustrate this point.

Losing Sight of What Is Important

I once worked on a project involving changing a rather complex and


low-yield process from batch (put the ingredients into a big tank called
a reactor, stir, and heat—kind of like cooking spaghetti sauce from
scratch) to continuous (slowly feed all the ingredients all the time and
then separate out the product). We were scaling the process up from
laboratory scale to pilot plant scale. The pilot plant work seemed to be
going well, and everybody was excited about the project.
When the program was well over half completed, there was a major
reorganization and a different client organization took over funding
this work. Where the original client was concerned mostly about the
technical results of the work and completing the work in a timely
fashion, the second client was solely concerned with expenditures. The
new client project manager would call at least twice a week to
complain about the spend rate and pressure us to cut costs. To try to
oblige the second client (thinking “the customer is always right”), I de-
staffed the program as much as I could.
Unfortunately, there was a glitch in the process—we started seeing
anomalous results that we did not understand. I should have shut the
process down until we understood what was happening, but I
succumbed to the pressure put on by the second client and kept going,
trying to finish the program cheaply. One night the unit suffered a
detonation and came apart. Fortunately we were operating it remotely,
so nobody was hurt and the emergency shutdown and environmental
protection equipment worked properly.
We completed a thorough investigation and did figure out what caused
the problem. Unfortunately, the second client organization used the
incident to abandon what could have been an excellent process—their
objectives were short-term cash-flow preservation, whereas the first
client organization wanted to use the new technology to improve their
long-term position. As we did not have an objectives hierarchy for the
project, the shift imposed on us by the second client was felt daily but
never expressed directly. The second client did not share the first
client’s fundamental objective of developing the new process to
improve their long-term position with the product. The second client’s
objective was to minimize the cash outlay for a program they inherited
but had no real desire to complete.

Developing a good objectives hierarchy with the first client and then
using it with the second client would have resulted in either (1) getting buy-
in on what was important from the second client, or (2) abandoning the
project at the time of changing clients. Either outcome would have been
preferable to what happened.

Keeping the Objectives in Focus

The positive example involved a large project’s framing meeting just


before the project moved into FEED (front end engineering and
design). During a pre-FEED workshop with the project manager and
his staff of team leaders, I facilitated the objectives hierarchy.
However, when we started building out the team’s hierarchy, the
project manager stopped the group and said, “I’ve been thinking about
this for some time, and I’d like to propose that there are seven
fundamental objectives, all of which must be completed in order to
accomplish our primary goal.” He then wrote the seven objectives
across the whiteboard and explained what he meant by each one. The
team asked him questions as he went. After discussion, the team
leaders agreed with the seven fundamental objectives.
They then were able to categorize many means objectives that they had
come up with underneath the seven fundamental objectives. They were
also able to draw the links between objectives, which gave them
insight as to where, how, and when they were going to have to keep
their teams working with each other.
As FEED progressed, the staff used the objectives hierarchy to make
sure that they were progressing all seven fundamental objectives. And
when FEED was completed, the seven objectives were addressed and
the project moved successfully through FID (final investment
decision) and was completed.

Implications for the Manager


One obvious use of the objectives hierarchy is career counseling (as noted
in my discussion earlier).
Another powerful use of the objectives hierarchy is in negotiation
preparation. If you understand your objectives and take a first pass look at
your counterparty’s objectives, it should help you to achieve a win–win
agreement that both parties like.
The objectives hierarchy is also a powerful planning tool for you and
your family. Understanding each other’s goals and objectives can remove
ambiguity and increase the chances for alignment. It also gives you a
context to more objectively assess whether alternatives you face as a family
will help you achieve your goals or take you further away from them. One
of the best examples of this is my colleague Frank Koch. Prior to retiring,
he and his wife sat down and developed their shared objectives hierarchy,
which was very useful as they figured out where to live and when to retire.
Understanding how objectives and trade-offs are related and connected
can really help teams reach alignment and consensus on what needs to be
done and how things fit together. This is especially important when a group
or team loses focus and productivity starts to decline.
CHAPTER 4

Decisions and Alternatives

What Can We Do?

Now that we understand what we want and have explored the trade-offs
between our various goals and objectives, we can begin to look at our
alternatives. The tools we’ll explore include the decision hierarchy, decision
trees, and the strategy table. All three of these tools are very useful and can
be used within the context of an analysis or to add insight to a smaller
problem.

Decision Hierarchy
There are three kinds of decisions we need to consider:

closed decisions,
strategic decisions, and
tactical decisions.

Closed decisions are decisions that have already been made. They are
sometimes referred to as boundary conditions. Closed decisions may be
decisions that have been discussed and decided by the project team or group
already and you do not want to reopen them. Closed decisions may be
decisions that have previously been made by management. Some texts call
these “policy” decisions, but I don’t use this term because teams confuse
policy decisions with corporate policy. The word “closed” is a better
description. It is important, however, to verify that these decisions are
indeed closed with whomever the decision maker is for your problem.
Sometimes executives will challenge what a team presumes to be a closed
decision and instruct the team to explore potential alternatives.

Considering our career path example, closed decisions might include:


(1) we have three possible career alternatives, any one of which is
acceptable, (2) our significant other will be involved in the decision-
making process, and (3) we will consider both financial and
nonfinancial characteristics of the alternatives.

Strategic decisions are the decisions of interest relative to the current


problem or opportunity. Trees (discussed in the next section) are useful to
show the alternatives associated with each decision. It is important to test
the strategic decisions with decision makers—many times they’ll note a
decision that is missing or one that is really closed (in which case you move
it to the closed decisions part of the decision hierarchy).

It sounds trite, but if we have no alternatives, we have no decisions to


make! One consulting project I once had was a two-day framing
workshop. The team did really well until we got to the strategic
decisions part of the agenda. They brainstormed a whole list of
strategic decisions. However, when we started drawing trees for each
decision, we found that there were no alternatives! Some were closed
decisions (which we verified via teleconference with the decision
maker). Some of the alternatives were technically not possible. We
exhausted the brainstormed list and did not have one strategic decision
left! So we spent the rest of the workshop planning and laying out an
implementation time line. The framing exercise was still very valuable
for the team, though, as they gained alignment on what they were
going to do and why they were going to do it.
For our career path example, the strategic decision is, “which job
offer should we select among the three offers that are available?”

Tactical decisions (in this context) are decisions that are important, but
they can be addressed later. Again, though, it is important to verify with the
decision maker(s) that you can make these decisions later. Decision analysts
have a saying, “Today’s tactical is tomorrow’s strategic,” as it is very likely
that you are going to have to come back and deal with the tactical decisions
sooner or later. This part of the hierarchy is useful for planning, as you can
note on a time line when the decisions will need to be addressed.
Considering our career-path example, our decision hierarchy might
include:

Closed decisions

We have three possible career alternatives, and all are acceptable.


Our significant other will be involved in the decision process.
We will consider both financial and nonfinancial characteristics of
the alternatives.

Strategic decision

Which position is of the highest interest?

Tactical decisions

Should I purchase a new car or truck?


Should I upgrade my wardrobe?
Which moving company should I choose if we relocate?

Many analysts include a triangle as part of the decision hierarchy. There’s


nothing wrong with including a triangle as you draw the hierarchy, but to
me it doesn’t add insight, so I generally don’t draw it.
Some decision analysts (my friend Paul Wicker, for example) don’t use
the closed/strategic/tactical nomenclature. They put decisions on a time line
showing decisions that:

have already been made,


need to be made now (or in the near future), and
will need to be made later on.
Plotting the decisions out with time on the x-axis can sometimes add
valuable insight for the team. This can be done quite quickly on a large
whiteboard.
Sometimes teams struggle with developing robust alternatives and stay
with small variations on the same alternative. One way to address this is,
after the team thinks they have developed all of the alternatives associated
with a decision, ask, “What would we do if none of the alternatives we have
listed were available?” The team will usually have blank looks for a few
minutes, but somebody will usually suggest an out-of-the-box (perhaps
humorous or sarcastic) alternative (which you write down along with the
other alternatives already suggested). This can get the brainstorming
process going, and you may come up with a creative and powerful
alternative that you otherwise would have missed.

Decision Trees
You’ve probably figured out that one of my favorite tools is the decision
tree. Trees are simple but powerful and can be used in many ways. In a
decision tree:

squares are used for decisions,


circles are used to show uncertainties,
triangles are used to show outcomes (payoffs),
lines are used to show connectivity, and
probabilities are noted underneath the lines.

Figure 4.1 shows an example tree.


In this example, we have an opportunity that we have the option of
pursuing. There’s a chance that the opportunity could fail, in which case we
lose money. If the opportunity succeeds, there is uncertainty as to how
much money we could make.
Figure 4.1 Example decision tree

A decision tree is evaluated (or “rolled back”) by summing the product of


each outcome and its associated probability. This gives the expected value
(EV), which is the value that should be achieved if the “game” could be
played hundreds or thousands of times. The important thing to understand
about EV is that you do not get the EV—as my friend David Skinner says,
“you have bought the whole curve,” meaning that there’s a probability
curve associated with the payoffs and any point on that curve is just as
likely to happen as any other point. We’ll discuss probability curves more in
Chapters 6 and 9.
Figure 4.2 shows the example tree with payoff values and the tree solved,
or rolled back. The EV for each uncertainty and the initial decision are
shown in the boxes. For example, in Figure 4.2 starting from the right, 30%
of $1,500 plus 40% of $800 minus 30% of $100 is $740.
This particular tree was developed using a computer program called
TreeAge Data, which is useful for drawing and solving trees. You can also
use Excel. In this example, the EV is positive, so you would be inclined to
pursue the opportunity, recognizing that (1) the deal may fail, in which case
you lose $250, and (2) even if the deal succeeds, you could lose money (as
shown by the low payoff branch).

Once again considering our career-path example, Figure 4.3 shows a


decision tree that captures uncertainties and decision we’re facing.
In this example, we have three job offers to choose from. There’s
uncertainty associated with offers A and B—we don’t know how the
companies will perform in the future, which will affect compensation.
Plus we don’t know how good the fit is with each company, which will
determine how fast or slow promotions and new assignments come.
Offer C represents a start-up company, which could do very well or it
could go bankrupt. If the company goes bankrupt, the alternatives are
to either find another job or go back to grad school. We believe that
there is a 60% chance that company C will succeed and a 40% chance
it will fail.

Trees don’t have to be complete or complicated to yield insight. Consider


the situation I described in Chapter 1 where a team was struggling with
dealing with how to integrate potential new technology into an existing
program. Here are the three outcomes I noted:

Figure 4.2 Solved or “rolled back” tree


Figure 4.3 Multiple career path/job offer example

Current technology is adequate to address the needs of the program,


in which case the new equipment is interesting but not necessary to
complete the mission.
Current technology won’t adequately address the needs of the
program, in which case the current program should go on hold until
the new equipment is available.

Figure 4.4 Example decision tree

Figure 4.5 Example skeleton tree

Current technology addresses part but not all the needs of the
program, in which case a cost/benefit analysis should be done to
determine whether to proceed or wait. We labeled this outcome the
“sort-of” case; the team deemed this the most likely (and least
desirable) outcome.

This wasn’t how I drew the tree, though. Figure 4.4 is the tree we used
during the meeting.
The insight was that only in the middle “sort of” case would a decision
have to be made.
Also, you don’t have to completely fill out the tree to gain insight. If you
have several uncertainties, the tree can quickly become a thorn bush with
too many branches. Here you can simply draw the decisions and
uncertainties as unconnected events in their relevant sequencing (Figure
4.5). Decision analysts sometimes refer to this kind of a tree as a “skeleton”
tree.
Trees can also be used as part of the analysis toolkit, so we’ll be talking
about decision trees again later.

Alternatives Tables
If you have more than two or three decisions, it becomes difficult to
represent your problem with trees. If, for example, you have five decisions
with three alternatives for each decision, there would be three to the fifth
power, or 243 possible end nodes (combinations of alternatives) in the tree
without including uncertainty! A tree with this many branches is too large a
tree to add insight.
To handle this, we list each decision across the page and note the
alternatives available for each decision in a column underneath it. The
easiest way to build an alternatives table is to use a whiteboard, or, if you
want to use a projector and work online, use Excel. Note that since we only
have one decision for our career path example, we don’t need an
alternatives table. Therefore, we’ll illustrate the concept with a new
automobile strategy example.
For this example, you are on a team that has been tasked with developing
a completely new vehicle. Starting with a blank whiteboard, you have many
alternatives available, such as shown in Figure 4.6.
Sometimes it is important for the alternatives to be mutually exclusive
and collectively exhaustive, but in this case, you can have more than one
alternative selected to constitute a product line. For example, a high
performance car could come standard with a V8 and have an option for a
V12 (like the BMW 830/840/850 series did).
Once again we can see the value of the objectives hierarchy.
Understanding what our management wants and what our team’s objectives
are will give us insight into which alternatives logically fit together. Once
we start to think about coherent sets of alternatives, we can develop a
strategy table.
Figure 4.6 Automobile alternatives table

Strategy Tables
Developing a strategy table can be fun. Looking at the decisions and lists of
alternatives in the alternatives table, we brainstorm a list of strategic themes
(e.g., aggressive, economy). Humorous names for the themes can make this
exercise more fun (e.g., “Genghis Khan” for aggressive, “Cheapskate” for
economy). Select the alternative for each decision that is consistent with the
strategic theme. In a team situation, I ask each person on the team to
develop one or two strategies and note their rationale in developing their
strategies. Then we list each strategic theme and quickly go through them
together. Figure 4.7 shows several possible strategies for the automobile
example.
I’ve listed the strategic themes in the left column and then the
alternatives for each of the five decisions in columns to the right. A
“frugality” theme could include a coupe and a sedan. “Haul the family”
could include a minivan, SUV, and station wagon. The strategy table helps
you to formulate strategies that are internally consistent and allows you to
explore your options.
One way to qualitatively evaluate each strategy is to list its pros, cons,
and hunches. Then list the answer to the question, “If our competitors knew
we were planning this strategy, what would they say about it?” This
evaluation can be used to narrow from many strategies to two or three to
carry into analysis. Also, many times a hybrid strategy ends up being the
strongest.

Figure 4.7 Automobile strategy table

Keep It Clear!
It is important to make sure that you and your team are clear on objectives,
alternatives, and uncertainties. I help David Skinner teach the EMBA class
at Rice each year. The class is divided into teams, and each team presents a
comprehensive project summary at the end of the semester. Invariably at
least one team will add confusion to their report by adding objectives and
uncertainties to the alternatives contained within their strategy table. Client
teams I’ve worked with have been tempted to do this too.
An alternative represents a true choice that your organization has the
authority to make. An objective is something that you want—it may be
important and something to strive for, but it is not an alternative. An
uncertainty is something that you cannot control. It may be extremely
important, and it may be related to your alternatives and your objectives,
but it is not a choice.

Let’s put this in the context of our career path example. We have one
decision with three alternatives. We choose which job offer we accept.
Our objectives are noted in the previous chapter—we would like a
good financial package, a good social life, and a good professional
environment. The uncertainty is the extent to which, given the choice
we make, we will achieve our objectives.
The reason that I bring this up is that a strategy table won’t work if you
don’t have clarity around the choices that you actually have.

Implications for the Manager


Decision trees are very powerful brainstorming aids. Many times drawing
the decisions and developing the uncertainties can help resolve
disagreements and add insight. The skeleton trees can be very useful to
show the sequence of decisions and uncertainty resolution.
Decision trees are also very useful for discussing family issues and
gaining consensus as to what the real alternatives are and how they fit
together with what is unknown.
Trees can also help you prepare for a negotiating session. Draw the tree
from your perspective, and then draw it from your counterparty’s
perspective. This can help you to frame your discussion and figure out how
to achieve a win–win negotiation outcome.
CHAPTER 5

Influence Diagrams

What Do We Know?

We’ve discussed our problem and put it into context. We now understand
our objectives and decisions. It is time to consider uncertainty (although the
decision tree does get us started on the uncertainty discussion). A good way
to discuss uncertainty is to use a tool called the influence diagram.
An influence diagram is a graphical representation of a problem, which
shows:

decisions as squares or rectangles,


uncertainties as ovals or circles,
calculated variables as double-lined ovals,
measures-of-value as hexagons,
key constants (e.g., interest rate) can be shown as text, and
relationships and conditionality as arrows.

Figure 5.1 illustrates the decision, uncertainty, calculated variable,


relationship, and measure-of-value components, which are used to develop
an influence diagram.
Figure 5.1 Influence diagram components

The influence diagram has two uses:

During framing, the influence diagram shows a consensus view of


uncertainties and decisions (and how they fit together).
The influence diagram is the bridge between framing and analysis.
The person who “runs the numbers” (usually referred to as the
modeler) starts with the influence diagram and feeds back to the
team how the uncertainties were assessed and how the model was
built.

A “generic” influence diagram illustrates how the items fit together


(Figure 5.2).
We have a decision as to whether to implement a project or not.
Associated with the project are sales price and volume, which lead to
revenue (note the double lines in the revenue node). The cost component
(cost of goods sold, or COGS) consists of variable cost (hence the arrow
from the volume node) and fixed cost. Revenue less COGS is pretax profit.
We have to estimate taxes, which depend on profit and depreciation. Capital
costs affect depreciation and cash flow, hence the arrow from capital cost to
our measure-of-value, net present value (NPV). Note that the interest rate is
shown as a constant. Also note that this project can fail, hence the
probability of success node.
Therefore this influence diagram shows that we can estimate NPV if we
can assess six uncertainties (sales, volume, variable cost, fixed cost, capital
costs, and probability of success).
Figure 5.2 “Generic” influence diagram

Most influence diagrams are more complex than this one. Teams starting
out have a tendency to make the influence diagrams too complicated—
including every possible input from an accounting cost sheet. For one of my
first projects, we had a conference room with one entire wall made up with
whiteboard, and we filled the entire wall with the influence diagram!
During the framing phase, it is better to keep the influence diagram
relatively simple. Once we start the analysis and assess subject matter
experts, we’ll find that the influence diagram will change anyway, as
experts have their own way of looking at the risks and uncertainties.
Another tip for using the influence diagram is to note the experts you will
talk with to assess the risks and uncertainties on the influence diagram after
you have drawn it. If you are reviewing the project for your decision maker,
showing the experts that you plan to consult will add credibility to your
project if the decision maker agrees with your selection of subject matter
experts. It is also likely that the decision maker will suggest additional
experts that your team had not considered, which will allow you access to
more expertise and increase the credibility of the analysis.

For our career path example (Figure 5.3), we have one strategic
decision: which job offer to accept. There are three measures-of-value:
financial (we’ll use NPV of pay plus benefits as the financial
measure), social, and professional. We’ll discuss how to handle the
non-quantitative measures-of-value (social and professional) later in
the book (hence they are grayed out in Figure 5.3). The uncertainties
associated with NPV include starting salary, raises and promotions,
bonuses, value of benefits, options, and, in the case of offer C,
probability of success for the start-up company.
Another option (not shown to keep the example simple) is to include
an estimate of cost-of-living associated with each job offer. If cost-of-
living differences are significant, you would need to include this
uncertainty as well. Also, I’ve shown taxes in the influence diagram—
you may elect to use NPV of pre-tax cash flow as your measure-of-
value, or you may decide to use after-tax cash flow.

When you consider Figure 5.3, think how valuable it would be to show
the influence diagram to your significant other as you think through the job
offer decision. The decision tree helps us to frame the decision, but the
influence diagram shows the uncertainties that feed into just the financial
part of the decision! It also gives you insight as to how you might approach
doing the calculations to estimate the NPV of the three alternatives, hence
illustrating the value of the influence diagram as a segue into analysis.

Figure 5.3 Career path example influence diagram

Facilitating and Drawing the Influence Diagram


The best way to facilitate the influence diagram is for the team to draw it
together on a large whiteboard. This way, different colors can be used and
discussion can take place around the influencing arrows, uncertainties, and
decisions. I would not recommend trying to do it during the meeting with a
software program, as this is likely to be too slow. Online can work only if
your meeting is just you and another person or two and you can run the
software program relatively quickly. If you’re using a whiteboard, take a
digital picture of the end product so that you can transcribe it to the
computer later.
A chart pad with markers can also work well. The disadvantage of the
chart pad is that you can run out of room rather quickly.
I usually use PowerPoint to transcribe the final influence diagrams, as (1)
PowerPoint includes connecting lines and arrows within its toolset, (2) your
influence diagram is likely to end up in a slide deck anyway, and (3) it is
easy to paste from PowerPoint into Word. Note that Visio and Analytica can
also be used to draw influence diagrams, as can Adobe Illustrator™.
Analytica is a powerful simulation program available from Lumina
Systems.
One caution, though—it can take more time than you think to draw a
clear, clean influence diagram. Another caution—you need to be willing to
make your influence diagram “evergreen.” By that I mean that you need to
keep it up to date as models are built and experts are assessed, as it will
change.

Implications for the Manager


The influence diagram is a very powerful brainstorming tool. When there is
a disagreement between different people on how things fit together,
sometimes drawing the influence diagram on a whiteboard together can
help resolve the situation.
The influence diagram can also include the subject matter experts that
you plan to assess. This is very useful to discuss with management—many
times upper management will have experts that they would like to make
sure that you consult.
CHAPTER 6

Uncertainty Assessment

The Boundary between Known and Unknown

At this point, we have completed framing our problem (or risk or


opportunity). We understand our objectives, the strategic decisions we have
to make, and have started to consider the uncertainties associated with our
situation. Before we can start to estimate NPV (or other measure-of-value),
we have to assess the risks and uncertainties we’ve drawn on our influence
diagram.
The process we use to quantify the state-of-knowledge about
uncertainties associated with our problem is called assessment or expert
assessment. In some cases, we can use historical data to augment our
assessments. The quality of our analysis is completely dependent on the
quality of our assessments. Without good assessments, further modeling or
analysis is best termed “GIGO” (garbage in, garbage out).

Discrete Probability
When hearing or reading the word risk, what comes to your mind? Most
people associate risk with the chance that something bad will happen. Some
decision analysts use the term risk or the term chance when the unknown
outcome has a discrete outcome associated with it: yes or no, success or
failure, zero or one. A coin flip has a discrete outcome with a 50% chance
of success for either side (heads or tails) of the (presumably honest) coin.
Discrete outcomes are always assessed as a percentage (i.e., a number
between zero and one), never a range.
For example, weather forecasters assign a percentage for the chance of
rain (e.g., 30%). What the percentage means is that, when weather
conditions are like they are now, on average for any given location within
the relevant area, on three days there will be measurable precipitation and
on seven days there will be none. Note that the probability of rain does not
address how much rain is likely to fall. It could be 0.01 inch or 14 inches!

On our career path influence diagram and in the decision tree, we


noted that the start-up company associated with our third job offer
could go out of business. This has a discrete outcome. Either the
company stays in business or it doesn’t.

Another example of a discrete outcome is the probability of success when


drilling an oil or gas well. Either your well hits recoverable hydrocarbons or
it does not—a “dry hole.”
I occasionally use the terms risk or chance to refer to variables with
either a success or failure outcome, that is, variables with a discrete
outcome associated with them. There are times when I’ll revert back to
generic usage of risk, for example, when we discuss how to do a risk
analysis in Chapter 13.
Also note that for some games of chance, the outcome distribution is
discrete but has more potential states than success or failure. For example, a
six-sided die has six possible discrete outcomes, each with a probability of
100%/6 or 16.67%. Other than games of chance, however, most discrete
variables you will come across will have zero or one, success or failure,
heads or tails, as their possible outcomes.

Continuous Probability (Uncertainty)


Decision analysts use the word uncertainty to characterize an outcome that
is not known and that has a continuous distribution of possible outcomes
associated with it. The continuous variable of possible outcomes is assessed
as a range, which we’ll discuss.
So far in the book, when discussing decision trees and influence
diagrams, I’ve used the term uncertainty to refer to variables with unknown
outcomes. From here forward, though, we’ll make the distinction between
discrete and continuous variables, the latter of which we’ll refer to as
uncertainties.
When we talked about discrete variables, we noted that the probability of
rain does not address how much rain could fall. There is a continuous
distribution of possible outcomes associated with how much rain we will
get if it does rain. Therefore, how much rain falls is an uncertainty.
With an oil well, there are several uncertainties associated with the
quantity of recoverable reserves: initial production rate, total reserves, gas-
to-oil ratio, and so on. Geologists, geophysicists, and reservoir engineers
break this down even further, considering porosity, permeability, depth-of-
play, viscosity of the hydrocarbon, and so on.
There’s common shorthand for uncertainty assessments that you’ll hear
analysts use—a “10–50–90” range (or “p10–p50–p90”). What is being
referred to is as follows:

A “p10” assessment: only 1 time in 10 similar assessments does the


actual result come out lower than the “p10” assessment. With our
rain example, the p10 might be 0.05 inches—only 1 day in 10 when
conditions are like they are now would we have less than 0.05
inches if we have measurable precipitation.
A “p90” assessment: only 1 time in 10 similar assessments does the
actual result come out higher than the “p90.” Again with the rain
example, the p90 might be 1.5 inches—only 1 day in 10 when
conditions are like they are now would we have more than 1.5
inches if we have measurable precipitation.

A “p50” assessment is the median of the distribution:

Half of the time, the answer will be lower and half of the time the
answer will be higher than the “p50.”
It equals the mean when dealing with uniform or normal
distributions.

With the rain example, the median might be 0.25 inches, again, if we have
measurable precipitation.
I’ve shown these assessments (0.05, 0.25, and 1.5 inches) in tree form
(along with a 30% chance of rain, Figure 6.1).
Figure 6.1 Rain uncertainty

Figure 6.2 Rain uncertainty (rolled back)

Many times as we talk with experts about uncertainty, drawing a tree as


shown in Figure 6.1 on the whiteboard can help communicate what we’re
looking for.
If we “roll back” (solve) the tree, it looks as shown in Figure 6.2. The
expected values are 0.51” if it rains and 0.15” overall considering the
probability of precipitation. Does this mean that we can expect either 0.15”
or 0.51”? No—those numbers reflect what we would expect as an average
over hundreds of days when the weather conditions are like they are today.

Ranges and Distributions


We’ve used the word range in talking about uncertainty. A range is the
distribution of possible outcomes associated with an uncertainty. It is
usually expressed in terms of a “10–50–90” or “p10–p50–p90,” but any
probability can be used. A range can be thought of in terms of a density
function (histogram) or cumulative probability curve (s-curve).
We think in terms of the traditional normal distribution—a bell-shaped
symmetrical distribution with standard deviations extending out from the
mean in both directions. Figure 6.3 shows the histogram and the cumulative
distribution for a normal distribution. The p10, p50, and p90 values are
highlighted. Note that the area under the density function is always equal to
one. Hence, the cumulative probability curve is showing you the area under
the corresponding density curve.

Figure 6.3 Probability distributions

Some people prefer to work with the density function; others prefer the
cumulative probability graph (s-curve). I’ve always worked with the s-
curves; therefore, I will primarily focus on them.
In reality, distributions are seldom normal. Consider the rain quantity
distribution in Figure 6.4—it is clearly not a normal distribution.
If there is measurable precipitation, you can see that the curve flattens out
rather quickly. This makes sense—it is normally quite rare (at least in North
America) to get more than four inches of rain within one day. Of course, if
there is a tropical storm or hurricane, the conditions are different and the
weatherman’s assessment would likely be different than the one shown in
the tree (Figure 6.1).
Referring again to Figure 6.4, the p70 is at about 0.5 inches. This means
that only three times in ten similar situations will rainfall exceed 0.5 inches.
Note that Figure 6.4 assumes that there is measurable precipitation. If we
consider that there’s only a 30% chance of measurable precipitation, the
curve would not start until the 70% point on the y-axis.
Figure 6.4 Rainfall cumulative probability distribution

Many naturally occurring distributions are log normal; however, there are
many different statistical distributions. We’ll talk more about how to deal
with different distributions when we talk about modeling and simulation
later in the book.

Assessing Experts: Potential Biases


Before starting, you must be clear on whether you are talking with an expert
about a variable with a discrete outcome or a variable that has a continuous
range of possible outcomes (uncertainty). If you’re not clear on the
distinction, you’ll confuse your expert! Again, discrete variables are
assessed as a percentage, and variables with a continuous range of possible
outcomes are assessed as a range.
The first step in assessing an expert is to talk with the expert about what
you want to do and what you want to assess. As you initially talk with the
expert, you want to proceed in a way that minimizes the bias in his or her
assessments.
Be sure that your units of measure are (1) well defined and (2) consistent
with the expert’s thinking. If you’re assessing an expert on the range of
mileage a vehicle can expect to achieve, you need to know whether the
expert thinks in terms of miles per gallon or kilometers per liter. Don’t make
the expert translate the units—you can do that later!
Anchoring Bias

Anchoring is a common bias. If you start your assessment process by


talking about p50, your p10 and p90 estimates will likely end up (1) too
close together and (2) more of a plus/minus from the p50 than a true
p10/p90 estimate.
Always start with the p10 or p90, then consider the opposite end of the
scale, and finish with the p50.
Anchoring bias is also important to consider when you are negotiating on
price for something. Salesmen will always try to anchor you high; you will
try to anchor the opposite side low. There’s an old saying in negotiations,
“He who floats the first number loses.”

I remember one of my former bosses lamenting a trip they made to


purchase a small company that fit into the business strategy well. After
touring the facilities and examining the financials for a couple of days,
my boss’s boss floated a number. The owner of the business thought
for a few minutes and accepted the offer. They knew at that point
they’d paid too much!

The way to counter this is to do the best job you can ahead of time and
develop an s-curve as to what you think the other side would accept. You
can float the first number as long as you understand where on the s-curve
you are—you want a number that is credible but low (e.g., the p10 or p20
on the curve). This is a very powerful concept.

I once facilitated a negotiation where we assessed and developed the s-


curve for our side and a second one for the other side. When we
finished our work, our lead negotiator was smiling like the “cat who
ate the canary.” We asked him why he was smiling, and he said that he
was sure the other side had not done their homework. Because we had
done ours, they were going to leave money on the table by the end of
the negotiation.

It bears repeating, though—always start with the p10 or p90 and finish
with the p50.
Overconfidence

The most common bias is overconfidence. We think we know more than we


really do.

David Skinner and I use a fun exercise with our classes to illustrate
overconfidence. We call this the “Attila the Hun” exercise. We go
through the explanation of uncertainty, ranges, and distributions. Then
we have five “off the wall” questions (e.g., “What is the year Attila the
Hun died?”). Each person in the class writes down their p10, p50, and
p90 estimate of the answer. Then we tally up the results for the five
questions in four columns:

column A: “the Answer” is less than your p10 estimate,


column B: “the Answer” is between your p10 and p50,
column C: “the Answer” is between your p50 and p90, and
column D: “the Answer” is greater than your p90 estimate.

If true p10s and p90s were captured, about 20% of the answers would be
lower than the p10 estimates or higher than the p90 estimates. This never
happens! Usually about half of the results are in columns A and D.
The way we counter this bias is to ask the question, “Let’s start with the
p10 outcome. What things could happen that could lead to a low outcome
for this uncertainty?” You write down all the items the expert can think of.
Then you reverse the question and ask the expert for things that could
happen that could lead to a high outcome for this uncertainty. Once you
have painted this picture, you ask the expert for their p10 estimate and their
p90 estimate. After you have the p10 and the p90 established, you ask for
the p50.
There’s nothing magic about the p10–p50–p90 points on the distribution.
If the expert thinks differently about the situation, capture the expert’s
thinking. For example, one time I was assessing an expert about equipment
costs, and he said, “I don’t know about a p90, but I can tell you what a p70
number would be.” In his mind, he felt comfortable giving an estimate that
the outcome would be higher about 30% of the time.
Immediacy Bias

Another bias is that of immediacy. Airline ticketing always drops


immediately after a major air crash. Our perception of airline safety drops
after an incident. If you had assessed experts about the probability of a
major offshore drilling disaster before the Deepwater Horizon incident,
you’d have markedly different assessments than, say, a month after the
incident.

Personal Interest Bias

Yet another potential bias is personal interest. If the expert has a personal
interest in your outcome, they cannot help but be biased. If you are talking
with one of your project engineers about how long it will take to complete
part of a project, the engineer may be tempted to anchor you high and “sand
bag” the task.
There are just a few common biases. For further reading on bias, I
recommend Tversky and Kahneman.1

Developing the Assessments


When assessing an expert, I like to use a whiteboard. Start by defining what
the uncertainty is; the influence diagram is a good tool to help the expert
understand how the uncertainty fits into the situation. Many times the
expert will correct your influence diagram as you discuss it, as he or she
will have a deeper understanding of the uncertainties than the original team
did. The expert is likely to need to “decompose” your high level ovals into a
more detailed picture. As with the units-of-measure, you need to stay with
the expert’s view and write down their thinking as you progress. Keep the
influence diagram as an “evergreen” document, so that you can use it to
explain to other team members/stakeholders how the thinking evolves as
you talk with experts and specialists.

Assessing Discrete Probabilities


To assess variables with discrete outcomes, ask the expert for the factors
that could lead to success and write down a summary of their thinking.
Next, ask about factors that could lead to failure, and again, write down a
summary of their thinking. Last, ask for their assessment of the probability
of success. Make sure that you settle on one percentage—this is not a range.

Decision analysts sometimes like to use a probability wheel for these


assessments. This is a circle with two colors where you can adjust the
proportion between the colors until the expert is happy with the
proportion. Then you can read the corresponding probability of
success off the back of the wheel.

Think about weather forecasters—they almost always give you the


rationale behind their probability of precipitation. They’ll explain about
high pressure and low pressure areas moving, fronts coming through, the jet
stream, areas of humid or dry air, and so on.

As an example, in our career path case (Figure 5.3), we have a chance


that one of the companies can fail. This would be assessed as a
percentage (chance of success). To assess an expert on this topic, we
would document the factors that would lead the company to succeed
and factors that could lead it to fail. Then we would ask the expert for
their best judgment as to the probability and would write it down. In
this example, we assess that the company has a 60% chance of
success. This means that if we had a group of very similar companies,
we would expect four out of ten of these companies to fail.

Assessing Variables with Continuous Distributions (Uncertainties)

For each uncertainty, as noted when we discussed anchoring bias, start with
either the p10 or p90. Write down the expert’s view of the kinds of things
that could lead to a p10 outcome. Then switch and write down the things
that could lead to the p90 outcome. Then ask the expert, “Given that a
number of the factors that could lead to a p10 outcome happen, what do you
think the outcome would be?” Write down his or her answer and repeat for
the p90. Draw a graph showing these points on the whiteboard. When you
and the expert are satisfied that you have captured his or her state-of-
knowledge, ask what the expert thinks the p50 outcome would be. This is
the outcome where the expert thinks there’s an equal chance of the outcome
ending up being higher or lower than the estimate. Plotting this point gives
you an idea of the expert’s thinking about the distribution. For example, if
the p50 is much closer to the p10 than to the p90, the distribution is skewed
toward the low side, but there’s a small chance of a high outcome.
It is important to write down the basis behind the expert’s thinking.
Sometimes other managers will not agree with the numbers, probably
because of their own experience or they have a different expert. Having
written down the factors leading toward the p10 and p90, the people who
disagree can discuss the factors. It is possible that the original expert did not
consider additional factors that might be important, in which case you can
work with the experts to update the assessment. As true experts discuss the
factors and their experience, in my experience they will almost always build
on each other’s thinking and end up with a better result. Again, document
the subsequent discussions—you may have to iterate several times, but the
assessments should improve in quality as you iterate.

In the career example, we would assess the high, medium, and low
outcomes (i.e., p10/50/90 outcomes) of our potential wages, salary,
bonuses, and benefits for each of the three companies. We are likely to
have very similar factors that would lead to high outcomes or low
outcomes for these uncertainties. The expert may want to consider
them collectively or individually. As noted previously, we write down
the factors that could result in a high (p90) outcome, write down
factors that could lead to a low (p10) outcome, and then write down
the assessments.
In the career path example, we have offers in-hand, so we know the
salary and current benefits. What is not known are future raises,
bonuses, and changes to benefits. This is what we have to assess. For
example, for a given company, future raises could be assessed at 0%,
2%, or 5% per year. Bonuses could be assessed at 0%, 5%, and 15%.
Note that these distributions are not normal distributions—we’ll
discuss distributions again in Chapters 8 and 9.
A Special Case: Price
What about the selling price of a product? Is it a known (constant)? Is it a
decision that a company makes? Or is it uncertain? Traditional economics,
of course, notes the price/volume relationship—as you lower a product’s
price, you should see a corresponding increase in sales volume, and vice
versa.
It sounds trite, but my answer to the previous questions is “yes!” Price
can be a constant, it can be a decision, and it is definitely uncertain. For
commodities, of course, price is a true uncertainty, set by traders in
commodity exchanges (e.g., the price of oil, gold, and wheat). However, for
products that are not commodities, I think the correct way to look at price
includes the following:

1. We set a target list price for a product. This is a decision that is likely
based on judgment, experience, market assessment, and (hopefully)
results from a good DA model!
2. Market response to our list price is uncertain. The volume of sales we
will achieve from a specific list price is an uncertainty and should be
represented by a range.
3. The level of discounting off the list price we will have to utilize in
order to stay in business is another uncertainty (with a few exceptions
such as some of Apple’s products, nobody pays list price for anything
today!).
4. The length of time before we have to lower our list price in order to
respond to competitive pressure is another uncertainty. Or, we may
have to eventually raise prices to cover costs if we’re in a period of
rising commodity prices.
5. Ability to sustain a price increase due to increasing commodity prices
is uncertain. Think about the airlines—if fuel prices go up, one airline
will raise their prices and hope the others follow. If the others do not
follow, they’ll have to lower their prices back if they want to keep the
same number of customers.
6. Planned obsolescence is another decision or set of decisions. Consider
electronic and computer products—new models are introduced all too
frequently. This gives the manufacturers an opportunity to reset the list
price and customer expectations with each introduction (if they so
desire). The length of time to develop and produce the next edition of
the project is another uncertainty (your best critical path plan usually
represents somewhere between a p01 and a p05 of the actual start-up
date—we’ll talk more about this later).

Figure 6.5 Price and revenue influence diagram

Sounds complicated, doesn’t it? It gets worse—you have to also consider


the time horizon of monitoring sales volume. If you monitor, say, the
number of units you are selling on too short a time frame, there is so much
variation in the data that you can’t see trends or gain insight. If you use too
large a time frame, first, by the time you analyze your data you’ll be too late
to make adjustments, and second, you are unlikely to see meaningful trends.
The best way is to monitor several time frames until you figure out what
gives you the most insight (e.g., hourly—daily—weekly—monthly—
quarterly—yearly).
Here is a simple influence diagram (Figure 6.5) that considers revenue
generation resulting from a list price decision. You can see the power of the
influence diagram—it makes the previous complex discussion clear.

Implications for the Manager


Even if you are not approaching a problem from a full DA standpoint,
understanding and documenting assessments and using the underlying
thought processes are important. Ignoring bias is a pathway to disaster!
You will be talking with experts and specialists frequently. As you
discuss problems and opportunities and write down the basis for expert
thinking, this will help you (1) get better answers from your experts and (2)
demonstrate to them that you’re listening to them and taking their input
seriously.
On the other side of the scale, if you suspect that personal interest bias is
present (i.e., you’re being sand-bagged), asking the question, “What are the
things that have to happen to achieve the p90 (or the p10) outcome?” can
help to trigger more objective (and possibly even creative) thinking.
Just asking the question “What kinds of things could cause this project to
miss the target date?” and recording the experts’ answers can lead to more
realistic thinking about how long your project will take.
The most important consideration, though, is to keep your models and
assessments updated and use them as part of your management process.
This will give you more power and insight to explore and implement
responses as the business environment changes.
CHAPTER 7

Building a Deterministic Model

Time to Run the Numbers

We have talked with our experts and updated our influence diagram
accordingly. It is now time to start building a deterministic financial model.
By deterministic, I mean a model that provides one “answer” (or set of
answers, e.g., NPV and internal rate of return of the project) and does not
address uncertainty (yet). Each uncertainty is set to its base value. Once we
have built and validated a deterministic model, we’ll use it to explore which
uncertainties are important (Chapter 8) and then update it for simulation
(Chapter 9). We will also use the deterministic model to populate the end
nodes of the decision tree(s) so that the trees can be rolled back as we
showed in Figure 4.2.
I’m assuming that you use Excel and can find your way around the
program, as most managers develop budgets, projections, and forecasts (or
check work done by somebody on their staff). If you went to business
school, you likely built (or your team built) many financial models to
analyze cases. Therefore, we’ll focus on how to build good models that are
clear, concise, reasonably efficient, and adaptable to probabilistic simulation.
We’ll start with some information about Excel, the terminology we’re
going to use throughout the rest of the book, and some helpful hints on
developing good models.
There is a caveat, though, about this chapter. If you don’t use Excel in
your job, presumably you have support staff who provide this capability. In
this case, I’d suggest skimming this chapter quickly from a viewpoint of
thinking about how you can maintain quality control with respect to the
work that Excel is used to accomplish. You should stop skimming, though,
and start reading with the section titled “Model Validation and Quality
Control” as this is important. I used Excel to prepare many of the tables and
charts, and a reader who is skilled in using Excel can create similar models,
tables, and charts by applying the principles and techniques discussed in this
book. Therefore I am not providing step-by-step instructions for doing this.

About Excel
Excel has an interesting history. Those of us who were early into PC usage
well remember that the dominant spreadsheet program was Lotus 1–2–3.
Lotus 1–2–3 built on an idea first implemented with a program called
“VisiCalc,” which was written for the Apple II and ported to the
Commodore. VisiCalc was a significant breakthrough in computing and was
one of the programs that turned PCs into something useful for business.
Lotus 1–2–3 was an amazing program, both in terms of functionality and
efficient coding—it would run on an IBM PC that contained only 128K of
memory! Most of us used it for economic, financial, and engineering
computations well into the 1990s. However, Lotus was slow to port the
program to the Mac, which left a gap.
In 1985, Microsoft filled the gap with a new program written for the Mac
called Excel. By 1987, the first Windows version was released. A major
upgrade came with Excel 95, developed to correspond with Windows 95.
Visual Basic for Applications (VBA) was added with the Excel 97 version
for Windows and the Excel 98 version for the Mac. By the mid-1990s, Lotus
1–2–3 was rarely used. Subsequent changes have added marginally valuable
features and rearranged the interface into the “ribbon” format.
For a good discussion on Excel’s history (and many other useful items), I
refer you to John Walkenbach’s website http://www.spreadsheet-page.com.
Walkenbach has written over 50 books about Excel and one of his books1
was the most useful VBA tutorial I have found.

Terminology
This may sound dogmatic, but some terminology is appropriate at this point.
Try to avoid the term spreadsheet because the word is ambiguous. Does
“spreadsheet” refer to a page or tab in a file, the file itself, or the
application? I recommend the following terminology (which is consistent
with Microsoft usage):

Use “workbook” (or “workbook file”) to describe one complete


Excel file.
This file contains worksheets, charts, and macros (i.e., Visual
Basic programs).
The name of the workbook is the name of your Excel file
(e.g., 2018forecast.xls).
A “worksheet” is an individual page in the workbook.
The name of the worksheet is found at the tab on the bottom
of the page. Some refer to the worksheets as “tabs.”
“Charts” are what Excel calls graphs.
Charts can be embedded within a worksheet or a standalone
page.
Charts can be linked to information in the worksheet or
developed from calculations.
A “model” is a workbook that is designed to give you an answer to a
physical or financial problem.
A “cell” is a single square within a worksheet. It contains content
that you type in (a formula, a value, or a label). It may also contain
references to other cells.
A “range” is any cell or group of cells (e.g., “A2” refers to the cell
located in column A and row 2).
An “equation” is how you tell Excel to perform specific calculations
of interest to you.

The $ Operator
The dollar sign operator contained within a cell or range reference (within an
equation) is very important, as it fixes the column or row when you copy and
paste or drag to copy a cell or range within the worksheet. For example:
If you copy (or drag) a reference to the cell “A2” to the right by one
column and down one row, the reference changes in the new cells to
“B2” and “B3,” respectively.
If you copy a reference to “A$2” across one column and down one
row, the reference changes to “B$2” and “B$2” for both of the new
cells.
If you copy a reference to “$A2” across one column and down one
row, the reference changes to “$A2” and “$A3” for the new cells,
respectively.
If you copy a reference to “$A$2” across one column and down one
row, the reference does not change, but remains “$A$2.”

This operator is very powerful, but it is really easy to get mixed up on


fixing the rows or the columns.

Range Names
We defined a “range” as any cell or group of cells. The reason ranges are
important is because you can name them. To name a range, you can either:

Highlight the cell or group of cells you want to name and then type
the name you want to use into the “Name Box” (upper left part of
your screen) and hit “enter” (“return”), or
From the menu, choose “Insert,” then “Name,” and then “Define,”
which brings up a dialog box where you can specify the name you
want to use and the range to which you want to refer. For the newer
“ribbon” versions of Excel, this is accessed by Formulas, and then
Define Names.

The method described in the second bullet point is how you can edit a
range name once you have created it. Also, using the second method, you
can create a constant that is not in any of the worksheets. You can apply the
range name to just one worksheet; however, normal practice is to use the
same range names throughout the workbook file.
Range names are important because they (1) speed up formula entry, (2)
make debugging and usage by other people much easier, and (3) will help
you remember what you did if you come back later (potentially years later)
to update your file. For example, if you have a series of cells constituting
sales contained in the cells B2 through B13, the formula “=sum(sales)” is
easier to quickly understand than “=sum(B2:B13).”
One caution, though—you don’t want to use a range name for every
possible variable in your workbook. If you have too many range names, they
become confusing and lose their utility.

Styles
The Microsoft Office suite allows you to define Styles for files produced by
each application. Like range names, styles can add clarity within Excel—you
can use them to define consistent fonts, colors, shading, number formats, and
lines for a specific type of cells. For example, I always create a style called
“InputBox” that I use for each input within the workbook (the newer
versions of Excel contain a preformatted style for Inputs). This way, places
to enter numbers are clear on the worksheet page.
However, like range names, you want to use a few styles and keep their
significance clear.

When I was consulting, a client gave me one of their models to work on


that was very slow in recalculating. I started looking at it and found that
it had over 1400 styles being used! I wrote a VBA (Visual Basic for
Applications) macro to delete styles and added back in about 10 that
were actually useful. The model performed correctly after that
adjustment.

Model Architecture
Whether you have a very simple model designed to complete one calculation
or have a very large model containing all the economics for a mega-project,
the principles of good model architecture are the same. This section contains
some key principles of good model architecture.
Modules
Organize your work into modules, with inputs in one module, calculations in
one module, and outputs (reports) and graphs in a third module. If you have
a large model, you’d normally use one worksheet (tab) for the input
modules. There might be several calculation worksheets and several output
reports. If your workbook is simple, your input module might be small (e.g.,
three columns and five rows), with the calculation module below or to the
right.

Hard-wired Inputs
It is important to never “hard-wire” an input into a formula! Always use
separate input cells for each input, including (and especially) constants. A
workbook file with hard-wired inputs is very likely to propagate errors, as:

another person (or you after time has elapsed) won’t know what the
inputs are or where they came from,
hard-wired inputs may have incorrect units of measure, and
if the input changes, it is unlikely you’ll find all the places in the
workbook file where it was used.

Within the input module, leave space for probabilistic inputs (p10–p50–
p90 ranges), as we’ll be adding the assessments after the deterministic model
has been built.

Units
Make sure that you label each input and put in a separate column for units.
Therefore, an input module needs at least three columns: labels, units, and
values.

Range Names
As noted previously, careful use of range names makes workbook debugging
easier and makes your file more useful later on, both for yourself and for
others.

Formulas Within a Row


Most calculation sections in a financial model are in the form of a time
series, where each column represents a month, or a quarter, or (most
common) a year. It is important not to change formulas as you progress
across the time series, as you’re likely to forget that you’ve done so and
you’ll generate errors as you copy, paste, edit, extend, and update your work.
Sometimes the first column in your calculation section has to be different,
as it represents a start-up year. The second column and all other columns
have the same equation, which is a function of the result from the preceding
column. In this case, change the style of the number in the first column so
that it is clear that it is different. I usually use italics and a different text color
to emphasize the difference.

Quick and Dirty

It is a great temptation sometimes to short-cut these principles. However, it


doesn’t take much more time to set your files up correctly, which will make
everything that you do more useful later. If you use range names, keep your
inputs within a module and record their labels, values, and units, and avoid
hard-wired inputs, you can build files that can be adapted for new problems
in the future.

Version Control, Documentation, and Security

If you are building a model that you (1) may need to use for a long time, (2)
plan to share with others, or (3) will use infrequently in the future, add a
worksheet (or section within a worksheet) for version control and
documentation. Use this as a log to track the changes that you make and why
you are making them. Excel workbook files can sometimes become corrupt,
in which case you are likely to have to go back to an earlier version and
reconstruct your changes. If you have to come back a year or two later and
update the logic in the model, it will be much easier if you have a history of
how you built it.
The best way to monitor for file corruption is to track the size of the file.
If a 700 kB workbook file suddenly takes up 10 MB, it is corrupt and you
need to delete it and go back to the 700 kB version. The easiest way to cope
with this is save your work each day with a new date or version number
incorporated into the file name (e.g., MyModelRev21.xls or
MyModel28OCT17.xls).
Excel does allow requiring a password to open the file. This can help you
with securing proprietary or confidential information. However, this can be
very problematic if you have to come back to the same model several years
later and cannot remember the password. Also, be sure to back up your work
somewhere other than your computer’s hard drive, which can potentially fail
at any point in time. I use DropBox for this purpose, as it is free (up to a
certain point), secure, and works on most computers, smart phones, and
tablet devices.

Large Model Architecture


As noted already, with large models you should have separate worksheets for
inputs, calculations, and reports. You may want to add a separate page for
controls, so that you can conveniently change between cases and
alternatives. For large models, the principles for version control and
documentation are essential.

Model Validation and Quality Control


Several professors and consulting firms have published studies of auditing
Excel files and finding significant errors in the logic and resultant
calculations. In fact, there is an organization that is dedicated to combating
spreadsheet errors, http://www.eusprig.org/. This website has a tab for horror
stories that document the significant negative effects that spreadsheet errors
have on companies, government agencies, and educational institutions.
Until the mid-1980s, engineers and scientists had been trained to use slide
rules, which were good for calculations, but did not carry the decimal with
the calculation. The person had to keep track of the decimals involved in the
calculation themselves. Hence, engineers and scientists developed a feel for
what the results of their calculations should be and went back and checked if
the answer didn’t look right. They would use a mainframe computer
(remember the IBM 360?) for advanced calculations, but would still check
the results by hand with a slide rule. Accountants relied on their adding
machines, and again, checked results coming out of the mainframe—
especially numbers that didn’t look right. With the transition to PCs, we have
speeded up our ability to model immeasurably, but we have lost that feel for
the numbers. Before PCs we would frequently ask a peer to check our work
or to independently develop an answer so that we could make sure that we
were correct.
When an Excel workbook file is being used for something important, the
best way I know to achieve quality control is to (1) follow the practices
we’ve been talking about, and (2) have a competent, independent person
build a verification model from scratch without having seen the original
model or calculation result (this may be a good time to call an experienced
consultant—a few days of consulting help might help you avoid a
multimillion dollar mistake!). Provide the independent person with a table of
inputs and a list of what outputs are desired. I’ve done this several times,
both as the “validatee” and the “validator,” and the results (for large models)
have never matched upon first comparison! There are always errors, even
with experienced and meticulous modelers.

Interest Rate and NPV

We have to decide what interest rate we’re going to use for NPV. Interest
rate is an interesting consideration. For a company, the correct interest rate to
use would be the weighted average cost of capital (WACC). For an
individual, the correct interest rate would be the interest the individual can
achieve with their surplus funds. For many years, decision analysts have
used 10% as a default interest rate. However, since the onset of the great
recession in 2007–2008, 10% is probably too high.
As a consultant, I’ve observed several errors in specifying the interest
rate. Sometimes managers will use different interest rates for different
projects that they are comparing. This destroys clarity. The proper way to
compare the relative risk associated with different projects is to compare
their s-curves. I’ve even seen people change interest rates for different parts
of their model.
The problem with setting an interest rate too high is that doing this can
over penalize projects with later payouts. This is especially important in the
post-2008 recession era of low interest rate monetary policy. And, if you
delete your long-term payout projects, after a few years, you won’t have any
projects!
However, if your interest rate is too low, you may end up implementing
weak projects that should be deleted from your portfolio.

Example Deterministic Model


We’ll use our career path influence diagram as our example deterministic
model (see Figure 7.1).
This influence diagram helps us get started, but it also illustrates the point
that influence diagrams are only a starting point for building a model. As we
dig in, we find that we’re going to have to deal with more detail.
We’ll input the p10/50/90 ranges as part of our deterministic model and
will use the p50 values as the base to calculate a base estimate of NPV. We
have global variables, which apply equally to each of the three job offers,
and uncertainties that are unique to each job offer. Considering our inputs:

1. Global variable, interest rate: For this analysis I’ve used 5%, as this
represents a compromise between what we could borrow money for
(assuming a good credit rating) and is probably higher than we could
achieve with a CD or money market fund.
2. Global variable, time of analysis: We have to decide what our time
frame of interest is. For this example, I’ve used 6 years. If we look at a
shorter time frame, we may not capture a good picture of the total
uncertainty associated with our decision. If we look at a longer time
frame, the accuracy of our inputs and assessments likely diminishes
significantly. We always have the option of increasing the time frame to
see if it changes our decision.
Figure 7.1 Career path influence diagram
3. Starting salary: In this scenario, we have job offers in hand, so we
know what the starting salaries are for each offer.
4. Raises and bonuses: In this scenario, the raises and bonuses are difficult
to assess. We are likely to be able to gain anecdotal evidence of past
practices as we interview people within each company. We can also
investigate websites to see if we can gain insight. The health of the
basic economy is an obvious factor affecting the raises that companies
give out. Also, the degree to which the individual succeeds in the job
and performs well should affect both.
I’ve modeled the raises and bonuses as two uncertainties for all the
years. This probably overstates the effect of this uncertainty. The most
likely scenario is that the raises received each year depend on the health
of the company and the overall economy and each year’s raise is mostly
independent relative to the previous year. When we look at simulation,
we can further explore this to see if it changes our decision.
5. Benefits: Most companies report total value of their benefit packages,
including 401(k) or similar programs. The 401(k) program is usually
spelled out, although sometimes the company contribution can depend
on whether the company is profitable. In our model, we do have some
uncertainty in the total benefits package, as it might be increased or
reduced depending on the economy and other factors. Note that we take
care not to double count the 401(k) contribution.
6. Taxes: Taxes are uncertain and can be assessed by examining historical
tax rates and considering the political environment and the state taxes
associated with each job offer. We could use NPV of pre-tax cash flow
as our measure-of-value, but this would not allow good accounting for
the effect of the 401(k) and state income tax. Taxes are not a global
variable due to the effect of state income tax.
7. Measure-of-value: After-tax NPV is our measure-of-value. An
alternative measure-of-value could be total compensation (wages,
salaries, and benefits).

Figure 7.2 is a screenshot of our initial deterministic model. Several


features of the model bear pointing out:

Note the distinct input style. This makes it easy to see where the
inputs should be entered.
Note the distinct output style used for NPV.
I usually start in cell B2. This leaves an extra row and column, which
makes it easier to add something above or to the left of the work if
needed.
Note that the rows for the inputs correspond to where they are used.
This makes it easier to copy and paste and it also makes debugging
easier.
Since we’re using after-tax NPV as the measure-of-value, we have to
calculate gross income and pre-tax net income. Therefore, the 401(k)
is computed and subtracted to calculate taxes but is added back in to
compute the total. Interest and appreciation of the 401(k) is not
included in this model, as it is highly uncertain and not that relevant
over the six-year time horizon.
Note that the year 2013 is italicized. This is because it is equal to the
start year; subsequent years just add one to the value in cell E2. The
italics serve as a reminder that column E differs from the other
columns and would not be included in using copy and paste if we
decide to add more years to our model.
Note I have rounded to the nearest whole dollar—adding decimals
make the numbers more difficult to read. I like to use the red font
with parentheses for negative numbers (this is a preference)—it
makes subtractions really stand out.
There is a balance between breaking down the calculations into steps
and combining into what you want. For example, I could have
calculated pre-tax net income directly, leaving out rows 8 and 9.
However, (1) we need the 401(k) results as a component of the after-
tax total anyway, and (2) it is easier to understand and troubleshoot
the calculation with rows 8 and 9 visible.

Figure 7.2 Career example—initial deterministic model screenshot

We can use an identical set of inputs and calculations for both


Alternatives A and B—just the inputs are different between these
alternatives. Alternative C, however, is more complex, and we won’t
model it until we discuss simulation.
There is a global input section for variables that are used for all three
alternatives.

Implications for the Manager


As we’ve discussed, Excel is a powerful and useful tool, but poor practices
can lead to significant errors that in turn can lead to bad decisions. It is
especially tempting for time-pressed managers to use Excel to get a “quick
and dirty” answer and ignore good model-building principles. The other
important consideration is quality control of models used for making
decisions. A peer review of the model is a good way to make sure that the
inputs, transformation, and outputs within the model are what they are
supposed to be. Having an independent person develop a high-level model in
parallel with the real one is also useful for quality assurance.
As a manager, quality control of any models your organization uses for
business decisions is extremely important but difficult to achieve. The
techniques discussed in this chapter are a good first start toward achieving
this goal.
CHAPTER 8

Tornado Diagrams

Figuring Out What Is Important


We now have a working deterministic model that gives us a single-point
estimate of our measure-of-value. The next tool we’ll look at is called a
tornado diagram or tornado graph, which is designed to show which
uncertainties are important and which are less important. This is especially
valuable if people you are working with disagree on the assessments you
have—if the uncertainty is low on the tornado, it isn’t worth more time to
discuss the disagreement. But if it is near the top of the tornado, you’ll have
to take the time to work through it.

For example, with one new product line I worked on, the owner was
very concerned about reusing abandoned equipment from a former
project that had not worked out commercially. When we completed the
tornado diagram, it showed that other uncertainties, such as success
with market and business development, had a much higher impact on
NPV than success or failure with reuse of the old equipment. We
reassigned the engineers that were working on the equipment reuse
issue to business development as a result of the analysis.

In addition to showing us which uncertainties are important, there’s a


tactical benefit for developing the tornado graph. As you develop the data
for the graph, you’ll be switching inputs between their base, p10, and p90
values (as we’ll see later). If your model isn’t working properly, you’re
likely to discover that either your model doesn’t work at all with some of
the inputs or you’ll develop some outputs that just don’t look right. This
will give you an opportunity to correct your deterministic model.
Another important point—don’t make decisions between alternatives
based on the tornado graphs (that is not their purpose). The cumulative
probability graphs we’ll discuss in the next chapter are used to compare
alternatives.

How to Develop a Tornado Graph


To develop a tornado graph, we complete the following steps:

Make sure that each uncertainty is set to its base value (p50). Input
each uncertainty’s p10 and p90, one at a time, into our deterministic
model and record the resulting measure-of-value (usually we use
NPV as our measure-of-value for the tornado graph).
In a similar fashion, input each discrete variable as success and
failure, one at a time, into the deterministic model and record the
result.
Compute the absolute value of the difference between the p90 and
p10 resulting measure-of-value for each uncertainty.
Compute the absolute value of the difference between the success
and failure cases for each discrete variable.
Order the differences from maximum to minimum.
Plot the results using a horizontal stacked bar graph.

This sounds complicated, but an example will illustrate how to do this.


Figure 8.1 shows the p10/p50/p90 values for our first job offer.
We plug the p10 of raises, 0%, into our model instead of 2%. This results
in an NPV of $357,105 instead of the base value of $374,388. We then plug
5% into the model. The resulting NPV is $402,034. Then we take the
difference between the two results and record $44,929 for the raises
uncertainty (see Figure 8.2). Then we put 2% back into the model and
verify that we still get an NPV of $357,105. This step is important as it
provides a quality check that we still have the correct inputs being used by
the model. You can see how it is important not to hard-wire inputs into your
model!
Figure 8.1 Offer A uncertainties

We repeat these steps for the remaining three uncertainties and record the
results and the differences as shown in Figure 8.2. Using these numbers, we
can construct a tornado graph for Offer A as shown in Figure 8.3.
Developing the graph in Figure 8.3 involves using a few tricks in Excel:

Using the data in Figure 8.2, insert a horizontal stacked bar graph.
Use the low values as a “ghost” data series—the bars will show up
to the left of those in Figure 8.3.

Figure 8.2 Offer A calculation results ($ values shown are NPVs)

Figure 8.3 Offer A tornado graph


Set the second data series to the difference between the low and high
values.
Set the labels to the names of your uncertainties.
Set the value that the vertical axis crosses the horizontal axis at the
base value ($357,105).
Add the p10/p90 values by hand by (1) adding data labels to the
second series and (2) manually replacing the values and moving the
text fields to the proper place. Note that the tax rate is reversed
relative to the other uncertainties in Figure 8.3—as the tax rate
increases, NPV decreases.
After the graph is complete, set the ghost data series fill and line to
“no fill” and “no line” (you want to do this last, as you can’t see to
select it after the fill and line are the same color as the background).

The other way to develop a tornado graph in Excel is to use the clustered
bar graph and then adjust the overlap for each data series.
The tornado shows us which uncertainties are less important. In the Offer
A example, the results match what we would expect—raises are the most
important uncertainty, followed by bonus, tax rate, and finally benefits.
If you are going to compare graphs (we’ll develop the Offer B and C
graphs next), you’ll have to set the maximum and minimum values for the
x-axis manually so that they all have the same scale. Also, note that the
tornado is more powerful than a traditional ±10% sensitivity graph as it
uses actual distributions rather than an arbitrary ± quantity.
In a similar fashion, we can develop the Offer B graph (Figure 8.4).
There are commercially available packages that will develop tornado
graphs, which allow you to develop the graph more quickly than directly
from Excel. Crystal Ball a simulation program available from Oracle (that
we will discuss in the next chapter), has the capability of generating tornado
graphs. Figure 8.5 was developed using an Excel add-in. Note that the
effect of the variables (upside vs. downside) is shown by the different
colored bars in Figure 8.5. This can be done directly from Excel by using
the horizontal stacked bar graph with three series: a “ghost” (unseen) series,
a downside series, and an upside series (take care to get the downsides and
upsides correct).
Figure 8.4 Offer B tornado graph

Figure 8.5 Offer A tornado graph showing upside and downside

One important limitation of the tornado diagram is that it doesn’t show


the effect of correlation between the uncertainties. In our career example,
the raises and bonuses are likely correlated with the state of the overall
economy, the health of our industry, and our success in the job. We could
redraw the influence diagram to reflect this. However, it is more practical to
assess the uncertainties that we are using for this problem. If our example is
a launch/no launch decision for a billion dollar product, we would use more
fundamental uncertainties.
The Offer C tornado graph is a little more complicated. If you remember
the situation in Chapters 4 and 5 (see Figure 4.3 for the decision tree),
Offers A and B are from traditional large companies with good benefits and
a track record of good management. Offer C is from a start-up company,
which could be very successful. However, there’s a 40% chance the
company could fail. If the company fails, the plan is to try to get another job
(we assess this probability as 50%). If we can’t get another job, we go back
to grad school. Therefore, we have to add the chance of Company C failing
and the chance of not being able to get another job. Figure 8.6 shows the
Offer C discrete variables and uncertainties.
We have to extend the model to account for the probability of success for
Company C. Using these inputs, we can develop the tornado graph for Offer
C, which is significantly different from those for Offers A and B (see Figure
8.7). The failure case assumes that the mitigation (finding a new job) is also
unsuccessful—we don’t have any way of examining these success or failure
events one at a time versus the base because the probability of mitigation is
irrelevant unless Company C fails.

Figure 8.6 Offer C uncertainties and discrete variables


Figure 8.7 Offer C tornado graph

Figure 8.7, when compared with Figures 8.3 and 8.4, has a completely
different profile. There’s a lot of upside potential, but there is also a
significant financial penalty if Company C fails. Knowing the potential
impact of Company C failure, if we selected this job offer, it would be
prudent to monitor Company C’s progress carefully and to spend time
networking to increase the chances of getting another job if Company C
fails. Understanding the potential upside if Company C succeeds would
likely be a powerful motivator!

Sometimes variations on this approach can add insight. For example,


one project I worked on included several different oil and gas fields of
varying size and composition. The owner wanted to know which fields
were most important from an uncertainty viewpoint. I used the model
to calculate NPV of the total project with each field missing and
recorded the result. Then I put each field to its p10 GOR (gas-to-oil
ratio) and to its p10 of BOE (barrel-of-oil equivalent) and recorded the
result. Then I moved the GOR up to the p50, then the BOE to the p50,
then the GOR up to the p90, and so on. Then I calculated the
difference between the p90/p90 cases and the missing field cases and
rank ordered the fields in order of importance.
In this case, for one of the fields (Source C), the NPV for the total
project was lower with the p10/p10 cases than it was for the missing
field case. This means that it would be better to leave the field out of
the project if you knew it was going to come in at the p10/p10 levels.
The final plot showed the effect on project NPV as each field moved
up from the field missing case to increasing reserves and GOR. The
owner was not surprised that Source A was his most important
resource, but as Source B was planned for to be developed last, he was
surprised that it had a large effect on NPV (which was using a 10%
discount rate). As a result of this work, the owner decided to re-
examine the timing on Source B and do more appraisal work on
Source C.

Figure 8.8 Portfolio of oil and gas resources

This tornado (Figure 8.8) had to be tipped on its side to show all the
probability combinations for each resource.

Implications for the Manager


A tornado diagram is a useful picture of the relative effects of uncertainties
upon the measure-of-value (usually NPV). Knowing which uncertainties are
important allows you to focus your attention on what is important. Many
times experts will disagree—sometimes passionately—about the ranges that
should be associated with your uncertainties. If—using the widest possible
assessments between the p10 and p90—the uncertainty is low on the
tornado, this can help the experts to more quickly agree on how the
uncertainty should be approached.
The first few times you develop a tornado graph in Excel will take some
time, but after you’ve done two or three, you can set them up reasonably
quickly. And, as we noted earlier, completing the tornado graph can prove
valuable in making sure that your model is working correctly.
CHAPTER 9

Cumulative Probability

Looking at the Range of Outcomes

Our next step is to develop a cumulative probability graph (sometimes


called an s-curve) for our decision. This will show us the full range of
outcomes given our assessed inputs. No single outcome on the tree is more
or less likely than any other point—my friend David Skinner expresses this
as, “You’ve bought the whole curve.” Comparing the curves for each
alternative usually gives us insight as to which alternative best fits our
appetite for risk.
There are two ways to develop the cumulative probability graph. We can:

run each case in our decision tree after we have assessments for
p10/50/90s for each uncertainty and then solve (roll back) the tree,
or
run a Monte Carlo simulation.

If we have more than two or three uncertainties, the tree method can be
very laborious. There are times when developing and solving the tree as I
outline can yield insight, but most of the time decision analysts prefer to use
Monte Carlo simulation because it can be set up and run in less time than
developing the full decision tree.
A note about measures-of-value is appropriate at this point. I recommend
using NPV as the simulation measure-of-value. This is especially valuable
in comparing projects.
The internal rate of return (IRR) is a popular measure-of-value. However,
(1) the IRR function in Excel will return an error message if the cash flow
of your project ever goes negative after the first few years of investment are
made (e.g., if you have to make another investment partway through the
project’s life or sales price drops part way through the project’s life), and
(2) NPV s-curves can give a good picture of the uncertainty associated with
your projects.

Solving the Decision Tree


Decision analysts are divided as to whether solving the decision tree or
running a Monte Carlo simulation is the better way to develop an s-curve.
Both methods work. If you have time, using both methods has the
advantage of providing a step of quality assurance versus an input or
computational error.
We’ll start with solving the decision tree. To solve the tree, we:

build out the tree using each uncertainty with corresponding


p10/50/90 branches,
compute the measure-of-value (usually NPV) and probability for
each end point branch,
build a table of these results,
sort from high to low NPV,
calculate the cumulative probability of each branch, and
plot the cumulative probability versus the sorted values using the x/y
graph in Excel.

The best way to understand how to solve the tree is by example. We’ll
use the tree we developed in Chapter 4 for the job offer example, but
updated to include uncertainties. With Offer A, we have four uncertainties
with three outcomes (a p10, p50, and p90) for each.
A simple decision tree can become a complex decision bush with this
approach—we have 34 or 81 possible outcomes for just Offer A! Offer C is
even more complex. Figure 9.1 shows a skeleton tree for Offer A—the
entire tree is too large to display in the book.
This brings up an important use of the tornado graph—if we’re going
to solve the decision tree to develop the cumulative probability graph,
we should use only the top three or four uncertainties, as the ones
lower in the tornado don’t affect the outcome materially.

Figure 9.1 Skeleton tree, offer A

Rather than plug and chug through 81 different outcomes for Offer A and
another 81 different outcomes for Offer B (and hope that I record the values
with no typographical errors), I reduced the formula for NPV to one line
and then mapped every possible combination of p10/50/90s. Once the
formula works correctly in one row, it works for all the rows and can be
copied and pasted. I use the deterministic model to verify that my “one-
line” calculations are correct. Figure 9.2 illustrates this concept for the first
part of the full decision tree for Offer A. The data are not yet sorted in
Figure 9.2. Note that I include a sequence number so that I can sort and re-
sort if necessary. Also note that a quality control check on your probabilities
is that the sum of the probabilities of all the branches (“probability”
column) has to add up to 1.0—if it doesn’t, you have a math error
somewhere.
A comment on integration of the input distributions is needed at this
point. We’re using the p10/p50/p90 distributions to approximate an entire
distribution of inputs—therefore we use an approximation for the
probability associated with each branch of the tree. For Figure 9.2, I used
the extended Swanson-McGill weighting, which uses 0.30, 0.40, and 0.30
as the probabilities associated with the p10, p50, and p90 values,
respectively.1 An alternative weighting system uses 0.25, 0.50, and 0.25 for
the p10, p50, and p90 values, respectively.2 This method is referred to as
the McNamee– Celona method. For a more thorough discussion of
encoding uncertainties, see Skinner.3 I tend to use the 0.30/0.40/0.30 values,
as they are more appropriate when there’s skew in the input distribution
(which is usually the case).
As an aside, with Excel, it is sometimes possible to map all possible
combinations. I once completed a shipping fleet sizing analysis where
we had proposals for two different types of ships from several
shipyards. We were buying seven ships, so I mapped each possible
combination of type of ship and shipyard sourcing. The measures of
value were capital cost and cargo hauling capacity (i.e., capacity
versus cost).
Figure 9.2 Table used to develop cumulative probability graph
There were thousands of combinations, but this is not a problem
with Excel. Once the first formula was developed, I generated a table
similar to Figure 9.2. We were then able to pick a few combinations of
interest and examine those combinations more closely. We’ll discuss
this technique more when we discuss multi-attribute decision analysis
and portfolio analysis.

It can be quite time consuming to develop the first “one-line” formula so


that it works correctly. For example, the formula for the first year, which is
used in the NPV calculation, in Figure 9.2 is very long, and it took me quite
a bit of time to get it to work.
After sorting the data from low to high NPV, we can develop the
cumulative probability graph, which is shown in Figure 9.3 for both A and
B. Each of the small points on the graph represents one outcome branch of
the decision tree.
Across all ranges of probability, Offer B dominates Offer A. The only
reason that Offer A would ever be taken is if some other attribute overrode
our economic interest (which we’ll talk about in Chapter 11). And, if we did
select Offer A, we now have a good picture of the value that we’re giving
up in exchange for whatever attribute is more important.
As Offer C involves an even larger tree to solve, we’re going to use
simulation to develop its s-curve, which is the next topic.

Using Simulation to Develop S-curves


There are two commercially available products that can be used to run
Monte Carlo (random number) simulations: @Risk and Crystal Ball.
(Analytica can also be used to develop s-curves; however, it does not
depend as much on Excel.) These programs use your Excel model as a
calculation engine, and you specify the input distributions for each
uncertainty that you want to use and then specify the measures of value that
you want to examine. Once you’ve run the simulation, you can extract the
data and develop a graph similar to the one in Figure 9.3. As these products
have been extensively written about elsewhere,1 I’m not going to provide
step-by-step instructions on how to operate the software.
Figure 9.3 Cumulative probability curve, offers A and B

The advantage of using one of the commercial programs is that you can
set up the simulation much more quickly than you can solve the tree (as
discussed in the previous section). The disadvantage is that the software is a
“black box” and it can be very difficult to debug problems if your results
are not correct.
I prefer to build the logic for the distributions into the workbook model
itself; that way I can debug the calculations and logic as I build the model. I
use Excel’s “RAND()” function to generate random numbers, which are
then used to select p10/50/90 values for the uncertainties and
success/failure cases for the discrete variables. I’d advise you to avoid the
LOOKUP functions, as they are really slow. Set the calculation in Excel in
manual and use F9 to watch how the numbers change as you randomly
select each iteration.

The ease-of-use associated with commercial programs can lead to


going too fast and making mistakes. Early in my experience with
decision analysis, I had two projects that I used the Monte Carlo
simulator to develop s-curves. In both projects, the outcome was
nowhere near the s-curve that I’d generated. In one case, I had done a
careful job of assessing a project leader on costs, but both the leader
and I missed a factor that caused a significant project overrun.
On the other project, the engineer and I both missed a regulatory
risk: the State wrote a new law that completely negated our key
assessments. In fairness to Monte Carlo simulation, the problems were
not with the simulation but with the inputs. But these two experiences
made me slow down and work more carefully. It is so easy to develop
a quick Monte Carlo and get a curve that we can be tempted to go too
fast and miss something important.

After completing and testing the model, I will sometimes use Crystal Ball
as a “player,” that is, I don’t use the input distribution capability but do use
the software to track the measures-of-value. However, it is also easy to set
up a macro that computes a value and writes it for each iteration and then
use the PERCENTILE function on the data to develop the s-curve. To do
this, set up a table with percentages from 1% to 99% (in any increment that
you want) in one column and then use the PERCENTILE function on the
next column (using the first column and a range containing the output that
your macro generated) to compute the values corresponding to each
percentage. Then use Excel to plot cumulative probability versus value.
For our example, the simulation works really well to capture the
uncertainty around Company C going out of business—we have a 60%
chance of success with a 50% chance of mitigation via obtaining a new job.
We also have an uncertainty on which year the company fails (if it fails)—it
could be as early as year 3, most likely year 4, or it could be all the way out
to the end of our time frame, year 6. When we run the Monte Carlo
simulator and plot the results, Figure 9.4 is the result.
As expected from Offer C’s tornado, there’s significant upside and
significant downside relative to the other two offers.

Every fall for the past several years, I give a lecture about DA at UT
Austin for my friend Tom Sciance’s Engineering Economics class.
Every year Dr. Sciance has between 20 and 40 hardy souls who get up
early to make his very useful 8 AM class! I use Figure 9.4 in my
lecture, and after explaining cumulative probability and the scenario of
multiple job offers, I ask for a show of hands as to which job offer they
prefer. Usually two or three students opt for Offer B, and the rest of the
class picks Offer C. We talk about their rationale for their selection.
Figure 9.4 Cumulative probability, offers A, B, and C

For several years, I gave “lunch and learns” with a very similar
lecture at Chevron. When I got to Figure 9.4 and solicited a show of
hands for job preference, one or two of the participants would choose
Offer C, but without fail, the rest of the class would opt for Offer B.
However, in 2008, when I was at UT, the class did not pick Offer C!
Almost everybody in the class wanted Offer B. This illustrates how our
preferences and opinions can be affected by external events. The
students’ perspective on upside potential versus downside risk changed
significantly as a result of the Great Recession. In 2009 and 2010, the
class still preferred Offer B. By 2011, the percentage was beginning to
shift back toward Offer C, but still not at pre-recession levels.

As I mentioned earlier, one advantage of using Monte Carlo simulation


and solving the tree is that it gives you a quality check on your work. The
graphs should be similar, but won’t necessarily be exactly on top of each
other. For Offers A and B, Figure 9.5 shows a comparison of the two
methods. The Monte Carlo is affected by the choices I made for
representing the p10/50/90 input distributions, which may not capture the
downside quite as well as solving the decision tree. (I did use triangular
distribution on several inputs, which won’t have quite as much downside.)
Figure 9.5 Tree solution versus Monte Carlo solution

There is a new development in simulation developed by professor and


author Sam Savage called “SIPMath,” where SIP stands for stochastic
information packets. This technique uses Excel’s data tables and
predetermined sets of random numbers. This allows simulations to be done
almost instantly and has the potential to allow enterprise-wide consistency
from analysis to analysis. This is an advanced DA topic, however, I
mention it here because some readers may find it to be a topic of interest.
The Probability Management (.org) website has free tutorials and examples
available.

Implications for the Manager


Some major companies require a cumulative probability graph for every
major capital project, as the senior executives want to see the upside
potential and the downside risk for their projects. This practice gives
everybody associated with a project a common language and understanding
about probability and what uncertainties are driving potential outcomes.

One time almost two decades ago, David Skinner and I completed a
major business strategy analysis for a major business unit within a
large corporation. This was the business unit’s first experience with
decision analysis. The managers had a “momentum” (i.e., existing)
business strategy; however, as a result of the framing and analysis a
more powerful strategy was identified and analyzed. When we showed
the leader of the business unit a figure that was similar to Figure 9.4
except that there was less downside associated with the more powerful
strategy than with Offer C, he looked at the graph for a few minutes
and then looked at his staff and said, “It looks like we are changing
gears.” The rest of the meeting focused on planning for the change in
strategy, which ultimately was successful.

The bottom line is that s-curves are a powerful tool, and you can make
better business and personal decisions if you use them. However, be aware
that short-cutting the assessment process can be very dangerous, as you’ll
think your s-curve is better than it really is!
1 See Skinner, Chapter 9, and Leach, Chapter 4 for detailed discussion on how to set up simulations
using Crystal Ball.
CHAPTER 10

Value of Information

How Much Is It Worth to Know?

Once we have built our decision tree(s) and our model, we can explore the
value of information and the value of control. The traditional way decision
analysts explain these concepts is to relate:

the value of information to a “real” clairvoyant or oracle (who can


answer a question but will charge us for the information), and
the value of control to a real wizard who can perform one specific
miracle.

The key concept behind both of these tools is that there is no value of
information or control unless you are willing to change your decision. In
other words, if you have already made up your mind concerning your
decision, more information has no value!

Consider the alarm lights and gauges in your car. They’re designed to
help you make good decisions and take appropriate action. However, if
you’re not willing to act on the information, the cost of these systems is
wasted. One guitar player I used to play music with drove a classic
Mustang. He was driving home one night when the red “check engine”
light came on. Rather than stop and call for assistance, he decided to
just keep driving. Within a few miles, the engine failed catastrophically.
I asked him why he did that, and he replied, “I decided that I wasn’t
going to let a warning light affect what I wanted to do.”
Probably the most significant commercial use of the value of information
tool is in oil and gas exploration. When you’re searching for recoverable
hydrocarbons and then trying to size processing equipment correctly, it is
important to know when to keep obtaining more data (seismic, seismic
reprocessing, and appraisal wells) and when to stop.

Estimating the Value of Information


To estimate the value of information, we first complete the analysis as
already discussed, and then we re-draw the decision tree with the risk or
uncertainty (for example, success or failure of the company) as a known
before we have to make the decision.
As an example, we’ll use the chance of Company C going out of business
in our job offer situation. Using the results from the simulation, we can fill in
the p10/50/90 outcomes for each branch of Figure 4.3, which is shown in
Figure 10.1.
If we roll back (i.e., solve) the decision tree, we obtain estimates of
expected value (EV). Again, there’s nothing “expected” about expected
(we’ve bought the whole s-curve), but it does allow us to compare cases:

Offer A EV = $381,000
Offer B EV = $411,200
Offer C EV = $384,880, but…
Offer C EV = $436,200 if Company C is successful,
Offer C EV = $307,900 if Company C fails,
Offer C EV = $396,100 if Company C fails and another job is found
(assumes the job is equivalent to Offer A), and
Offer C EV = $219,700 if Company C fails and another job can’t be
found.

This is another way of thinking about the decision, but the s-curves
(Figure 9.4) present more clarity as to what the alternatives imply financially
than do the expected values!
There are two discrete variables in Figure 10.1 for which we would be
interested in knowing the value of information: Company C’s success or
failure and our ability to find another job if Company C fails. We’ll begin
with Company C.

Figure 10.1 Job offer decision tree rolled back

Figure 10.2 shows the decision tree with knowing whether Company C
succeeds or fails before making the job offer decision.
Value of information is calculated by taking the EV of the offers with the
information and subtracting the value of the offers without the information.
From Figure 10.2, the EV with the information is $426,200 (note that
Company C’s success or failure is still uncertain; hence, the EV calculation
accounts for both branches of the first chance node). The strategy of choice
without information is Offer B, which has an EV of $411,200. Therefore, the
value of information is $426,200 minus $411,200, or $15,000.
As this number assumes that the information is inerrant, it represents the
upper limit of what you should pay for information concerning Company C’s
success or failure.
In a similar fashion, we can estimate the value of knowing whether a job
could be obtained in the event of Company C’s failure (or not). Here we put
the uncertainty of whether we can find another job upon Company C’s
failure ahead of the decision to select among the three offers (Figure 10.3).
In this case, the value of information is $415,680 minus $411,200 or
$4,480. This represents the upper limit of what you should be willing to pay
for information concerning the probability of obtaining another job if
Company C fails.
It is important to note that we’ve been estimating the value of perfect
information. There are methods to estimate the value of imperfect
information, but that is beyond the scope of this book. Suffice it to say that
the value of imperfect information is always less than what we’ve been
estimating in this chapter—perfect information.

Value of Control
The value of control is similar to value of information, but here we can
control the uncertainty to get the outcome that we want. Instead of an all-
knowing clairvoyant, we have access to an all-powerful wizard who can
control an outcome for us. My friend David Skinner always tells his classes,
“The wizard is more powerful than the clairvoyant.” (David even has a
wizard hat that he sometimes wears to make this point!)
Figure 10.2 Value of information, company C’s success or failure
Figure 10.3 Value of information, find another job

There are three areas where the value of control is useful:

understanding an upper limit for how much to spend to develop


technology to mitigate an uncertainty (how much would you be
willing to pay a “real” wizard?),
getting an idea of how much would be appropriate to spend hedging
an uncertainty, and
using the concept to brainstorm out-of-the-box ideas on how an
important uncertainty could be hedged or mitigated.

Calculating the value of control is easier than estimating the value of


information. It is the difference in expected value between the uncontrolled
case and the expected value of the controlled case.
With the job offer example, the value of the branch with Offer C is
$436,200 if Company C is successful (see Figure 10.1). This represents an
increase of $25,000 over $411,200 for Offer B, which would be the expected
value choice.
If Company C is traded publicly, it might be possible to accept Offer C but
allocate some funds for shorting the stock if you feel that the company may
not succeed (assuming that you are not making your hedging decision based
on insider information, of course). This would be a way to hedge your
choice. The value of control would set an upper limit on how much you
would want to spend for hedging.

Implications for the Manager


The key point from this chapter is that if the decision has already been made,
there’s no value in further analysis, studies, tests, or data. Only if you are
willing to change your mind is there value in more information.
The value of control lets you know the maximum that you would pay to
hedge an uncertainty. It also lets you know the most that you would fund
technology which, when developed, could mitigate a chance event or control
an uncertainty.
When David Skinner and I teach DA together, if we have time, we
include a fun and interesting exercise around a game of chance with
difficult-to-calculate odds, an auction, and a prize of significant but
uncertain value. This exercise allows us to introduce decision trees, risk
using real money (we auction off the right to play the game and keep
the money from the winner), probability, expected value, certain
equivalent, and value of information in a real-time setting. Near the end
of the exercise, we have a volunteer complete the random part of the
deal (but keep it secret). To illustrate the value of information, we allow
another volunteer to see the outcome (this person gets to be a “real”
clairvoyant). We then let the winner of the auction for the chance to
play the game negotiate a price for the information. This is an
interesting negotiation to watch. Sometimes the auction winner feels
lucky and will only offer a small price (again, they have to pay with
their own money) for the information. Sometimes the clairvoyant gets
greedy and wants so much money for the information that the auction
winner gives up and doesn’t buy the information.
In an executive MBA class a couple of years ago, the “real”
clairvoyant wouldn’t negotiate. She told the auction winner, “I’ll just
give you the information.” Over a couple of decades of playing this
game, neither David nor I had ever seen anybody do this. When we
asked our clairvoyant about her logic, she replied, “My relationship
with the people in here is much more important to me than money. I’m
glad to be able to help him win the prize.”
I was genuinely impressed by our clairvoyant’s understanding of her
own objectives hierarchy, especially under the pressure of a game
situation in front of her peers in an MBA program where almost the
entire curriculum focuses on making money. This young lady never lost
sight of what was important to her.

If I could leave you with only one thing from this book, it would be for
you to understand your own objectives hierarchy. All the other tools we’re
discussing are predicated on understanding what you (and your significant
stakeholders) want and the trade-offs associated with those objectives.
CHAPTER 11

Multiattribute Decision Analysis

There’s More to Life than Money

Many decisions can be made on the basis of a financial measure-of-value


(expected NPV, IRR); however, for some decisions, financial objectives are
not sufficient. These are referred to as multiattribute decisions.

My definition: multiattribute decision analysis is appropriately


considering non-financial objectives and their associated trade-offs,
which leads to clarity for the decision maker.

Multiattribute DA addresses how to make decisions where there are


multiple factors (attributes) affecting the decision. In this chapter, we’ll
discuss how to appropriately incorporate subjective factors and opinions
into decision making.
The decision analysis community does not agree on how to handle
multiattribute decision analysis. Some analysts believe that all decision
criteria can be reduced to money. Others go to the opposite extreme and
treat the financial measure-of-value as just another attribute. The key is
using the tools in a way that adds clarity for the decision maker, and this
involves both financial (objective) and subjective considerations.
Consider American Idol, Dancing with the Stars, or the Miss America
contest—people can make decisions very quickly using subjective
judgment—they know what they like. Consider going out to bid for a prime
contractor or key piece of complex equipment—companies sometimes
accept the lowest bid, but many times they consider other factors. Consider
buying a house or a vehicle—rarely would you only consider price when
making this decision. You probably wouldn’t take another job for more
money if the company’s culture, the job’s location, your interest in the
work, and professional development were inconsistent with your objectives.
Some consultants and analysts ignore financial analysis and focus solely
on the subjective. Some people won’t buy a house or a car with a complex
rating/weighting/scoring system with economics added in as just another
attribute to be ranked, weighted, and scored. There is a natural tendency to
use complexity as an attempt to add credibility and validity to what is
subjective. Using a complicated ranking, weighting, and scoring system
when buying a house or a car may give you the illusion that you’re adding
precision and science to your decision making, but at the end of the day,
you’re likely to buy the car or the house that you want (if you can afford
it!). The bottom line is that both economics and subjective fit with our
objectives are important.
I call the subjective assessment beauty. My friend Frank Koch, a retired
Chevron analyst, calls it “net present happiness.” It is subjective. It varies
from person to person. You can’t prove your assessment of beauty—you
can explain it, but another person may see things differently.
There are three overriding principles that should be applied to any
multiattribute decision problem:

Keep the subjective part of the analysis as clear and simple as


possible.
Use the objectives hierarchy to decide what attributes to score.
Add clarity by showing the decision maker how much subjective
benefit (beauty) can be gained as more money is spent (or given up).
In the vernacular, this is “bang for the buck.”

Going through our career example will illustrate how this works.

Multiattribute Example
When we developed our objectives hierarchy for our job offer example
(Figure 3.2 placed here as Figure 11.1),1 we had three fundamental
objectives: financial, professional, and social. We’ve analyzed our three
alternative job offers quite thoroughly from a financial perspective in the
previous chapters. What we’re going to do now is score each job offer
subjectively with respect to the professional and social fundamental
objectives.

Figure 11.1 Objectives hierarchy, multiple job offers example

Referring to Figure 11.1, when scoring the professional objective, we’d


want to consider how challenging the job would be, how well it fits in with
our individual skills and interests, and how collaborative the company’s
culture is.
For social, we’d consider locations, proximity to friends and family, the
working environment, and (again) how collaborative the company’s culture
is.
We can use any scale we want to scale the professional and social
objectives:

Some people like to score from 1 to 10.


Some people like to score from 1 to 4 (e.g., four stars).
Amazon and Netflix let users rate books and products with 1 to 5
stars. Yelp.com also uses a 1 to 5 stars rating system for restaurants
and service providers.

It doesn’t matter which scale we use—it is a preference. I used 1 to 10 in


our example. Here is where many people want to add weights, asking,
“Which is your more important, professional or social?” My tongue-in-
cheek answer to that question is “Yes,” meaning that both objectives are
essential. Adding weights to the attributes seldom adds clarity. Keep it
simple—just add the scores and divide by 2 so that we maintain our “1 to
10” mindset. If it is really important to show multiple objectives’ scores,
you can do this with a “stock market” graph, and I’ll show an example of
this later in the chapter (Figure 11.5).
The following three columns show our scoring for the three job offers:

Professional Social Subjective Score


Job A 9 10 9.5
Job B 8 8 8
Job C 8 4 6

You may recall that the tax treatment for Job C was different—this was
because Job A and Job B were both in our hometown state, which has
higher taxes than Job C, for which we have to move a significant distance
away from friends and family. Hence, Job C scores lower on social. Job A
scores highest on social because their company culture is very collaborative
and professional, and we like their locations; Company A is regarded as one
of the best companies in the world to work for. Company B is similar but
not rated as high on locations.
Then we plot the average of the subjective scores versus expected value
of NPV, which gives us a picture of “beauty” versus financial value (Figure
11.2).
When I show this graph to the students in Austin who picked Job C, I ask
if any of them wants to change their selection. Usually a small percentage
of the students would switch to Job B, but most of the students would stay
with Job C for the upside potential. Speaking of which, let’s add the
p10/50/90 values for each job offer to the graph and see if this presents our
alternatives more clearly (Figure 11.3).
Offer C’s flat s-curve shows up in Figure 11.3 as having off-the-scale
upside and downside potential. You can also see that the p50 value from the
simulation is not the expected value (shown by the circular data point being
offset from the cross mark).
When I’m facilitating an attribute scoring session, I develop headline
bullet points for the rationale of how each person would score an objective.
This is not a novel concept—when a judge on Dancing with the Stars gives
a score, he or she gives a succinct rationale for why they liked or did not
like the performance. By documenting the thinking, we add transparency
and clarity for the decision maker. The decision maker may not agree with
the rationale, but if we document it, he or she can understand it. Just
because it is a subjective judgment doesn’t mean that it is irrational.

Figure 11.2 Career path example, subjective score (“beauty”) versus


financial potential
Figure 11.3 Subjective score versus p10/50/90 and expected value

Purchasing Decisions
Many decisions can usually be made solely on financial measures,
including large capital project and product development decisions (in fact, a
project undertaken with the justification that it is “strategic” is likely poor
use of company money). Merger and acquisition decisions can be made
solely on financial measures. However, company culture is a very important
factor as to whether a merger or acquisition succeeds or fails. Delta Airlines
and Chevron, for example, have been generally successful addressing
corporate culture as they have acquired companies.
The most common multiattribute decisions most of us will face (whether
business or personal) are purchasing decisions. Normally as you spend
more money, you can achieve more of your subjective benefits (“beauty”).
A curve drawn through the highest “beauty” scores associated with each
financial cost defines what is called the efficient frontier. The points along
this curve are the best that you can achieve at any given cost (see Figure
11.4).
Sometimes, though, you get lucky and your best subjective match is the
lowest (or near the lowest) cost. I had two business situations where this
happened:

My friend Doug Quillen and I were working with a joint venture (JV)
team and were chartered to find office space for a team of 20–30
people. We looked at about 35 office properties for lease in downtown
Houston. Our subjective objectives included convenience to mass
transit, how easy or difficult it would be to build out the property to fit
our needs, and how well or poorly the lease terms, especially timing,
fit our requirements.
Figure 11.4 Typical purchasing situations

We were lucky—the least expensive property we found fit our subjective


requirements the best. We were also fortunate that our subjective evaluation
was correct—the JV team’s experience with our office suite was positive
and we developed positive relationships with the landlord and the building
staff.
A few months later, the joint venture’s board of directors instructed us to
put in an independent financial accounting system (the team had been
piggybacking on one of our owners’ systems). We worked with a consultant
and drew up detailed specifications and requirements (especially
considering controls and delegation of authority) and went out for bids in a
two-step process. We went out with a request for proposals to a wide range
of potential providers and interviewed each respondent. Based on the
interviews and initial responses, we narrowed the field and updated our
specifications. Our subjective objectives were:

the potential provider’s proposal fit our requirements out-of-the-box


(that is, no custom programming required),
the reputation and financial strength of the software publisher(s),
and
the long-term viability and cultural fit with the system implementing
company (referred to as a “value-added reseller”).

We were lucky again—the best match for our needs was among the
lower cost providers. And, we had an excellent outcome—the system
was a little late but was on budget and it performed as required. The
system also passed an audit by an independent accounting firm and
two owner audits. By coincidence, the importance of the third
subjective objective was driven home when one of the potential value-
added resellers went bankrupt while we were going through the
selection process.

Another benefit of this approach is that it makes the rationale for


selection very transparent to upper management. In both the examples given
previously, as soon as the board of directors saw the specifics graphically,
as implied by Figure 11.4, the recommendations were almost immediately
accepted. We simply told the story each time while the beauty versus cost
chart was being projected.

Figure 11.5 Example of multiattribute graph

Also, as we’ve already discussed, I do not recommend weighting


attributes. Either what you want (back to the objectives hierarchy) is
important or not. If it is not important, then ignore it. In the case of the
accounting system, all three subjective objectives were critical to the
project’s success. We didn’t average them—we showed them separately.
Figure 11.5 shows a conceptual version similar to what we showed our
board of directors.
Another point to make when selecting vendors is that you discard
nonconforming bids as soon as you determine that the bids do not conform.
If, for example, a respondent has an unacceptable safety or environmental
record, you’re unlikely to want to do business with them and there’s no
point in scoring their proposal.

Implications for the Manager


The main points from this chapter are as follows:

Multiattribute decision analysis is necessary for many real-life


decisions.
Keep multiattribute scoring as simple and transparent as possible.
Avoid complex weighting, ranking, and scoring systems.
Tie your subjective attributes back to your objectives hierarchy. If
they don’t match, either your objectives hierarchy or your attributes
are not correct.

My recommendation is to build these principles into your procurement


procedures. The first decision when buying something is “Can we live with
accepting the low bid?” If the answer is yes, you do not need multiattribute
analysis. If the answer is no, you need to work to develop the beauty versus
cost graph.
The principles in this chapter represent a very powerful way to build
transparency and consensus with personal and family decisions. My good
friend Frank Koch went through this kind of process with his wife when
they made retirement decisions. They developed their objectives and then
balanced trade-offs between conforming alternatives (starting with a
strategy table) and cost. Frank and his wife have been in retirement for a
couple of years now and are really happy with how the process worked out.
Another benefit of the process is that when you are in a frustrating point
of personal indecision, you can go back to your objectives hierarchy and
usually the frustration is coming from trade-offs and uncertainties
associated with meeting your objectives. Sometimes your planning just has
to incorporate patience until uncertainties are resolved.
1 I’m repeating the objectives hierarchy to emphasize how important it is to use it to develop the
attributes that you score.
CHAPTER 12

Decision Quality

How Well Have We Done?

In Chapter 1, I noted that a good decision does not guarantee a good


outcome and that a poor decision does not always have a poor outcome
(Figure 1.1). Making good decisions improves our odds of getting what we
want. We also noted Skinner’s definition of a good decision:

A good decision is one that is logically consistent with our state of


information and incorporates the possible alternatives with their
associated probabilities and potential outcomes in accordance with our
risk attitude.

This is a useful definition, but it is easier to assess the quality of a decision


by breaking this down into several factors to consider.
In their book Decision Quality: Value Creation from Better Business
Decisions, Carl Spetzler and his co-authors1 describe six observable
characteristics of a high-quality decision:

appropriate frame,
creative alternatives,
relevant and reliable information,
clear values and trade-offs,
sound reasoning, and
commitment to action.
In a facilitated DA project, I use these six characteristics as a way to help
the team self-assess the degree to which they are on their way to a good
decision (or decisions). Each team member considers the state of their
project subjectively and simply scores their opinion (0 to 100%) as to how
well they have done on each of the six items. We then plot the results on a
“spider” diagram. This quickly points out differences in thinking.
For example, if the project manager scores “appropriate frame” as 90%
but two of the team members score it at 20%, there is something that needs
to be brought out, discussed openly, and resolved.
Also, note that the subjective opinions will vary depending on the
relative maturity of the project and the analysis. If you have an opportunity
that is early in the development process, you would expect lower scores
than a project that is ready for the final investment decision.

Appropriate Frame
This aspect of decision quality goes all the way back to the left-hand
column of Figure 1.2 (the list of framing tools). Have we correctly defined
the problem, risk, or opportunity? Do we have a good understanding of the
business situation? Do we understand our objectives? Most importantly, are
we solving the correct problem?
There’s another aspect of framing, and that has to do with our
relationship with management. Do we have a clear charter from
management concerning the situation? Does management want a thorough,
unbiased analysis, or are we simply developing documentation to support a
decision that has already been made? If the latter is the case, rather than
waste time doing a lengthy analysis, time and resources would better be
spent on planning.
For a personal decision, sometimes just writing down and agreeing on the
problem (or risk or opportunity) statement at the start can help add clarity
for all parties involved as to what the situation includes and what it does not
include. Sometimes the energy of a group or a team or a family will be
around a completely different topic, in which case you will likely have to
address the other topic first and then come back to the situation that you
want to deal with.
Creative Alternatives
One of the biggest temptations a team has is to analyze minor variations
around the “momentum” strategy, that is, the strategy that they perceive is
what management would do if no analysis were being done. Alternatives
need to be doable and compelling, but need to be robust enough that they
are not just small variations. With many decisions, the alternatives should
be mutually exclusive and collectively exhaustive. This can be another
check on the robustness of the alternatives under consideration.
For personal decisions, the decision trees can really help other people
involved see the range of possible alternatives and catalyze a healthy
discussion on creative alternatives.

Relevant and Reliable Information


The vernacular antithesis for this aspect of decision quality is “GIGO”
(garbage in, garbage out). Any numerical analysis or model that we run is
only as good as the information that we use as inputs. This is where an
understanding of bias is so important. Any time we can validate out inputs
with analogs or benchmarks, this can help us immeasurably. One of the
projects I heard presented at David Skinner’s Rice University DA class
handled this one especially well. After they completed their analysis, they
completed benchmarking of several other similar businesses located in
different states (the analysis concerned a Houston-area business) and they
found that the benchmark data fit well within their ranges.
Another aspect of understanding your inputs and assumptions is that you
can monitor the ones that are truly important and perhaps get early
indications as to whether your information is incorrect or if the situation is
changing such that your information is no longer valid. Better yet, you can
run cases that show what happened and examine contingency planning and
scenarios.
For personal decisions, the more you can talk with people with expertise,
experience, and viewpoints that differ from yours, the better your chances
of understanding the biases involved in your situation and whether your
inputs and assumptions make sense.
Clear Values and Trade-Offs
The objectives hierarchy is your best tool for examining values and trade-
offs. One way to stimulate thinking on values is to look at your primary
objective and fundamental objectives in a situation and ask the question,
“why?” This has to be a thorough and honest discussion, though. Many
times I see objectives hierarchies that start with, “maximize shareholder
value.” This is not a compelling objective. Rob and Aviva Kleinbaum make
the point in their book, Creating a Culture of Profitability, a monetary
objective “can motivate and energize people to a point—it adds horsepower
to your engine, but it should never be behind the steering wheel.”2 They
note that Jack Welch and Warren Buffet both denounce short-term and
purely financial strategies.
If your objectives hierarchy is complete and correct, it will illustrate the
trade-offs involved. Consider the example of career paths that we discussed
in Chapter 3. With competing job opportunities, there are always trade-offs
involved between financial, social, and professional.
For personal decisions, gaining clarity on your values, objectives, and
trade-offs is obviously very important.

Sound Reasoning
This aspect of decision quality is subtler. Obviously if you build a financial
model of a situation, you need to get the equations correct. However, trees
showing the interconnectivity of decisions, risks, and uncertainties
(especially on a time line) can help teams and groups of people think
through a complex or confusing situation clearly. This also helps with
contingency planning.
For personal decisions, the tools we’ve been talking about can help you
think through your situation in a logical and clear way, which can reduce
confusion and ambiguity.

Commitment to Action
To be clear, deciding not to act can be a quality decision. This aspect of
quality has to do with commitment to the decision that is made and more
accurately could be stated “commitment to the decision(s)” (however, this
would be a more awkward statement). This aspect scores our ability to
avoid “analysis paralysis” and simple procrastination. And this aspect of
decision quality does not preclude working on contingency planning so that
there is clarity on changes in our situation that would compel changing the
decision.

Other Considerations
One thing implied but not explicitly stated in Spetzler’s six aspects of
decision quality is moving from deterministic to probabilistic thinking and
analysis. As discussed in Chapter 1, Pat Leach3 (Why Can’t You Just Give
Me The Number?) makes a thorough and compelling case for “knowing the
odds” concerning your business, that is, understanding the ranges of
probabilities associated with your uncertainties and your measures of value.
Spetzler includes this in the relevant and reliable information aspect;
however, I would call this out as an explicit measure of decision quality.
My friend Brian Hagen4 (Problem, Risk, and Opportunity Enterprise
Management) adds two other elements to decision quality: timing (with
options) and choice defensibility. His timing element could arguably be
considered part of correct logic and commitment to action; however, in
some situations timing can be incredibly important and thinking through
how well we understand that timing can be very valuable. “Skeleton”
decision trees (Figure 4.5) can add much clarity to sequencing, timing, and
option discussions.
His element on choice defensibility I agree completely with, in that if you
have maximized NPV with your decision but would be embarrassed to
defend it on, say, your local TV news or with the Wall Street Journal, you
need to go back and rethink your objectives and likely spend more effort on
stakeholder analysis.

Implications for the Manager


If you are on a large project and decision analysis tools have been used to
help frame and structure decisions associated with the project, you should
expect the facilitator to go through the six items and ask you to score your
perception as to how the work is going. If this doesn’t happen, you can
suggest that the team take a few minutes and go through this exercise.
More importantly, though, if you are facing a significant decision (either
business or personal), the six aspects of decision quality are very useful as a
checklist concerning your decision.
For example, you and your significant other may have thought through a
large purchase using the tools that we have been talking about, but at the
end, both of you just don’t feel good about making the purchase. Going
through the list, you would check “OK” on the first five aspects of decision
quality, but “Not OK” on “commitment to action.” This gives you insight as
to further questions to ask. What is it about this situation that is leading us
to discomfort about taking the action? What would it take for us to increase
this, and is that even possible?
CHAPTER 13

Risk Analysis

I Want It Cheap and I Want It Now

“Generic” Risk Analysis


Risk analysis refers to a qualitative assessment of an endeavor’s risk, where
risk is not defined specifically as we defined it in Chapter 5 but more in line
with the general public’s idea of risk—something bad happening. Many
companies require a risk analysis to be completed as part of their
documentation package for large projects.
The basic process is quite subjective—usually subject matter experts are
gathered together in a facilitated session to conduct the analysis.
Brainstorming is used to develop a list of potential risks. Then for each risk,
the team assesses:

How likely is the risk, that is, what is the probability of it


happening?
If it happens, how severe would the consequences be?
How difficult and/or expensive would it be to mitigate the risk?

Usually a scale of one to five is used for the exercise. For example,
probability could be very unlikely, very low, low, medium, or high.
Consequences are usually orders of magnitude dollars, for example, $10K
or less, $100K, $1M, $10M, $100M, or greater. Mitigation is usually very
easy, easy, moderately easy, difficult, or very difficult.
Then the risks are compiled based on their relative scoring in a list from
1 to n, where n is the total number of risks identified. The team then
recommends mitigation activities based on the analysis.
The important thing about doing the risk analysis is that it needs to be
updated periodically because situations and assumptions change. I
recommend updating a risk analysis at least twice a year.
Another thing to remember is that immediacy bias will affect people’s
opinions markedly in a subjective exercise like this one. If you assess
people’s fear of flying before and after a major airline crash, you’ll find a
markedly different view of flying risk before versus after. Offshore drilling
risk assessments changed significantly after the BP Deepwater Horizon
incident.
Also, try to be as specific as possible concerning the risk items that you
put in your register. For example, “skilled labor availability” is not very
useful. It would be better to note which trades are likely to be short and
when during the project this would have an effect. (By the way, I have
never seen a project that didn’t have an electrical and instrumentation
“crunch” just before startup.)
From a personal standpoint, when discussing risks with your family, the
three considerations probability, consequences, and mitigation give you a
framework for talking about risks. But again, be aware of the biases that
influence people’s thinking.

Managing Cost and Schedule Uncertainties

Project cost and schedule are almost always important uncertainties. My


father, a dairy farmer and carpenter, used to say, “Make your best estimate
of cost and time required for a project…and then double it.” Schedule is
especially onerous, as we usually start with a critical path plan and use that
as our estimate. Projects are almost always late, which usually results in
going over budget as well. My friend Frank Miller at PCR (a chemical
company I worked for) used to say, “Cost, schedule, quality: pick any two.”
With these two uncertainties, we’re really talking about implementation.
With careful planning and detailed follow-up, the probability of cost
overruns and delays can be minimized.
Pat Leach has a good discussion on this topic (see Chapter 10, “Late and
Over Budget—Project Management Issues”) in his book Why Can’t You
Just Give Me The Number?1
A friend of mine, Jack Bess, spent much of his career doing probabilistic
cost and schedule analysis. His experience (and subsequent rule-of-thumb)
was that the critical path outcome of a project was usually between the p01
and the p05 on the s-curve. This is why projects are almost always late!
Jack would develop a detailed influence diagram of the schedule and then
develop a model and run a simulation. He could tell you the likelihood that
a particular task or delivery would end up being on the critical path. Then
you knew which items needed to be watched and which were not likely to
cause a problem.

Jack Bess told me a story of one project that was very late and over
budget. He had helped the team with the original cost and schedule
estimate, so he was asked to go back and figure out how to get the
project back on track. When he got there, he asked the project manager
where the initial schedule risk analysis was. The manager looked blank
for a minute and then replied, “Oh, I think it is in the file cabinet in the
hallway…,” and they went looking for it.

If the project manager had been using the model and keeping it updated,
it is unlikely the project would have gotten so far behind. For project cost
and schedule risk, this is even more important than it is for the generic risk
assessments.
One good example from my experience was the West Valley
Demonstration Project, which Westinghouse completed for the Department
of Energy (DOE). This was a commercial nuclear fuel reprocessing plant
that was very poorly operated and eventually abandoned, leaving a
contaminated facility and 600,000 gallons of high-level radioactive waste
behind. The State of New York and ultimately the DOE inherited the clean-
up. I knew the clean-up project manager, Steve Marchetti (after
Westinghouse started the work, he was the Westinghouse leader at the site).

Steve had a detailed schedule, which he reviewed with his staff each
week in detail. They discussed each activity and updated the master
schedule based on the new information and adjusted as necessary.
Every day they met in the morning for a few minutes to update each
other on progress toward the weekly plan. Then each month they did a
complete review—past, current, and future—and re-issued the
schedule. At the time I met with Steve, the project was on schedule
and on budget. He eventually was transferred (and promoted), and I
lost track of him and the Demonstration Project.

Steve didn’t use probabilistic methodology per se, but he had a detailed
model of the project, and they used this model to keep ahead of the work
and head off problems before they happened. It was part of their project
management system.
Another question for capital projects would be, “What confidence level
should we use for the capital authorization and startup date?” Even if we
include the s-curves in the appropriation request, we still need specific
numbers to include. If we pick too high a level, say a p90 or p95, the project
will probably find a way to spend the money and we’ll end up with budgets
drifting significantly upward through time. If we set it too low, say p10, the
project team will give up because they have little chance of meeting the
target. The best strategy is to pick a point on the cost and schedule s-curves
that are credible yet realistic, perhaps p40 or p60 values. Also, these targets
should be specific to the project being done and should have a rational line
of reasoning behind them. People don’t respond well to arbitrary and
capricious goals.

Project Cost and Schedule Risk Quantification

John Hollmann recently published a book titled Project Risk


Quantification.2 John is a very experienced cost and schedule risk
consultant and is an influential “guru” within the cost engineering
professional association, AACE International (www.aacei.org, formerly the
Association for the Advancement of Cost Engineering). John makes the
case for integrating probabilistic analysis completely with project
management, focusing on cost and schedule. His work builds on
benchmarking work Rand Corporation did under Ed Merrow’s leadership in
the 1980s. Merrow went on to form the Independent Project Analysis
consulting firm, where John worked for many years (IPAGlobal.com).
I heard a talk recently at the Houston SPE (Society of Petroleum
Engineers) that cited Merrow3 in an article in the Oil and Gas Facilities
journal:

78% of recent large upstream oil and gas projects underperformed,


average cost overruns: 33%,
average schedule overruns: 30%, and
percent of underperformers with serious production problems: 64%.

Even with advanced computing and modeling capability, complicated stage-


gate project management systems, and relatively mature technology, we are
not doing well with project execution. Hollmann makes a compelling case
that incorporating historical experience and probabilistic analysis can help
us do a better job of managing project cost and schedule risk. He has
proposed a set of risk quantification methods that are empirically valid and
are practical. Hollmann has found that systemic risks—“…artifacts of
system attributes (the internal project system, its maturity, company culture,
complexity, bias, and so on), and the project’s interaction with external
systems (regional culture, political, and regulatory systems)”4—are more
likely to make your project late and over budget than project-specific risks.
In addition to these risks, there are labor and material escalation risks and
currency exchange risks that you must consider and should quantify.
Hollmann proposes an empirical model for quantification of systemic
risk that uses updated Rand study benchmarking and historical data as its
foundation. Going through his process, you may not like the answer you
get, but you’ll know where your project is likely to end up (just as you may
not like the realization that your critical path plan likely has less than a 5%
chance of happening).
Hollmann also notes that large project cost expenditures versus their
original estimates have a bimodal output distribution. Some projects “blow
out,” that is, they are years late and hugely over budget. He has a “tipping
point” methodology to gage your likelihood of having a project blowout.
When you visit a construction site that is at or past this tipping point, you
see increased labor head count and hours (it is crowded), out-of-sequence
fabrication and construction such that equipment and material are stored
everywhere and hence the workers spend much time searching for stuff, and
(of course), people are not happy about working there. Arguments are
frequent and heated.
I mention Hollmann’s work because if you are taking over as a project
manager for a new project or have oversight over the project manager(s),
the techniques he outlines in his book (1) are consistent with decision
analysis tools and principles, and (2) could significantly help you
understand how to quantify and subsequently manage risk.

Implications for the Manager


The main point to remember about risk analysis and managing projects is to
periodically and fastidiously update your work. On a personal level,
thinking about risk in terms of probability, potential consequences, and
mitigation is a powerful framework to discuss risk with your spouse and
family members.
As an aside here, our entire society is losing functioning capability
relative to the recent past. The advances we’ve made in computerization
seem to be offset but increasing regulation and compliance costs. For
example, in 1952, President Harry Truman asked the DuPont Company to
build and operate the Savannah River Project. DuPont replied that they
would do so and would charge $1 plus $1 per year above actual costs, but
they would build it and operate it as if it were one of the other DuPont
plants. By 1955, DuPont (and the Atomic Energy Commission) had built
and safely started up:

five full-scale very powerful production reactors,


two isotopic separation facilities (these were huge facilities that
were operated from behind eight-foot thick concrete walls; these
facilities were referred to as “canyons”),
a heavy water production facility,
a very large and comprehensive waste treatment area,
a tritium production facility,
a fuel fabrication facility, and
a large laboratory to support the complex from a scientific and
engineering perspective (the laboratory is now SRNL, Savannah
River National Laboratory).

The reactors and the other processes worked. The engineers who designed
them didn’t have computers—they used slide rules! The reactors were
operated safely and cost-effectively until the late 1980s, when they were
shut down for political reasons. The other facilities continued to operate for
some time after the reactors shut down. Can you imagine how much it
would cost and how long it would take to build something like this today?
There’s nobody in the world—not even China—who could do this today.
Note that this was the same generation that successfully completed six
manned missions to the moon between 1969 and 1972.
You might ask, what does this have to do with decision or risk analysis?
The answer is that like individuals, teams, groups, and companies, society
as a whole has objectives and trade-offs and has to prioritize. Society as a
whole also has to decide whether to discuss these objectives and trade-offs
transparently and honestly or to allow manipulative politics to control the
discussion.
CHAPTER 14

Other Topics

More Things to Think About

This chapter discusses a few more tools and items to consider. We’ll start
with discussions about portfolio analysis and then risk analysis. Utility
theory is an interesting decision analysis (DA) concept but is infrequently
necessary. We’ll discuss normative and descriptive viewpoints and
preferences.
The most important topics in this chapter concern DA, ethics, and
personal decision making. We’ll conclude the chapter by disclosing the
“secret” of DA.

Portfolio DA Projects
When hearing the word portfolio, most people think of an investment
portfolio. In DA, portfolio refers to a set of potential alternatives. It could be
a collection of businesses, or a group of oil and gas fields, or a group of
pharmaceutical compounds, or a collection of publicity or advertising
alternatives.
Decision analysts spend considerable time and work helping clients with
portfolio management, which can become quite complex in practice
(especially in the pharmaceutical industry, where the company’s ultimate
survival can depend on properly managing the pipeline of new potential
drugs). In this chapter, we’ll discuss some of the tools and how they can be
relevant for the working manager.
From a DA standpoint, analyzing a portfolio project is very similar to
what we’ve been talking about so far. We’d start the analysis with an
opportunity or problem statement, get reactions to the statement, discuss the
business situation, develop an objectives hierarchy, and talk about closed,
strategic, and tactical decisions. However, the analysis part is a bit different.
The first tool we would use would be to plot probability of success versus
NPV if successful for each element of the portfolio. This is the traditional
way of starting a portfolio analysis. However, sometimes the y-axis is
another measure. For example, if you are looking at a fleet of ships, you
might plot shipping capacity versus present value of the total cost to operate.
Figure 14.1 shows a plot of probability of success versus NPV for a group
of potential projects. If you could only choose four projects, which ones
would you choose?
Also shown in Figure 14.1 are lines of indifference—a $100MM NPV
project with a 50% chance of success should be as desirable as a $200MM
NPV project with a 25% chance of success (assuming similar s-curves for
NPV).
Another way to characterize projects within a portfolio refers to which
quadrant of Figure 14.1 the project is in. Upper right projects are large and
relatively certain—these can be referred to as “pearls.” Large projects with
lower probability of success would be called “oysters” as they turn into
pearls if they are successful. Smaller projects with high probability of
success can be called “bread and butter” as you need these projects to stay in
business. Small projects with low probability of success are termed white
elephants and should not be completed unless they are really inexpensive to
pursue.

Figure 14.1 Probability of success versus potential value for a portfolio


Another useful portfolio tool is to plot cumulative expected value versus
cumulative expected cost. In our hypothetical portfolio, if each project costs
$10MM to implement, we can develop this plot as shown in Figure 14.2.
It is appropriate to consider both views of the portfolio (as represented by
Figures 14.1 and 14.2). Sometimes larger projects cost so much that they are
poor choices for the portfolio, while sometimes small, low probability of
success projects are so inexpensive to pursue that they can be included.
If you have p10/50/90 results for the projects in your portfolio, you can
complete tornado graphs and s-curves for the entire portfolio or subsets of
the portfolio. I once had a portfolio project where a more aggressive strategy
than the client’s momentum strategy did not cost much more money but had
much larger potential cash flow. Once the client saw the difference in the s-
curves, they decided to implement the more aggressive portfolio.
Probably the most difficult part of using these tools is recognizing that
you are in a portfolio situation. Once you realize that, the analysis is
relatively easy to do.
I haven’t discussed personal financial portfolio management in this
chapter, and you may be wondering why. My perception is that there is an
overabundance of financial advisors who peddle financial portfolio snake
oil. The essence of their theory is that (1) a diversified portfolio has lower
risk, and (2) you need a mix of investments considering risk on the y-axis
and reward on the x-axis.

Figure 14.2 Portfolio view, decreasing economic benefit

The problem with this theory is best exemplified by mutual funds—the


fund managers take a lot of your money to manage risk versus reward, but
most of them don’t do as well as the S&P 500.

One company I used to work for made a very professional financial


advisor available to us. He went through a very detailed and thorough
financial analysis for my wife and recommended a set of investments to
diversify our portfolio and lower our risk. During the great recession,
these investments dropped more than the stock portfolio that I had in
my remaining investments.

So I’m biased, but at the end of the day Warren Buffett’s approach of
buying good companies and keeping them seems to work the best.

Utility Theory
One thing that I have observed reading academic publications and attending
conferences is that much attention is devoted to utility theory. Utility theory
uses equations to adjust a project’s NPV to account for its size relative to the
company as a whole. At one conference, two very experienced decision
analysts were asked how often they had come across a real utility theory
problem in their consulting practice. Both men stated, “One or two over 30
years of consulting.”
The rule of thumb that I’ve heard expressed is that if a given project can
affect more than one-fifth of a company’s total value, it needs to be
considered very carefully. Skinner summarizes utility theory quite well in his
book, Introduction to Decision Analysis.1 My (unsolicited) advice to entities
funding more academic work in utility theory is to find something else to
fund. My advice to you is to be aware that this exists but you’re probably not
going to need it.

Normative Versus Descriptive Viewpoints


A distinction that decision analysts refer to can be very useful: the
distinction between how things should be—normative—versus how things
are—descriptive. In fact, the DA community is divided between
investigating how people should make decisions versus how people actually
do make decisions. The tools I’ve described in this book are designed to help
you learn how to make better decisions, both for yourself and the people you
associate with. Hence the topics I’ve covered are normative in nature.
I have found this distinction to be very valuable in understanding
differences between people. Some people view situations from a “should”
standpoint, that is, how they believe that the situation should be. Others tend
to view situations viewing them as they “are,” regardless of how the
situation should be. This distinction is implied by the Myers–Briggs fourth
scale (J versus P) and is also implied by the Critical Parent ego state in
Transactional Analysis.
Both viewpoints are important and necessary—as a manager,
understanding how people are looking at the world can help you resolve
conflict. Usually people on either end of the normative versus descriptive
perspectives can get emotional pretty quickly in defending their view, and
understanding that this can be quite valuable.

DA and Ethics
Professor Marianne Jenkins, Arizona State University, published an
excellent article in Financial Engineering News (which, unfortunately, is no
longer available) in which she discussed the Enron/Arthur Anderson
bankruptcy and collapse. She noted that when Enron first experienced a
quarter with slightly lower-than-expected earnings, they decided to record an
off-balance sheet transaction to inflate the quarterly income. Their lead
auditor from Arthur Anderson told them they could not do this, but they
went over his head to Anderson headquarters and overturned his ruling.
Their logic was that earnings would be better the next quarter, and they
could either ignore the transaction or issue a footnote in the quarterly report
and correct it.
However, when the next quarter came, earnings were again lower than
they wanted. At this point, there were only three alternatives available to
Enron:

stretch the earnings like they had done in the previous quarter,
rectify the “error” from the previous quarter and take the earnings hit,
or
ignore the previous quarter’s stretch but exercise the discipline to
never do this again.

Unfortunately, they chose the first alternative and kept stretching until they
eventually got caught.
I’ve represented this as a decision tree in Figure 14.3. Each quarter is
represented as a separate decision.
The important point here is that neither Enron nor Arthur Anderson had
the objective of cheating until their companies imploded. In a similar
fashion, nobody ever has an objective of becoming an alcoholic by the time
they reach 45, or a drug addict by the time they are 30. Few people have the
objective of being addicted to cigarettes for a pack-a-day habit. It starts with
an ethical stretch and proceeds with small decisions until the person is
addicted. Nobody enters into a marriage with the objective of cheating on
their spouse, but if they cheat and get away with it, it can be much easier to
give in to subsequent situations and eventually be found out.
Note that with each small decision, the difficulty of rectification or
discipline on the next small decision becomes greater. Eventually you
essentially lose these alternatives, as they become too difficult to do.
Figure 14.3 gives you a framework for discussing ethical situations with
your colleagues, bosses, and your family. My sincere advice is that if your
company is in one of these situations, you need to find another job and get
out, as sooner or later they will get caught. Drawing a tree like the one
shown in Figure 14.3 with a person in your family engaged in risky
behaviors might be a useful intervention, depending on how far down the
path to addiction they are. This topic is very important for our families, as
personal decisions are the leading cause of death for people in the United
States.

Personal Decisions
In 2008, Ralph Keeney, a professor at Duke and one of the founding fathers
of DA, published an article in Operations Research titled “Personal
Decisions Are the Leading Cause of Death.”2 He analyzed data from 1900,
1950, and 2000 and found that:
Figure 14.3 A series of ethical decisions (the path to addiction)

About 5 of deaths in 1900 could be attributed to personal decisions.


Between 20% and 25% of deaths in 1950 could be attributed to
personal decisions.
At least 45% of deaths in 2000 could be attributed to personal
decisions.

He states in his summary: “Over one million people prematurely die each
year in the United States due to causes that can be attributed to personal
decisions … For ages from 15 to 64, about 55% of all deaths are attributable
to personal decisions.”3
Dr. Keeney began his analysis with statistical data on the medical causes
of death, from which he determined the actual cause of death (e.g., medical
cause of death lung cancer, actual cause of death smoking, and personal
decision smoking). He studied 14 medical causes of death (diseases of the
heart, cancer, stroke, unintended injuries, AIDS, suicide, homicide, etc.)
traced to eight actual causes of death (smoking, overweight, alcoholism,
accidents, etc.) that were attributed to eight personal decisions.
Dr. Keeney then summarized his work into eight rather obvious life-
saving decisions we all can make to improve our chances of avoiding
premature death. Avoid:
smoking (don’t start, quit)
overweight (exercise, diet)
alcoholism (quit)
vehicle accidents (seat belt, slow down)
illicit drugs (don’t start, quit)
suicide
criminal behavior, and
unprotected sex (condom, abstinence).

These are all rather obvious, but Figure 14.3 shows how small decisions
we make every day can eventually lead to a bad outcome. The longer we
proceed down the addiction tree, the more difficult it becomes to rectify the
situation and change our ways.

The Secret of DA
My friend David Skinner always says, “Never tell them the answer.” When
David and I were consulting together, he (almost) never broke this rule.
When I was consulting, I respected David’s guidance and found that his
advice was sound. Sometimes clients would ask me (usually during a heated
debate) what I thought and I would decline to answer. I’m going to break
David’s Rule, though, and tell you the secret of DA. It’s really simple but is
also very powerful.
Actually, David revealed the secret of DA in his book,4 but he discussed it
so subtly that I think most people miss it. The secret of DA revolves around
three questions:

What do you want?


What can you do?
What do you know?

All through the book I’ve stressed the importance of the objectives
hierarchy. This is the tool that helps you examine what you really want and
the trade-offs associated with getting what you want.
What you can do includes your decisions, alternatives, strategies, and
portfolio selections.
Your state of knowledge (and lack thereof!) is addressed by assessments
of uncertainty, value of information, and value of control.
The key to using the tools I’ve described in this book is to keep these
three questions—three important distinctions—in mind. You can boil down
these questions even further to the three verb ideas in the questions: want,
do, and know.

More Information
If you have further interest in DA, there is an informal group of practitioners
called the Decision Analysis Affinity Group (DAAG) that is made up
primarily of people from industry and was started in the mid-1990s. Their
website is www.DAAG.net and the group meets every year for a two-day
conference. There are no membership dues—you just register and pay for
the conference, which is usually very reasonably priced. Each year between
80 and 120 people attend the conference.
Another DA group is the Society of Decision Professionals (SDP),
www.decisionprofessionals.com. This group started in the mid-2000s. There
is a membership fee associated with SDP, but they sponsor free (to
members) training, webinars, and events. They usually meet just before or
after the DAAG conference each year and are in the process of absorbing
DAAG. Houston has a local SDP chapter that meets once a month, but you
do not have to be an SDP member to attend the Houston meetings.
There is a large DA community of practice in academia. The primary
group that publishes academic DA work is called INFORMS, which stands
for the Institute for Operations Research and the Management Sciences.
According to the INFORMS website, INFORMS is “the largest professional
society in the world for professionals in the field of operations research
(O.R.), management science, and business analytics.” The INFORMS
conferences are much larger than DAAG or SDP.
You may also choose to explore some of the readings noted in the
References section.
Implications for the Manager
Understanding normative and descriptive viewpoints can help you
understand your own thinking and why others may view the world
differently.
I sincerely hope that linking Dr. Keeney’s careful analysis showing that
personal decisions are the leading cause of death (in the United States) with
Figure 14.3, the decision tree of addiction, gives you:

the willpower to deal with, that is, rectify, any situation that you ever
find yourself in relative to Figure 14.3, and
a framework for discussing personal and ethical decisions with
colleagues, bosses, and, most importantly, family members if it
becomes necessary.

Now that we’re at the end of our journey together, my advice is to use the
tools that we’ve talked about to make better business and personal decisions.
A good decision does not guarantee a good outcome (just as a poor decision
does not guarantee a catastrophe), but we can improve our odds of getting
what we want and help others along the way.
I’d be interested in your successes and failures using DA tools in your
business and in your life. Send me an e-mail: dave@decisions-books.com.
And may all your outcomes be good ones, or to paraphrase Mr. Spock, make
good decisions, live long, and prosper.
Notes

Chapter 1
1. Leach (2006), p. 1.
2. Eisenhardt (1989), pp. 543–576.
3. Skinner (1999), pp. 11–13, 16.
4. Howard (1965).
5. Edwards and von Winterfeldt (1986), pp. 560–569.
6. Nutt (1997), pp. 44–52.
7. Nutt (1990), pp. 159–174.
8. Nutt (2002).
9. Skinner (1999), p. 16.

Chapter 2
1. Doyle and Straus (1976).
2. Skinner (2009), pp. 100–102, 238–241.
3. Hagen (2017).

Chapter 3
1. Keeney, Ralph L., Value Focused Thinking, Harvard College (1992), p. 34.
2. Op sit, Keeney, p. 83.
3. Op sit, Keeney, p. 35.
4. Op sit, Keeney, pp. 68–72 typical.
5. Skinner, p. 31–32.
6. Op sit, Skinner, p. 103.
7. Skinner (2009), pp. 30–32.
8. Op sit, Keeney, p. 5.

Chapter 6
1. Tversky and Kahneman (1974).
Chapter 7
1. Walkenbach (1996).

Chapter 9
1. Keefer, D.
2. McNamee, P. and J. Celona, Decision Analysis with Supertree, 2nd Edition, The Scientific Press,
1990.
3. Skinner, 3rd Edition, pp. 190–192.

Chapter 12
1. Spetzler, Winter, and Meyer (2016).
2. Kleinbaum and Kleinbaum (2013).
3. Op sit, Leach.
4. Op sit, Hagen.

Chapter 13
1. Leach (2006), pp. 108–110.
2. Op sit, Hollmann.
3. Merrow (2012).
4. Op sit, Hollmann, p. 15.

Chapter 14
1. Skinner, 3rd Edition, pp. 210–217.
2. Keeney, Ralph, “Personal Decisions Are the Leading Cause of Death,” Operations Research, Vol.
56, No. 6, November–December 2008, pp. 1335–1347.
3. Op sit, Keeney p. 1345.
4. Skinner, “Introduction to Decision Analysis” 2nd Edition, pp. 75–76.
References
Doyle, M. and Straus, D. (1976). How to Make Meetings Work. New York, NY: The Berkley
Publishing Group.
Edwards, W. and von Winterfeldt, D. (1986). Decision Analysis and Behavioral Research.
Cambridge, UK: Cambridge University Press.
Eisenhardt, K. M. (1989). Making Fast Strategic Decisions in High-Velocity Environments. Academy
of Management Journal 32, pp. 543–576.
Hagen, B. (2017). Problems, Risks and Opportunities: Enterprise Management. To be published this
year.
Hollmann, J. K. (2016). Project Risk Quantification. Gainesville, FL: Probabilistic Publishing.
Howard, R. A. (1965). Decision Analysis: Practice and Promise. Management Science, 34(6 1988),
pp. 679–695.
Keefer, D. and Bodily, S. E. (1983). Three-point Approximations for Continuous Random Variables.
Management Science 29, pp. 595–609.
Keeney, R. L. (1992). Value-Focused Thinking. Cambridge, MA: Harvard University Press.
Keeney, R. L. (2008). Personal Decisions Are the Leading Cause of Death. Operations Research,
Vol. 56, No. 6, November–December 2008, pp. 1335–1347.
Kleinbaum, R. and Kleinbaum, A. (2013). Creating a Culture of Profitability. Gainesville, FL:
Probabilistic Publishing.
Leach, P. E. (2006). Why Can’t You Just Give Me The Number? Gainesville, FL: Probabilistic
Publishing.
McNamee, P. and Celona, J. (1990). Decision Analysis with Supertree (2nd Edition). San Francisco,
CA: The Scientific Press.
Merrow, E. W. (2012). Oil and gas industry megaprojects: Our recent track record. Oil and Gas
Facilities 1(2), pp. 38–42.
Nutt, P. C. (1990). Preventing Decision Debacles. Technological Forecasting and Social Change 38,
pp. 159–174.
Nutt, P. C. (1997). Better Decision-Making: A Field Study. Business Strategy Review 8(4), pp. 44–52.
Nutt, P. C. (2002). Why Decisions Fail. San Francisco, CA: Berrit-Koehler Publishers, Inc.
Skinner, D. C. (1999). Introduction to Decision Analysis (2nd Edition). Gainesville, FL: Probabilistic
Publishing.
Skinner, D. C. (2009). Introduction to Decision Analysis (3rd Edition). Gainesville, FL: Probabilistic
Publishing.
Spetzler, C., Winter, H., and Meyer, J. (2016). Decision Quality: Value Creation from Better Business
Decisions. Hoboken, NJ: John Wiley & Sons.
Tversky, A. and Kahneman, K. (1974). Judgement under Uncertainty: Heuristics and Biases. Vol. 2.
Readings on the Principles and Applications of Decision Analysis. Menlo Park, CA: Strategic
Decisions Group.
Walkenbach, J. (1996). Excel for Windows 95 (2nd Edition). Foster City, CA: IDG Books.
Index
Ambiguity, 24
Anchoring bias, 57
Assessment
definition, 51
experts (see Experts assessment)
uncertainty (see Uncertainty assessment)

Bush Grid, 24

Career decisions, 25–26


Charts, 67
Closed decisions, 33–35
Continuous probability
continuous distribution, 52
expected values, 54
p10, p50, p50 assessment, 53
rain uncertainty, 54
roll back tree, 54
Cost of goods sold (COGS), 46
Cumulative probability graph
decision tree
McNamee–Celona method, 89
Offers A and B curve, 92
“one-line” formula, 91
skeleton tree, Offer A, 88–89
Swanson-McGill weighting, 89–90
internal rate of return, 87–88
Monte Carlo simulation
advantage and disadvantage, 92
vs. decision tree solution, 94–95
Great Recession, 94
Offers A, B, and C curve, 94
PERCENTILE function, 93
RAND() function, 92
@Risk™ and Crystal Ball™, 91
SIPMath, 95

DA. See Decision analysis


Decision
alternatives tables, 39–40
definition, 6
quality (see Decision quality) strategy tables, 41
trees (see Decision trees)
Decision analysis (DA)
analysis tool, 7, 11
companies, 4
consensus and alignment, 5
consultants, 9–10
DAAG, 137–138
definition, 6
ethics, 133–135
framing tool (see Framing)
good decision
definition, 9
disappointing outcome, 3
INFORMS website, 138
luck
vs. decision quality, 2–3
vs. talent, 4
managers reward, 4, 9
normative vs. descriptive viewpoints, 132–133
origin, 7
personal and business success, 2
personal decisions, 134, 136
poor decisions, good outcomes, 3
portfolio
cumulative expected value vs. cumulative expected cost, 131
decreasing economic benefit, 131
definition, 129
great recession, 132
management, 129
mutual funds, 132
success vs. NPV probability, 130
tornado graphs and s-curves, 131
secret of, 137
Society of Decision Professionals, 138
success probability, 8
Decision Analysis Affinity Group (DAAG), 137–138
Decision quality
appropriate framing, 116
commitment to action, 118–119
creative alternatives, 116–117
good decision, 115
high-quality decision, 115
relevant and reliable information, 117
sound reasoning, 118
spider diagram, 116
timing and choice defensibility, 119
values and trade-offs, 117–118
Decision Support Package (DSP), 20
Decision tree
cumulative probability graph
McNamee–Celona method, 89
Offers A and B curve, 92
“one-line” formula, 91
skeleton tree, Offer A, 88–89
Swanson-McGill weighting, 89–90
deterministic model, 64
example, 36
expected value, 37
family issues, 42
multiple career path/job offer, 37, 38
negotiating session, 43
outcomes, 37–39
skeleton tree, 39
solved/rolled back tree, 37, 38
Deterministic model
career path influence diagram
benefits, 75
interest rate, 74
measure of value, 75
model features, 76–78
p10/50/90 ranges, 74
raises and bonuses, 75
screenshot, 77
starting salary, 75
taxes, 75
time of analysis, 74–75
Excel (see Excel)
Lotus 1-2-3, 66
uncertainty, 65
Discrete probability, 51–52, 59–60
Dollar sign operator, 67–68

Ethical decisions, 133–135


Excel
dollar sign operator, 67–68
history, 66
model architecture
documentation, 71
formulas within a row, 70–71
hard-wired inputs, 70
large, 72
modules, 69–70
quick and dirty, 71
range names, 68
security, 72
units, 70
version control, 71
model validation and quality control, 72–74
range names, 68–69
styles, 69
terminology, 66–67
Expected value (EV), 37, 54
Experts assessment
anchoring bias, 57
immediacy bias, 59
overconfidence, 58
personal interest bias, 59, 64
units of measure, 56

Framing, 7
decision quality, 116
facilitation
body language, 15
capturing headlines, 15
listening, 15, 20
number of people, 13–14
people willingness, 14
PowerPoint, 13
problem-solving meetings, 14
whiteboard, 13
influence diagram
issue raising, 17–18
objectives hierarchy (see Objectives hierarchy)
problem/opportunity statement, 15–17
process documentation, 19–20
situation analysis, 18
stakeholder analysis, 18–19
tools, 11
Front end engineering and design (FEED), 28
Fundamental objective, 21

Game Theory approach, 10


Generic influence diagram, 46
Generic risk analysis
cost management, 122–124
mitigation activities, 121
offshore drilling, 122
project cost, 124–126
schedule risk quantification, 124–126
schedule uncertainties, 122–124
skilled labor availability, 122
Good decision
definition, 9, 115
disappointing outcome, 3

Immediacy bias, 59
Influence diagram (ID)
career path example, 47, 48
components, 45
drawing, 48–49
facilitation, 48–49
framing and analysis, 47
generic, 46
INFORMS, 138
Internal rate of return (IRR), 87–88

Lotus 1-2-3, 66

McNamee–Celona method, 89
Means objective, 21
Monte Carlo simulation, 91–95
Multiattribute decision analysis
advantage
definition, 105
job offer
financial, professional, and social objectives, 107
objectives hierarchy, 106–107
stock market graph, 108
subjective scores vs. expected value, 109
subjective scores vs. financial potential, 109
principles, 106
purchase decisions
beauty vs. cost chart, 110
efficient frontier, 110
merger and acquisition decisions, 110
multiattribute graph, 112
subjective objectives, 111
subjective assessment, 106

Net present value (NPV), 46, 73

Objectives hierarchy Bush Grid, 24


career counseling, 28–29
career decisions, 25–26
construction, 22
defining and achievement, 22
example, 24–25
facilitation, 23
focus, 28
fundamental objective, 21, 24
losing sight, 26–28
means objective, 21, 24
measure(s)-of-value, 24
negotiation preparation, 31
origin, 22–23
primary objective, 21
purpose, 3
uncertainty vs. ambiguity, 24

Personal interest bias, 59


Primary objective, 21
Probability distributions, 55
Project documentation, 19–20

Rain uncertainty, 54
Range names, 68–69
Rant time, 17, 20
Risk analysis. See Generic risk analysis

s-curve. See Cumulative probability graph


Slide rules, 72, 127
Society of Decision Professionals (SDP), 138
Solved/rolled back tree, 37, 38, 54
Spreadsheet, 66–67
Strategic decisions, 34, 35

Tactical decisions, 34, 35


Tornado diagram/graph
Company C failure, 84, 85
limitation, 83
NPV, 85
Offer A
calculation results, 80–81
Excel tricks, 81–82
uncertainties, 80
upside and downside, 82, 83
Offer B graph, 82, 83
Offer C
Excel graph, 84–85
uncertainties and discrete variables, 84
oil and gas resources portfolio, 86

Uncertainty
vs. ambiguity, 24
deterministic model, 65
strategy tables, 42
Uncertainty assessment
continuous distributions variables, 60–61
continuous probability
continuous distribution, 52
expected values, 54
p10, p50, p50 assessment, 53
rain uncertainty, 54
roll back tree, 54
definition, 51
discrete probability, 51–52, 59–60
experts (see Experts assessment)
influence diagram, 59, 63
price, 62–63
ranges and distributions
definition, 54
density function, 55
probability distributions, 55
rainfall cumulative probability distribution, 55, 56
revenue, 63
Utility theory, 132

Value of control, 100, 103


Value of information
company C success/failure, 100, 101
expected value, 98
finding another job, 100, 102
job offer decision tree rolled back, 98–99
perfect and imperfect information, 100
VisiCalc, 66
Visual Basic for Applications (VBA), 66

Weighted average cost of capital (WACC), 73


Workbook, 67
Worksheet, 67
OTHER TITLES IN QUANTITATIVE APPROACHES TO
DECISION MAKING COLLECTION
Donald N. Stengel, California State University, Fresno, Editor

Business Decision-Making: Streamlining the Process for More Effective Results by Milan
Frankl
Operations Methods: Managing Waiting Line Applications, Second Edition by Kenneth A.
Shaw
Applied Regression and Modeling: A Computer Integrated Approach by Amar Sahay
Data Visualization: Recent Trends and Applications Using Conventional and Big Data,
Volume I by Amar Sahay
The Art of Computer Modeling for Business Analytics: Paradigms and Case Studies by
Gerald Feigin

Announcing the Business Expert Press Digital


Library
Concise e-books business students need for classroom and research

This book can also be purchased in an e-book collection by your library as

a one-time purchase,
that is owned forever,
allows for simultaneous readers,
has no restrictions on printing, and
can be downloaded as PDFs from within the library community.

Our digital library collections are a great solution to beat the rising cost of textbooks. E-books can be
loaded into their course management systems or onto students’ e-book readers.
The Business Expert Press digital libraries are very affordable, with no obligation to buy in future
years. For more information, please visit www.businessexpertpress.com/librarians. To set up a trial
in the United States, please email sales@businessexpertpress.com.

You might also like