1

1
MS&E 252
Decision Analysis I
Problem Session 8
2
What concepts do we expect you to
master Ior the Final Exam?
· The Foundations oI DA
· PIBP. PISP
· Relevance
· The Five Rules
· U-Curves
· The Delta Property
· Sensitivity Analysis
· InIormation Gathering.
Value oI InIormation
· Decision Diagrams
· Probability Encoding
· The Rain Bet
· Medical DA
2
3
The Foundations of Decision Analysis
4
The Decision Analysis Cycle
Deterministic
Analvsis
Probabilistic
Analvsis
Appraisal Structure
Initial
Situation Decision
Iteration
Formulation Evaluation Appraisal
3
5
How do we evaluate the quality oI a Decision?
100º is where additional improvement efforts are not worth their cost.
1
Appropriate
Frame
2
Creative.
Doable
Alternatives
3
Meaningful. Reliable
Information
5
Logically
Correct
Reasoning
6
Commitment
To Action
0° 100°
Decision
Quality
4
Clear Values
and
Trade-offs
Adapted Irom: Matheson. Adapted Irom: Matheson.
Smart Organization Smart Organization. 1998 . 1998
6
I
n
f
o
r
m
a
t
i
o
n
P
r
e
f
e
r
e
n
c
e
s
Frame
A
l
t
e
r
n
a
t
i
v
e
s
Logic
Decision
Basis
Commitment
to action
The Six Elements of Decision Quality
4
7
PIBP & PISP
8
We deIined PIBP and PISP.
Me
Shirt
Me
¹ $20
~
PISP - The least you would accept to part with something you own.
ignoring market Iorces. such that you would be indiIIerent to
whether you parted with it or not.
'Mv PISP for
the Shirt
equals $20.`
Me
Me
Watch
- $35 ~
PIBP - The most you would pay to get something you do not own.
ignoring market Iorces. such that you would be indiIIerent to
whether you got it or not.
'Mv PIBP for
the Watch
equals $35.`
5
9
What is your PISP Ior car you own?
AIter Car Sale
- -
-
-
-
-
-
-
$
State 2
BeIore Car Sale
$
State 1
~
10
What is your PIBP Ior a new car?
BeIore Car Purchase
- -
-
-
-
-
-
-
$
AIter Car Purchase
$
State 2 State 1
~
6
11
We also discussed some issues related
to these two concepts.
· Cycle oI Ownership
· Wealth EIIect
· How should past events aIIect our
PISP and PIBP?
Sunk cost principle: ignore past events
and non recoverable loss oI resources
unless they aIIect your thoughts about the
present or the Iuture
Information
12
Relevance
7
13
We introduced the notion oI relevance
to talk about distinctions.
Probabilistically. A is relevant to B if
]A[B. &] is not equal to ]A[B`. &].
· In other words. if knowing B tells vou
something about the probabilitv of A
occurring. then A is relevant to B.
· Relevance does not imply causality.
· Relevance is a matter of information. not
logic.
A and B could be relevant given &.
and vet irrelevant given C and &.
14
But relevance can be diIIicult to spot in
trees. so we introduced relevance diagrams.
Relevance diagrams can make
irrelevance statements.
. but they cannot make any
relevance statements!
You should get used to saving 'the diagram shows that
there is a possibility of relevance between A and B`.
8
15
Instead oI trees. we can use
relevance diagrams.
¦A ' &} ¦B ' A. &}
No arrows
pointing in
Arrow pointing
Irom A into B
conditions B on A
A
1
A
2
B
1
B
1
B
2
0.626
0.374
0.714
0.286
0.552
0.448
B
2
A B
The arrow from A to B onlv
implies possible relevance.
Probability Tree
Relevance Diagram
16
Relevance diagrams allow us to make strong
statements oI irrelevance between distinctions.
A
1
A
2
B
1
B
1
B
2
0.626
0.374
0.714
0.286
0.714
0.286
B
2
The absence of an arrow from
A to B asserts irrelevance'
Probability Tree
Relevance Diagram
A B
¦AB'&} ÷ ¦A'&}¦B'&}
9
17
Just like we can Ilip trees. we can Ilip
arrows in relevance diagrams.
A B
¦A ' &} ¦B ' A. &}
A B
¦A ' B. &} ¦B ' &}
Arrow flipping requires that the two distinctions
be conditioned on the same state of information.
18
'Add arrows wherever vou want. provided vou
dont create a cvcle, A cvcle made bv more than
3 nodes is also not allowed.`
A B
C
We can only Ilip arrows according to
certain rules.
RULE #1
X
X
10
19
'You can flip an arrow between A and B if and
onlv if A and B are conditioned on the same state
of information.`
We can only Ilip arrows according to
certain rules.
RULE #2
In other words. any other node C
which points to A also points to B.
and any other node D which points
to B also points to A.
A B
D
C
?
Tip - draw a box around A and B
20
'You cannot remove anv arrows arbitrarilv.`
We can only Ilip arrows according to
certain rules.
RULE #3
A B
11
21
· Are A and B irrelevant given &?
D E
B A
C
next
line
D E
B A
C D E
B A
C
D E
B A
C D E
B A
C
An example oI recognizing irrelevance
Irom diagrams.
So we cant conclude' There is a possibility of relevance.
22
We also learnt how to avoid the
Associative Logic Error Iallacy.
We say that people make an ~Associative Logic
Error¨ when they fall into the trap
]A[B.&] ÷ ]B[A.&]
12
23
The Five Rules of Actional Thought
24
Do you remember the meaning oI the
Five Rules oI Actional Thought?
· Probability rule
· Order rule
· Equivalence rule
· Substitution rule
· Choice rule
Mnemonic:
POE`S Choice
The rules prevent us from becoming a 'monev pump.` or someone
who can be rationallv convinced to give awav all of his or her assets.
The rules also dictate how we think about decisions.
13
25
U-Curves & the Delta Property
26
U-curves helped us capture people`s
preIerences in Iace oI uncertainty.
U-curves are a good
means Ior us to assess the
values people place on
uncertain deals.
14
27
We learnt how to use u-curves and u-
values to roll back a decision tree.
0.4
0.6
S
R
0.95
0.32
.4*.95·.6*.32
Uncertainties:
Take e-value of u-values
÷ 0.63
Decisions:
Pick best u-value
O
I
0.40
0.63
0.57
P
÷ 0.57
Max(.4. .57. .63)
28
What is the delta property?
~
x
1
p
1
p
2
p
3
x
2
x
3
x
~
~
p
1
p
2
p
3
x
1
¹ ∆
x
2
¹ ∆
x
3
¹ ∆
?
We will add an amount ∆ to all possible outcomes.
What happens to vour PISP?
Q
A
If vou are a ∆-person.
then vour PISP should go up bv ∆'
15
29
II you choose to Iollow this property.
some other nice properties will Iollow.
· For a delta-person. PIBP oI a deal ÷
PISP oI that deal.
· Much easier to calculate the value
oI clairvoyance: VOC÷VFC-VNC.
· We can characterize your u-curve
by asking a single question.
· Exponential U-curve.
· CE does not depend on initial
wealth or on other existing deals.
∆ ∆∆ ∆
∆ ∆∆ ∆
30
What types oI u-curves satisIy the
delta-property?
Onlv two u-functions satisfv the delta-propertv.
ρ /
. . ) (
x x
e b a r b a x u
− −
+ = + = Exponential
bx a x u + = ) ( Straight line
16
31
A delta person`s u-curve can be
characterized by a single number.
DeIinition:
p
p
rx

=
1
Risk Attitude Relations
r ÷ 1 Risk Neutral
r ~ 1 Risk Averse
r · 1 Risk Seeking
x
-x
p
1-p
0 ~
r
m
= r
n
m
n Property:
32
Sensitivity Analysis
17
33
PerIorming Sensitivity Analysis helps us derive
some critical insights into a decision situation.
· How does the best decision change as
the probability oI sun changes?
· How does the certain equivalent change
as the probability oI sun changes?
· How does the value oI clairvoyance
change as the probability oI sun
changes?
'How does this change when vou change that?`
p
0 0.2 0.4 0.6 0.8 1
$150
$100
$50
$0
34
Here is a typical example oI what
Sensitivity Analysis allows you to do.
p 70 $-value
Red Ball
$750
Deal A 0.75
#N/A 1-p
#N/A White Ball
$400
0.40
p
Red Ball
$1,000
Deal B 1.00
0
#N/A #N/A 1-p
#N/A #N/A White Ball
$0
0.00
Deal C
$500
CE ÷ p*750
¹ (1-p)*400
CE ÷ p*1000
¹ (1-p)*0
CE ÷ 500
Find the
CE for
each
alternative
as a
function of
p.
18
35
$0
$200
$400
$600
$800
$1,000
$1,200
0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 1.00
Proba of Red Ball
C
e
r
t
a
i
n

E
q
u
i
v
a
l
e
n
t
Deal A
Deal B
Deal C
Best
This is conIirmed by our spreadsheet.
and also by plotting the sensitivity graph.
Go for C Go for B Go for A
Jalue of the
deal with
Free
Clairvovance
36
$0
$50
$100
$150
$200
$250
0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 1.00
Proba of Red Ball
C
e
r
t
a
i
n

E
q
u
i
v
a
l
e
n
t
VoC
Here is a graph showing the Value oI
Clairvoyance as a Iunction oI p.
We Iind the same
breakpoints as in
the previous graph.
19
37
Information Gathering
38
InIormation gathering involves using a test to
illuminate an unobservable distinction.
'What we can observe¨ 'What we really want to know¨
Uncertainty Test
Uncertainty Test
Inferential form
Assessed form
20
39
The 'assessed¨ tree determines the ioint
probabilities Irom the prior and the likelihood.
Assessed Form
Virus 'Test¨
1/400
÷ 0.0025
399/400
÷ 0.9975
Virus
No Virus
'Prior` 'Likelihood`
'Positive¨
'Negative¨
'Negative¨
'Positive¨
0.99
0.01
0.01
0.99
0.002475
0.009975
0.000025
0.987525
True Pos
True Neg
False Pos
False Neg
¦'Pos¨ ' Pos } ÷ Test Sensitivity
¦'Neg¨ ' Neg } ÷ Test Specificity
40
The 'inIerential¨ tree determines the posterior
and pre-posterior probabilities Irom the ioint.
Inferential Form
'Test¨ Virus
÷ 0.01240
÷ 0.98755
'Positive¨
'Negative¨
Virus
No Virus
No Virus
Virus
'Pre-Posterior` 'Posterior`
0.20
0.000025
0.80
0.999975
0.002475
0. 000025
0. 009975
0.987525
Lets flip this tree'
21
41
When is it worthwhile to obtain
inIormation Irom an imperIect test?
· The test must be observable.
· The test must be relevant to the distinction of interest.
· The test must be material to the decision.
That is. the alternative with the highest CE mav change
based on the results of the test.
Even one potential change is enough to make a test material.
· The test must be economic (create more value than it
costs to perform).
Need all four. or else the test is a no go!
42
Decision Diagrams
22
43
Why are decision diagrams important?
· Facilitates communication.
· Grows linearly. unlike trees that grow
exponentially.
· No degrees and no probabilities yet!
ThereIore no clarity test required
Enables high level abstraction
44
Decision diagrams contain Iour kinds
oI nodes.
Decision Node Uncertainty Node
Deterministic Node Value Node
F(x.y)
x
y
23
45
Four kinds oI arrows indicate
conditioning.
InIormational Arrow Relevance Arrow
Functional Arrow
InIluence Arrow
A B C
D
2
D
1
E
D F
D C
'B is conditioned on A.
and mav be relevant to A.`
'C and D
1
are known when D
2
is made.`
'F is a deterministic function of D and E.`
'Mv choice in decision D mav affect
the distribution of uncertaintv C.`
46
Probability Encoding
24
47
Why do we need probability encoding?
· Probability encoding is a process
that will help you elicit the
numbers you need Ior your
analysis.
· ThereIore. your analysis will be
as good as your probability
encoding was.
· But probability encoding can be
time-consuming and is Iull oI
traps.
What did vou learn from the 20 Questions?
1
2
3
4
7
8
9
10
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
80.0%
90.0%
100.0%
0 20 40 60 80 100 120
1
48
'You think you know more than you do.¨
'People generallv tend to have distributions
that are too narrow.`
0
1
2
3
4
5
6
7
8
9
1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85
Technically speaking. we
call this a ~Central Bias¨.
To avoid it. you can use
backcasting.
See the number oI surprises
in the 20 Questions.
25
49
'There are manv distinctions that can help vou
think about a distinction of interest.`
'You know more than you think you do.¨
US dog food
production
US Population
Avr. # oI
dogs per
home
How much
Iood does a
dog eat?
Etc.
Dog Iood
thrown
away
Consumption
oI dog Iood by
other pets
50
Rain Bets
26
51
What do we call the Rain Bet?
· Suppose that vou can find two people
who have different beliefs about some
uncertaintv.
· Then vou can set up a deal in which thev
will lose or gain monev from the other
person depending on the outcome of this
uncertaintv. and this deal will also look
appealing to both of them.
· You can even make monev out of them'
52
We set up a bet that both DolI and
Sung will Iind appealing.
Dolf Sung
¦R ' &
DolI
} ÷ 0.9 ¦R ' &
Sung
} ÷ 0.08
Rain
No Rain
Rain
No Rain
0.9 0.08
$5.10
-$4.90
$4.90
-$5.10
$4.10 ~ $4.10 ~
Both could actuallv be willing to pav up to $4.1 to get the deal'
27
53
Let us reveal what was hidden behind
this bet.
Dolf Sung
¦R ' &
DolI
} ÷ p ¦R ' &
Sung
} ÷ q · p
Rain
No Rain
Rain
No Rain
p q
$ m
-$ (K-m)
$ (K-m)
-$ m
mp -
(1-p).(K-m)
- qm +
(1-q).(K-m)
m ÷ K ` (2 - p - q) / 2
54
Medical Decision Analysis
28
55
Why did we evaluate micro-
probabilities?
· A micromort is a chance of one in a
million of dving.
· This unit is usuallv more appropriate
when we cope with life and death
issues.
· We are interested in evaluating the
dollar value per micromort ($/µmt).
56
p
1-p
24 hour Pain. then cure
Cure
Death
C
A
Pay a certain amount X. then cure
We calculated your micromort value
by setting up two deals.
B
Form the ratio
(X/p)/1.000.000
The ratio (X/p)/1.000.000 is vour $-value per micromort.
Indifference
29
57
Why is the value per micromort useIul?
There is a range of small risks over which these two
curves are linear.
Payment to Avoid Death Risk
1
10
100
1000
10000
100000
1000000
10000000
100000000
0 0 1 10 100 1,000 10,000 100,000 1,000,000
Micromorts (1 in a million chance at death)
P
a
y
m
e
n
t

(
$
)
Payment to Accept Death Risk
1
10
100
1000
10000
100000
1000000
10000000
100000000
0 0 1 10 100 1,000 10,000 100,000 1,000,000
Micromorts (1 in a million chance at death)
P
a
y
m
e
n
t

(
$
)
58
Here is a simple example showing how
you would use the constant range.
· Leo is vour tvpical American, as
such. he considers he faces 270 µmt
a vear from auto accidents.
· His value per micromort in the
constant range is about $20.
· How much should he be willing to
pav for a full 'auto death
prevention` policv?
Answer: 20 ` 270 ÷ $5.400 a year!
30
59
Summary
60
Clairvoyant
Core Decision Analysis Concept Map
Clarity
Possibility
Measure
Distribution
Distinction
Probability Relevance
Relevance
Diagram
e-Value Prospect
PreIerence
Probability
Deal
u-Curve
u-Value
Value
Measure
Wizard
Probabilistic
Dominance
Certain
Equivalent
Delta
Property
Value oI
Clairvoyance
Risk
Averse
Decision
Diagram
Exponential
Best
Decision
Decision
Resource