You are on page 1of 59

MGMT 3130 Judgment and Decision

Making in Organizations

UNIT 1.3
DECISION MAKING UNDER
UNCERTAINTY

1
Where are we now?

Decision Heuristics Choice and Applications


Analysis and Biases Preference
Two systems of
Problem thinking Performance
Prospect theory
definition appraisal
Availability
heuristic
Decisions Motivating
Escalation of
involving multiple Representativeness employees by
commitment
objectives heuristic money? Or…?

Decision-making Anchoring and Why people


adjustment Time and
under become bad
preferences
uncertainty apples?
Embodied
cognition
Collecting more Is more choice
Bounded Going green
information always better?
awareness
2
UNIT 1.3
DECISION MAKING UNDER UNCERTAINTY

In many decisions, the consequences of the alternatives cannot


be predicted with certainty. What can we do?
• Maximax choice and Maximin choice
• Decision trees with expected monetary value (EMV) criterion
• Decision trees with expected utility (EU) criterion
When decisions involve multiple objectives and uncertainty
• Putting SMART and Decision Trees together
Gathering information to resolve uncertainty
• Revising probability estimation in light of new information
• Value of information

Goodwin, P. & Wright, G. (2009). Decision Analysis for Management Judgment, 4th Edition. UK: John Wiley &
Sons. Chapter 6 (pp. 115-129) and Chapter 9. 3
Think about it
Each box contains 10 cards.
A dollar amount is printed on each card.
You are asked to choose a box to draw a card and win the
amount of $$ stated on the card.

Box 1: Box 2:
some “$1000” some “$600”
some “$500” some “$550”
some “$0” some “$350”

Which box would you prefer? Why?


4
MAXIMAX CHOICE and MAXIMIN CHOICE

When the decision maker is unable, or unwilling, to estimate probabilities


of different outcomes, s/he might use the following strategies:
• Maximax choice: Maximizing the maximum possible payoff.
• Maximin choice: Maximizing the minimum possible payoff.

Box 1: Box 2:
some “$1000” some “$600”
some “$500” some “$550”
some “$0” some “$350”

E.g. In the case above, decision makers will choose


• Box ___ if a maximax choice strategy is used.
• Box ___ if a maximin choice strategy is used.
5
Think about it
Each box contains 10 cards.
A dollar amount is printed on each card.
You are asked to choose a box, and draw a card for 1000 times
with replacement. You will gain the average amount of $$ stated
on the 1000 cards.

Box 1: Box 2:
1 card: $1000 3 cards: $600
8 cards: $500 3 cards: $550
1 card: $0 4 cards: $350
Which box would you prefer? Why?
6
SOLVING DECISION TREES
USING THE EXPECTED MONETARY VALUE (EMV) CRITERION
• What is expected monetary value?
• Suppose you play the following game, you flip a coin:
If the coin lands heads, you win $100.
If the coin lands tails, you win nothing.
• Let X be our consequences for one flip

• The probability of X taking on a particular value xi is p(xi)


p(x1) = 50%; p(x2) = 50%
• The expected monetary value is the weighted sum of the
probabilities and consequences
EMV(coin flip) =  xi p(xi)
• The expected monetary value of flipping the coin is $50.
7
• Note that EMV is not the most likely value of a single coin flip.
Outcomes are either $100 (if the coin lands heads) or $0 (tails).
$50 is not a possible outcome from a single flip.
• The expected value of an uncertain prospect is the long run average
value of that prospect.
• Suppose you flipped the coin 1000 times. What would you expect the
coin flip outcomes to be? 500 heads and 500 tails.
Your expected total winnings would be:
500 heads * $100/head + 500 tails * $0/tail = $50,000
Your expected average winnings for each flip would be:
$50,000 total / 1000 flips = $50/flip
The expected value of each coin toss is $50.

8
EMV: Business deal example
• Suppose you are considering a business deal that requires a $60M
investment, and the return will depend on economy conditions.

Potential economy Probability: Value: Value x Probability:


conditions:
Economy goes up 20% Large return on
investment $20M
$160M - $60M = $100M
Economy stays the 60% Small return on
same investment $6M
$70M - $60M = $10M
Economy goes down 20% No return on
investment -$12M
$0 - $60M = -$60M
SUM: $14M
EMV(deal) = $14M
9
Decision Tree
• It is a way to represent graphically decisions that involve uncertainty.
• What are the possible alternatives?
• What are the possible consequences of each alternative?
• What are the chances of occurrence for each consequence?
The business deal example:
Economy goes up (20%) Large return on
Chance node; its branches
investment: $100M
represent possible outcomes
of a chance event
Economy stays the same (60%) Small return on
Invest investment: $10M

Economy goes down (20%) No return on


investment: -$60M
Do not invest
No change
Decision node, its branches $0
represent the available choices 10
Solving a decision tree (using EMV criteria)
• Start off at the endpoints on the right and move left.
• When you encounter a chance node, calculate the expected monetary
value.
• When you encounter a decision node, choose the branch with the
highest expected monetary value.

The business deal example:


Economy goes up (20%) Large return on
investment: $100M
EMV =
$14M Economy stays the same (60%) Small return on
Invest investment: $10M

Economy goes down (20%) No return on


investment: -$60M
Do not
invest
No change
$0
11
In-class exercise: R&D decision
• You can stop development, and gain/lose nothing ($0).
• Or you can continue development, and:
• If you do not receive the patent (30% chance), you lose your
investment (-$2B).
• If you do receive the patent (70% chance), you can either develop
and market a product based on the technology yourself, or you can
license the technology to another firm.
• If you license the technology, you receive ($23B).
• If you develop the product yourself, your success depends upon
the demand for the product.
• If demand is high (25%), you make $43B.
• If demand is medium (55%), you make $21B.
• If demand is low (20%), you make $3B.
• Draw the decision tree and then solve it.
12
In-class exercise: R&D decision
[Your answer here]

13
Think about it
Each box contains 10 cards.
A dollar amount is printed on each card.
You are asked to choose a box, and draw a card ONCE.
You will gain the amount of $$ stated on the card.

Box 1: Box 2:
1 card: $1000 3 cards: $600
8 cards: $500 3 cards: $550
1 card: $0 4 cards: $350
Which box would you prefer? Did you follow the EMV criterion?
14
Not the EMV criterion?
Consider the following two games:

Game 1 Game 2
Flip a coin: Flip a coin:
If it lands heads, win $30. If it lands heads, win $2000.
If it lands tails, lose $1. If it lands tails, lose $1900.

Which game would you prefer?


EMV (Game 1) EMV (Game 2)
= 50% * $30 + 50% * (-$1) = 50% * $2000 + 50% * (-$1900)
= $14.5 = $50

Did you follow the EMV criterion?

15
The expected utility (EU) criterion

Sometimes, people don’t choose the option that maximize EMV. Why?
• The EMV criterion is most suitable when the decision can be repeated a
large number of times so that a long run average result would be of
relevance.
• In some situations, however, decision makers only have one
opportunity to choose the best course of action. If things go wrong,
then there will be no chance of recovering losses. In such circumstances,
some people might prefer a less risky course of action.
• Thus, we need to consider expected utility:
EU(X) = U(x1)*p(x1) + U(x2)*p(x2) + … + U(xi)*p(xi) where xi is the various
outcomes of the uncertain event X
• EU takes into account our risk attitudes which can be reflected by our
utility functions.

16
UTILITY FUNCTIONS

• A utility function relates ‘the monetary value of money’ with ‘our


subjective value of money’.
• To derive our utility functions, we have to understand the concept of
certainty equivalent first.
• Certainty Equivalent (CE): The amount of money in your mind that is
equivalent to the expected payoff of a chance event.
• E.g. Consider a gamble:
“50% chance to get $1000, 50% chance to get $0.”
What is the maximum amount of money ($x) you will pay to engage in
this gamble?
[i.e. What is the $x such that U($x) = EU(gamble)?]

17
Defining our utility functions using the method of bisection

• I want to define my utility function over $0 to $1000.


• First thing to do:
• Set the utility of the lowest point in the range to 0, i.e. U($0) = 0.
• Set the utility of the highest point in the range to 1, i.e. U($1000) = 1.
Utility
1
U($1000) = 1
0.8

0.6

0.4

0.2
U($0) = 0
0
$0 $200 $400 $600 $800 $1,000
18
Money ($)
• Then, consider a gamble:
“50% chance to get $1000, 50% chance to get $0.”
What is my certainty equivalent of the gamble?
• In other words, what is $x1 such that:
U($x1) = EU(gamble) = ½ U($1000) + ½ U($0) = ½ *1 + ½ *0 = 0.5
• For me, $x1 is $250.
Utility
1
U($1000) = 1
0.8

0.6
U($250) = 0.5
0.4

0.2
U($0) = 0
0
$0 $200 $400 $600 $800 $1,000
19
Money ($)
• Then, consider a gamble:
“50% chance to get $1000, 50% chance to get $250.”
What is my certainty equivalent of the gamble?
• In other words, what is $x2 such that:
U($x2) = EU(gamble) = ½ U($1000) + ½ U($250) = ½ *1 + ½ *0.5 = 0.75
• For me, $x2 is $600.
Utility
1
U($1000) = 1
0.8
U($600) = 0.75
0.6
U($250) = 0.5
0.4

0.2
U($0) = 0
0
$0 $200 $400 $600 $800 $1,000
20
Money ($)
• Then, consider a gamble:
“50% chance to get $250, 50% chance to get $0.”
What is my certainty equivalent of the gamble?
• In other words, what is $x3 such that:
U($x3) = EU(gamble) = ½ U($250) + ½ U($0) = ½ *0.5 + ½ *0 = 0.25
• For me, $x3 is $70.
Utility
1
U($1000) = 1
0.8
U($600) = 0.75
0.6
U($250) = 0.5
0.4

U($70) = 0.25
0.2
U($0) = 0
0
$0 $200 $400 $600 $800 $1,000
21
Money ($)
• Then, consider a gamble:
“50% chance to get $250, 50% chance to get $70.”
What is your certainty equivalent of the gamble?
• In other words, what is $x4 such that:
U($x4) = EU(gamble) = ½ U($250) + ½ U($70) = ½ *0.5 + ½ *0.25 = 0.375
• For me, $x4 is $125.

Utility
1
• Then, joint U($1000) = 1
the points 0.8
together U($600) = 0.75
0.6
and this is
U($250) = 0.5
MY utility 0.4 U($125) = 0.375
function.
U($70) = 0.25
0.2
U($0) = 0
0
$0 $200 $400 $600 $800 $1,000
22
Money ($)
Define YOUR utility function over the range $0 - $1000

50/50 gamble CE U(CE)


between
1
$0/$1000 $x1 = 0.5 0.9
0.8
$x1/$1000 $x2 = 0.75
0.7
0.6

Utility
$0/$x1 $x3 = 0.25
0.5
$0/x3 $x4 = 0.125 0.4
0.3
$x3/$x1 $x5 = 0.375 0.2
0.1
$x1/$x2 $x6 = 0.625 0
0 100 200 300 400 500 600 700 800 900 1000

$x2/$1000 $x7 = 0.875 Money ($)

23
Utility Functions and Risk Attitudes
Utility

Concave Our utility functions can be of


different shapes. Shapes of utility
functions reflect our risk attitude.
Linear

Convex

Money

Let’s consider how people with different shapes of utility function


would react to the following gamble:
“Your current wealth is $w, the gamble offers you 50% chance to win $1000,
and 50% chance to lose $1000.”
EMV(gamble) = 50%*(w-$1000) + 50%*(w+$1000) = $w = EMV(doing nothing)
EU(gamble) = 50%* U(w-$1000) + 50%* U(w+$1000)
Will EU(gamble) be larger than EU(doing nothing)?
24
Concave utility function implies risk aversion
Utility

EU(gamble) = 50%*U(w-1000)+50%*U(w+1000)
EU(gamble) < U(w)
EU(gamble) < EU(doing nothing)
U(w+1000)
Unwillingness to accept a
“fair” gamble indicates
U(w) that you are risk averse.
EU(gamble)

Risk aversion is implied by


U(w-1000)
the concavity of the utility
function.

w-1000 w w+1000
0 200 400 600 800 1000
Wealth ($)
25
Linear utility function implies risk neutrality
Utility

EU(gamble) = 50%*U(w-1000)+50%*U(w+1000)
EU(gamble) = U(w)
EU(gamble) = EU(doing nothing)

U(w+1000) Indifference to accepting a


“fair” gamble indicates
U(w) EU(gamble) that you are risk neutral.

Risk neutrality is implied


U(w-1000) by a linear utility function.

w-1000 w w+1000
0 200 400 600 800 1000
Wealth ($)
26
Convex utility function implies risk seeking
Utility

EU(gamble) = 50%*U(w-1000)+50%*U(w+1000)
EU(gamble) > U(w)
EU(gamble) > EU(doing nothing)
Willingness to accept a
“fair” gamble indicates
that you are risk seeking.

Risk seeking is implied by


U(w+1000) the convexity of the
utility function.
EU(gamble)
U(w)
U(w-1000)
w-1000 w w+1000
0 200 400 600 800 1000
Wealth ($)
27
PRS
You have $w in your pocket. Consider a gamble (X):
50% chance to win $100, 50% chance of winning nothing.
If you are risk averse, what can you say about your certainty equivalent (CE)
for the gamble X relative to its expected monetary value (EMV)?
1. CE(X) < EMV(X)
2. CE(X) = EMV(X)
3. CE(X) > EMV(X)
4. Oh! I don’t know!

Hints:
 Draw a utility function that reflects “risk averse”
 On the x-axis, identify 3 points: w, w+100, and EMV(gamble)
 On the y-axis, identify EU(gamble)
 On the x-axis, identify CE(gamble)
 What can you said about the EMV and CE?

28
Solving the PRS question:

29
SOLVING DECISION TREES
USING THE EXPECTED UTILITY (EU) CRITERION

• Earlier, we talked about using decision trees to solve problems of


maximizing EMV.
• You can also use decision trees in the same way to solve problems of
maximizing EU:
• For each of the consequence, define the utility of the consequence.
• When you encounter a chance node, you calculate the EU (instead
of EMV).
• When you encounter a decision node, you choose the option with
the highest EU (instead of EMV).

30
Solving decision tree for EU: The R&D decision example

Stop development
U($0B)

No patent awarded (30%)


U(-$2B)
High demand (25%)
U($43B)
Continue Develop
development product Medium demand (55%) U($21B)

Patent
Low demand (20%)
awarded U($3B)
(70%)

License technology
U($23B)

31
Step 1: For each of the consequence, define the utility of the consequence.
Utility function of the decision maker is given as below:

($43B, 1.0)
Utility

($23B, 0.80)

($21B, 0.78)

($3B, 0.25)

($0B, 0.15)

(-$2B, 0)
-2 8 18 28 38
Money ($B)
32
Steps 2 & 3: When you encounter a chance node, calculate the EU.
When you encounter a decision node, choose the branch with the highest EU.

Stop development
U($0B)=0.15

No patent awarded (30%)


U(-$2B)=0
High demand (25%)
U($43B)=1.0
Continue Develop
development product Medium demand (55%)
U($21B)=0.78

Patent EU =
EU = 0.729 Low demand (20%)
awarded U($3B)=0.25
0.56 (70%)

License technology
U($23B)=0.8

33
Maximizing EMV or EU?
• Scholars argue that the EMV criterion is also appropriate in one-off
decisions.
• Although a single decision may be unique, over time a decision-maker
may make a large number of decisions involving similar monetary
sums, so that returns should still be maximized by consistent
application of the EMV criterion.
• Moreover, large organizations may be able to sustain losses on
projects that represent only a small part of their operations.

34
DECISION INVOLVING MULTIPLE OBJECTIVES AND UNCERTAINTY
(PUTTING SMART AND DECISION TREES TOGETHER)

• We have discussed decision-making with multiple objectives but


no uncertainty – SMART.
• We have discussed decision-making under uncertainty – Decision
Trees with EMV and EU criteria.
• Many decisions involve both multiple objectives and uncertainty.
• We can use our existing tools to solve this kind of problems.

35
Retreat example:
• Our department is considering organizing a retreat.
• There are two alternatives: “Boat trip” and “Dinner at hotel”
• Evaluation criteria: “Cost” and “Level of fun”
• Uncertainty: Level of fun in boat trip depends on weather condition.

Step 1: Draw a decision tree to represent your decision problem.

Cost: Fun:
Rainy (10%)
$50,000 Low

Boat trip & ball Cloudy (20%) $50,000 Medium


game on beach

Sunny (70%)
$50,000 High

Dinner & dance at a


hotel ball room $60,000 Medium

36
Step 2: For each evaluation attribute,
assign utility scores to reflect the Cost: Fun:
desirability of outcomes. Rainy (10%)
Low
$50,000
(1.0) (0.0)

Boat trip & ball Cloudy (20%) $50,000 Medium


game on beach (1.0) (0.7)

Sunny (70%) $50,000 High


(1.0) (1.0)
Dinner & dance at a
hotel ball room $60,000 Medium
(0.0) (0.7)

Step 3: Assign swing weight to each attribute, then normalize the weights.
Hypothetical worst Hypothetical best Swing Normalized
alternative alternative Weight Weight
60/160 =
Cost $60,000 $50,000 60
0.375
100/160 =
Fun Low High 100
0.625 37
Step 4: Calculate the weighted utility at each branch,
then solve the decision tree using EU criterion.

Cost: Fun: Weighted


(0.375) (0.625) Utility
Rainy (10%)
$50,000 Low
(1.0) (0.0) 0.375

EU = 0.9
Boat trip & ball Cloudy (20%) $50,000 Medium 0.813
game on beach (1.0) (0.7)

Sunny (70%) $50,000 High 1.000


(1.0) (1.0)
Dinner & dance at a
hotel ball room $60,000 Medium 0.438
(0.0) (0.7)

38
Step 5: For figures that you are not very sure of (e.g. utility of an intermediate
outcome or a swing weight), conduct a sensitivity analysis.

1.00
0.90
0.80
EU of alternatives

0.70
0.60
0.50
Boat trip
0.40
0.30 Hotel dinner
0.20
0.10
0.00
0 10 20 30 40 50 60 70 80 90 100
Weight on Cost

Our choice is insensitive to the weight placed on cost.


Boat trip is the best alternative regardless of the weight placed on cost. 39
Let’s hear what students from the previous semesters said about
the Decision Analysis assignment :

• “The decision tree analysis is very useful for my personal decision


making dilemma.”
• “The decision tree and exploring alternatives actually got me rethink
about the future development of myself after graduation.”
• “The decision tree project allows students to put knowledge into use
and I think it's a very helpful assignment.”

40
ASSESSING PROBABILITY

Of course, you can use your subjective judgment.


Luckily, sometimes, you know the base rate of a chance event, i.e. how
common an outcome is in the population.
Examples
• Mrs. Chen is 35 and is pregnant for the first time. Doctors: 1% chance
that a woman of her age will have a baby with Down syndrome. Mrs.
Chen would like to know whether her baby has Down syndrome.
• You are planning for a boat trip on 26th July. Historical data shows that
10% of days in July are rainy. You wish to know whether 26th July this
year will be a rainy day.
Without further information, your estimation of the probability that an
outcome occurs in your specific case should equal to the base rate.

41
Sometimes, you might be able to obtain more information on your
specific case.
Examples
• Mrs. Chen is 35 and is pregnant for the first time. Doctors: 1% chance
that a woman of her age will have a baby with Down syndrome.
Mrs. Chen can have a Triple Screen test – this test can detect if her
baby has Down syndrome.
• You are planning for a boat trip on 26th July. Historical data shows that
10% of days in July are rainy.
You can consult a weatherman – this expert can tell you if 26th July this
year will be a rainy day.
How should you revise the probability judgment in the light of new
information?

42
Revising probability judgments in the light of new information

Down syndrome example


• Mrs. Chen is 35 and is pregnant for the first time. Doctors: 1% chance
that a woman of her age will have a baby with Down syndrome.
• Initial probability judgment:
Baby has the disease (1%)

Baby doesn’t have the disease (99%)

• If Mrs. Chen had a 100%-accurate Triple Screen test done, and the test
indicated that the baby doesn’t have Down syndrome:
• Revised probability judgment: “Baby has the disease (0%)”
• Similarly, if the 100%-accurate test indicated that the baby does have
Down syndrome:
• Revised probability judgment: “Baby has the disease (100%)”
43
In fact, the Triple Screen test is NOT 100% accurate.
• If the baby has the disease, there is a 0.80 probability that the test will
correctly show a positive result.
• If the baby doesn’t have the disease, there is a 0.90 probability that the test
will correctly show a negative result.
Let’s combine the probability that characterizes the test performance and the
base rate.
In reality, the baby has
Disease (1%) No Disease (99%)
Disease Correct False Alarm
The test
indicates:
No Disease Missed Detection Correct

In reality, the baby has


Disease (1%) No Disease (99%)
Disease 1% * 0.8 = 0.8% 99% * 0.1 = 9.9%
The test
indicates:
No Disease 1% * 0.2 = 0.2% 99% * 0.9 = 89.1%

Total = 100% 44
In reality, the baby has
Disease (1%) No Disease (99%)

Disease 0.8% 9.9%


The test
indicates:
No Disease 0.2% 89.1%

• If now this imperfect test indicates that the baby has Down syndrome, what
should be your estimate of the probability that the baby has the disease?
• That is, given that the test indicated “disease,” what is the probability that the
baby has the disease?

This is a question about conditional probability:


P(A|B) = P(A&B) / P(B)

“The probability of A given B”


The probability of some event A, given the
occurrence of some other event B.
45
In reality, the baby has
Disease (1%) No Disease (99%)

Disease 0.8% 9.9%


The test
indicates:
No Disease 0.2% 89.1%

• If now this imperfect test indicates that the baby has Down syndrome, what
should be your estimate of the probability that the baby has the disease?
• That is, given that the test indicated “disease,” what is the probability that the
baby has the disease?
P(baby has the disease| test indicated “disease”)
= P(baby has the disease AND test indicated “disease”) /
P(test indicated “disease”)
= 0.8% / (0.8% + 9.9%)
= 7.48%
• Revised probability judgment:
Baby has the disease (7.48%)

Baby doesn’t have the disease (92.52%)


46
PRS
In reality, the baby has
Disease (1%) No Disease (99%)

Disease 0.8% 9.9%


The test
indicates:
No Disease 0.2% 89.1%

• If now this imperfect test indicates that the baby does not have Down syndrome,
what should be your estimate of the probability that the baby has the disease?

1) 0.2%
2) 0.22%
3) 7.5%
4) 10.7%

47
VALUE OF INFORMATION

• Decision makers who face uncertain prospects often gather


information with the intention of reducing uncertainty.
• We have just learnt how we should revise our probability judgment in
light of new information.
• However, in many cases new information is very EXPENSIVE to obtain:
• Carry out a market research
• Conduct scientific tests
• Hire consultants
• The question now is: Is it worth obtaining the information in the first
place? How much should we pay to obtain the information?
• To address this question, we need to calculate the expected value of
information.

48
Value of Information – Planting Example

• One year ago a farmer suffered serious losses when a virus affected the crop.
• Since then, the farmer has taken measures to eliminate the virus from the soil.
On the basis of scientific evidence, it is estimated that there is a 70% chance that
the elimination was successful.
• Now the farmer must make a decision about what to do next season:

Virus Conditions
Plant options Virus was eliminated Virus was NOT
(70%) eliminated (30%)
Grain $90M - $20M
Soybeans $30M

We are going to examine the EMV of the decision under 3 conditions:


General: base rate
Specific: conditional probabilityl

 When no further information is obtained


 When perfect information is obtained
 When imperfect information is obtained
49
• If the farmer makes a decision without obtaining further information:
Virus Conditions
Plant options Virus was eliminated Virus was NOT
(70%) eliminated (30%)
Grain $90M - $20M
Soybeans $30M

Soybeans
$30M

Virus was eliminated (70%)


Grain $90M
Virus was NOT eliminated (30%)
- $20M
EMV = $57M

If no further information is obtained, the EMV of the decision is $57M.


50
• Now suppose that the farmer has an option of hiring a laboratory to test
whether the virus is still in the soil. And this test is perfectly accurate.

Soybeans
$30M
Virus was eliminated (70%)
$90M
Grain
Virus was NOT eliminated (30%)
- $20M
EMV = $57M

Grain
Test indicates no virus (70%) $90M
Test Soybeans
Soil $30M
Grain
EMV= $72M Test indicates virus (30%) - $20M
Soybeans
$30M
Base rate

If perfect information is obtained, the EMV of the decision is $72M.

51
• How much would it be worth paying to have the test done?
• If no further information is obtained, the EMV of the decision is $57M.
• If perfect information is obtained, the EMV of the decision is $72M.
• Perform the test if ($72M - cost of test) > $57M
i.e. if cost of test < $15M

• Expected Value of Perfect Information (EVPI) =


EMV of the decision if perfect information is obtained –
EMV of the decision if no further information is obtained
• In this case, EVPI = $15M.

52
• Of course not all information is perfect (not 100% accurate).
• Suppose the information provided by the soil test is imperfect, and data showed
that:
• If the virus is still present, the test has a 90% chance of correctly detecting it.
• If the virus has been eliminated, the test has a 20% chance of incorrectly
indicating that the virus is still present.
• Let’s combine the probabilities that characterize the performance of the soil test,
and the base rate of successful virus elimination:
In reality, the virus is:
Present (30%) Eliminated (70%)

Present 30% * 90%= 27% 70% * 20% = 14%


The test
indicates:
Eliminated 30% * 10% = 3% 70% * 80% = 56%

• Now, we want to know the EMV of the decision, with the option of
gathering this imperfect information.
53
Soybeans
$30M

Virus was eliminated (70%)


$90M
Grain
Virus was NOT eliminated (30%)
- $20M

Virus was eliminated (?%)


Grain $90M
Test Test indicates virus (?%) Virus was NOT eliminated (?%) -$20M
Soil Soybeans
$30M
Virus was eliminated (?%) $90M
Grain
Virus was NOT eliminated (?%) -$20M
Test indicates NO virus (?%) Soybeans
$30M

54
In reality, the virus is:
Present (30%) Eliminated (70%)

Present 27% 14%


The test
indicates:
Eliminated 3% 56%

• What is the probability that the test indicates that the virus is present?
P(test indicates that virus is present) = ____________________
• Given that the test indicates that virus is present, what is the probability that the virus is present?
P(virus is present | test indicates that virus is present) = ____________________
• Given that the test indicates that virus is present, what is the probability that the virus has been
eliminated?
P(virus has been eliminated | test indicates that virus is present) = ____________________
• What is the probability that the test indicates that the virus is eliminated?
P(test indicates that virus is eliminated) = ____________________
• Given that the test indicates that virus has been eliminated, what is the probability that the virus is
present?
P(virus is present | test indicates that virus has been eliminated) = ____________________
• Given that the test indicates that virus has been eliminated. What is the probability that the virus has
been eliminated?
P(virus has been eliminated | test indicates that virus has been eliminated) = _________________
Soybeans
$30M

Virus was eliminated (70%)


$90M
Grain
Virus was NOT eliminated (30%)
EMV: - $20M
$57M
EMV:
$17.4M Virus was eliminated (34%)
Grain $90M
Test Test indicates virus (41%) Virus was NOT eliminated (66%) -$20M
Soil Soybeans
$30M
EMV:
EMV: Virus was eliminated (95%)
Grain $84.5M $90M
$62.2M Virus was NOT eliminated (5%)
-$20M
Test indicates NO virus (59%) Soybeans
$30M

If imperfect information is obtained, the EMV of the decision is $62.2M 56


• How much would it be worth paying to have the test done?
• If no further information is obtained, the EMV of the decision is $57M.
• If imperfect information is obtained, the EMV of the decision is $62.2M.
• Perform the test if ($62.2M - cost of test) > $57M
i.e. if cost of test < $5.2M

• Expected Value of Imperfect Information (EVII) =


EMV of the decision if imperfect information is obtained –
EMV of the decision if no further information is obtained
• In this case, EVII = $5.2M.

57
Time for REFLECTION

What idea/concept in this class do you find most useful in future


decision making? How are you going to apply it?

58
INTENDED LEARNING OUTCOMES FOR UNIT 1.3

By now, you should be able to:


• Solve decisions involving uncertainty using (1) maximax choice
and maximin choice, (2) decision trees with EMV criterion, and
(3) decision trees with EU criterion.
• Explain the concept of certainty equivalent, and describe how
shapes of utility functions reflect risk attitudes.
• Discuss the appropriateness of EMV versus EU criteria.
• Use SMART and Decision Tree to solve decision problems that
involve multiple objectives and uncertainty.
• Estimate probability based on base rate and new information.
• Calculate EVPI and EVII.

59

You might also like