You are on page 1of 24

MS&E 252

Decision Analysis I
Problem Session 3

Announcements
• Make sure to CC your buddy in any emails to
your TA
• Amount of time spent on homeworks

1
What concepts do we expect you to master?

• Distinctions • Relevance
• Kind, degree • Relevance Diagrams
• Probability • The Rules of Arrow
• Background state of Flipping
information (&)
• Inferential notation • Associative Logic
Errors
• Probabilistic inference
• Tree flipping

What are the important distinctions


about distinctions?
• Kind: type of grouping (beer drinker, college
graduate)
• Degree: separation within grouping (beer drinker
vs. not beer drinker)

Probability tree
representation:

Kinds
Degrees

2
What concepts do we expect you to master?

• Distinctions • Relevance
• Kind, degree • Relevance Diagrams
• Probability • The Rules of Arrow
• Background state of Flipping
information (&)
• Inferential notation • Associative Logic
Errors
• Probabilistic inference
• Tree flipping

We encode our uncertainty using probability.

Uncertainty comes from our lack of


knowledge.
• Probability allows us to “speak
precisely about our ignorance.”
• Instead of “The probability is …”
say “I assign a probability … to …”

Your probability changes as your knowledge changes.

3
Probability Notation

& your background state of information

{A|&} probability of event A occurring given


your background state of information

{A|B,&} probability of event A occurring given you


know event B occurred and your
background state of information

{AB|&} probability of event AB occurring given


your background state of information

We will use Inferential Notation for


probabilistic statements.
“Given B and my background
{A | B, &} = 0.8 means state of information, I assign a
probability of 0.8 to A”

• Two remarks:
– We condition A on B when we think about A given
that B happened.
– Always condition probabilities on & (your
background state of information)

4
Probabilistic inference is how we learn
about uncertainties indirectly.

We start with assessed


probabilities and ...

… through
probabilistic … we can come up
inference ... with probabilities we
have not assessed.

Tree-flipping

But what exactly is tree-flipping?

In this example, we first calculate the joint


distribution of two distinctions.
{A | &} {B | A,&} {AB | &} = {A | &} * {B | A,&}
B1
A1 B 1 0.36 = {A1 B1 | &} = (0.9)*(0.4)
0.4
A1
0.9 0.6
A1 B 2 0.54
B2
&
B1
A2 B 1 0.05
0.1 0.5

A2 0.5
A2 B 2 0.05
B2
Σ = 1.00

5
We can then “flip” the tree to infer the
unassessed probabilities.
{B | &} {A | B,&} {BA | &} = {B | &} * {A | B,&}
A1
B1A1 0.36 {A1 B1 | &} = {B1 A1 | &}
0.36
B1 0.41
= 0.88
0.05
0.41 0.41
= 0.12
B1 A2 0.05
A2
&
A1
B2 A1 0.54
0.54
0.59 0.59
= 0.92

B2 0.05
0.59
= 0.08
B2 A2 0.05
A2
Σ = 1.00

Here is an example
of probabilistic inference.
• I have two coins in my pocket; one is normal (N) and the other
is “double-headed” (D).
• I take a coin out of my pocket and flip it – “Heads”!
• What is the probability that I originally chose the double-
headed coin?
“H” N
0.25 = “H”N 0.25 = “H”N
0.5 1/3
N “H”
0.5 0.75
0.5 2/3
0.25 = “T”N 0.50 = “H”D
“T” D
0.5 0.25
0.50 = “H”D 0.25 = “T”N
D “H” “T” N
0 = “T”D

6
Let’s flip this tree as an exercise!

B A
21/50 21/50
7/10 21/23
A B
3/5 3/10 23/50 2/23
9/50 2/50
~B ~A
& &
B A
2/50 9/50
2/5 1/10 27/50 1/3

~A 9/10 ~B 2/3
18/50 18/50
~B ~A

This tree is a little trickier.


C1
3/50
D 3/22
3/50
C1 1/5
D C2
3/10 4/50
4/5 22/50 4/22
12/50
D’
D 15/22
4/50 15/50
2/5 C3
C2
2/10 C1
3/5 12/50
6/50 3/28
D’
D 28/50 C2
15/50 6/50
3/5 D’ 6/28
C3
5/10 2/5 10/28
10/5 10/50
D’ 0 C3

7
What concepts do we expect you to master?

• Distinctions • Relevance
• Kind, degree • Relevance Diagrams
• Probability • The Rules of Arrow
• Background state of Flipping
information (&)
• Inferential notation • Associative Logic
Errors
• Probabilistic inference
• Tree flipping

Introducing Relevance

Probabilistically, A is relevant to B if
{A|B, &} is not equal to {A|B’, &}.

• In other words, if knowing B tells you


something about the probability of A
occurring, then A is relevant to B.

8
Introducing Relevance

• Relevance, like probability, describes a


person’s beliefs about the world, not
the world itself.
• Relevance does not imply causality.
• Relevance is a matter of information,
not logic.
A and B could be relevant given &,
and yet irrelevant given C and &.

Probability trees can help determine whether


distinctions are “relevant” to each other.
• If knowing outcome A tells you something about the
probability of B, then A is relevant to B.
• Otherwise they are irrelevant.
• A is relevant to B iff {B|A, &} ≠ {B|~A, &}.
B B
3/4 3/4
A A
1/3 1/4 1/3 1/4
~B ~B

B B
2/3 1/10 2/3 3/4

~A 9/10 ~A 1/4
~B ~B

9
How can we recognize relevance using
trees? (3 or more degrees)
B1
A and B are relevant Otherwise, A and B are
given & if ... A1 0.2 B2 irrelevant given &.
0.3 B3
0.5
B1

}
0.2 … for one of the
B2 degrees of A, the
0.3 distribution of B
A2 B3 differs from ...
0.5
B1

}
… the
0.8 B2 distribution of B
for another
A3 0.1
B3 degree of A.
0.1

How can we recognize relevance using


trees? (3 or more distinctions)
B and C are relevant
given distinction A
C1
and & if ... … either this
B1 0.3
} distribution differs
from...
A1 C2
C1

B2
0.3
} … this one ...
C2
-- OR --
C1
B1 0.3
} … this distribution
differs from...
C2
A2 C1 Otherwise, B and C

B2
0.6
} … this one. are irrelevant given
distinction A and &.
C2

10
But sometimes we first need to flip the tree
to determine whether there is relevance.
Are A and B irrelevant given C and &?
A B C C A B
0.3 … this
Our strategy is to put C
0.5 0.7 differs
and & first in the tree, from ...

then put A and B, and


0.6
0.5 0.5 then put everything else. … this
0.5
-- OR --

0.5 Then look at the tree. A … this


differs
0.6 0.5 and B are relevant given from ...
0.4
C if ...
0.4 0.75 … this.
0.25

One problem with trees


is that they grow exponentially.

It would thus be nice to have another tool that would help us


reflect on relevance regardless of the size of the tree.

11
Introducing Relevance Diagrams

Relevance diagrams can make


irrelevance statements…

… but they cannot make any


relevance statements!

You should get used to saying “the diagram shows that


there is a possibility of relevance between A and B”.

Clarifying our notation...

Note that these are kinds,


Uncertainty not degrees of the
uncertain distinction!

Arrows indicate the


possibility of relevance.

So when building or assessing a relevance diagram,


the biggest statements made are those of irrelevance!

12
Instead of trees,
we can relevance diagrams.

Probability Tree Relevance Diagram


B1

A1 0.714
A B
0.626
0.286
{A | &} {B | A, &}
B2

B1 No arrows Arrow pointing


0.552 pointing in from A into B –
0.374 conditions B on A
A2 0.448
B2 The arrow from A to B only
implies possible relevance.

Relevance diagrams allow us to make strong


statements of irrelevance between distinctions.

Probability Tree Relevance Diagram


B1

A1 0.714

0.626
0.286 A B
B2
B1 {AB|&} = {A|&}{B|&}
0.714
0.374
A2 0.286
B2 The absence of an arrow from
A to B asserts irrelevance!

13
Diagrams vs. Trees
C1
B1 0.3
A1 C2
C1 These two A
0.3 numbers are
B2 the same...
C2
-- AND -
C1 - B C
B1 0.3 ...these two
numbers are
C2 the same
A2 C1 A is IRR to C | B, &
0.3
B2 C2
From trees we can make both From diagrams we can only
relevance and irrelevance make statements of
statements. irrelevance!

Relevance is a matter of
information, not logic.

Example
– Assume you have two “fair” dice.
– You believe the result of each die toss is
irrelevant to the other.

Die 1 Die 2

14
Adding or taking away information can
change relevance relationships.

But once you know the sum of the two tosses,


they are now relevant to each other.

Sum

Die 1 Die 2

“The two tosses MAY BE relevant to each


other given their sum and &.”

How are these two tools used?

• We start building the relevance diagram, A


as a way to clarify our thoughts and
learn which assessments need to be B C

made.

• We then make the necessary probability B1


C1
0.3

assessments and build our trees. A1 C2


C1

Note that it is the irrelevance statements B2


0.3
C2
that reduce the number of probability B1
C1
0.3
assessments that need to be made. C2
A2 C1
0.3
B2 C2

15
Irrelevance helps us simplify our
probabilistic thinking.
Bayes’ Rule tells us that:
{ABC|&}={A|&} * {B|A, &} * {C|B, A, &}

A|& This diagram is telling us the same thing!


{ABC|&} = {A|&}* {B|A, &}* {C|B, &}
A

B C

B | A, & C | B, &

What concepts do we expect you to master?

• Distinctions • Relevance
• Kind, degree • Relevance Diagrams
• Probability • The Rules of Arrow
• Background state of Flipping
information (&)
• Inferential notation • Associative Logic
Errors
• Probabilistic inference
• Tree flipping

16
Just like we can flip trees, we can flip
arrows in relevance diagrams.

{A | &} {B | A, &}

A B

A B

{A | B, &} {B | &}

Arrow flipping requires that the two distinctions


be conditioned on the same state of information.

We can only flip arrows according to


certain rules.

RULE #1
“Add arrows wherever you want, provided you
don’t create a cycle; A cycle made by more than
3 nodes is also not allowed.”

X X
A B

17
We can only flip arrows according to
certain rules.

RULE #2
“You can flip an arrow between A and B if and
only if A and B are conditioned on the same state
of information.”

C
In other words, any other node
(here, C and D) which points to A ?
also points to B. A B

*Tip* – Draw a box around A and B D

We can only flip arrows according to


certain rules.

RULE #3
“You cannot remove any arrows arbitrarily.”

A B

18
Example of diagram manipulation
(arrow-flipping)

C D E
Can we flip the arrow
Q
between A and B?
A B

No, since A and B are conditioned on different states of


A information!
A has arrows from C and D, but B has arrows from D and E.

Example of diagram manipulation


(arrow-flipping)

C D E

A
We need to add arrows
from C to B and E to A.
A B

A
Now, A and B are conditioned on the same state of
information, so we can now flip the arrow.

19
Why would we want to
manipulate diagrams?
• We can recognize irrelevance without
needing to assess numbers.
• If there is no arrow between nodes A and B
given the same state of information S, then A
and B are irrelevant given S.
A B
A B
C

A and B are A and B are


irrelevant given & irrelevant given C and &

An example of recognizing irrelevance


from diagrams.

Consider the following diagram: C D E

A B

• Are C and D irrelevant given &?


Yes; they are both conditioned only on & and there is no
arrow between them.

20
An example of recognizing irrelevance
from diagrams.

Consider the following diagram: C D E

A B

• Are C and D irrelevant given &?


Yes; they are both conditioned only on & and there is no
arrow between them.

• Are A and B irrelevant given C, D, E, and &?


Yes; add arrows from C to B and E to A.

An example of recognizing irrelevance


from diagrams.

• Are A and B irrelevant given &?

C D E C D E C D E

next
A B A B A B line

C D E C D E

A B A B

So we can’t conclude! There is a possibility of relevance.

21
Another example of recognizing
irrelevance from diagrams.

Annual growth Cost


Market size

Competition Revenue Profit

• Cost and Competition irrelevant given &?


• Revenue and Cost irrelevant given Market Size, &?
• Profit and Annual Growth irrelevant given &?

What concepts do we expect you to master?

• Distinctions • Relevance
• Kind, degree • Relevance Diagrams
• Probability • The Rules of Arrow
• Background state of Flipping
information (&)
• Inferential notation • Associative Logic
Errors
• Probabilistic inference
• Tree flipping

22
What is constitutes an
Associative Logic Error?

We say that people make an “Associative Logic


Error” when they fall into the trap

{A|B,&} = {B|A,&}

Examples of Associative Logic Errors

Some can be quite obvious…


… but the probability for a
Hemophiliacs are virtually
male to be a hemophiliac is
all male…
1/1000 or less!

23
Examples of Associative Logic Errors

… But other examples can be very tricky.


Most people who are treated … and yet the probability of
for lung cancer seem to be getting lung cancer if you are a
heavy smokers… heavy smoker is as low as 0.1!

24