You are on page 1of 2

Supplement of conditional probability

1. Probability space

Consider a partitioned set

mind= { happy , sad }={M h , M s }

and

weather = { sunny ,rainny , foggy }={W s , W r , W f }

2. The intersection( or joint) probability as

Probability table

sunny rainy foggy


happy 0.5 0.05 0.05 0.6
sad 0.2 0.15 0.05 0.4
0.7 0.2 0.1

P ( M h ∩W s )=0.2, P ( M s ∩W f )=0.05

1) Check the legitimacy

3 3
1=P ( M h ) + p ( M s ) =∑ P( M h ∩W ( i ) )+ ∑ P(M s ∩W (i ) ¿ ) ¿
i=1 i=1

2) Conditional probability

 You measure the weather condition to infer your mind

P ( M h ∩W s ) 0.5
P ( M h|W s ¿= =
P(W s ) 0.7

 You are confined in the room not to see outside. If you are happy. Predict
the weather

P ( W s ∩ M h ) 0.5
P ( W s|M h ) = =
P( M h ) 0.6

P ( W r ∩ M h ) 0. 0 5
P ( W r|M h ) = =
P(M h ) 0.6

P ( W f ∩ M h ) 0.05
P ( W f |M h )= =
P(M h) 0.6

 Check the legitimacy on the marginal probability


m m
P( B¿¿ j)=∑ P ( Ai ∩ B j ) =∑ P ( B| A i ) P (A i)¿
i=1 i=1

%% Kim’s comment :

Bayesian rule is very important nowadays. It is indispensable in ML. We should be familiar to


this rule.

P ( B| A i ) P ( A i )
P ( A i|B )=
P( B)

The interpretation may be

- Given B , the prob of A == the prob of A measured B == after measured B estimate A

== posterior probability

- P(A) : the original event of A == prior probabilaity

- Since
n
P ( B )=∑ P ( B| A j ) P( A j )
j=1

It is a marginal prob. It is constant w.r.t. event A i.e., the sunny prob.

- P ( B| A i ) may be called the conditional marginal prob. i.e., it is a marginal regarding “B”
conditioning Ai

You might also like