Professional Documents
Culture Documents
Unit 11:
Uncertainty
Outline
• Probability Distribution
• Joint Distribution
• Marginal Distribution
• Conditional Distribution
• Product Rule
• Probabilistic Inference
• Bayes’ Rule
References:
• Chapter 13 in Russell & Norvig
• CS188 Lecture Note: Probability [link]
Uncertainty
Many uncertainty in the real world:
▪ Medical Diagnosis
• temperature, blood pressure, types of pain -> disease?
▪ Speech Recognition
• sound signals -> sentence?
▪ Tracking objects
• current position, speed, acceleration -> next position?
▪ Genetics
• gene expression data -> gene interactions ?
▪ Error correcting codes
• data corrupted with noise -> original message?
▪ … lots more!
3
Random Variable
4
Probability Distribution
𝑃 𝐸 = 𝑃 𝑥1 , 𝑥2 , … , 𝑥𝑛
𝑥1 ,𝑥2 ,…,𝑥𝑛 ∈𝐸
For example: Find the probability of the weather (W) being hazy
𝑃(𝑇, 𝑊)
hazy sunny rainy
hot 0.25 0.3 0.05
cold 0.15 0.05 0.20
𝑃(𝑇, 𝑊)
hazy sunny rainy
hot 0.25 0.3 0.05
cold 0.15 0.05 0.20
11
Conditional Distributions
Suppose the temperature is hot:
Conditional distribution
P( W | T )
marginal
▪ Example:
P(hazy) = P(hazy, hot) + P(hazy, cold)
hazy sunny rainy = 0.25 + 0.15 = 0.4
hot 0.25 0.3 0.05
0.15 0.05 0.20
P(hot | hazy) = P(hot, hazy) / P(hazy)
cold
= 0.25 / 0.4
= 0.625
P(hazy, hot) = P(hot | hazy) P(hazy)
= 0.625 0.4
= 0.25 15
Example 1
Given the joint distribution table below, compute the
following conditional distributions:
▪ 𝑃(𝑥 | 𝑦) = P(x,y)/P(y)
𝑋 𝑌 𝑃(𝑋, 𝑌) = 0.2/(0.2+0.4)
= 0.3333
¬𝑥 ¬𝑦 0.1
¬𝑥 𝑦 0.4
𝑥 ¬𝑦 0.3 ▪ 𝑃(¬𝑥 | 𝑦) = P(¬x, y)/P(y)
𝑥 𝑦 0.2 = 0.4/(0.2 + 0.4)
= 0.6667
There are two production lines (line 1 and 2) in a factory with the
same output capacity.
▪ Given that 30% of the product from line one is faulty. What is
the probability that a product which is faulty and comes from
line one?
P(line1) = 0.5 P(faulty, line1) = P(faulty|line1)P(line1)
P(faulty | line1) = 0.3 = (0.3)(0.5)
P(faulty, line1) = ? = 0.15
▪ 10% of the products are faulty and come from line two. Given a
product from line two, what is the probability that the product
is faulty?
P(line2) = 0.5 P(faulty|line2) = P(faulty, line2)/P(line2)
P(faulty, line2) = 0.1 = 0.1/0.5
P(faulty | line2) = ? = 0.20
18
Probabilistic Inference
19
Inference by Enumeration
Given the following: and we want to find:
▪ Query event:
▪ Query variables:
▪ Evidence variables:
▪ Hidden variables:
(don’t care)
The inference process:
4. Get Event
3. Normalize
Sum the entries consistent with
Normalize to get the conditional
the event: probability: 20
Probabilistic Inference Example
▪ For example: Consider the joint probability of having a
toothache (Toothache), having a cavity in the teeth (Cavity),
and the dentist catches (detects) a damaged part of the teeth
(Catch)
toothache toothache
catch catch catch catch
cavity 0.108 0.012 0.072 0.008
21
Example 1
Suppose we want to find the
probability of a person having no
cavity if he has toothache
Query event: cavity
Query variable: Cavity = (cavity, cavity)
Evidence variables: Toothache = True Find: P (cavity | toothache)
Hidden variables: Catch
3. Normalize
4. Get Event
P(Catch | cavity)
P(catch| cavity) = 0.90 catch catch
0.18/(0.18+0.02) 0.02/(0.18+0.02)
=0.90 =0.10 23
Causal and Diagnostic Probability
▪ Most of the time, we have the causal probability:
P(effect | cause)
For example: P(fever | dengue) = the probability (%) of a dengue
patient that has fever. This can be easily computed from historical
data.
▪ But more often than not, what we really want to do is to diagnostic
probability:
P(cause | effect)
For example: P(dengue | fever) = given a patient has fever
symptom, what is the probability that he got dengue. A doctor can
use this information to help him in his diagnosis.
▪ It is possible to compute the diagnostic probability from the causal
probability?
24
Bayes’ Rule
P(m) = 0.0001
P(s|m) = 0.8
P(s|m) = 0.01
P(m|s) = ?
26
Example 2
Given:
W P D W P
sunny 0.8 wet sunny 0.1
rainy 0.2 dry sunny 0.9
wet rainy 0.7
dry rainy 0.3
Bayesian Network