This action might not be possible to undo. Are you sure you want to continue?

AGENDA

Conditional probability Independence Intro to Bayesian Networks

**REMEMBER: PROBABILITY NOTATION LEAVES VALUES IMPLICIT
**

P(AB) = P(A)+P(B)- P(AB) means

**P(A=a B=b) = P(A=a) + P(B=b) - P(A=a B=b) For all aVal(A) and bVal(B)
**

If it helps, A and B are random variables. A=a and B=b are events. Random variables denote many possible combinations of events

CONDITIONAL PROBABILITY

= P(A|B) P(B) = P(B|A) P(A) P(A|B) is the posterior probability of A given knowledge of B Axiomatic definition: P(A|B) = P(AB)/P(B)

P(AB)

what do I believe about A?µ If a new piece of information C arrives.C) .CONDITIONAL PROBABILITY P(A|B) is the posterior probability of A given knowledge of B ´Given that I know B. the agent·s new belief (if it obeys the rules of probability) should be P(A|B.

P C. P C. P P(state) 0. P C. the patient is no longer an ´averageµ one.016 0.012+0. T. and normalizing their sum to 1 .2) of Cavity is no longer valid P(Cavity|Toothache) is calculated by keeping the ratios of the probabilities of the 4 cases of Toothache unchanged.064) = 0.144 0.012)/(0.012 0. T. T. T. P C. P C.016+0.CONDITIONAL DISTRIBUTIONS State C.108 0.008 0. T.064 0. P C. T. and the prior probability (0.108+0.072 0. T.6 Interpretation: After observing Toothache.576 P(Cavity|Toothache) = P(CavityToothache)/P(Toothache) = (0. P C.108+0. T.

P C. T.144 0. P P(state) 0.072 0. T. P C.108 0. T.576 The patient walks into the dentists door Let D now observe evidence E: Toothache holds with probability 0.064 0. P C. T. P C. T.008 0.016 0. P C.012 0. T.8 (e. T.g.. ´the patient says soµ) How should D update its belief state? .UPDATING THE BELIEF STATE State C. T. P C. P C.

T. P C.012 0. P C.144 0.072 0.E) = P(CP|T) P(CTP|E) = P(CPT) P(T|E)/P(T) .8 We want to compute P(CTP|E) = P(CP|T.016 0. we consider that C and P are independent of E given T.064 0. T. T. hence: P(CP|T. P C. P P(state) 0. T. T.108 0.576 P(Toothache|E) = 0.UPDATING THE BELIEF STATE State C.008 0. T. P C. P C. P C. P C. T. T.E) P(T|E) Since E is not directly related to the cavity or the probe catch.

P C. T. T.008 0.108 0.8 We want to compute P(CTP|E) should be These rows = P(CP|T.016 0.E) = P(CP|T) P(CTP|E) = P(CPT) P(T|E)/P(T) . P C.144 0.2 P(CP|T.576 P(Toothache|E) = 0. T. P C. we consider that C and P are independent of These rows should be E given T. T. T. T. P C.8 Since E is not directly related to the cavity or the probe catch. hence: scaled to sum to 0. P C.064 0. P C.E) P(T|E) scaled to sum to 0. T.UPDATING THE BELIEF STATE State C.012 0. P P(state) 0. P C.072 0. T.

T.E) = P(CP|T) P(CTP|E) = P(CPT) P(T|E)/P(T) . P C.256 0.036 0.2 P(CP|T. P C.012 0. T. T.8 Since E is not directly related to the cavity or the probe catch.108 0. P C. hence: scaled to sum to 0. P C.018 0. T.576 0.048 0.002 0. P C.072 0.008 0. P C.144 P(Toothache|E) = 0.064 0.144 0. P P(state) 0. T.016 0.064 0.UPDATING THE BELIEF STATE State C. P C.E) P(T|E) scaled to sum to 0.432 0. we consider that C and P are independent of These rows should be E given T. T. T.8 We want to compute P(CTP|E) should be These rows = P(CP|T. T.

ISSUES If a state is described by n propositions. then a belief state contains 2n states (possibly. some have probability 0) p Modeling difficulty: many numbers must be entered in the first place p Computational issue: memory size and time .

INDEPENDENCE OF EVENTS Two events A=a and B=b are independent if P(A=a B=b) = P(A=a) P(B=b) hence P(A=a|B=b) = P(A=a) Knowing B=b doesn·t give you any information about whether A=a is true .

all events A=a and B=b are independent] .INDEPENDENCE OF RANDOM VARIABLES Two random variables A and B are independent if P(A B) = P(A) P(B) hence P(A|B) = P(A) Knowing B doesn·t give you any information about A [This equality has to hold for all combinations of values that A and B can take on. i..e.

just look up P(A=a) and P(B=b) in the individual tables and multiply them together . store two much smaller probability tables! To compute P(A=a B=b).SIGNIFICANCE OF INDEPENDENCE If A and B are independent. then P(A B) = P(A) P(B) => The joint distribution over A and B can be defined as a product of the distribution of A and the distribution of B Rather than storing a big probability table over all combinations of A and B.

B.C can take on] . this has to hold for all combinations of values that A.CONDITIONAL INDEPENDENCE Two random variables A and B are conditionally independent given C.C) = P(A|C) Once you know C. learning B doesn·t give you any information about A [again. if P(A B|C) = P(A|C) P(B|C) hence P(A|B.

SIGNIFICANCE OF CONDITIONAL INDEPENDENCE Consider Rainy. I don·t need to think about the state of the roads. so they are not independent« So it is reasonable to believe that Thunder and RoadsSlippery are conditionally independent given Rainy So if I want to estimate whether or not I will hear thunder. and RoadsSlippery Ostensibly. thunder doesn·t have anything directly to do with slippery roads« But they happen together more often when it rains. just whether or not it·s raining! . Thunder.

P C. P C. P C.576 Toothache and PCatch are independent given Cavity.064 0. P C.State C. T. T. T. T.008 0. P C. T. but this relation is hidden in the numbers! [Homework] Bayesian networks explicitly represent independence among propositions to reduce the number of probabilities defining a belief state . T. P C. P C. T.072 0. T.144 0.108 0.012 0.016 0. P P(state) 0.

BAYESIAN NETWORK Notice that Cavity is the ´causeµ of both Toothache and PCatch.2 P(PCatch|c) Cavity 0.9 Cavity 0.6 Cavity 0.1 P(Cavity) Cavity 0.02 Toothache PCatch 5 probabilities. instead of 7 . and represent the causality links explicitly Give the prior probability distribution of Cavity Give the conditional probability tables of Toothache and PCatch P(CTP) = P(TP|C) P(C) = P(T|C) P(P|C) P(C) P(Toothache|c) Cavity 0.

CONDITIONAL PROBABILITY TABLES P(CTP) = P(TP|C) P(C) = P(T|C) P(P|C) P(C) Cavity P(Toothache|c) Cavity 0.4 0.6 0.9 PCatch Rows sum to 1 If X takes n values. just store n-1 entries .02 Toothache P(t|c) Cavity Cavity 0.6 Cavity 0.2 P(PCatch|c) Cavity 0.1 P(Cavity) 0.9 Cavity 0.1 P(t|c) 0.

we should be able to decompose joint distribution to take advantage of it Bayesian networks are a way of efficiently factoring the joint distribution into conditional probabilities And also building complex joint distributions from smaller models of probabilistic relationships But« What knowledge does the BN encode about the distribution? y How do we use a BN to compute probabilities of variables that we are interested in? y .SIGNIFICANCE OF BAYESIAN NETWORKS If we know that variables are conditionally independent.

P(B) = 7a P(B|A=a) P(A=a) y Can derive P(A|B).BAYES· RULE AND OTHER PROBABILITY MANIPULATIONS P(AB) = P(A|B) P(B) = P(B|A) P(A) P(A|B) = P(B|A) P(A) / P(B) Gives us a way to manipulate distributions e. P(B) using only P(B|A) and P(A) y .g.

A MORE COMPLEX BN Burglary Intuitive meaning of arc from x to y: ´x has direct influence on yµ Earthquake causes Alarm Directed acyclic graph effects JohnCalls MaryCalls .

70 F 0.95 0.001 P(J|«) 0.05 MaryCalls A P(M|«) T 0.A MORE COMPLEX BN Burglary Size of the CPT for a node with k parents: 2k JohnCalls A T F P(B) 0.001 Earthquake B E P(A|«) P(E) 0.002 Alarm T T F F T F T F 0.29 0. instead of 31 .94 0.01 10 probabilities.90 0.

John does not observe any burglaries directly .WHAT DOES THE BN ENCODE? Burglary Earthquake P(BJ) { P(B) P(J) P(BJ|A) ! P(B|A) P(J|A) JohnCalls Alarm MaryCalls Each of the beliefs JohnCalls and MaryCalls is independent of Burglary and Earthquake given Alarm or Alarm For example.

the reasons why John and Mary may not call if there is an alarm are unrelated The beliefs JohnCalls and MaryCalls are independent given Alarm or Alarm .WHAT DOES THE BN ENCODE? Burglary Earthquake P(BJ|A) ! P(B|A) P(J|A) Alarm P(JM|A) ! P(J|A) P(M|A) JohnCalls A node is independent of MaryCalls its non-descendants given its parents For instance.

WHAT DOES THE BN ENCODE? Burglary Earthquake Burglary and Earthquake are independent JohnCalls Alarm A node is independent of MaryCalls its non-descendants given its parents For instance. the reasons why John and Mary may not call if there is an alarm are unrelated The beliefs JohnCalls and MaryCalls are independent given Alarm or Alarm .

i. the CPTs are small and the BN contains much fewer probabilities than the full joint distribution If the # of entries in each CPT is bounded by a constant.e. O(1).. then the # of probabilities in a BN is linear in n ² the # of propositions ² instead of 2n for the joint distribution .LOCALLY STRUCTURED WORLD A world is locally structured (or sparse) if each of its components interacts directly with relatively few other components In a sparse world.

B) Constrains joint probability P(A.C) Nicely encoded as causality relationship A B Conditional probability given by equation rather than a CPT C .EQUATIONS INVOLVING RANDOM VARIABLES C=AB C = max(A.B.

Effectn) = P(Cause) 4i P(Effecti | Cause) Cause Effect1 Effect2 Effectn .NAÏVE BAYES MODELS P(Cause.«.Effect1.

F1.Feature1.NAÏVE BAYES CLASSIFIER P(Class.Fk)/P(F1.«.«.«.Fk) = P(C... what class? Feature2 Featuren Word occurrences P(C|F1..Fk) = 1/Z P(C) 4i P(Fi|C) .«.Featuren) = P(Class) 4i P(Featurei | Class) Class Spam / Not Spam English / French/ Latin « Feature1 Given features.

CAN WE COMPUTE THE FULL JOINT DISTRIBUTION OF THE PROPOSITIONS FROM IT? .BUT DOES A BN REPRESENT A BELIEF STATE? IN OTHER WORDS.

001 Earthquake B E P(A|«) T T F F T F T F 0.70 F 0.001 P(E) 0.01 .05 MaryCalls A P(M|«) T 0.002 P(JMABE) = ?? Alarm JohnCalls A T F P(J|«) 0.CALCULATION OF JOINT PROBABILITY Burglary P(B) 0.94 0.95 0.29 0.90 0.

E)P(B)P(E) MaryCalls .B.E) v P(B) v P(E) (B and E are independent) P(JMABE) = P(J|A)P(M|A)P(A|B.E) v P(M|A.E) = P(J|A) (J and B and J and E are independent given A) P(M|A.E) v P(ABE) = P(J|A.B.E) v P(B|E) v P(E) = P(A|B.B.B.E) = P(M|A) P(ABE) = P(A|B.E) v P(ABE) (J and M are independent given A) P(J|A.Burglary Earthquake Alarm P(JMABE) JohnCalls = P(JM|A.B.

E)P(B)P(E) = 0.998 = 0.90 0.9 x 0.00062 JohnCalls A T F P(J|«) 0.94 0.002 P(JMABE) = P(J|A)P(M|A)P(A|B.7 x 0.29 0.95 0.CALCULATION OF JOINT PROBABILITY Burglary P(B) 0.70 F 0.001 Earthquake B E P(A|«) T T F F T F T F 0.01 .05 MaryCalls A P(M|«) T 0.001 P(E) 0.001 x 0.999 Alarm x 0.

7 x 0.95 0.70 F 0.00062 P(x1x2«xnT) 0.94 0.904i=1.CALCULATION OF JOINT PROBABILITY Burglary P(B) 0.001 Earthquake B E P(A|«) T T F F T F T F 0.999 Alarm x 0.998 = 0.001 P(E) 0.29 0.001 x 0.E)P(B)P(E) = 0.01 A P(J|«) A P(M|«) full joint distribution table .05 F 0.nP(xi|parents(Xi)) = JohnCalls MaryCalls T 0.9 x 0.002 P(JMABE) = P(J|A)P(M|A)P(A|B.«.

001 P(x1x2«xnT) 0.05 F 0.70 F 0.29 F F 0.94 F T 0.95 « T F 0. it P(jmabe) B E P(A| represents a ) belief state = P(j|a)P(m|a)P(a|b.7 x 0.002 of a distribution set of propositions.9 x 0.001 full jointEarthquake 0.e)P(b)P(e) T T 0.nP(xi|parents(Xi)) = JohnCalls MaryCalls T 0.«.00062 Since a BN definesP(E) the Burglary 0.CALCULATION P(B) OF JOINT PROBABILITY = 0.999 Alarm x 0.904i=1.998 = 0.01 A P(J|«) A P(M|«) full joint distribution table .001 x 0.

01111 Querying a BN is just applying Bayes· rule on a larger scale« algorithms next time .QUERYING THE BN Cavity P(C) 0.4 F 0.1 The BN gives P(T|C) What about P(C|T)? P(Cavity|T=t) = P(Cavity T=t)/P(T=t) = P(T=t|Cavity) P(Cavity) / P(T=t) [Bayes· rule] C P(T|C) Toothache T 0.

MORE COMPLICATED SINGLY-CONNECTED BELIEF NET Battery Radio SparkPlugs Gas Starts Moves .

SOME APPLICATIONS OF BN Medical diagnosis Troubleshooting of hardware/software systems Fraud/uncollectible debt detection Data mining Analysis of genetic sequences Data interpretation. image understanding . computer vision.

Grass. Tree.Region = {Sky. Rock} R1 Above R2 R3 R4 .

BN to evaluate insurance risks .

PURPOSES OF BAYESIAN NETWORKS Efficient and intuitive modeling of complex causal interactions Compact representation of joint distributions O(n) rather than O(2n) Algorithms for efficient inference with given evidence (more on this next time) .

Sign up to vote on this title

UsefulNot useful- Probability
- Scope Of OR
- m13 Uncertainty
- Chapter 13-uncertainty.pdf
- bayesian-classifiers2
- 10H91D2501
- Probabilistic Plan Inference for Group Behavior Prediction
- Substructural Neighborhoods for Local Search in the Bayesian Optimization Algorithm
- B4.1-R3
- Multi Sensor Data Fusion Andres Navarro
- Basic Statistics
- Ch11
- 2015 H2 Set G Solutions (7) Binomial & Poisson Distributions-2
- Evidential Grids Information Management in Dynamic Environments
- STA3045F+Exam+2012
- Upheaval Buckling
- Bayes 2010
- Isi Mtech Qror 06
- 1.200 Syllabus
- Lorenz 40
- WCDMA_07.07101302
- Midterm2012 Wang Sol (1)
- Random Presentation
- Intro to Probability Theory
- ISOM201 Class Notes
- Probability Dice Throwing Cooperative Learning
- Grade 12 Data Management
- Probability Theory and Stochastic Processes with Applications
- StatThink_IntroStat
- HIDI
- 19probabilisticinference

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue reading from where you left off, or restart the preview.