Professional Documents
Culture Documents
Lecture-5
Bayesian Networks
1
Graphical Models
• Graphical models provide an efficient structure to represent dependencies
in probabilistic systems.
• There are two main types of graphical models for probabilistic systems:
• Bayesian Networks are directed graphical models
• Markov Networks (Markov Random Fields) are undirected graphical models
(they can model dependencies Bayes nets cannot. – Will not be discussed.
• Both types of models can represent different types of dependencies
• Graphical models in probabilistic systems allow the representation
of the interdependencies of random variables
• Structure shows dependency relations
• Inference can use the structure to control the computations
• Graphical models provide a basis for a number of efficient problem
solutions
• Inference of prior and conditional probabilities
• Learning of network structure
3
BAYESIAN NETWORKS
2
Bayesian Networks
3
Joint Distribution
• Remember a Bayesian Network should be a simple representation
of a system with a large number of probabilistic variables with some
independencies.
• Calculating the joint distribution can be done using:
,…, |
Conditional Independence
4
Node Ordering
10
5
Child continuous, Parent Discrete
11
Hybrid Example
,! ,
, ,
12
Example taken from RN2003
6
Linear Gaussian Distribution
13
14
7
So, why did we do all of these?
15
Inference
16
8
Inference by Enumeration
Inference by Enumeration –
Example
• How much is P(SP|rs,fl)
• , , , ∑ , , , , ,
• This requires 4 additions over n multiplications
• The worst case complexity is 2
• Example shows “variable elimination” with which real complexity can be
reduced. Complexity also depends on sparsity of the network and which
variables are used for evidence, query and which are hidden)
18
9
Approximate Inference
• Simplest method.
• Forget any evidence you may have for nodes.
• Sample each variable in topological order based on the
outcomes of the previous samples. Do this many times
(let’s say M times).
• This is going to result in M number of N-tuples:
e.g., { , , , , |1
• Individual and joint probabilities now can be estimated by
how many times out of M samples something has
happened.
20
10
Rejection Sampling in
Bayesian Networks
• Recall: rejection sampling was used to sample from a
hard to sample distribution given an easy one.
• Used in this context to add evidence and thus to
determine conditional probabilities.
• Having the M N-tuples, we can count how many times
the evidence happened and out of those times, how
many times the query happened (for Boolean). The
conditional probability will be the ratio of these two.
• The problem is that some probabilities may become very
low and thus using them as evidence variables will
require huge sets of samples.
21
Likelihood Weighting
11
Markov Chain Monte Carlo
References
24
12