Professional Documents
Culture Documents
CS340 Machine Learning Gibbs Sampling in Markov Random Fields
CS340 Machine Learning Gibbs Sampling in Markov Random Fields
Image denoising
y
x = E[x|y, ]
Ising model
2D Grid on {-1,+1} variables Neighboring variables are correlated
p(x|)
Ising model
ij (xi , xj ) = eW eW eW eW
p(x|)
1 exp[H(x|)] Z()
H(x)
xT Wx =
<ij>
Wij xi xj
Cold T=2
Hot T=5
Boltzmann distribution
Prob distribution in terms of clique potentials
1 p(x|) = Z() c (xc | c )
cC
Hc (xc ) + log Z]
Ising model
2D Grid on {-1,+1} variables Neighboring variables are correlated
Hij = Wij Wij Wij Wij
H(x)
p(x| )
=
=
xT Wx =
<ij>
Wij xi xj
1 exp[H(x| )] Z( )
Local evidence
p(x, y) = p(x)p(y|x) = 1 Z ij (xi , xj ) p(yi |xi )
i
<ij>
p(yi |xi )
N (yi |xi , 2 )
Gibbs sampling
A way to draw samples from p(x1:d|y,) one variable at a time, ie. p(xi|x-i)
1. xs+1 p(x1 |xs , . . . , xs ) 2 1 D 2. xs+1 p(x2 |xs+1 , xs , . . . , xs ) 3 2 1 D 3. xs+1 p(xi |xs+1 , xs i+1:D ) i 1:i1 4. xs+1 p(xD |xs+1 , . . . , xs+1 ) 1 D D1
Mackay 29.13
10
<jk>:j,kFi
jk (xj , xk )] jk (xj , xk )]
ij (Xi = , xj )][
<j,k>:j,kFi
ij (Xi = , xj ) ij (Xi = , xj )
jNi
11
ij (Xi = +1, xj )
jNi
ij (Xi = 1, xj ) xj ]
= = = wi (u) = 1/(1 + eu ) =
xj ]
jNi
exp[J
jNi
xj ] + exp[J
12
Run demo
13
p(Xi , Xi ) x p(Xi , Xi ) p(Xi , U1:n , Y1:m , Z1:m , R) x p(x, U1:n , Y1:m , Z1:m , R) p(Xi |U1:n )[ j p(Yj |Xi , Zj )]P (U1:n , Z1:m , R)
x
p(Yj |Xi , Zj )]
j
p(Yj |P a(Yj )
14
Birats
15
Samples
16
17
Boltzmann machines
Ising model where the graph structure is arbitrary, and the weights W are learned by maximum likelihood
18
Hopfield network
Boltzmann machine with no hidden nodes (fully connected Ising model)
19