You are on page 1of 18

Artificial Neural Networks - 15

Dr. Aditya Abhyankar

Last Time!
BP + RBF
Dr. Ghongades lecture

Today
Boltzmann Machine

Boltzmann Machines

States of units are binary valued


State transitions are probabilistic
Weight adjustment = degree of desirability
Depict constrained optimization problems
Training based on maximization of
consensus function

Boltzmann Machines Architecture

Algorithm

Algorithm

Setting the weights


These machines are fixed weight machines
Transition is made towards max consensus
function
P and b values decide the state transitions
-p is a penalty for violating condition of only
1 unit per row/column be on
Self connection is a bonus
Mostly the problems are p>b problems

Paradox 1
If Uij is OFF and none of the connected
units are ON, then turning this unit ON
increases the consensus by b.
Likelihood of this unit getting accepted by
the Net gets more

Paradox 2
If Uij+1 is already ON, then attempts to turn
Uij ON would change consensus by b-p
The Net will tend to reject this change
For (b-p)>0, total consensus of the Net
reduces

Modified Hebbian Learning


Learns principal components (eigenvectors
corresponding to eigenvalues of correlation
matrix)
Covariance matrix is given as:

Modified Hebbian Learning


Weigh change:

Final output:

Algorithm

Boltzmann machine with


learning

4:2:4
Arch

Algorithm

Algorithm

Algorithm

Algorithm