CONTENT Definition History of Hebbian Learning
Associative Memory Networks
Types of Associative Network
Advantage of Hebbian Learning
DEFINITION
Associative learning is a theory that
states that ideas reinforce each other and can be linked to one another. This lesson will explain the theory of Associative learning as well as provide some practical, real-life examples of this type of learning. Hebbian Learning It was proposed in 1950 by “Donald Hebb Hebb’s principle can be described how to alter the weight between modal neurons The Weight between two neurons increase if two neurons activate simultaneously, and reduces if activate separately . If there are no signal correlation ,the weight remain unchanged. In case of associative network, Hebbian learning is applied,updating the weight Wij by Wij.
Yj Wij Xi Wij = CXiYj
THE HUBB RULE
The formulaic description of Hebbian learning,
Wij = CXiYj
Where,
Wij : is the weight from ith to jth neoron.
C : is the learning constant. Xi : is the input signal. yi : is the output signal. Associative Memory Networks
It is the type of a neural network
Work on basis of pattern association
It makes a parallel search
It is also known as Content-Addresseble Memory (CAM)
Architecture Of Associative Networks
In this single-layer networks are used.
In feed forword net,information flows from input units to output units. In recurrent net, connections among the nodes that form closed loops. The learning rule used (Hubb Rule).
The data and weight representation can be binary or
bipolar form. Types Of Associative Networks
Auto Associative Memory
Hetero Associative Memory
Auto-Associative Networks This is a single layer neural network in which the input training vector and the output target vector are the same.
In this the network stores a set of
patterns.
For this network, Hebb rule are used.
Architecture The architecture of auto associative network has n number of input vectors and n number of output vectors.
Training Algorithm
It is using the Hebb learning rule.
Step 1 − Initialize all the weights to zero as
wij = 0 (i = 1 to n, j = 1 to n)
Step 2 − Perform steps 3-4 for each input vector.
Step 3 − Activate each input unit as follows − xi=si(i=1 to n)
Step 4 − Activate each output unit as follows −
yj=sj(j=1 to n)
Step 5 − Adjust the weights as follows −
wij(new)=wij(old)+xiyj Hetero-Associative Networks It is also a single layer neural network In which the input training vector and the output target vector are not the same. It is static in nature It would be no non-linear and delay operations. For this network also, Hebb rule are used Architecture The architecture of Hetero Associative Memory network has ‘n’ number of input training vectors and ‘m’ number of output target vectors.
Training Algorithm
It is using the Hebb or Delta learning rule.
Step 1 − Initialize all the weights to zero as
wij = 0 (i = 1 to n, j = 1 to m)
Step 2 − Perform steps 3-4 for each input vector.
Step 3 − Activate each input unit as follows − xi = si(i = 1 to n)
Step 4 − Activate each output unit as follows −
yj = sj(j = 1 to m)
Step 5 − Adjust the weights as follows −
wij(new) = wij(old)+xiyj Advantage:-
o The main advantage of associative learning is that it is
used in the pattern association and pattern recognition Thanks