Professional Documents
Culture Documents
• It was the pioneering work of Hopfield in the early 1980s that led the way
for designing neural networks with feedback paths.
• The work of Hopfield is seen by many as the starting point for the
implementation of associative memory by using a special structure of
recurrent neural networks.
• The associative memory concept means simply that the network is able to
recognize newly presented (noisy or incomplete) patterns using an already
stored “complete” version of that pattern.
• The new pattern is “attracted” to the stable pattern already stored in the
network memories.
Topology of The Hopfield Network
• A number of processing units configured in one single layer (besides
the input and the output layers) with symmetrical synaptic
connections, i.e., wij = wji.
• In the original work of Hopfield, the output of each unit can take a
binary value (either 0 or 1) or a bipolar value (either −1 or 1).
• This value is fed back to all the input units of the network except to
the one corresponding to that output.
• By assuming the state of the network with dimension n (n neurons)
takes bipolar values.
• The activation rule for each neuron is then provided by the following:
• This value is fed back to all the input
units of the network except to the one
corresponding to that output.
• The learning algorithm for the Hopfield network is based on the Hebbian
learning rule.
• This is one of the earliest procedures for carrying out unsupervised learning.
• The Hebbian learning rule, also known as the outer product rule of storage,
as applied to a set of q presented patterns pk(k = 1, . . . , q) each with
dimension n (n denotes the number of neuron units in the Hopfield network),
is expressed as:
Learning Algorithm (cont.)
• Step 4 (retrieval 2): Continue the process for other presented unknown
patterns by starting again from Step 2.
Example
We need to store a fundamental pattern (memory) given by the vector B =
[1 1 1 −1]T in a four-node binary Hopfield network. We will show that all
possible 16 states of the network will be attracted to the fundamental
memory given by B or to its complement L = [−1 −1 −1 1]T. Let us
presume that all threshold parameters θi are equal to zero. The weight
matrix is given by below equation, where the ratio (1/n = 1/4) was
discarded:
• Using the energy equation we compute the energy value for every state (each
state is coded with entries oi, where oi = 1 or −1 for each i = 1, 2, 3, 4).
• Looking at the energy levels, we find that two potential attractors
emerge: the original fundamental pattern coded by [1, 1, 1, −1]T and
its complement given by [−1, −1, −1, 1]T. Both of them have the
lowest energy value given by “−6”.
• At the retrieval stage, we use