You are on page 1of 12

ERROR CORRECTION -G Naga

Venkatesh(2016PD08)
LEARNING
CONTENTS
Learning
Five basic learning rules
Error correction learning
Uses of Error correction learning
LEARNING-OVERVIEW
What is meant with learning?
The ability of the neural network (NN) to learn from its
environment and to improve its performance through
learning.
-The NN is stimulated by an environment
-The NN undergoes changes in its free parameteres
-The NN responds in a new way to the environment
DEFINITION OF LEARNING
Learning is a process by which the free parameters of a
neural network are adapted through a process of stimulation
by the environment in which the network is embedded. The
type of the learning is determined by the manner in which
the parameter changes take place.
FIVE BASIC LEARNING RULES
Error-correction learning <- optimum filtering
Memory-based learning <- memorizing the training data
explicitly
Hebbian learning <- neurobiological
Competitive learning <- neurobiological
Boltzmann learning <- statistical mechanics
ERROR CORRECTION
LEARING
The error correction learning procedure is simple enough in
conception. The procedure is as follows:
During training an input is put into the network and flows
through the network generating a set of values on the output
units. Then, the actual output is compared with the desired
target, and a match is computed. If the output and target
match, no change is made to the net. However, if the output
differs from the target a change must be made to some of the
connections.
ERROR CORRECTION
LEARNING
error signal = desired response output signal

ek(n) = dk(n) yk(n)

ek(n) actuates a control mechanism to make the output signal y k(n)


come closer to the desired response dk(n) in step by step manner

Also called deltra rule, Widrow-Hoff rule


ERROR CORRECTION
LEARNING-CONTINUED..
A cost function (n) = ek(n) is the instantaneous value of the error energy -
a steady state. Minimization of cost function leads to a learning rule
commonly referred to as the delta rule or Widrow-Holf rule. Let (n) denote the
value of synaptic weight of neuron k excited by element of the signal vector
x(n) at time step n. According to delta rule, the adjustment wkj(n) applied to
the synaptic weight at time step n is defined by,
wkj(n) = ek(n) xj(n) , is the learning rate parameter
The adjustment made to a synaptic weight of a neuron is proportional to the
product of the error signal and the input signal of the synapse in question.
wkj(n+1) = wkj(n) + wkj(n)
MAIN FEATURES
Error-correction learning is local
The learning rate determines the stability or convergence.
It is positive constant that determines rate of learning as we
proceed from one step in the learning process to another.
Delta rule presumes that error signal is directly measurable.
The choice of also has a profound influence on the accuracy
and other aspects of the learning process.
EXAMPLES OF ERROR
CORRECTION LEARNING
The least-mean square (LMS) algorithm (Windrow and Hoff) ,
also called delta rule.

And its generalization known as the back propagation (BP)


algorithm.

Error correction learning is a example of closed loop


feedback system.
REAL LIFE APPLICATIONS
Automated medical diagnosis:

play an important role in medical decision-making, helping


physicians to provide a fast and accurate diagnosis. In many
research papers researchers discussed the diagnosis by using
back propagation algorithm which is a use of error correction
learning.
With the development of back propagation algorithm it has
now become possible to successfully attack problems
requiring neural networks with high degrees of nonlinearity
and high precision.

You might also like