Professional Documents
Culture Documents
Supervised Learning - I
3. Weight adjustment
The weights of all links are computed simultaneously based on the error and
propagated backwards
Definition of Learning
Learning is a process by which the free parameters of a neural network are
adapted through a process of stimulation by the environment in which the
network is embedded.
The type of the learning is determined by the manner in which the parameter
changes take place.
- Learning and memory are the one that we make humans intelligent.
- Imagine not being able to recall where we were a moment ago, or what
we read a couple of hours ago or what we said some time ago? Where
did we go to high school?
• Long term memory can be further divided into two: implicit and explicit
memory.
• Explicit and implicit memory are stored at different sites and use a different
logic.
• Explicit memory deals with everyday facts and events. Its storage site is the
medial temporal lobe and the hippocampus.
DEPARTMENT OF INFORMATION TECHNOLOGY 9 11/09/2021
• Explicit memory is defined as the memory that involves conscious
recollection of past experience.
• For eg: Recollection of birthday party celebrated three days back
• Implicit memories are stored differently depending upon how they are
acquired
• Unsupervised methods help you to find features which can be useful for
categorization.
• It is taken place in real time, so all the input data to be analyzed and labeled in
the presence of learners.
• It is easier to get unlabeled data from a computer than labeled data, which
needs manual intervention.
Class A1
1
2
1
x1
Class A2 x1
If at iteration p, the actual output is Y(p) and the desired output is Yd (p), then the
error is given by:
e( p) Yd ( p) Y( p) where p = 1, 2, 3, . . .
Step 2: Activation
Activate the perceptron by applying inputs x1(p),
x2(p),…, xn(p) and desired output Yd (p). Calculate
the actual output at iteration p = 1
n
Y ( p ) step xi ( p ) w i ( p )
i 1
where n is the number of the perceptron inputs,
and step is a step activation function.
Step Function is one of the simplest kind of activation functions. In this, we consider
a threshold value and if the value of net input say y is greater than the threshold
then the neuron is activated.
DEPARTMENT OF INFORMATION TECHNOLOGY 27
11/09/2021
Perceptron’s training algorithm (continued)
Step 3: Weight training
Update the weights of the perceptron
wi ( p 1) wi ( p) wi ( p)
where Δ wi(p) is the weight correction at iteration
p.
The weight correction is computed by the delta
w ( p) x ( p ) .e( p)
rule:
i i
Step 4: Iteration
Increase iteration p by one, go back to Step 2 and
repeat the process until convergence.
DEPARTMENT OF INFORMATION TECHNOLOGY 28 11/09/2021
28
Example of perceptron learning: the logical operation
AND Epoch
Inputs Desired
output
Initial
weights
Actual
output
Error Final
weights
x1 x2 Yd w1 w2 Y e w1 w2
1 0 0 0 0.3 0.1 0 0 0.3 0.1
0 1 0 0.3 0.1 0 0 0.3 0.1
1 0 0 0.3 0.1 1 1 0.2 0.1
1 1 1 0.2 0.1 0 1 0.3 0.0
2 0 0 0 0.3 0.0 0 0 0.3 0.0
0 1 0 0.3 0.0 0 0 0.3 0.0
1 0 0 0.3 0.0 1 1 0.2 0.0
1 1 1 0.2 0.0 1 0 0.2 0.0
3 0 0 0 0.2 0.0 0 0 0.2 0.0
0 1 0 0.2 0.0 0 0 0.2 0.0
1 0 0 0.2 0.0 1 1 0.1 0.0
1 1 1 0.1 0.0 0 1 0.2 0.1
4 0 0 0 0.2 0.1 0 0 0.2 0.1
0 1 0 0.2 0.1 0 0 0.2 0.1
1 0 0 0.2 0.1 1 1 0.1 0.1
1 1 1 0.1 0.1 1 0 0.1 0.1
5 0 0 0 0.1 0.1 0 0 0.1 0.1
0 1 0 0.1 0.1 0 0 0.1 0.1
1 0 0 0.1 0.1 0 0 0.1 0.1
1 1 1 0.1 0.1 1 0 0.1 0.1
Threshold:
= 0.2; learning rate:
DEPARTMENT OF INFORMATION TECHNOLOGY
= 29
0.1 11/09/2021
11/09/2021 29
THANK YOU