Professional Documents
Culture Documents
In
puts
x1 Lin
ear Hard
w C
ombin
er L
imiter
1 O
utp
ut
Y
w
2
x2
T
hresh
old
Step 1: Initialisation
Set initial weights w1, w2,…, wn and threshold to random
numbers in the range [0.5, 0.5].
If the error, e(p), is positive, we need to increase perceptron
output Y(p), but if it is negative, we need to decrease Y(p).
Step 2: Activation
Activate the perceptron by applying inputs x1(p), x2(p),…, xn(p)
and desired output Yd (p). Calculate the actual output at
iteration p = 1
n
Y ( p ) step xi ( p ) wi ( p )
i 1
wi ( p 1) wi ( p) wi ( p)
where is the weight correction at iteration p.
wi ( p ) xi ( p ) e( p )
January 23, 2022 Neural Networks 6
Perceptron’s training algorithm