You are on page 1of 16

Feed Forward Network

Backpropagation Neural Network Problem

S. Vivekanandan

Cabin: TT 319A
E-Mail: svivekanandan@vit.ac.in
Mobile: 8124274447
21-03-2017 Dr. S. Vivekanandan Asso. Prof.-SELECT 2
y
W11 W0k

W12
W13

z1 Z2 Z3
V12 V13 1

V11
V0j

X1 X2 X3
1
1. Initialize the weights and bias

2. Feed forward stage

2a Hidden layer
Find Zinj  Voj  XiVij
1
Activation function Z1  f ( Zinj ) 
1  e 2

2b. Output layer


1
Find Y 1  f (Yinj) 
1  e 2

Activation function Yink  Wok  ZiWjk


21-03-2017 Dr. S. Vivekanandan Asso. Prof.-SELECT 5
21-03-2017 Dr. S. Vivekanandan Asso. Prof.-SELECT 6
21-03-2017 Dr. S. Vivekanandan Asso. Prof.-SELECT 7
21-03-2017 Dr. S. Vivekanandan Asso. Prof.-SELECT 8
3. Back propagation of error

Output error
k  (tk  Yk ) f 1 (Yink )
f 1 ( x)  f ( x)(1  f ( x))

Hidden layer m
  inj   kWjk
k 1

j  injf 1 ( zinj )
21-03-2017 Dr. S. Vivekanandan Asso. Prof.-SELECT 10
21-03-2017 Dr. S. Vivekanandan Asso. Prof.-SELECT 11
21-03-2017 Dr. S. Vivekanandan Asso. Prof.-SELECT 12
4. Weight Updation

4a. Weight correction for output layer is

Wjk ( new)  Wjk (old )  Wjk


Wjk  kZj

Bias correction Wok  k

4b. Weight correction for hidden layer is

Vij ( new)  Vij (old )  Vij

Vij  jXi

Bias correction Voj  j


21-03-2017 Dr. S. Vivekanandan Asso. Prof.-SELECT 14
21-03-2017 Dr. S. Vivekanandan Asso. Prof.-SELECT 15
21-03-2017 Dr. S. Vivekanandan Asso. Prof.-SELECT 16

You might also like