Professional Documents
Culture Documents
Madhavan Mukund
https://www.cmi.ac.in/~madhavan
Acyclic
Input layer, hidden layers,
output layer
Acyclic
Input layer, hidden layers,
output layer
Assumptions
Acyclic
Input layer, hidden layers,
output layer
Assumptions
Hidden neurons are
arranged in layers
Acyclic
Input layer, hidden layers,
output layer
Assumptions
Hidden neurons are
arranged in layers
Each layer is fully
connected to the next
Acyclic
Input layer, hidden layers,
output layer
Assumptions
Hidden neurons are
arranged in layers
Each layer is fully
connected to the next
Set weight to zero to
remove an edge
Layers ` 2 {1, 2, . . . , L}
Inputs are connected first hidden layer, layer 1
Layer L is the output layer
Layer ` has m` nodes 1, 2, . . . , m`
Layers ` 2 {1, 2, . . . , L}
Inputs are connected first hidden layer, layer 1
Layer L is the output layer
Layer ` has m` nodes 1, 2, . . . , m`
Node k in layer ` has bias bk` , output zk` and activation value ak`
Weight on edge from node j in level ` 1 to node k in level ` is wkj`
Let w `k = (wk1
` `
, wk2 `
, . . . , wkm ` 1
)
` 1 ` 1 ` 1 ` 1
and a = (a1 , a2 , . . . , am` 1 )
Then zk` = w `k · a` 1
Let w `k = (wk1
` `
, wk2 `
, . . . , wkm ` 1
)
` 1 ` 1 ` 1 ` 1
and a = (a1 , a2 , . . . , am` 1 )
Then zk` = w `k · a` 1
Assume all layers have same number of nodes
Let m = max m`
`2{1.2,...,L}
For any layer i, for k > mi , we set all of wkj` , bk` , zk` , ak` to 0
Matrix formulation
2 ` 3 2 ` 32 3
z1 w1 a1` 1
6 z` 7 6 w` 7 6 a` 1 7
6 2 7 = 6 2 76 2 7
4 ··· 5 4 ··· 54 ··· 5
z `m w `m am` 1