Professional Documents
Culture Documents
Modul ke:
An Example of NN using ReLu
09 Fakultas
Teknik
Program Studi
Teknik Elektro
𝒘𝟏 𝒘𝟓
𝒊𝟏 𝒉𝟏 𝒐𝟏
𝒘𝟐 𝒘𝟔
𝒘𝟑 𝒘𝟕
𝒊𝟐 𝒘𝟒 𝒉𝟐 𝒘𝟖 𝒐𝟐
𝒃𝟏 𝒃𝟐
The Architecture
𝟎. 𝟏𝟓 𝟎. 𝟒𝟎
𝟎. 𝟎𝟓 𝒉𝟏 𝒐𝟏
𝟎. 𝟐𝟎 𝟎. 𝟒𝟓 𝟎. 𝟎𝟏
𝟎. 𝟐𝟓 𝟎. 𝟓𝟎
𝟎. 𝟏 𝟎. 𝟑𝟎 𝒉𝟐 𝟎. 𝟓𝟓 𝒐𝟐
𝟎.99
𝟎. 𝟑𝟓 𝟎. 𝟔𝟎
The Forward Pass
The Forward Pass 𝑻𝒐𝒕𝒂𝒍 𝒊𝒏𝒑𝒖𝒕
We figure out the total net input to
each hidden layer neuron, squash the 𝑧𝑗 = 𝑛𝑒𝑡ℎ𝑗 = 𝑤𝑗 𝑎𝑗 + 𝑏𝑗
total net input using an activation 𝑗=1
function / logistic function, then
repeat the process with the output 𝑳𝒐𝒈𝒊𝒔𝒕𝒊𝒄 𝒇𝒖𝒏𝒄𝒕𝒊𝒐𝒏 (𝑹𝒆𝑳𝒖)
layer neurons. 𝜎 𝑧𝑗 = 𝑂𝑢𝑡ℎ𝑗 = max(1, 𝑧𝑗 )
𝒛𝒊𝒉𝟏 = 𝑤1 𝑖1 + 𝑤3 𝑖2 + 𝑏1 = 𝟎. 𝟑𝟕𝟕𝟓
𝟎. 𝟑𝟕𝟕𝟓 𝑧𝑖ℎ1
𝒛𝒊𝒉 =
𝟎. 𝟑𝟗𝟐𝟓 𝑧𝑖ℎ2
max(1, 𝑧𝑖ℎ1 ) 𝟏
𝝈 𝒛𝒊𝒉 = =
max(1, 𝑧𝑖ℎ2 ) 𝟏
The Forward Pass
𝑻𝒐𝒕𝒂𝒍 𝒊𝒏𝒑𝒖𝒕
𝑤5 𝑤7 𝜎 𝑧𝑖ℎ1
𝑧𝑗 = 𝑛𝑒𝑡ℎ𝑗 = 𝑤𝑗 𝑎𝑗 + 𝑏𝑗 𝑧ℎ𝑜 = 𝑤 + 𝑏2
6 𝑤8 𝜎 𝑧𝑖ℎ2
𝑗=1
𝑳𝒐𝒈𝒊𝒔𝒕𝒊𝒄 𝒇𝒖𝒏𝒄𝒕𝒊𝒐𝒏 (𝒔𝒊𝒈𝒎𝒐𝒊𝒅)
𝜎 𝑧𝑗 = 𝑂𝑢𝑡ℎ𝑗 = max(1, 𝑧𝑗 ) 0.4 0.5 1
𝑧ℎ𝑜 = + 0.6
0.45 0.55 1
𝟏. 𝟓𝟎 𝑧ℎ𝑜1
𝒛𝒉𝒐 =
𝟏. 𝟔𝟎 𝑧ℎ𝑜2
max(1, 𝑧𝑗 ) 𝟏. 𝟓𝟎
𝝈 𝒛𝒉𝒐 = =
max(1, 𝑧𝑗 ) 𝟏. 𝟔𝟎
The Cost Function
𝑪𝒐𝒔𝒕 𝑭𝒖𝒏𝒄𝒕𝒊𝒐𝒏
1 2
𝑀𝑆𝐸 = 𝐶 𝑤, 𝑏 = 𝑡(𝑥) − 𝑧(𝑥)
2𝑛
𝑥
1
(𝑡 − 𝜎 𝑧ℎ𝑜1 )2
2 𝑜1
𝑀𝑆𝐸 = 1
(𝑡 − 𝜎 𝑧ℎ𝑜2 )2
2 𝑜2
1
(0.01 − 1.50)2
2
𝑀𝑆𝐸 = 1
(0.99 − 1.60)2
2
𝒘+
𝟏 𝒘+
𝟓
𝒊𝟏 𝒉𝟏 𝒐𝟏
𝒘+
𝟐 𝒘+
𝟔
𝒘+
𝟑 𝒘+
𝟕
𝒊𝟐 𝒘+ 𝒉𝟐 𝒘+
𝟖
𝒐𝟐
𝟒
𝒃𝟏 𝒃𝟐
Terima Kasih
Zendi Iklima, ST, S.Kom, M.Sc