Professional Documents
Culture Documents
Arles Rodríguez
aerodriguezp@unal.edu.co
Facultad de Ciencias
Departamento de Matemáticas
Universidad Nacional de Colombia
Motivation
• Backpropagation is the most difficult part of a
neural network to implement.
• There is a mathematical way to debug the
gradient programming called gradient
checking.
• It consists of a numerical approximation of the
gradients.
Numerical approximation of gradients
Let`s say , 𝜀=0.01
𝑓 ( 𝜃+𝜀)
𝑓 ( 𝜃+𝜀 ) − 𝑓 (𝜃 − 𝜀)
𝑓 ( 𝜃 + 𝜀) − 𝑓 ( 𝜃 − 𝜀)
𝑓 ′ ( 𝜃) ≈
2𝜀
𝑓 (𝜃− 𝜀) Given
2𝜀 𝑓 ′ ( 𝜃) ≈
𝑓 ( 1.01 ) − 𝑓 (0.99)
2 (0.01)
( 1.01 )3 −(0.99)3
𝑓 ′ (𝜃 )≈
2 (0.01)
¿3,0001
𝑓 ′ ( 𝜃 ) =3 𝜃 2=3
𝑓 ( 𝜃+𝜀 ) − 𝑓 (𝜃 − 𝜀)
𝜀→ 0 2𝜀
If .0001
𝜃−𝜀 𝜃+ 𝜀
𝜃=1
Gradient checking
• Take parameters and concatenate into a big
vector
for each i: