Professional Documents
Culture Documents
PRESENTED BY
LALITHADEVI.B
RC2013003011017
FULL TIME RESEARCH SCHOLAR,
DEPT. OF COMPUTER SCIENCE ENGINEERING,
SRM INSTITUTE OF SCIENCE AND TECHNOLOGY.
1
Overview of Neural networks
- Made of interconnected neurons.
2
x = (weight * input) + bias
- error is calculated.
- Based on this error value, the weights and biases of the neurons are
updated - back-propagation.
3
Types of activation functions
4
1. Binary Step Function
f(x) = 1, x>=0
= 0, x<0
5
def binary_step(x):
if x<0:
return 0
else:
return 1
Derivative(Gradient)
6
2. Linear Function
f(x) = ax
def linear_function(x):
return 4*x
7
Derivative(Gradient)
f '(x) = a
8
3. Sigmoid Function
f(x) = 1/(1+e^-x)
import numpy as np
def sigmoid_function(x):
z = (1/(1 + np.exp(-x)))
return z
9
Derivative(Gradient)
f'(x) = sigmoid(x)*(1-sigmoid(x))
10
4. Tanh Function
tanh(x) =2 / sigmoid(2x)-1
tanh(x) = 2 / (1+e^(-2x)) -1
Range = (-1 to 1)
def tanh_function(x):
z = (2/(1 + np.exp(-2*x))) -1
return z
11
Derivative(Gradient)
12
5. ReLu Function
f(x)=max(0,x)
Range = 0 or x
def relu_function(x):
if x<0:
return 0
else:
return x
13
Derivative(Gradient)
f '(x) = 1, x>=0
= 0, x<0
14
6. Leaky ReLu Function
def leaky_relu_function(x):
if x<0:
return 0.01*x
else:
return x
15
Derivative(Gradient)
f '(x) = 1, x>=0
= 0.01, x<0
16
7. Softmax Function
z = np.exp(x)
z_ = z/z.sum()
return z_
Suppose, got the output from the neurons as [1.2 , 0.9 , 0.75].
Applying the softmax function over these values, will get the following result
18
THANK YOU
19