You are on page 1of 35

CS-8001 SOFT COMPUTING

ACTIVATION FUNCTION & LEARNING


B.TECH IV YEAR

BY:MS.ARUNA BAJPAI
DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING
ITM GOI
ACTIVATION FUNCTION &
LEARNING

Presented By
Aruna Bajpai
Terminology
• Biological NN Artificial Neural Network
1. Soma Node
2. Dendrites Input
3. Synapse Weights or Interconnections
4. Axon Output
Model of ANN
• yin=x1.w1+x2.w2+x3.w3…..xm.wm

• Output Y=F(yin)
Difference B/W ANN & BNN
Criteria BNN ANN
Processing Massively parallel, slow but superior than ANN Massively parallel, fast but
inferior than BNN
Size 1011 neurons and 1015 interconnections 102 to 104 nodes 

Learning They can tolerate ambiguity Very precise, structured and


formatted data is required to
tolerate ambiguity

Fault tolerance Performance degrades with even partial damage It is capable of robust
performance, hence has the
potential to be fault tolerant
Storage capacity Stores the information in the synapse Stores the information in
continuous memory locations
Activation Functions
• This function may be defined as the extra force
or effort applied over the input to obtain an
exact output.
• In ANN, we can also apply activation functions
over the input to get the exact output.
• It is also known as Transfer Function.
Types
1.Binary step Function
•A binary step function is a threshold-based activation
function.
• If the input value is above or below a certain
threshold, the neuron is activated and sends exactly
the same signal to the next layer.
Linear Activation Function
• It takes the inputs, multiplied by the weights for
each neuron, and creates an output signal
proportional to the input.
Sigmoid Activation Function
• It is of two type as follows −
• Binary sigmoidal function − This activation
function performs input editing between 0 and 1. It
is positive in nature. It is always bounded, which
means its output cannot be less than 0 and more
than 1. It is also strictly increasing in nature, which
means more the input higher would be the output.
It can be defined as
• F(x)=sigm(x)=1/1+exp(-x)
• Bipolar sigmoidal function − This activation
function performs input editing between -1 and 1.
It can be positive or negative in nature. It is
always bounded, which means its output cannot be
less than -1 and more than 1. It is also strictly
increasing in nature like sigmoid function. It can
be defined as
• F(x)=sigm(x)=1-exp(x)/1+exp(x)
Step Function:
• Step Function is one of the simplest kind of
activation functions. In this, we consider a
threshold value and if the value of net input say y
is greater than the threshold then the neuron is
activated.
• Mathematically,
• f(x)=1, if x>=0
• f(x)=0, if x<0
ReLU:
The ReLU function is the Rectified linear unit. It is
the most widely used activation function. It is defined
as:
f(x) = max(0, x)
•The main advantage of using the ReLU function
over other activation functions is that it does not
activate all the neurons at the same time.if the input is
negative it will convert it to zero and the neuron does
not get activated.
Tanh Function
The Tanh function is a modified or scaled up
version of the sigmoid function. In Sigmoid was that
the value of f(z) is bounded between 0 and 1;
however, in the case of Tanh the values are bounded
between -1 and 1.
Learning
• Learning rule is a method or a mathematical logic.
It helps a Neural Network to learn from the
existing conditions and improve its performance.
• Learning, in artificial neural network, is the
method of modifying the weights of connections
between the neurons of a specified network.
• Learning in ANN can be classified into three
categories namely supervised learning,
unsupervised learning, and reinforcement
learning.
Suvervised Learning
• As the name suggests, this type of learning is done
under the supervision of a teacher. This learning
process is dependent.
• During the training of ANN under supervised
learning, the input vector is presented to the
network, which will give an output vector.
• This output vector is compared with the desired
output vector. An error signal is generated, if there
is a difference between the actual output and the
desired output vector.
• On the basis of this error signal, the weights
are adjusted until the actual output is matched
with the desired output.
Unsupervised Learning
• As the name suggests, this type of learning is done
without the supervision of a teacher. This learning
process is independent.

• During the training of ANN under unsupervised


learning, the input vectors of similar type are
combined to form clusters.
• When a new input pattern is applied, then the neural
network gives an output response indicating the
class to which the input pattern belongs.

• There is no feedback from the environment as to


what should be the desired output and if it is correct
or incorrect. Hence, in this type of learning, the
network itself must discover the patterns and
features from the input data, and the relation for the
input data over the output.
Reinforcement Learning
• As the name suggests, this type of learning is
used to reinforce or strengthen the network over
some critic information. This learning process is
similar to supervised learning, however we
might have very less information.

• During the training of network under


reinforcement learning, the network receives
some feedback from the environment.
• This makes it similar to supervised learning.
However, the feedback obtained here is
evaluative not instructive, which means there
is no teacher as in supervised learning.
• After receiving the feedback, the network
performs adjustments of the weights to get
better information in future.
Quiz
• Who invented perceptron neural networks?
a) McCullocch-pitts
b) Widrow
c) Minsky & papert
d) Rosenblatt
• What is adaline in neural networks?
a) adaptive linear element
b) automatic linear element
c) adaptive line element
d) none of the mentioned
• What was the 2nd stage in perceptron model called?
a) sensory units
b) summing unit
c) association unit
d) output unit
• What is an activation value?
a) weighted sum of inputs
b) threshold value
c) main input to neuron
d) none of the mentioned
• Positive sign of weight indicates?
a) excitatory input
b) inhibitory input
c) can be either excitatory or inhibitory as such
d) none of the mentioned
Thank You

You might also like