Professional Documents
Culture Documents
• Soft Computing:-
• evolution changes for a specific species like the human nervous system and behavior of an Ant’s,
etc.
• Biological Neuron:-
• Axons connect to dendrites via synapses.
• Electro-chemical signals are propagated from
• the dendritic input, through the cell body, and down
the axon to other neurons
• A neuron only fires if its input signal exceeds a
certain amount (the threshold) in a short time period.
• Synapses vary in strength
• Good connections allowing a large signal
• Slight connections allow only a weak signal.
Chapter 1 Soft Computing 10/31/2023 5
Introduction to Neural Network- Biological Neurons, McCullah and Pitts models
of neuron, Models of ANN, Neural processing, Neural network learning rules, Single
layer perceptron classifier, Linearly separable classification
Basic Models of
ANN
Activation
Interconnections Learning rules
function
• Types of learning :-
• Supervised Learning:-
• Unsupervised Learning:-
• Reinforcement Learning:-
• Supervised: • Unsupervised
• Adaline, Madaline • Competitive Learning
• Perceptron • Kohenen self organizing map
• Back Propagation • Learning vector quantization
• multilayer perceptron • Hebbian learning
• Radial Basis Function Networks • Reinforced
• Positive
• Negative
x1
X1
y = f ( yin )
w1
Y y
x2
w2
X2 yin = x1w1 + x2 w2
Advantages Disadvantages
Simplistic Weights and thresholds are fixed
Substantial computing power Not very flexible
Chapter 1 Soft Computing 10/31/2023 18
Introduction to Neural Network- Biological Neurons, McCullah and Pitts models of
neuron, Models of ANN, Neural processing, Neural network learning rules, Single layer
perceptron classifier, Linearly separable classification
• Supervised:
• Adaline, Madaline
• Perceptron
• Back Propagation
• multilayer perceptron
• Radial Basis Function Networks
• Unsupervised
• Competitive Learning
• Kohenen self organizing map
• Learning vector quantization
• Hebbian learning
Neural processing
• refers to the way the brain works
• the ability to adapt to changing situations and
• to improve its function as more information becomes available.
• Recall , retrieval, associate and relearn are basic process
• Auto association
• Hetero association
• Classification
Auto association
Auto association
Hetero association
Auto association
Auto association
• Weights
• Bias
• Threshold
• Learning rate
• Momentum factor
• Vigilance parameter
• Notations used in ANN
Chapter 1 Soft Computing 10/31/2023 26
Introduction to Neural Network- Biological Neurons, McCullah and Pitts models of
neuron, Models of ANN, Neural processing, Neural network learning rules, Single layer
perceptron classifier, Linearly separable classification
T
• Weights w1 w11w12 w13...w1m
wT
• Bias 2
wT
3
w21w22 w23...w 2 m
• Threshold . ..................
.
• Learning rate W= =
.
...................
• Momentum factor .
• Vigilance parameter
.
T wn1wn 2 wn 3...wnm
wn
• Notations used in ANN
n
• Weights
1
y =x w
inj i ij
i =0
• Bias X1
bj = x 0 w0 j + x1w1 j + x 2 w2 j + .... + x n wnj
• Threshold w n
Xi 1j Yj = w0 j + x i wij
• Learning rate w i =1
n
• Momentum factor y =b +x w
ij
Xn w inj j i ij
i =1
• Vigilance parameter nj
• Threshold C(bias)
• Learning rate
• Momentum factor
Input
• Vigilance parameter X Y y=mx+C
• Weights
• One of the transfer function /activation in ANN
• Set value based upon which the final output of
• Bias
the network may be calculated
• Threshold • The activation function using threshold can be
• Learning rate defined as
• Momentum factor
1ifnet
• Vigilance parameter f (net ) =
• Notations used in ANN − 1ifnet
Chapter 1 Soft Computing 10/31/2023 32
Introduction to Neural Network- Biological Neurons, McCullah and Pitts models of
neuron, Models of ANN, Neural processing, Neural network learning rules, Single layer
perceptron classifier, Linearly separable classification
• Weights • Denoted by α.
• Bias
• Used to control the amount of weight adjustment at each
• Threshold step of training
• Learning rate
• Learning rate ranging from 0 to 1 determines the rate of
• Momentum factor
learning in each time step
• Vigilance parameter
• Notations used in ANN
Chapter 1 Soft Computing 10/31/2023 33
Introduction to Neural Network- Biological Neurons, McCullah and Pitts models of
neuron, Models of ANN, Neural processing, Neural network learning rules, Single layer
perceptron classifier, Linearly separable classification
• x: Input Data.
Chapter 1 Soft Computing 10/31/2023 36
Introduction to Neural Network- Biological Neurons, McCullah and Pitts models of
neuron, Models of ANN, Neural processing, Neural network learning rules, Single layer
perceptron classifier, Linearly separable classification
• Two dimension
• to separate the point (1,1) from the other points. You can
see that there exists a line that does this. In fact, there
exist infinite such lines. So, these two "classes" of points
are linearly separable.
• In this case, you just cannot use one single line to separate
the two classes . So, they are linearly inseparable.
• n dimension
• Things go up to a lot of dimensions in neural networks. So to separate classes in
n-dimensions, you need an n-1 dimensional "hyperplane".