You are on page 1of 26

PERCEPTRON

What is a Neural Network?

 A neural network can be defined as a model of reasoning based on the human brain. The
brain consists of a densely interconnected set of nerve cells, or basic information-processing
units, called neurons.
ANALOGY BETWEEN BIOLOGICAL
AND ARTIFICIAL NNS

Biological NN Artificial NN
Soma Neuron
Dendrite Input
Axon Output
Synapse Weight
An Artificial Neuron
Input Signal

x1 w1
Output Signal
Neuron

x2 w2

 Y Y

x3 w3

θ
xn wn
Single Layer Neural Net
w11
w21

y
x1

1
wi1
wn1

w12
w22

Output signal
y
Input Signal

x2

wi2

2
wn2

· ·
· w13
·
· w23 ·
wi3

yj
xi

wnj

w1m




·
·
·

w2m
wim
wnm ym
xn

Input layer Output


layer
Multilayer Neural Net

x1

V11 Z1
V1m w11
x2 V21 y1 w12
Input Signals

Output Signals
V2m w1n
Z2
V31 wm1
x3 ym wm2
V3m

wmn Zn
vl1
Vlm
xl
Middle Layer
Input Layer Output Layer
CHARACTERISTICS OF NN

 The NN exhibit mapping capabilities.


 The NNs learn by examples.
 The NNs process the capability to generalize
 The NN robust systems and are fault tolerant
 The NNs can process information in parallel, at high speed,
and in a distributed manner.
Activation Function

•The activation function is used to calculate the output


response of a neuron.
• The sum of the weighted input signal is applied with an
activation to obtain there response.
•For neurons in same layer, same activation functions are
used.
Activation Function of Neuron

Step function Sign function Sigmoid function Linear function

Y Y Y Y
+1 +1 1 1

0 X 0 X 0 X 0 X
-1 -1 -1 -1

1, if X  0 sign 1, if X  0 sigmoid 1


step
Y  Y  Y  Y linear X
0, if X  0 1, if X  0 1  e X
Step Activation Function

1 if x  θ
f ( x)  
0 if x  θ
f(x) f(x)
1 1

θ=0 x 0 x

-5 -4 -3 -2 -1 0 1 2 3 4 5 -5 -4 -3 -2 -1 0 1 2 3 4 5
θ=1
Sign Activation Function

  1 if x  θ
f ( x)  
- 1 if x  θ
f(x)

+1

θ=0 x
-5 -4 -3 -2 -1 0 1 2 3 4 5

-1-1
-1
Sigmoid Activation Function
1
Binary Sigmoid f ( x)  Bipolar Sigmoid
1  ex
f(x) 2
f ( x)  x  1
1
1 e

f(x)

x
0 +1

When x=0, f (x) 


1
2
x  , f (x)  0
x  , f (x)  1 -1
Activation Function
Find the output of the following net when (i) Step function (θ =0), (ii) Step function (θ
=0.5), (iii) Step function (θ =1.0), (iv) Sign function (θ =-0.5), (v) Sign function (θ =1.0),
(vi) Binary sigmoid function (vii) Bipolar sigmoid function are used.
x1
0.2
1.0
0.3 x2 -0.3
0.2
-0.1
0.7 x3 0.2 ∑ y

0.4
0.5
0.5 x4

x5
0.1
Activation Function(Solution)
Find the output of the following net when (i) Step function (θ =0), are used.

x1
0.2
1.0
x2 -0.3
0.3 0.2
-0.1
0.7 x3 0.2 ∑ Z

n
z  Step (  wi xi  b )
i 1

Z=Step(0.2*(-o.3)+0.3*(-0.1)+0.7*0.2+1.0*0.2)
Z= Step( -0.09+0.14+0.2)
Z=Step(0.25)
Z=1
TYPES OF ARTIFICIAL NEURAL NETWORK

 Perceptron
 BAM(Bidirectional Associative Memory)
 FAM(Fuzzy Associative Memory )
 Hebb Net
 Hopfield Network
 SOFM(Self Organizing Neural Net)
APPLICATIONS OF ANN

 Medical Diagnosis
 Assisting doctors with their diagnosis by analyzing the reported
symptoms.
 Input data: patient's personal information, patterns of symptoms,
heart rate, blood pressure, temperature, laboratory results,
treatment procedure etc.

Patient's Length of Stay Forecasts


forecast which patients remain for a specified number of days.
APPLICATIONS OF ANN

Prediction

-use some variables or fields in the database to
predict unknown or future values of other
variables of interest.
Knowledge Discovery
- find new relationships and non-obvious
trends in the data.
Quality Control
- predict the quality of plastics, paper, and
other raw materials
APPLICATIONS OF ANN

Electronic noses
• Electronic noses have several potential applications in
telemedicine.
• Telemedicine is the practice of medicine over long distances
via a communication link.
• The electronic nose would identify smells in the remote
surgical environment.
• These identified smells would then be electronically
transmitted to another site where an door generation system
would recreate them.
• Because the sense of smell can be an important sense to the
surgeon, telesmell would enhance telepresent surgery.
LEARNING OF NEURAL
NETWORK

 Learning is process by which a neural network adapts itself


to a stimulus by properly making parameter adjustment and
producing a desired response
 Types
 Supervised learning
 Unsupervised learning
SUPERVISED LEARNING

Child learns from a teacher

Each input vector requires a corresponding target vector.

 Training pair=[input vector, target vector]

Neural
X Network Y
W
(Input) (Actual output)
Error
Error
(D-Y) Signal
signals Generator (Desired Output)
UNSUPERVISED LEARNING

Unsupervised training requires no teacher; input patterns are applied, and the network self-
organizes by adjusting its weights according to a well-defined algorithm. Since no desired
output is specified during the training process, the results are unpredictable in terms of
firing patterns of specific neurons. In this method of training, the input vectors of similar
types are grouped without the use of training data to specify how a typical member of each
group looks or to which group a member belongs. During training the neural network
receives input patterns and organizes these patterns into categories. When new input
pattern is applied, the neural network provides an output response indicating the class to
which the input pattern belongs. If a class cannot be found for the input pattern, a new
class is generated.
Perceptron is the simplest form of a neural network. It
consists of a single neuron with adjustable synaptic weights
and a hard limiter activation function.

Linear Hard
x1 Combiner Limiter
W1
Output
Inputs  Y
n 
n
Y  sign  xi wi   
W2 x w
i 1
i i  i 1 
θ

x2
ROSENBLATT’S PERCEPTRON

Step 1: Initialization
Set initial weights w1, w2, ..wn and threshold θ to random numbers [-0.5,0.5].
Step 2: Activation
 n 
Activate the perceptron Y (byp)applying 
 step inputs
xi ( x
p 1)(p),
w i (xp2
 i 1 at iteration p=1:
)
(p),
 x3(p), ……xn(p) and desired
output Yd(p). Calculate the actual output

Where n is the number of the perceptron inputs, and step is a step activation function.
wi ( p  1)  wi ( p)    xi ( p)  e( p),
Step 3: Weight training
Update the weights of the perceptron

Step 4: Iteration

You might also like