Professional Documents
Culture Documents
1
Introduction to Perceptron
• Perceptron was the first algorithmically described neural
network, invented by Rosenblatt in 1958.
• It is the simplest algorithm and used to solve linearly
separable problem.
• Perceptron is single layer architecture which receives
multiple inputs neuron and processes them to produce an
output.
• The architecture of perceptron as given in Figure 3.1
• The algorithm used to adjust the free parameters of this
neural network (weights and bias). Rosenblatt prove that if
the patterns (vectors) used to train the perceptron are
drawn from two linearly separable classes, then the
perceptron algorithm converges and positions the decision
surface in the form of a hyperplane between the two classes
(Figure 3.2)
2
Introduction to Perceptron
Input Output
Layer Layer
Where:
Class 1 Class 1
Class 2
Class 2
(b)
(a)
Figure 2.2 Decision boundary for linearly separable (a) and non
linear separable (b)
4
Perceptron Algorithm
Output
Input Layer
Layer
5
Perceptron Algorithm
6
Perceptron Algorithm
7
Perceptron Algorithm
8
Hand Calculation Example of
Perceptron Algorithm
9
Hand Calculation Example of
Perceptron Algorithm
Solution:
10
Hand Calculation Example of
Perceptron Algorithm
11
Hand Calculation Example of
Perceptron Algorithm
12
The perceptron convergence
theorem
• The theorem states that for any data set which is linearly
separable, perceptron learning rule is guaranteed to find
solution in finite number of iteration.
13
References
• Simon S Haykin, Neural networks and learning machines,
volume 3. Pearson Education Upper Saddle River, 2009.
• Sandhya Samarasinghe. Neural networks for applied
sciences and engineering: from fundamentals to complex
pattern recognition. CRC Press, 2006.
14