Professional Documents
Culture Documents
A Comprehensive Foundation
• Introduction
• Learning Processes
• Single Layer Perceptrons
• Multilayer Perceptrons
• Reference: Neural Networks: A Comprehensive Foundation
Simon Haykin, 2nd Edition, Prentice Hall
• mroushdy@cis.asu.edu.eg
• Mohamed.Roushdy@fue.edu.eg
• miroushdy@hotmail.com
Overview
• Neural Network (NN) or Artificial Neural Networks
(ANN) is a computing paradigm
• The key element of this paradigm is
• the novel structure of the information processing
system consisting of a large number of highly
interconnected processing elements (neurons) working
in unison to solve specific problems
• Development of NNs date back to the early 1940s
• Minsky and Papert, published a book (in 1969)
• summed up a general feeling of frustration (against
neural networks) among researchers
Overview (Contd.)
• Experienced an upsurge in popularity in the late
1980s
• Result of the discovery of new techniques and
developments and general advances in computer
hardware technology
• Some NNs are models of biological neural networks
and some are not
Overview (Contd.)
• Historically, much of the inspiration for the field of NNs
came from the desire to produce artificial systems
capable of
• sophisticated, perhaps intelligent, computations similar to
those that the human brain routinely performs, and thereby
possibly to enhance our understanding of the human brain.
Overview (Contd.)
• Most NNs have some sort of training rule. In other words, NNs learn
from examples
• as children learn to recognize dogs from examples of dogs) and exhibit some
capability for generalization beyond the training data
• Neural computing must not be considered as a competitor to
conventional computing.
• Rather should be seen as complementary
• Most successful neural solutions have been those which operate in conjunction with
existing, traditional techniques.
Overview (Contd.)
Digital Computers Neural Networks
• Deductive Reasoning. We apply • Inductive Reasoning. Given input
known rules to input data to and output data (training
produce output examples), we construct rules
• Computation is centralized, • Computation is collective,
synchronous, and serial. asynchronous, and parallel.
• Memory is packetted, literally • Memory is distributed,
stored, location addressable internalized, short term and
• Not fault tolerant. One transis- content addressable.
tor goes and it no longer works. • Fault tolerant, redundancy, and
• Exact. sharing of responsibilities.
i 1 y
w3
… ..
. w
xn-1 n-1
xn wn
The McCullogh-Pitts model
Artificial neurons
Nonlinear generalization of the McCullogh-Pitts
neuron:
y f ( x, w)
y is the neuron’s output, x is the vector of inputs, and w
is the vector of synaptic weights.
Examples: 1
y T
w xa
sigmoidal neuron
1 e
|| x w||2
2a 2 Gaussian neuron
ye
Activation Functions