You are on page 1of 8

Multilayer Feed Forward

Neural Network
Feed Forward Neural Network :-
 A Feedforward neural network consists of layers of processing units,
each layer feeding input to the next layer in a forward direction.
 The feedforward neural network has an input layer, hidden layers and an output
layer. Information always travels in one direction from the input layer to the
output layer and never goes backward.
 The simplest feedforward network is given below
 The Difference between single layer and multilayer neural networks is
In single layer network, the input layer connects to the output layer. In multi-
layer network has more layers called hidden layers between the input and the
output layers.
 We choose Multilayer layer network over single layer network , Because
a single layer network can only learn linear functions ,but a multilayer
network can learn non-linear functions.
 Multilayer Feedforward Network : - A multilayer feedforward neural
network is an interconnection of perceptron's in which data flow in a
single direction, the data flow from input to hidden layers and from hidden
layers to the output layer.
 The number of layers in a neural network is the no.of layers of perceptron's.
 Perceptron is a mathematical model of a biological neuron, the electrical
signals are represented as numerical values.
The Multilayer feedforward neural network is given as above , the network
consists of I linear input units indexed by i, the second layer has J nonlinear units
indexed by j, and the third layer has k, only one layer of hidden units
(perceptron's) is considered. The weights from input layer to hidden layer are
denoted as (Wji) and the weights from hidden layer to output layer are denoted as
(Wkj) . The weights are nothing but the numerical value given to each of the links.
It is mainly used to solve the classification problem in pattern recognition tasks for
non- linear units by introducing hidden layers, it is also used in forecasting.
Pattern Recognition Tasks by Feedforward Neural Network :-
1. Pattern Association : -

The single layer feedforward network is used for realisation of pattern association
task with linear processing units.
From the above diagram, the input patterns are a1, a2, a3..and the corresponding
output patterns are b1, b2, b3.. when any input al is given, it is associated with the
corresponding output bl. Suppose an input pattern ai which is not defined in the
input patterns is given, then this will associate(link) with the corresponding output
of the nearest input which is al. So, the output pattern for ai is also bl and ai input
pattern produces Noise. Thus, the network produces an incorrect output pattern.
2. Pattern Classification :-

The single layer feedforward network with non-linear processing units is used for
pattern classication problem. In the pattern association problem, Some of the output
patterns are identical, then the no.of distinct output patterns can be viewed as class
labels and the input patterns corresponding to each class is viewed as samples and this
lead to the pattern classification problem. The input patterns are usually from the natural
sources like speech and hand printed characters. So, they may be corrupted by extrenal
noise and this will be mapped into one of the distinct pattern outputs, hence network
displays an accretive behaviour. This network overcomes the constraints of number and
linear independence on the input patterns. It introduces limitations of linear seperability
Classification problems which are not linearly seperable are called hard problems.
In order to overcome the constraint of linear seperability in pattern classification
problems, we go for pattern mapping.
3. Pattern Mapping :-

A multilayer feedforward network with non-linear processing units in all the


intermediate hidden layers and in the output layer is considered for pattern
mapping task.
In a pattern mapping problem both the input and the output patterns are only
samples from the mapping system. Once the system behaviour is captured by the
network, the network would produce a possible output pattern for a new input
pattern, which is not defined in the input patterns. The possible output pattern
would be approximately an interpolated version of the output
patterns corresponding to the input patterns close to the given test input pattern.
Thus, the network shows an interpolative behaviour. It can solve hard problems
In a network, but there is a difficulty in adjusting the weights of the network to
capture the functional relationship between input and output pattern pairs.
The above problem can be solved by using back propagation learning
algorithm. The resulting network provides the solution for pattern mapping
problems.

THANK YOU

You might also like