You are on page 1of 20

Artificial Neural Networks

- Introduction -
Overview

1. Biological inspiration
2. Artificial neurons and neural networks
3. Application
Biological Neuron
Animals are able to react adaptively to changes in their
external and internal environment, and they use their nervous
system to perform these behaviours.
An appropriate model/simulation of the nervous system
should be able to produce similar responses and behaviours in
artificial systems.
Biological Neuron

The information transmission happens at the synapses.


Artificial neurons Neuron
Artificial neurons

x1
x2 w1
n Output
x3 w2 z   wi xi ; y  H ( z )
Inputs

i 1 y
… w3
..
xn-1 . wn-1
xn wn
one possible model
Artificial neurons

Nonlinear generalization of neuron:

y  f ( x, w)
y is the neuron’s output, x is the vector of inputs, and w
is the vector of synaptic weights.
Examples: 1
y w xa
sigmoidal neuron
1 e
T

|| x  w||2

ye 2a 2 Gaussian neuron
Other Model
Hopfield Retropropagation
From Logical Neurons to Finite Automata
1
AND 1.5

1
OR 0.5

NOT
0
-1
Artificial neural networks

Output
Inputs

An artificial neural network is composed of many artificial


neurons that are linked together according to a specific
network architecture. The objective of the neural network
is to transform the inputs into meaningful outputs.
Artificial neural networks

Tasks to be solved by artificial neural networks:


• controlling the movements of a robot based on self-
perception and other information (e.g., visual
information);
• deciding the category of potential food items (e.g.,
edible or non-edible) in an artificial world;
• recognizing a visual object (e.g., a familiar face);
• predicting where a moving object goes, when a robot
wants to catch it.
Neural network mathematics

Output
Inputs

y11  f ( x1 , w11 )  y11  2


  y1  f ( y1 , w12 )  y 2

y 2  f ( x 2 , w2 ) y 1   y 2  2
1 1 1  3

 y 2  f ( y , w2 )
1 2 2
  2
 y  f ( y 2
, w3
1)
1
y y 3 Out
y 31  f ( x3 , w31 )  y3  2  2 
1  y3  f ( y , w3 )

1 2
 y3 
y 14  f ( x 4 , w14 )  y4 
Neural network mathematics

Neural network: input / output transformation

yout  F ( x,W )

W is the matrix of all weight vectors.


Learning principle for
artificial neural networks
ENERGY MINIMIZATION

We need an appropriate definition of energy for artificial


neural networks, and having that we can use
mathematical optimisation techniques to find how to
change the weights of the synaptic connections between
neurons.

ENERGY = measure of task performance error


Perceptron application
y  1
+
++
+ +
+ +
+ + + ++ +
+
+ + +
+ + ++ +
+ + + + ++
+ +
+ +
y  1 + +
++
y  sign (v)
c0  c1 x1  c2 x2  0
 v  c0  c1 x1  c2 x2
c0 c2
c1
x1
1 x2
Multi-Layer Perceptron
• One or more hidden
layers
Output layer • Sigmoid activations
functions
2nd hidden
layer

1st hidden
layer

Input data
Multi-Layer Perceptron Application

Types of Result
Structure
Decision Regions
Single-Layer Half Plane A B
Bounded By
Hyperplane B A

Two-Layer Convex Open A B


Or
Closed Regions B A

Three-Layer Abitrary
A B
(Complexity
Limited by No.
B A
of Nodes)
Conclusion

NN have some desadvantages such as:

1. Preprocessing
2. Results interpretation by high dimension
3. Learning phase/Supervised/Non
Supervised
References
1. http://neuron.eng.wayne.edu/software.html
Many useful example.
2. http://ieee.uow.edu.au/~daniel/software/libn
eural/BPN_tutorial/BPN_English/BPN_Engli
sh/BPN_English.html
3. http://www.ai-junkie.com/
4. http://diwww.epfl.ch/mantra/tutorial/english/
Demo: OCR

You might also like