Professional Documents
Culture Documents
Lecture 2: Single Layer Perceptrons: Kevin Swingler
Lecture 2: Single Layer Perceptrons: Kevin Swingler
W
j1
W
jn
I
2
I
3
I
n
A
j
Y
j
3
Networks of McCulloch-Pitts Neurons
One neuron cant do much on its own. Usually we will have many neurons
labelled by indices k, i, j and activation flows between via synapses with
strengths w
ki
, w
ij
:
) sgn(
1
i
n
k
ki i
I Y =
=
ki k ki
w Y I =
ij i ij
w Y I =
I
1i
I
2i
I
3i
I
ki
Neuron i
Y
i
Synapse ij
I
ij
Neuron j
i
w
ij
Neuron k
W
1i
4
The Perceptron
We can connect any number of McCulloch-Pitts neurons together in
any way we like
An arrangement of one input layer of McCulloch-Pitts neurons feeding
forward to one output layer of McCulloch-Pitts neurons is known as
a Perceptron.
) sgn(
1
j ij
n
i
i j
w Y Y =
=
:
:
:
:
i j
w
ij
1
2
N
1
2
M
M
5
Implementing Logic Gates with MP Neurons
We can use McCulloch-Pitts neurons to implement the basic logic
gates (e.g. AND, OR, NOT).
It is well known from logic that we can construct any logical function
from these three basic logic gates.
All we need to do is find the appropriate connection weights and neuron
thresholds to produce the right outputs for each set of inputs.
We shall see explicitly how one can construct simple networks that
perform NOT, AND, and OR.
6
Implementation of Logical NOT, AND, and OR
0 1
1 0
out in
NOT NOT
1 1 1
0 0 1
0 1 0
0 0 0
out in
2
in
1
AND AND
1 1 1
1 0 1
1 1 0
0 0 0
out in
2
in
1
OR OR
?
? ?
Problem: Train network to calculate the appropriate weights and thresholds in
order to classify correctly the different classes (i.e. form decision boundaries
between classes).
7
Decision Surfaces
Decision surface is the surface at which the output of the unit is
precisely equal to the threshold, i.e. w
i
I
i
=
In 1-D the surface is just a point:
I
1
=/w
1
Y=0 Y=1
I
1
In 2-D, the surface is
0
2 2 1 1
= + w I w I
which we can re-write as
1
2
1
2
2
I
w
w
w
I =
So, in 2-D the decision boundaries are
always straight lines.
8
Decision Boundaries for AND and OR
We can now plot the decision boundaries of our logic gates
1 1 1
0 0 1
0 1 0
0 0 0
out I
2
I
1
AND AND
I
1
I
2
(0, 0) (0, 1)
(1, 0)
(1, 1)
I
1
I
2
(1, 1)
(0, 1) (0, 0)
(1, 0)
1 1 1
1 0 1
1 1 0
0 0 0
out I
2
I
1
OR OR
AND AND
w1=1, w2=1, =1.5
OR OR
w1=1, w2=1, =0.5
9
Decision Boundary for XOR
The difficulty in dealing with XOR is rather obvious. We need two straight
lines to separate the different outputs/decisions:
I
1
I
2
0 1 1
1 0 1
1 1 0
0 0 0
out I
2
I
1
XOR XOR
I
1
I
2
Solution: either change the transfer function so that it has more than one
decision boundary, or use a more complex network that is able to generate
more complex decision boundaries.
10
ANN Architectures
Mathematically, ANNs can be represented as weighted directed graphs. The
most common ANN architectures are:
Single-Layer Feed-Forward NNs: One input layer and one output layer of
processing units. No feedback connections (e.g. a Perceptron)
Multi-Layer Feed-Forward NNs: One input layer, one output layer, and one or
more hidden layers of processing units. No feedback connections (e.g. a
Multi-Layer Perceptron)
Recurrent NNs: Any network with at least one feedback connection. It may, or
may not, have hidden units
Further interesting variations include: sparse connections, time-delayed
connections, moving windows,
11
Examples of Network Architectures
Single Layer Multi-Layer Recurrent
Feed-Forward Feed-Forward Network
12
Types of Activation/Transfer Function
Threshold Function
<
=
0 0
0 1
) (
x if
x if
x f
f(x)
x
Piecewise-Linear Function
Sigmoid Function
=
5 . 0 0
5 . 0 5 . 0 5 . 0
5 . 0 1
) (
x if
x if x
x if
x f
f(x)
x
x
e
x f
+
=
1
1
) (
f(x)
x
13
The Threshold as a Special Kind of Weight
The basic Perceptron equation can be simplified if we consider that the
threshold is another connection weight:
j nj n j j j ij
n
i
i
w I w I w I w I + + + =
=
K
2 2 1 1
1
If we define w
0j
=-
j
and I
0
=1 then
ij
n
i
i j nj n j j j ij
n
i
i
w I w I w I w I w I w I = + + + + =
= = 0
0 0 2 2 1 1
1
K
The Perceptron equation then becomes
) sgn( ) sgn(
0 1
ij
n
i
i j ij
n
i
i j
w I w I Y = =
= =