You are on page 1of 18

MACHINE

LEARNING
ARTIFICIAL NEURAL
NETWORKS
LEC-3A

1 Hammad Afzal
Department of CSE
hammad.afzal@mcs.edu.pk
WHAT IS IT?
 An artificial neural network is a crude way of trying to
simulate the human brain (digitally)
 Human brain – Approx 10 billion neurons

 Each neuron connected with thousands of others

 Parts of neuron
 Cellbody
 Dendrites – receive input signal
 Axons – Give output
INTRODUCTION
 ANN – made up of artificial neurons
 Digitally modeled biological neuron
 Each input into the neuron has its own weight associated with
it
 As each input enters the nucleus (blue circle) it's multiplied
by its weight.
INTRODUCTION
 The nucleus sums all these new input values which gives
us the activation
 For n inputs and n weights – weights multiplied by input
and summed

a = x1w1+x2w2+x3w3... +xnwn

a = -0; output = 1
INTRODUCTION
 If the activation is greater than a threshold value - the neuron
outputs a signal – (for example 1)
 If the activation is less than threshold the neuron outputs zero.

 This is typically called a step function


INTRODUCTION
 The combination of summation and thresholding is called a
node

http://www-cse.uta.edu/~cook/ai1/lectures/figures/neuron.jpg

 For step (activation) function – The output is 1 if:

x1w1+x2w2+x3w3... +xnwn > T


INTRODUCTION
x1w1+x2w2+x3w3... +xnwn > T

x1w1+x2w2+x3w3... +xnwn -T > 0

Let w0 = -T and x0 = 1

D = x0w0 + x1w1+x2w2+x3w3... +xnwn > 0

Output is 1 if D> 0;
Output is 0 otherwise

w0 is called a bias weight


TYPICAL ACTIVATION FUNCTIONS

Step function Sign function Sigmoid function Linear function

Y Y Y Y
+1 +1 1 1

0 X 0 X 0 X 0 X
-1 -1 -1 -1

1, if X  0 sign 1, if X  0 sigmoid 1


Y step  Y  Y  Y linear X
0, if X  0 1, if X  0 1  e X

Controls when unit is “active” or “inactive”


AN ARTIFICIAL NEURON- SUMMARY SO
FAR
 Receives n-inputs

 Multiplies each input by its


weight

 Applies activation function http://www-cse.uta.edu/~cook/ai1/lectures/figures/neuron.jpg

to the sum of results

 Outputs result
CLASSIFICATION BY BACK-PROPAGATION
 During the learning phase, the network learns by adjusting
the weights so as to be able to predict the correct class label of
the input tuples

10
A MULTI-LAYER FEED-FORWARD
NEURAL NETWORK
Output vector

Output layer

Hidden layer

wij
Input layer

11
Input vector: X
NETS WITHOUT HIDDEN LAYERS

 Input layer
 Output layer – one or more output nodes

 Simplest classifier – Single Neuron

Can a single neuron learn a task?


A MOTIVATING EXAMPLE
 Each day you get lunch at the cafeteria.
 Your diet consists of fish, chips, and drink.
 You get several portions of each

 The cashier only tells you the total price of the meal
 After several days, you should be able to figure out the price of each
portion.
 Each meal price gives a linear constraint on the prices of the
portions:

price  x fishw fish  xchipswchips  xdrink wdrink


SOLVING THE PROBLEM
 The prices of the portions are like the weights in of a linear
neuron.

 We will start with guesses for the weights and then adjust the
guesses to give a better fit to the prices given by the cashier.

w  ( w fish , wchips , wdrink )


THE CASHIER’S BRAIN
Price of meal = 850

Linear
neuron

w1=150 w2=50 w3=100

2 5 3

portions of fish portions of portions of


chips drink
THE CASHIER’S BRAIN DAY2
Price of meal = 800

Linear
neuron

w1=150 w2=50 w3=100

4 2 1

portions of fish portions of portions of


chips drink
A MODEL OF THE CASHIER’S BRAIN
WITH ARBITRARY INITIAL WEIGHTS
 Residual error = 350
Price of meal = 500
 The learning rule is:

wi   xi ( y  yˆ )
 With a learning rate of
w1= 50 50 50 1/35, the weight changes are
+20, +50, +30
 This gives new weights of

2 5 3 70, 100, 80
portions of fish portions of portions of  Notice that the weight for
chips drink
chips got worse!
A MODEL OF THE CASHIER’S BRAIN
WITH ARBITRARY INITIAL WEIGHTS
 Residual error = -60
Price of meal = 560
 The learning rule is:

wi   xi ( y  yˆ )
 With a learning rate 
of
w1= 70 w2 = 100 w3=80 1/35, the weight changes are
+20, +50, +30
 This gives new weights of

4 2 1 70, 100, 80
portions of fish portions of portions of  Notice that the weight for
chips drink
chips got worse!

You might also like