You are on page 1of 31

Artificial Neural Network Basis

Lecture 2

Ali Raza
Topics
• Machine Learning:
– Definition
– Important Tasks
• Neural Network Basis:
– Perceptron
– Binary Classification with Perceptron
– Training Perceptron
What is Machine Learning ?
• Field of study that gives
computer the ability to learn
without being explicitly
programmed (Arthur Samuel,
1956)
• Study of algorithms that improve
their performance P at some task
T: Play checkers
T with experience E (Tom P: % of games won
Mitchell, 1998) E: Playing against self

Well defined learning task: <P, T, E>


Well Defined Learning Task
• Handwriting Recognition
– Task T: recognizing and classifying handwritten words
within images
– Performance P: percent of words correctly classified
– Training experience E: a database of written words
with given classification
Machine Learning Tasks
• Supervised Learning
– Classification
– Regression
• Unsupervised Learning
• Reinforcement Learning
Learning Process
Features
Biological Neuron
Perceptron
• Rosenblatt proposed a binary classification method
• Key Idea
– One weight 𝑤𝑖 per input 𝑥𝑖
– Multiple weights with respective inputs and add bias
𝑥0 = +1
– If result larger than threshold, return 1, otherwise 0
Sentiment Classification
Feature Weights Good
Good 1.0 Great
1.0
Great 1.2 1.2
Awesome
Awesome 1.7 1.7
Bad -1.0 -1.0 ෍
Bad
Terrible -2.1 -2.1
Awful -3.3 Terrible
-3.3
Restaurant 0.0
Awful
, the,
where…
….
Sentiment Classification
Feature Weights Good
Good 1.0 Great
1.0
Great 1.2 1.2
Awesome
Awesome 1.7 1.7
Bad -1.0 -1.0 ෍
Bad
Terrible -2.1 -2.1
Awful -3.3 Terrible
-3.3
Restaurant 0.0
Awful
, the,
where…
….

Test example: “Mubeen was great, the food was


awesome. But the service was terrible”
Sentiment Classification
Feature Weights Good
Good 1.0 Great
1.0
Great 1.2 1.2
Awesome
Awesome 1.7 1.7
Bad -1.0 -1.0 ෍
Bad
Terrible -2.1 -2.1
Awful -3.3 Terrible
-3.3
Restaurant 0.0
Awful
, the,
where…
….

Test example: “Mubeen was great, the food was


awesome. But the service was terrible”
Positive score: 1.0+1.7-1.3 = 1.4
Sentiment Classification
Feature Weights Good
Good 1.0 Great
1.0
Great 1.2 1.2
Awesome
Awesome 1.7 1.7
Bad -1.0 -1.0 ෍
Bad
Terrible -2.1 -2.1
Awful -3.3 Terrible
-3.3
Restaurant 0.0
Awful
, the,
where…
….

Test example: “Mubeen was great, the food was


awesome. But the service was terrible”
Positive score: 1.0+1.7-1.3 = 1.4 >0 Output = 1 (means positive sentiment)
Sentiment Classification with
Perceptron
Test example: “Mubeen was great, the food was
awesome. But the service was terrible”

Good Great Awesome Bad Terrible Awful … …


Input: 1 0 1 0 1 0 … … …

Weights of
positive class: 1.0 1.2 1.7 -1.0 -2.1 -1.3 … … …

Parameters / weights
Sentiment Classification with
Perceptron
Test example: “Mubeen was great, the food was
awesome. But the service was terrible”

Good Great Awesome Bad Terrible Awful … …


Input: 1 0 1 0 1 0 … … …

ₓ ₊ ₓ ₊ ₓ ₊ ₓ ₊ ₓ ₊ ₓ ₌
Weights of
positive class: 1.0 1.2 1.7 -1.0 -2.1 -1.3 … … …

a.k.a parameters of positive class


Fast Sentiment Classification with
Perceptron
1
Good
Test example: “Mubeen was great, the food was
awesome. But the service was terrible” 0
Great
x:
1
Asm
0
Bad
W: 1.0 1.2 1.7 -1.0 -2.1 -1.3 … … …
1 Trbl

0
Awfl


Fast Sentiment Classification with
Perceptron
1
Good
Test example: “Mubeen was great, the food was
awesome. But the service was terrible” 0
Great
x:
1
Asm
0
Bad
W: 1.0 1.2 1.7 -1.0 -2.1 -1.3 … … …
1 Trbl

0
Awfl
𝑓 𝑥, 𝑊 = 𝑊𝑥


Fast Sentiment Classification with
Perceptron
1
Good
Test example: “Mubeen was great, the food was
awesome. But the service was terrible” 0
Great
x:
1
Asm
0
Bad
W: 1.0 1.2 1.7 -1.0 -2.1 -1.3 … … …
1 Trbl

0
Awfl
𝑓 𝑥, 𝑊 = 𝑊𝑥 + 𝑏

Bias …
Image Classification with Perceptron

Bias vector Class scores


Weights/
parameters
matrix
Image Classification with Perceptron

𝑓 𝑥, 𝑊 = 𝑊𝑥 + 𝑏
Image Classification with Perceptron

<0
Not Cate
Image Classification with Perceptron

<0
Not Cate

What to do?
Image Classification with Perceptron

<0
Not Cate

Where I did wrong?


Image Classification with Perceptron

<0
Not Cate

Where I did wrong?


Image Classification with Perceptron

<0
Not Cate

Ok, how to fix?


Image Classification with Perceptron

<0
Not Cate

Ok, how to fix?


Ans: Learning Algo.
Training a Perceptron
• Learning algorithm:
– Initialize weights randomly
– Take one sample 𝑥𝑖 and predict 𝑦𝑖
– For wrong predictions, update weights
• If the output was 𝑦ො𝑖 = 0 𝑎𝑛𝑑 𝑦𝑖 = 1, increase weights
• If the output was 𝑦ො𝑖 = 1 𝑎𝑛𝑑 𝑦𝑖 = 0, decrease weights
– Repeat until no errors are made
Training a Perceptron
• Learning algorithm:
– Initialize weights randomly
– Take one sample 𝑥𝑖 and predict 𝑦𝑖
– For wrong predictions, update weights
• If the output was 𝑦ො𝑖 = 0 𝑎𝑛𝑑 𝑦𝑖 = 1, increase weights
• If the output was 𝑦ො𝑖 = 1 𝑎𝑛𝑑 𝑦𝑖 = 0, decrease weights
– Repeat until no errors are made

But, how do the algorithm knew that the


predictions are wrong?
Training a Perceptron
• Learning algorithm:
– Initialize weights randomly
– Take one sample 𝑥𝑖 and predict 𝑦𝑖
– For wrong predictions, update weights
• If the output was 𝑦ො𝑖 = 0 𝑎𝑛𝑑 𝑦𝑖 = 1, increase weights
• If the output was 𝑦ො𝑖 = 1 𝑎𝑛𝑑 𝑦𝑖 = 0, decrease weights
– Repeat until no errors are made

But, how do the algorithm knew that the


predictions are wrong?
Cost function/ loss function tell it
Training a Perceptron
• Learning algorithm:
– Initialize weights randomly
– Take one sample 𝑥𝑖 and predict 𝑦𝑖
– For wrong predictions, update weights
• If the output was 𝑦ො𝑖 = 0 𝑎𝑛𝑑 𝑦𝑖 = 1, increase weights
• If the output was 𝑦ො𝑖 = 1 𝑎𝑛𝑑 𝑦𝑖 = 0, decrease weights
– Repeat until no errors are made

What is Cost function? 𝑚

ො 2
𝐿 𝑊, 𝑏 = ෍(𝑦 − 𝑦)
𝑖=1
Training a Perceptron
• Learning algorithm:
– Initialize weights randomly
– Take one sample 𝑥𝑖 and predict 𝑦𝑖
– For wrong predictions, update weights
• If the output was 𝑦ො𝑖 = 0 𝑎𝑛𝑑 𝑦𝑖 = 1, increase weights
• If the output was 𝑦ො𝑖 = 1 𝑎𝑛𝑑 𝑦𝑖 = 0, decrease weights
– Repeat until no errors are made Predicted
Actual
class
𝑚 class
What is Cost function?
ො 2
𝐿 𝑊, 𝑏 = ෍(𝑦 − 𝑦)
𝑖=1

You might also like