You are on page 1of 12

DELHI TECHNOLOGICAL UNIVERSITY

Deep Learning and Artificial Neural Networks

Project Report
Project : Classification of Numbers from 0 to 9
Vi semester 21-22

Submitted By
❏ANAS AYUB (2K19/EE/038)

Faculty
❏ Dr. Sudarshan K. Valluru
MATLAB Code for MultiClass Function

function [W1, W2] = MultiClass(W1, W2, X, D)


alpha = 0.9;
N = 10;
for k = 1:N
x = reshape(X(:, :, k), 25, 1);
d = D(k, :)';
v1 = W1*x;
y1 = Sigmoid(v1);
v = W2*y1;
y = Softmax(v);
e = d - y;
delta = e;
e1 = W2'*delta;
delta1 = y1.*(1-y1).*e1;
dW1 = alpha*delta1*x';
W1 = W1 + dW1;
dW2 = alpha*delta*y1';
W2 = W2 + dW2;
end
end

Training the Model using MultiClass Function

This code is saved as TestMultiClass.m

clc;
clear all
rng(3);
X = zeros(5, 5, 10);
X(:, :, 1) = [0 1 1 0 0; 0 0 1 0 0; 0 0 1 0 0; 0 0 1 0 0; 0 1 1 1 0];
X(:, :, 2) = [1 1 1 1 0; 0 0 0 0 1; 0 1 1 1 0; 1 0 0 0 0; 1 1 1 1 1];
X(:, :, 3) = [1 1 1 1 0; 0 0 0 0 1; 0 1 1 1 0; 0 0 0 0 1; 1 1 1 1 0];
X(:, :, 4) = [0 0 0 1 0; 0 0 1 1 0; 0 1 0 1 0; 1 1 1 1 1; 0 0 0 1 0];
X(:, :, 5) = [1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 0; 0 0 0 0 1; 1 1 1 1 0];
X(:, :, 6) = [1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 1; 1 0 0 0 1; 1 1 1 1 1];
X(:, :, 7) = [1 1 1 1 1; 0 0 0 0 1; 0 0 0 1 0; 0 0 1 0 0; 0 1 0 0 0];

Anas Ayub 2k19/EE/038


X(:, :, 8) = [1 1 1 1 1; 1 0 0 0 1; 1 1 1 1 1; 1 0 0 0 1; 1 1 1 1 1];
X(:, :, 9) = [1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 1; 0 0 0 0 1; 1 1 1 1 1];
X(:, :, 10) = [1 1 1 1 1; 1 0 0 0 1; 1 0 0 0 1; 1 0 0 0 1; 1 1 1 1 1];
D = [1 0 0 0 0 0 0 0 0 0;
0 1 0 0 0 0 0 0 0 0;
0 0 1 0 0 0 0 0 0 0;
0 0 0 1 0 0 0 0 0 0;
0 0 0 0 1 0 0 0 0 0;
0 0 0 0 0 1 0 0 0 0;
0 0 0 0 0 0 1 0 0 0;
0 0 0 0 0 0 0 1 0 0;
0 0 0 0 0 0 0 0 1 0;
0 0 0 0 0 0 0 0 0 1];
W1 = 2*rand(50, 25) - 1;
W2 = 2*rand( 10, 50) - 1;
for epoch = 1:10000 % train
[W1 W2] = MultiClass(W1, W2, X, D);
end
N = 10; % inference
for k = 1:N
x = reshape(X(:, :, k), 25, 1);
v1 = W1*x;
y1 = Sigmoid(v1);
v = W2*y1;
y = Softmax(v)
end

Anas Ayub 2k19/EE/038


Output

The model gives 100% accuracy for training data. The model is able to
classify all the numbers perfectly.

Testing with Contaminated Images

This code is saved as RealMultiClass.m


clear all
TestMultiClass; % W1, W2
X = zeros(5, 5, 10);
X(:, :, 1) = [0 0 1 1 0;0 0 1 1 0;0 1 0 1 0;0 0 0 1 0;0 1 1 1 0];
X(:, :, 2) = [1 1 1 1 0;0 0 0 0 1;0 1 1 1 0;1 0 0 0 1;1 1 1 1 1];
X(:, :, 3) = [1 1 1 1 0;0 0 0 0 1;0 1 1 1 0;1 0 0 0 1; 1 1 1 1 0];
X(:, :, 4) = [1 1 1 1 1;0 1 0 0 0;0 1 1 1 0;0 1 0 1 0;0 1 1 1 0];
X(:, :, 5) = [0 1 1 1 1;0 1 0 0 0;0 1 1 1 0;0 0 0 1 0;1 1 1 1 0];
X(:, :, 6) = [0 1 1 1 0;1 0 0 1 0;0 1 0 1 0;1 0 0 1 0;0 1 1 1 0];
X(:, :, 7) = [1 1 1 1 0;0 0 0 0 1;0 0 0 1 0;1 0 1 0 1;1 1 0 0 0];

Anas Ayub 2k19/EE/038


X(:, :, 8) = [1 1 1 1 0;0 0 1 0 1;0 1 0 1 0;1 0 0 0 1;1 1 1 1 0];
X(:, :, 9) = [1 1 1 1 1;0 1 0 0 1;0 1 1 1 1;0 0 0 1 1;0 1 1 1 1];
X(:, :, 10) = [1 1 1 1 1;0 1 0 0 1;1 0 0 0 1;1 0 0 1 0;1 1 1 1 1];

N = 10; % inference
for k = 1:N
x = reshape(X(:, :, k), 25, 1);
v1 = W1*x;
y1 = Sigmoid(v1);
v = W2*y1;
y = Softmax(v)
end

Output

Contamination of images leads to decrease in accuracy of identification of


numbers. The model looks confused. For eg, Contamination in 1 st image
shows that there is confusion between 1, 4, 5. Similarly, for 3rd image model
looks confused between 2 and 3. Similarly for other numbers also there is
confusion.

Anas Ayub 2k19/EE/038


Change in Activation Functions for Testing

ReLU activation function


This file is saved as RealMultiClassReLU.m
clear all
TestMultiClass; % W1, W2
X = zeros(5, 5, 10);
X(:, :, 1) = [0 0 1 1 0;0 0 1 1 0;0 1 0 1 0;0 0 0 1 0;0 1 1 1 0];
X(:, :, 2) = [1 1 1 1 0;0 0 0 0 1;0 1 1 1 0;1 0 0 0 1;1 1 1 1 1];
X(:, :, 3) = [1 1 1 1 0;0 0 0 0 1;0 1 1 1 0;1 0 0 0 1; 1 1 1 1 0];
X(:, :, 4) = [1 1 1 1 1;0 1 0 0 0;0 1 1 1 0;0 1 0 1 0;0 1 1 1 0];
X(:, :, 5) = [0 1 1 1 1;0 1 0 0 0;0 1 1 1 0;0 0 0 1 0;1 1 1 1 0];
X(:, :, 6) = [0 1 1 1 0;1 0 0 1 0;0 1 0 1 0;1 0 0 1 0;0 1 1 1 0];
X(:, :, 7) = [1 1 1 1 0;0 0 0 0 1;0 0 0 1 0;1 0 1 0 1;1 1 0 0 0];
X(:, :, 8) = [1 1 1 1 0;0 0 1 0 1;0 1 0 1 0;1 0 0 0 1;1 1 1 1 0];
X(:, :, 9) = [1 1 1 1 1;0 1 0 0 1;0 1 1 1 1;0 0 0 1 1;0 1 1 1 1];
X(:, :, 10) = [1 1 1 1 1;0 1 0 0 1;1 0 0 0 1;1 0 0 1 0;1 1 1 1 1];

N = 10; % inference
for k = 1:N
x = reshape(X(:, :, k), 25, 1);
v1 = W1*x;
y1 = ReLU(v1);
v = W2*y1;
y = Softmax(v)
end

Anas Ayub 2k19/EE/038


Output

In ReLU activation function we can see that the probabilities are higher for
identification of numbers. It means ReLU activation function helps the model to
identify the numbers accurately and strongly.

Anas Ayub 2k19/EE/038


Hyperbolic Tangent Activation Function
This file is saved as RealMultiClassTanh.m

clear all
TestMultiClass; % W1, W2
X = zeros(5, 5, 10);
X(:, :, 1) = [0 0 1 1 0;0 0 1 1 0;0 1 0 1 0;0 0 0 1 0;0 1 1 1 0];
X(:, :, 2) = [1 1 1 1 0;0 0 0 0 1;0 1 1 1 0;1 0 0 0 1;1 1 1 1 1];
X(:, :, 3) = [1 1 1 1 0;0 0 0 0 1;0 1 1 1 0;1 0 0 0 1; 1 1 1 1 0];
X(:, :, 4) = [1 1 1 1 1;0 1 0 0 0;0 1 1 1 0;0 1 0 1 0;0 1 1 1 0];
X(:, :, 5) = [0 1 1 1 1;0 1 0 0 0;0 1 1 1 0;0 0 0 1 0;1 1 1 1 0];
X(:, :, 6) = [0 1 1 1 0;1 0 0 1 0;0 1 0 1 0;1 0 0 1 0;0 1 1 1 0];
X(:, :, 7) = [1 1 1 1 0;0 0 0 0 1;0 0 0 1 0;1 0 1 0 1;1 1 0 0 0];
X(:, :, 8) = [1 1 1 1 0;0 0 1 0 1;0 1 0 1 0;1 0 0 0 1;1 1 1 1 0];
X(:, :, 9) = [1 1 1 1 1;0 1 0 0 1;0 1 1 1 1;0 0 0 1 1;0 1 1 1 1];
X(:, :, 10) = [1 1 1 1 1;0 1 0 0 1;1 0 0 0 1;1 0 0 1 0;1 1 1 1 1];

N = 10; % inference
for k = 1:N
x = reshape(X(:, :, k), 25, 1);
v1 = W1*x;
y1 = hyperbolicTangent(v1);
v = W2*y1;
y = Softmax(v)
end

Anas Ayub 2k19/EE/038


Output

For Hyperbolic Tangent Activation Function it is observed that the model is


more confused as compared to sigmoid and ReLU activation function. It is not
able to identify a single number accurately. For each image it is giving small
small probabilities for each number.

DeepReLU

function [W1, W2, W3, W4] = DeepReLU(W1, W2, W3, W4, X, D)


alpha = 0.01;
N = 10;
for k = 1:N
x = reshape(X(:, :, k), 25, 1);
v1 = W1*x;
y1 = ReLU(v1);
v2 = W2*y1;
y2 = ReLU(v2);
v3 = W3*y2;
y3 = ReLU(v3);
v = W4*y3;
y = Softmax(v);

Anas Ayub 2k19/EE/038


d = D(k, :)';
e = d - y;
delta = e;
e3 = W4'*delta;
delta3 = (v3>0).*e3;
e2 = W3'*delta3;
delta2 = (v2>0).*e2;
e1 = W2'*delta2;
delta1 = (v1>0).*e1;
dW4 = alpha*delta*y3';
W4 = W4+dW4;
dW3 = alpha*delta3*y2';
W3=W3+dW3;
dW2 = alpha*delta2*y1';
W2 = W2+dW2;
dW1 = alpha*delta1*x';
W1 = W1+dW1;
end
end

TestDeepReLU

clc;
clear all;
rng(3);
X = zeros(5, 5, 10);
X(:, :, 1) = [0 1 1 0 0; 0 0 1 0 0; 0 0 1 0 0; 0 0 1 0 0; 0 1 1 1 0];
X(:, :, 2) = [1 1 1 1 0; 0 0 0 0 1; 0 1 1 1 0; 1 0 0 0 0; 1 1 1 1 1];
X(:, :, 3) = [1 1 1 1 0; 0 0 0 0 1; 0 1 1 1 0; 0 0 0 0 1; 1 1 1 1 0];
X(:, :, 4) = [0 0 0 1 0; 0 0 1 1 0; 0 1 0 1 0; 1 1 1 1 1; 0 0 0 1 0];
X(:, :, 5) = [1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 0; 0 0 0 0 1; 1 1 1 1 0];
X(:, :, 6) = [1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 1; 1 0 0 0 1; 1 1 1 1 1];
X(:, :, 7) = [1 1 1 1 1; 0 0 0 1 0; 0 0 1 0 0; 0 1 0 0 0; 1 0 0 0 0];
X(:, :, 8) = [1 1 1 1 1; 1 0 0 0 1; 1 1 1 1 1; 1 0 0 0 1; 1 1 1 1 1];
X(:, :, 9) = [1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 1; 0 0 0 0 1; 1 1 1 1 1];
X(:, :, 10) = [1 1 1 1 1; 1 0 0 0 1; 1 0 0 0 1; 1 0 0 0 1; 1 1 1 1 1];
D = [ 1 0 0 0 0 0 0 0 0 0;
0 1 0 0 0 0 0 0 0 0;
0 0 1 0 0 0 0 0 0 0;

Anas Ayub 2k19/EE/038


0 0 0 1 0 0 0 0 0 0;
0 0 0 0 1 0 0 0 0 0;
0 0 0 0 0 1 0 0 0 0;
0 0 0 0 0 0 1 0 0 0;
0 0 0 0 0 0 0 1 0 0;
0 0 0 0 0 0 0 0 1 0;
0000000001
];
W1 = 2*rand(20, 25) - 1;
W2 = 2*rand(20, 20) - 1;
W3 = 2*rand(20, 20)-1;
W4 = 2*rand(10, 20)-1;
for epoch = 1:10000
[W1, W2, W3, W4] = DeepReLU(W1, W2, W3, W4, X, D);
end
N = 10;
for k = 1:N
x = reshape(X(:,:,k), 25, 1);
v1 = W1*x;
y1 = ReLU(v1);
v2 = W2*y1;
y2 = ReLU(v2);
v3 = W3*y2;
y3 = ReLU(v3);
v = W4*y3;
y = Softmax(v)
end

Anas Ayub 2k19/EE/038


Output

The accuracy of training the images is also almost 100% for each number for
DeepReLU.

Anas Ayub 2k19/EE/038

You might also like