Professional Documents
Culture Documents
DL Ann Pro Anas Ayub 2K19 - Ee - 038
DL Ann Pro Anas Ayub 2K19 - Ee - 038
Project Report
Project : Classification of Numbers from 0 to 9
Vi semester 21-22
Submitted By
❏ANAS AYUB (2K19/EE/038)
Faculty
❏ Dr. Sudarshan K. Valluru
MATLAB Code for MultiClass Function
clc;
clear all
rng(3);
X = zeros(5, 5, 10);
X(:, :, 1) = [0 1 1 0 0; 0 0 1 0 0; 0 0 1 0 0; 0 0 1 0 0; 0 1 1 1 0];
X(:, :, 2) = [1 1 1 1 0; 0 0 0 0 1; 0 1 1 1 0; 1 0 0 0 0; 1 1 1 1 1];
X(:, :, 3) = [1 1 1 1 0; 0 0 0 0 1; 0 1 1 1 0; 0 0 0 0 1; 1 1 1 1 0];
X(:, :, 4) = [0 0 0 1 0; 0 0 1 1 0; 0 1 0 1 0; 1 1 1 1 1; 0 0 0 1 0];
X(:, :, 5) = [1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 0; 0 0 0 0 1; 1 1 1 1 0];
X(:, :, 6) = [1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 1; 1 0 0 0 1; 1 1 1 1 1];
X(:, :, 7) = [1 1 1 1 1; 0 0 0 0 1; 0 0 0 1 0; 0 0 1 0 0; 0 1 0 0 0];
The model gives 100% accuracy for training data. The model is able to
classify all the numbers perfectly.
N = 10; % inference
for k = 1:N
x = reshape(X(:, :, k), 25, 1);
v1 = W1*x;
y1 = Sigmoid(v1);
v = W2*y1;
y = Softmax(v)
end
Output
N = 10; % inference
for k = 1:N
x = reshape(X(:, :, k), 25, 1);
v1 = W1*x;
y1 = ReLU(v1);
v = W2*y1;
y = Softmax(v)
end
In ReLU activation function we can see that the probabilities are higher for
identification of numbers. It means ReLU activation function helps the model to
identify the numbers accurately and strongly.
clear all
TestMultiClass; % W1, W2
X = zeros(5, 5, 10);
X(:, :, 1) = [0 0 1 1 0;0 0 1 1 0;0 1 0 1 0;0 0 0 1 0;0 1 1 1 0];
X(:, :, 2) = [1 1 1 1 0;0 0 0 0 1;0 1 1 1 0;1 0 0 0 1;1 1 1 1 1];
X(:, :, 3) = [1 1 1 1 0;0 0 0 0 1;0 1 1 1 0;1 0 0 0 1; 1 1 1 1 0];
X(:, :, 4) = [1 1 1 1 1;0 1 0 0 0;0 1 1 1 0;0 1 0 1 0;0 1 1 1 0];
X(:, :, 5) = [0 1 1 1 1;0 1 0 0 0;0 1 1 1 0;0 0 0 1 0;1 1 1 1 0];
X(:, :, 6) = [0 1 1 1 0;1 0 0 1 0;0 1 0 1 0;1 0 0 1 0;0 1 1 1 0];
X(:, :, 7) = [1 1 1 1 0;0 0 0 0 1;0 0 0 1 0;1 0 1 0 1;1 1 0 0 0];
X(:, :, 8) = [1 1 1 1 0;0 0 1 0 1;0 1 0 1 0;1 0 0 0 1;1 1 1 1 0];
X(:, :, 9) = [1 1 1 1 1;0 1 0 0 1;0 1 1 1 1;0 0 0 1 1;0 1 1 1 1];
X(:, :, 10) = [1 1 1 1 1;0 1 0 0 1;1 0 0 0 1;1 0 0 1 0;1 1 1 1 1];
N = 10; % inference
for k = 1:N
x = reshape(X(:, :, k), 25, 1);
v1 = W1*x;
y1 = hyperbolicTangent(v1);
v = W2*y1;
y = Softmax(v)
end
DeepReLU
TestDeepReLU
clc;
clear all;
rng(3);
X = zeros(5, 5, 10);
X(:, :, 1) = [0 1 1 0 0; 0 0 1 0 0; 0 0 1 0 0; 0 0 1 0 0; 0 1 1 1 0];
X(:, :, 2) = [1 1 1 1 0; 0 0 0 0 1; 0 1 1 1 0; 1 0 0 0 0; 1 1 1 1 1];
X(:, :, 3) = [1 1 1 1 0; 0 0 0 0 1; 0 1 1 1 0; 0 0 0 0 1; 1 1 1 1 0];
X(:, :, 4) = [0 0 0 1 0; 0 0 1 1 0; 0 1 0 1 0; 1 1 1 1 1; 0 0 0 1 0];
X(:, :, 5) = [1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 0; 0 0 0 0 1; 1 1 1 1 0];
X(:, :, 6) = [1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 1; 1 0 0 0 1; 1 1 1 1 1];
X(:, :, 7) = [1 1 1 1 1; 0 0 0 1 0; 0 0 1 0 0; 0 1 0 0 0; 1 0 0 0 0];
X(:, :, 8) = [1 1 1 1 1; 1 0 0 0 1; 1 1 1 1 1; 1 0 0 0 1; 1 1 1 1 1];
X(:, :, 9) = [1 1 1 1 1; 1 0 0 0 0; 1 1 1 1 1; 0 0 0 0 1; 1 1 1 1 1];
X(:, :, 10) = [1 1 1 1 1; 1 0 0 0 1; 1 0 0 0 1; 1 0 0 0 1; 1 1 1 1 1];
D = [ 1 0 0 0 0 0 0 0 0 0;
0 1 0 0 0 0 0 0 0 0;
0 0 1 0 0 0 0 0 0 0;
The accuracy of training the images is also almost 100% for each number for
DeepReLU.