You are on page 1of 3

Faculty of Computing

Artificial Intelligence - CS-370


Fall 2023
BSCS11 C
Lab 12: Back Propagation

Date: 15th Dec 2023


Instructor: Dr. Imran Malik
Lab Eng: Shakeela

CS370: Artificial Intelligence Fall 2023 Page 1


Lab 13: Back Propagation
Introduction:

Back propagation is a widely used algorithm in machine learning for training artificial neural
networks. It is a supervised learning algorithm that uses gradient descent to optimize the neural
network's weights and biases. In this lab, we will explore the concept of back propagation and
how it can be used to train a neural network.

Objectives:

The main objectives of this lab are:

 Understand the concept of back propagation.


 Learn how to train a neural network using back propagation.
 Learn how to implement back propagation using Python.
 Apply back propagation to a real-world dataset.

Tools:

Python 3.x, NumPy, Matplotlib, Jupyter Notebook or Google Colab

Description:

We will implement a neural network using Python and NumPy. We will use the MNIST dataset,
which contains a large number of handwritten digit images. We will preprocess the dataset, train
the neural network using back propagation, and evaluate its performance on a test set.

MNIST Dataset:

This dataset contains 70,000 images of handwritten digits, split into 60,000 training examples
and 10,000 test examples. The goal is to correctly classify each digit image into its
corresponding numerical digit (0-9).

The MNIST dataset contains images of size 28x28 pixels. Each pixel value is an integer between
0 and 255, representing the grayscale value of the pixel. The dataset is commonly used as a
benchmark for image recognition tasks in machine learning.

Here is a link to the MNIST dataset: http://yann.lecun.com/exdb/mnist/. However, MNIST


dataset is also saved in all machine learning libraries e.g. PyTorch, Tensorflow and Scikit-learn.
It can be used by importing from these libraries as well.

CS370: Artificial Intelligence Fall 2023 Page 2


Methodology:

1. Prepare the dataset: Load the dataset and split it into training and test sets. Normalize
the input features to bring them to a similar scale.
2. Define the model architecture: Decide on the number of input and output neurons, as
well as the number of hidden layers and neurons in each hidden layer. Also, choose an
appropriate activation function for each layer.
3. Initialize the weights and biases: Assign random values to the weights and biases of the
network.
4. Forward propagation: Pass the input through the network and compute the output.
5. Compute the loss: Calculate the difference between the predicted output and the actual
output using an appropriate loss function.
6. Backward propagation: Compute the gradients of the loss function with respect to the
weights and biases using the chain rule of differentiation.
7. Update the weights and biases: Adjust the weights and biases in the opposite direction
of the gradients to reduce the loss.
8. Repeat steps 4-7 for multiple epochs: Train the model for a fixed number of epochs or
until the loss stops decreasing.
9. Evaluate the model performance: Test the model on the test set and calculate metrics
such as accuracy, precision, and recall.
10. Make predictions: Use the trained model to predict the output for new input data.

Tasks:
 Load and preprocess the MNIST dataset using Python.
 Build a neural network using Python and NumPy with
o Different activation functions i.e., sigmoid, ReLu and hyperbolic tangent (tanh)
function on hidden layers and softmax on output layer.
o Different sizes of hidden layers i.e., 32, 64, 128 and 256.
o Use accuracy as evaluation metric.
 Train and evaluate the performance of the neural network.
 Compare and visualize the results of the neural network.
 Save the best performing network model using model.save() method.

Deliverables
 Students are required to upload the lab task solution in ipynb format on LMS.
 The file name must contain your name and CMS ID in the following format. <Lab_your
CMS ID_your name.ipynb>

CS370: Artificial Intelligence Fall 2023 Page 3

You might also like