You are on page 1of 1

Neural Networks and Deep Learning 3-0-0-3

(Revised Syllabus)
Unit I

Introduction to classical machine learning – bayes theorem, linear regression, logistic regression, PCA,
linear discriminant analysis. Introduction to neural networks- neuroscience inspiration; hypotheses
and tasks; training data; Perceptrons, sigmoid units; Learning in neural networks - output vs hidden
layers; linear vs nonlinear networks; linear models (regression) - LMS algorithm – Perceptrons –
classification - limitations of linear nets and perceptrons - multi-Layer Perceptrons (MLP)- activation
functions - linear, softmax, tanh, ReLU; error functions - feed-forward networks.

Unit II

Backpropagation - recursive chain rule (backpropagation) - Learning weights of a logistic output


neuron - loss functions - learning via gradient descent - optimization – momentum method; Adaptive
learning rates – RmsProp - mini-batch gradient descent - bias-variance trade off, regularization -
overfitting - inductive bias – regularization - drop out - generalization. Deep neural networks -
convolutional nets – case studies using Keras/Tensorflow.

Unit III

Introduction to deep reinforcement learning - neural nets for sequences - Recurrent Nets – Long-
Short-Term-memory; Introduction to Deep unsupervised learning – autoencoders - PCA to
autoencoders.

Text Book:

1. Ian Goodfellow, Yoshua Bengio and Aaron Courville. Deep Learning, MIT Press, Second
Edition, 2016.

Reference Books:

2. Duda, R.O., Hart, P.E., and Stork, D.G. Pattern Classification. Wiley-Interscience. 2nd Edition.
2001.
3. Theodoridis, S. and Koutroumbas, K. Pattern Recognition. Edition 4. Academic Press, 2008.
4. Russell, S. and Norvig, N. Artificial Intelligence: A Modern Approach. Prentice Hall Series in
Artificial Intelligence. 2003.
5. Bishop, C. M. Neural Networks for Pattern Recognition. Oxford University Press. 1995.
6. Hastie, T., Tibshirani, R. and Friedman, J. The Elements of Statistical Learning. Springer. 2001.
7. Koller, D. and Friedman, N. Probabilistic Graphical Models. MIT Press. 2009.

Pre-requisites:

15MAT111 Calculus and Matrix Algebra, 15MAT213 Probability and Random Processes

Evaluation Pattern: (Lab based course):

Periodical Test 1 – 20 marks


Periodical Test 2 – 20 marks
Continuous Assessment – 30 marks (Quiz, Tutorial, Assignment,Project)
End Semester – 30 marks

You might also like