You are on page 1of 3

NPTEL Syllabus

Artificial Neural Networks - Web course


This course has been designed to offer as a graduate-level/ final year

undergraduate level elective subject to the students of any branch of
engineering/ science, having basic foundations of matrix algebra, calculus and
preferably (not essential) with a basic knowledge of optimization.

Students and researchers desirous of working on pattern recognition and
classification, regression and interpolation from sparse observations; control and
optimization are expected to find this course useful. The course covers theories
and usage of artificial neural networks (ANN) for problems pertaining to
Electronics &
classification (supervised/ unsupervised) and regression.
The course starts with some mathematical foundations and the structures of
artificial neurons, which mimics biological neurons in a grossly scaled down
version. It offers mathematical basis of learning mechanisms through ANN. The
course introduces perceptrons, discusses its capabilities and limitations as a
pattern classifier and later develops concepts of multilayer perceptrons with back
propagation learning. Pre-requisites:
As more advanced ANNs, radial basis function networks and support vector
machines are discussed. Competitive learning and self organizing maps are 1. Knowledge of matrix algebra and
presented as unsupervised classifiers. The course also outlines fuzzy neural
networks, used in automated control applications.
The course adequately stresses on the analytical aspects of ANN rather than on
applications directly. My expectation is that on completion of the course, the Prof. Somnath Sengupta
students can apply the concepts to real-life engineering problems. Department of Electronics & Electrical
Communication EngineeringIIT

Module Topic/s No.of

No. Lectures

1 Introduction and ANN Structure. 4

1. Biological neurons and artificial neurons.

2. Model of an ANN.

3. Activation functions used in ANNs.

4. Typical classes of network architectures .

2 Mathematical Foundations and Learning 7


1. Re-visiting vector and matrix algebra.

2. State-space concepts.

3. Concepts of optimization.

4. Error-correction learning.

5. Mem ory-based learning.

6. Hebbian learning.
7. Competitive learning.

3 Single layer perceptrons. 5

1. Structure and learning of perceptrons.

2. Pattern classifier - introduction and Bayes'


3. Perceptron as a pattern classifier.

4. Perceptron convergence.

5. Limitations of a perceptrons.

4 Feedforward ANN. 5

1. Structures of Multi-layer feedforward networks.

2. Back propagation algorithm.

3. Back propagation - training and convergence.

4. Functional approximation with back propagation.

5. Practical and design issues of back propagation


5 Radial Basis Function Networks. 5

1. Pattern separability and interpolation.

2. Regularization Theory.

3. Regularization and RBF networks.

4. RBF network design and training.

5. Approximation properties of RBF.

6 Support Vector machines. 5

1. Linear separability and optimal hyperplane.

2. Determination of optimal hyperplane.

3. Optimal hyperplane for nonseparable patterns.

4. Design of an SVM.

5. Examples of SVM.

7 Competitive Learning and Self organizing ANN. 5

1. General clustering procedures.

2. Learning Vector Quantization (LVQ).

3. Competitive learning algorithms and architectures.

4. Self organizing feature maps.

5. Properties of feature maps.

8 Fuzzy Neural Networks. 4

1. Neuro-fuzzy systems.

2. Background of fuzzy sets and logic.

3. Design of fuzzy stems.

4. Design of fuzzy ANNs.

Total 40


1. Simon Haykin, "Neural Networks: A comprehensive foundation", Second

Edition, Pearson Education Asia.

2. Satish Kumar, "Neural Networks: A classroom approach", Tata McGraw

Hill, 2004.

3. Robert J. Schalkoff, "Artificial Neural Networks", McGraw-Hill International

Editions, 1997.

A joint venture by IISc and IITs, funded by MHRD, Govt of India