You are on page 1of 2

Course Course

Name of Course L T P Credit


Type Code
DE ECE17105 Machine Learning 3 0 0 6

Course Objective
The objective of the course is to present the fundamental mathematical concepts of Machine Learning (ML) theory
and algorithms, emphasizing on core ideas and algorithms for supervised and unsupervised learning, with a brief peek
into online machine learning.
Learning Outcomes
Upon successful completion of this course, students will:
 have a broad understanding of the mathematical concepts behind ML theory and algorithms.
 be able to design and analyze ML algorithms for solving several real world problems.
 be prepared to venture into more advanced areas of ML practice and research.

Unit Lecture
Topics to be Covered Learning Outcome
No. Hours
Introduction: Different types of ML paradigms and tasks: The student will have a bird’s eye
introduction to supervised, unsupervised and online view of the depth and breadth of ML.
learning; Review of basic probability: concepts of The student will acquire the basic
1 4
probability space, events, independence of events, Baye’s mathematical tools indispensable for
rule, random variables, probability distributions, studying ML.
Estimators: MAP and ML estimators.
Basic Concepts of ML: Loss functions; Training and Student will get familiar with some
2 Testing; Overfitting and Model Selection; Cross 3 common terms used in ML
Validation; The curse of dimensionality community
Binary Classification: Introduction to binary classifiers; The student will be introduced to the
The optimal decision rule and Baye’s risk; Introduction to most basic yet stimulating problem of
3 3
the a posteriori probability or the regression function binary classification and understand
the challenges in learning
Linear Regression: Introduction to the regression The student will have a good
problem; Introduction to the linear parametric model; understanding of the various linear
Maximum Likelihood and Least Squares; Bias-Variance models for regression problem and
4 6
Decomposition; Bayesian Linear Regression will also understand the inherent
problem of bias and variance in ML
models
Linear Classification: Discriminant functions; Fisher’s The student will have a good
5 Linear Discriminant: Rosenblatt’s perceptron; 6 understanding of the various linear
Probabilistic classification: Logistic regression models for classification problem
Theory of Statistical Learning: Introduction to Gives the student a glimpse of the
6 Empirical Risk Minimization; Introduction to PAC 3 statistical learning theory
learning; Brief mention to high probability risk bounds
Artificial Neural Networks: The multilayer perceptron; The student will have a good
Universal approximation Theorem; Network training: understanding of the basic
7 3
gradient descent algorithm; Error Backpropagation; feedforward architecture of neural
network
Support Vector Machine: The concept of margins; The student will acquire a good
8 Hard and soft margin classifiers; A detour to optimization 3 understanding of ML as an
and Lagrange multipliers; The concept of Kernels optimization problem
Mixture Models and EM: Introduction to clustering; This unit will introduce some
Introduction to Gaussian Mixture Model (GMM); powerful techniques for unsupervised
9 3
Introduction to the Expectation Maximization (EM) learning
procedure
PCA: Karhunen-Loève Transformation and dimensionality Introduces the student to the concept
10 2
reduction; Introduction to the concept of sparsity of dimensionality reduction
Introduction to Online Learning: Introduction to the Introduces the student to the exciting
11 online prediction setup; A very brief introduction to the 2 field of online prediction and learning
Hedge prediction algorithm

Text Books:
1. “Pattern Recognition and Machine Learning”, Christopher M. Bishop, Springer, 2006. [PRML]
2. “The Elements of Statistical Learning”, Trevor Hastie, Robert Tibshirani and Jerome Friedman, Springer,
2001. [ESL]
3. “Machine Learning: A Probabilistic Perspective”, Kevin P. Murphy, MIT Press, 2012.
4. “A Probabilistic Theory of Pattern Recognition”, Luc Devroye, László Györfi and Gábor Lugosi, Springer,
1996. [PTPR]
5. “Understanding Machine Learning: From Theory to Algorithms”, Shai Shalev-Shwartz and Shai Ben-David,
Cambridge University Press, 2014. [UML]
6. “Prediction, learning, and games”, Nicolò Cesa-Bianchi and Gábor Lugosi, Cambridge University Press,
2006. [PLG]

You might also like