You are on page 1of 2

Course Course

Name of Course L T P Credit


Type Code
DE ECE17105 Machine Learning 3 0 0 6

Course Objective
The objective of the course is to introduce Machine Learning, focusing on the core mathematical concepts of
supervised and unsupervised learning, to understand how to apply and analyze some popularly used ML algorithms.
Learning Outcomes
Upon successful completion of this course, students will:

 Be able to understand the mathematical theories and concepts behind the ML algorithms
 Be able to design and implement various ML algorithms in a range of real-world applications
 Be able to use commercial ML software based on these algorithms in a useful way for different datasets.

Unit Lecture
Topics to be Covered Learning Outcome
No. Hours
Introduction: concept of machine learning; Supervised, Will be able to identify any problem
unsupervised and reinforcement learning; Types of ML to its type and formulate its
1 problems: classification and regression;. Design of a 2 appropriate ML model. Have a
general learning system. Concept of Linear and Non-linear broad idea of the type of applications
Classifiers.
Evaluation Metrics Practical formulations/training, Will have an idea of how to evaluate
testing and validation data, Overfitting and underfitting and compare classifiers and ML
2 2
Confusion matrix, precision, recall and K-fold cross- models at the end of training.
validation
Bayesian Learning: Concepts of Conditional Practical use of Baeyes Theorem and
independence and conditional probability. The use of given any dataset (discrete and
3 Naïve Bayes Classifier, method of Laplace correction, 6 continuous) , will be able to
Gaussian NBC . construct the probabilistic model and
predict the class of a new feature set.
Logistic Regression: Log-likelihood function and its The concept of maximum Likelihood
Gradient Descent optimization. Batch and stochastic GD, estimation and gradient based
4 Cross entropy loss, softmax activation function. 6 searching in weight space will be
clear for deriving linear decision
boundaries.
ANN: Use of Perceptrons to learn Boolean functions Will be able to update the weights of
Multilayer Perceptrons and Non-linearly separable data. a multi-layer ANN and understand
5 6
Back-propagation training. the concept of how regression

Principal Component Analysis as an approach of Steps of applying PCA to a data,


6 Unsupervised learning, feature extraction, dimensionality 3 computing covariance matrix, Eigen
reduction features.
Support Vector Machines: The concept of Maximum Hyperplane geometry in developing
margin and support vectors. The use of Kernel mapping a discriminative optimal classifier,
7 for non-linear decision boundaries. 6 hard and soft margin, feature
mapping to higher dimensions,
handling of noisy data.
Deep learning : A brief overview of CNN and RNN The motivation of current
architectures (if time permits) application domains in using
8 3
different types of Deep Learning
models,
9 Class Tests/Quiz/Discussion of Homework solutions etc 4 Continuous evaluation
10 Tutorial classes 2 Practice of numerical problems

Text Books: Tom M. Mitchell, Machine Learning, McGraw Hill Publication (For S. No 1-5) (notes, derivations,
lecture slides etc and illustrative examples will be provided for most of the topics)

You might also like