You are on page 1of 3

Korea Advanced Institute of Science and Technology

School of Electrical Engineering

EE331 Introduction to Machine Learning


Spring 2019

Issued: Feb. 21 2019

Course Information

Name Tel. Email


Lecturer: Changdong Yoo T3470 cd yoo@kaist.ac.kr
TA : Kim, Junyeong T5470 junyeong.kim@kaist.ac.kr
Vu Van Thang T5470 thangvubk@kaist.ac.kr
Ju, Minjeong T5470 jujeong94@kaist.ac.kr
Ma, Minuk T5470 akalsdnr@kaist.ac.kr
Secretary: Dua Jeong T8070 duajeong@kaist.ac.kr

Text: Course handouts


Web: http://klms.kaist.ac.kr/course/view.php?id=56385

Day Time Room


Lecture: Wed, Fri 09:00-10:30 E11. #210

Recommended Courses

Recommended pre-requisite courses are EE209 (Programming Structure for Electrical Engineering),
MAS250/EE210 (Probability and Statistics/Probability and Random Processes) and MAS212/MAS109
(Linear Algebra/ Introduction to Linear Algebra). Recommended courses can be taken concurrently.
Mathematical maturity expected.

Course Description

This course introduces principles, algorithms and applications of machine learning (ML) from the
point of modeling, prediction and learning representation. This course will cover concepts such as
generalization, over-fitting, representation and regularization; topics such as framework for ML, de-
cision tree, logistic regression, PAC learning, large-margin classifier, Bayesian learning, EM, Hidden
Markov Model, probability and learning, k-means clustering, principal component analysis and re-
inforcement learning.

1
Course Organization and Grading Policy

There will be two 75 minute lectures per week. To facilitate the coverage of a large quantity of
material, copies of the lecture viewgraphs will be handed out.
There will be about five homeworks which will involve MATLAB and python programming.
The assignments must be turned in by the due date. You will be given a grace period of three
days for the five assignments. You can use the grace period however you please e.g. three days
on one homework or one day each on three homeworks. You are strictly forbidden to copy other
person’s work but collaboration is encouraged. Anyone suspected of copying a homework will receive
no points for that particular homework.
In addition to homework assignments, there will also be a required term project that requires
(1) constructing a new dataset for regression or multi-class classification, (2) reviewing and selecting
datasets and (3) implementing a ML algorithm and competing in a bakeoff on 5 dataset.

Midterm 20
Homework 20
Project 20
Participation 10
Final 30
Total 100

References

1. Tom Mitchell, Machine Learning, McGraw Hill, 1997.


2. David J.C. Mackay, Information Theory, Inference, and Learning Algorithms, Cambridge Uni-
versity Press, 2003.
3. Kevin P. Murphy, Machine Learning: A Probabilistic Perspective, MIT Press 2012

4. Ethem Alpaydin, Introduction to Machine Learning, The MIT Press, 2010

5. Bernhard Scholkopf, Alexander J. Smola, Learning with Kernels, Support Vector Machines,
Regularization, Optimization, and Beyond, The MIT Press, Cambridge, Massachusetts, 2002.

6. Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer 2006.


7. Trevor Hastie, Robert Tibshirani, The elements of Statistical Learning: Data Mining, Inference
and Prediction, Springer 2001.
8. Gareth James, Daniela Witten, An Introduction to Statistical Learning, Springer.

9. Shai Shalev-Shwartz, Shai Ben-David, Understanding Machine Learning: From Theory to


Algorithms, Cambridge University Press.
10. Mehryar Mohri, Afshin Rostamizadeh and Ameet Talwalkar, Foundations of Machine Learn-
ing, MIT Press, Second Edition, 2018
11. Willi Richert, Luis Pedro Coelho, Building Machine Learning Systems with Python, PACKT

2
Tentative Syllabus Schedule

Lect. Date Topic HW


L1 2/27, W introduction to ML
– 3/1, F National Holiday
L2 3/6, W framework for ML: loss, over-fitting, complexity, generalization
L3 3/8, F decision tree
L4 3/13, W perceptron
L5 3/15, F nearest neighbor HW1 out
L6 3/20, W unsupervised learning: k-means clustering
L7 3/22, F unsupervised learning: principal component analysis HW1 due, HW2 out
L8 3/27, W unsupervised learning: density estimation
L9 3/29, F performance, bias-variance decomposition HW 2 due
L10 4/3, W logistic regression and maximum likelihood estimation (MLE) HW 3 out
L11 4/5, F linear regression and regularization
L12 4/10, W linear regression and regularization
L13 4/12, F optimization HW 3 due
– 4/17, W Midterm Week
– 4/19, F Midterm Week
L14 4/24, W neural network HW4 out
L15 4/26, F neural network
L16 5/1, W large-margin classifier
L17 5/3, F large-margin classfier HW 4 due
L18 5/8, W probability and learning theory
L19 5/10, F probability and learning theory HW5 out
L20 5/15, W Bayesian Learning
L21 5/17, F Bayesian Learning : Naive Bayes
L22 5/22, W expectation maximization (EM)
L23 5/24, F Hidden Markov Model (HMM)
L24 5/29, W Hidden Markov Model (HMM)
L25 5/31, F Markov Decision Process HW5 due
L26 6/5, W reinforcement learning
L27 6/7, F reinforcement learning
– 6/12, W Final Week
– 6/14, F Final Week

You might also like