You are on page 1of 25

Applied Machine Learning Course

Schedule
DATE MODULE CHAPTER TOPIC
Keywords and identifiers,
comments, indentation and
statements, Variables and
data types in Python,
Module Standard Input and Output,
2019-10-09 1:Fundamentals Python for DataScience Operators, Control flow: if
of Programming else, Control flow: while
loop, Control flow: for loop,
Control flow: break and
continue,Revision Python for
DataScience
Lists, Tuples part 1, Tuples
Module
Python for DataScience:Data part-2, Sets, Dictionary,
2019-10-10 1:Fundamentals
Structures Strings,Revision Python for
of Programming
DataScience:Data Structures
Introduction, Types of
functions, Function
arguments, Recursive
Module
Python for functions, Lambda functions,
2019-10-11 1:Fundamentals
DataScience:Functions Modules, Packages, File
of Programming
Handling, Exception
Handling, Debugging
Python, Assignment-1
Module
Python for Revision Python for
2019-10-12 1:Fundamentals
DataScience:Functions DataScience:Functions
of Programming
Numpy Introduction,
Module
Python for Numerical operations on
2019-10-13 1:Fundamentals
DataScience:Numpy Numpy,Revision Python for
of Programming
DataScience:Numpy
Module Getting started with
Python for
2019-10-14 1:Fundamentals Matplotlib,Revision Python
DataScience:Matplotlib
of Programming for DataScience:Matplotlib
Getting started with pandas,
Module Data Frame Basics, Key
Python for
2019-10-15 1:Fundamentals Operations on Data
DataScience:Pandas
of Programming Frames,Revision Python for
DataScience:Pandas
Space and Time Complexity:
Find largest number in a
list , Binary search, Find
elements common in two
Module Python for
lists, Find elements common
2019-10-16 1:Fundamentals DataScience:Compputational
in two lists using a
of Programming Complexity
Hashtable/Dict,Revision
Python for
DataScience:Compputational
Complexity
Introduction to Databases,
Why SQL?, Execution of an
SQL statement., IMDB
dataset, Installing MySQL,
Module
Load IMDB data., USE,
2019-10-17 1:Fundamentals SQL
DESCRIBE, SHOW TABLES,
of Programming
SELECT , LIMIT, OFFSET,
ORDER BY, DISTINCT ,
WHERE, Comparison
operators, NULL
Logical Operators,
Aggregate Functions:
COUNT, MIN, MAX, AVG,
SUM, GROUP BY, HAVING,
Order of keywords., Join and
Module Natural Join, Inner, Left,
2019-10-18 1:Fundamentals SQL Right and Outer joins., Sub
of Programming Queries/Nested Queries/
Inner Queries, DML:INSERT,
DML:UPDATE , DELETE,
DDL:CREATE TABLE,
DDL:ALTER: ADD, MODIFY,
DROP
DDL:DROP TABLE,
TRUNCATE, DELETE, Data
Control Language: GRANT,
Module
REVOKE, Learning
2019-10-19 1:Fundamentals SQL
resources, Assignment-22:
of Programming
SQL Assignment on IMDB
data,Revision SQL,Revision
SQL
Introduction to IRIS dataset
and 2D scatter plot, 3D
scatter plot, Pair plots,
Limitations of Pair Plots,
Module 2:
Histogram and Introduction
Datascience:
to PDF(Probability Density
Exploratory Data Plotting for exploratory data
2019-10-20 Function), Univariate
Analysis and analysis (EDA)
Analysis using PDF,
Data
CDF(Cumulative
Visualization
Distribution Function),
Mean, Variance and
Standard Deviation, Median,
Percentiles and Quantiles
IQR(Inter Quartile Range)
and MAD(Median Absolute
Deviation), Box-plot with
Whiskers, Violin Plots,
Module 2:
Summarizing Plots,
Datascience:
Univariate, Bivariate and
Exploratory Data Plotting for exploratory data
2019-10-21 Multivariate analysis,
Analysis and analysis (EDA)
Multivariate Probability
Data
Density, Contour Plot,
Visualization
Exercise: Perform EDA on
Haberman dataset,Revision
Plotting for exploratory data
analysis (EDA)
Why learn it ?, Introduction
to Vectors(2-D, 3-D, n-D) ,
Row Vector and Column
Vector, Dot Product and
Angle between 2 Vectors,
Projection and Unit Vector,
Equation of a line (2-D),
Plane(3-D) and Hyperplane
Module 2:
(n-D), Plane Passing through
Datascience:
origin, Normal to a Plane,
Exploratory Data
2019-10-22 Linear Algebra Distance of a point from a
Analysis and
Plane/Hyperplane, Half-
Data
Spaces, Equation of a Circle
Visualization
(2-D), Sphere (3-D) and
Hypersphere (n-D), Equation
of an Ellipse (2-D), Ellipsoid
(3-D) and Hyperellipsoid (n-
D), Square ,Rectangle,
Hyper Cube,Hyper Cuboid,
Revision Questions,Revision
Linear Algebra
Introduction to Probability
and Statistics, Population
and Sample, Gaussian/
Normal Distribution and its
PDF(Probability Density
Function), CDF(Cumulative
Distribution function) of
Module 2: Gaussian/Normal
Datascience: distribution, Symmetric
Exploratory Data distribution, Skewness and
2019-10-23 Probability And Statistics
Analysis and Kurtosis, Standard normal
Data variate (Z) and
Visualization standardization, Kernel
density estimation, Sampling
distribution & Central Limit
theorem, Q-Q plot:How to
test if a random variable is
normally distributed or not?,
How distributions are used?,
Chebyshev’s inequality
Discrete and Continuous
Uniform distributions, How
to randomly sample data
points (Uniform
Distribution), Bernoulli and
Binomial Distribution, Log
Module 2:
Normal Distribution, Power
Datascience:
law distribution, Box cox
Exploratory Data
2019-10-24 Probability And Statistics transform, Applications of
Analysis and
non-gaussian distributions?,
Data
Co-variance, Pearson
Visualization
Correlation Coefficient,
Spearman Rank Correlation
Coefficient, Correlation vs
Causation, How to use
correlations? , Confidence
interval (C.I) Introduction
Computing confidence
interval given the underlying
distribution, C.I for mean of
a normal random variable,
Confidence interval using
bootstrapping, Hypothesis
Module 2:
testing methodology, Null-
Datascience:
hypothesis, p-value,
Exploratory Data
2019-10-25 Probability And Statistics Hypothesis Testing Intution
Analysis and
with coin toss example,
Data
Resampling and permutation
Visualization
test, K-S Test for similarity
of two distributions, Code
Snippet K-S Test, Hypothesis
testing: another example,
Resampling and Permutation
test: another example
Module 2:
How to use hypothesis
Datascience:
testing?, Propotional
Exploratory Data
2019-10-26 Probability And Statistics sampling, Revision
Analysis and
Questions,Revision
Data
Probability And Statistics
Visualization
What is Dimensionality
reduction?, Row Vector and
Column Vector, How to
represent a data set?, How
to represent a dataset as a
Module 2:
Matrix., Data Preprocessing:
Datascience:
Feature Normalisation,
Exploratory Data Dimensionality Reduction
2019-10-27 Mean of a data matrix, Data
Analysis and And Visualization
Preprocessing: Column
Data
Standardization, Co-variance
Visualization
of a Data Matrix, MNIST
dataset (784 dimensional),
Code to Load MNIST Data
Set,Revision Dimensionality
Reduction And Visualization
Why learn PCA?, Geometric
intuition of PCA,
Mathematical objective
function of PCA, Alternative
formulation of PCA: Distance
minimization, Eigen values
Module 2:
and Eigen vectors (PCA):
Datascience:
Dimensionality reduction,
Exploratory Data Principal Component
2019-10-28 PCA for Dimensionality
Analysis and Analysis
Reduction and Visualization,
Data
Visualize MNIST dataset,
Visualization
Limitations of PCA, PCA
Code example, PCA for
dimensionality reduction
(not-visualization),Revision
Principal Component
Analysis
What is t-SNE?,
Neighborhood of a point,
Module 2: Embedding, Geometric
Datascience: intuition of t-SNE, Crowding
Exploratory Data Problem, How to apply t-
2019-10-29 T-Sne
Analysis and SNE and interpret its
Data output, t-SNE on MNIST,
Visualization Code example of t-SNE,
Revision Questions,Revision
T-Sne
Dataset overview: Amazon
Fine Food reviews(EDA),
Data Cleaning:
Deduplication, Why convert
text to a vector?, Bag of
Module 3:
Words (BoW), Text
Foundations of
Preprocessing: Stemming,
Natural
Predict rating given product Stop-word removal,
2019-10-30 Language
reviews on amazon Tokenization,
Processing and
Lemmatization., uni-gram,
Machine
bi-gram, n-grams., tf-idf
Learning
(term frequency- inverse
document frequency), Why
use log in IDF?, Word2Vec.,
Avg-Word2Vec, tf-idf
weighted Word2Vec
Bag of Words( Code
Sample), Text
Preprocessing( Code
Module 3: Sample), Bi-Grams and n-
Foundations of grams (Code Sample), TF-
Natural IDF (Code Sample),
Predict rating given product
2019-10-31 Language Word2Vec (Code Sample),
reviews on amazon
Processing and Avg-Word2Vec and TFIDF-
Machine Word2Vec (Code Sample),
Learning Assignment-2 : Apply t-
SNE,Revision Predict rating
given product reviews on
amazon
How “Classification” works?,
Data matrix notation,
Classification vs Regression
(examples), K-Nearest
Neighbours Geometric
intuition with a toy example,
Module 3:
Failure cases of KNN,
Foundations of
Distance measures:
Natural Classification And
Euclidean(L2) ,
2019-11-01 Language Regression Models: K-
Manhattan(L1), Minkowski,
Processing and Nearest Neighbors
Hamming, Cosine Distance
Machine
& Cosine Similarity, How to
Learning
measure the effectiveness of
k-NN?, Test/Evaluation time
and space complexity, KNN
Limitations, Decision surface
for K-NN as K changes,
Overfitting and Underfitting
Need for Cross validation, K-
fold cross validation,
Module 3:
Visualizing train, validation
Foundations of
and test datasets, How to
Natural Classification And
determine overfitting and
2019-11-02 Language Regression Models: K-
underfitting?, Time based
Processing and Nearest Neighbors
splitting, k-NN for
Machine
regression, Weighted k-NN,
Learning
Voronoi diagram, Binary
search tree
How to build a kd-tree, Find
Module 3: nearest neighbours using
Foundations of kd-tree, Limitations of Kd
Natural Classification And tree, Extensions, Hashing vs
2019-11-03 Language Regression Models: K- LSH, LSH for cosine
Processing and Nearest Neighbors similarity, LSH for euclidean
Machine distance, Probabilistic class
Learning label, Code Sample:Decision
boundary .
Module 3:
Code Sample:Cross
Foundations of
Validation, Question and
Natural Classification And
Answers,Revision
2019-11-04 Language Regression Models: K-
Classification And
Processing and Nearest Neighbors
Regression Models: K-
Machine
Nearest Neighbors
Learning
Introduction, Imbalanced vs
balanced dataset, Multi-
class classification, k-NN,
given a distance or similarity
matrix, Train and test set
Module 3:
differences, Impact of
Foundations of
outliers, Local outlier Factor
Natural
Classification Algorithms in (Simple solution :Mean
2019-11-05 Language
Various Situations distance to Knn), K-
Processing and
Distance(A),N(A),
Machine
Reachability-Distance(A,B),
Learning
Local reachability-
density(A), Local outlier
Factor(A), Impact of Scale &
Column standardization,
Interpretability
Feature Importance and
Forward Feature selection,
Handling categorical and
Module 3:
numerical features,
Foundations of
Handling missing values by
Natural
Classification Algorithms in imputation, Curse of
2019-11-06 Language
Various Situations dimensionality, Bias-Variance
Processing and
tradeoff, Intuitive
Machine
understanding of bias-
Learning
variance., Best and worst
cases for an algorithm,
Question and Answers
Module 3:
Foundations of
Natural Revision Classification
Classification Algorithms in
2019-11-07 Language Algorithms in Various
Various Situations
Processing and Situations
Machine
Learning
Accuracy, Confusion matrix,
TPR, FPR, FNR, TNR,
Precision and recall, F1-
Module 3:
score, Receiver Operating
Foundations of
Characteristic Curve (ROC)
Natural
Performance Measurement curve and AUC, Log-loss, R-
2019-11-08 Language
of Models Squared/Coefficient of
Processing and
determination, Median
Machine
absolute deviation (MAD),
Learning
Distribution of errors,
Assignment-3: Apply k-
nearest neighbour
Module 3:
Foundations of
Natural
Performance Measurement Revision Performance
2019-11-09 Language
of Models Measurement of Models
Processing and
Machine
Learning
Conditional probability,
Module 3: Independent vs Mutually
Foundations of exclusive events, Bayes
Natural Theorem with examples,
2019-11-10 Language Naive Bayes Exercise problems on Bayes
Processing and Theorem, Naive Bayes
Machine algorithm, Toy example:
Learning Train and test stages, Naive
Bayes on Text data
Laplace/Additive Smoothing,
Log-probabilities for
numerical stability, Bias and
Variance tradeoff, Feature
importance and
Module 3:
interpretability, Imbalanced
Foundations of
data, Outliers, Missing
Natural
values, Handling Numerical
2019-11-11 Language Naive Bayes
features (Gaussian NB),
Processing and
Multiclass classification,
Machine
Similarity or Distance
Learning
matrix, Large
dimensionality, Best and
worst cases, Code example,
Assignment-4: Apply Naive
Bayes 
Module 3:
Foundations of
Natural
2019-11-12 Language Naive Bayes Revision Naive Bayes
Processing and
Machine
Learning
Geometric intuition of
Logistic Regression, Sigmoid
Module 3: function: Squashing,
Foundations of Mathematical formulation of
Natural Objective function, Weight
2019-11-13 Language Logistic Regression vector, L2 Regularization:
Processing and Overfitting and Underfitting,
Machine L1 regularization and
Learning sparsity, Probabilistic
Interpretation: Gaussian
Naive Bayes
Loss minimization
interpretation,
Hyperparameter search:
Grid Search and Random
Search, Column
Standardization, Feature
Module 3:
importance and Model
Foundations of
interpretability, Collinearity
Natural
of features, Test/Run time
2019-11-14 Language Logistic Regression
space and time complexity,
Processing and
Real world cases, Non-
Machine
linearly separable data &
Learning
feature engineering, Code
sample: Logistic regression,
GridSearchCV,
RandomSearchCV,
Assignment-5: Apply Logistic
Regression
Module 3:
Foundations of
Extensions to Logistic
Natural
Regression: Generalized
2019-11-15 Language Logistic Regression
linear models,Revision
Processing and
Logistic Regression
Machine
Learning
Geometric intuition of
Module 3:
Linear Regression,
Foundations of
Mathematical formulation,
Natural
Real world Cases, Code
2019-11-16 Language Linear Regression
sample for Linear
Processing and
Regression, Question and
Machine
Answers,Revision Linear
Learning
Regression
Differentiation, Online
differentiation tools, Maxima
Module 3: and Minima, Vector calculus:
Foundations of Grad, Gradient descent:
Natural geometric intuition,
Solving Optimization
2019-11-17 Language Learning rate, Gradient
Problems
Processing and descent for linear
Machine regression, SGD algorithm,
Learning Constrained Optimization &
PCA, Logistic regression
formulation revisited
Module 3:
Why L1 regularization
Foundations of
creates sparsity?,
Natural
Solving Optimization Assignment 6: Implement
2019-11-18 Language
Problems SGD for linear regression
Processing and
,Revision Solving
Machine
Optimization Problems
Learning
Geometric Intuition, Why we
take values +1 and and -1
for Support vector planes,
Module 4:
Mathematical derivation,
Machine
Loss function (Hinge Loss)
Learning-II
2019-11-19 Support Vector Machines based interpretation, Dual
(Supervised
form of SVM formulation,
Learning
Kernel trick, Polynomial
Models)
kernel, RBF-Kernel, Domain
specific Kernels, Train and
run time complexities
Module 4: nu-SVM: control errors and
Machine support vectors, SVM
Learning-II Regression, Cases, Code
2019-11-20 Support Vector Machines
(Supervised Sample, Assignment 7:
Learning Apply SVM,Revision Support
Models) Vector Machines
Geometric Intuition of
decision tree: Axis parallel
hyperplanes, Sample
Decision tree, Building a
decision Tree:Entropy,
Building a decision
Tree:Information Gain,
Building a decision Tree:
Module 4: Gini Impurity, Building a
Machine decision Tree: Constructing
Learning-II a DT, Building a decision
2019-11-21 Decision Trees
(Supervised Tree: Splitting numerical
Learning features, Feature
Models) standardization, Building a
decision Tree:Categorical
features with many possible
values, Overfitting and
Underfitting, Train and Run
time complexity, Regression
using Decision Trees, Cases,
Code Samples, Assignment
8: Apply Decision Trees
Module 4:
Machine
Learning-II
2019-11-22 Decision Trees Revision Decision Trees
(Supervised
Learning
Models)
What are ensembles?,
Bootstrapped Aggregation
(Bagging) Intuition, Random
Forest and their
construction, Bias-Variance
tradeoff, Bagging :Train and
Run-time Complexity.,
Bagging:Code Sample,
Extremely randomized trees,
Module 4:
Random Tree :Cases,
Machine
Boosting Intuition,
Learning-II
2019-11-23 Ensemble Models Residuals, Loss functions
(Supervised
and gradients, Gradient
Learning
Boosting, Regularization by
Models)
Shrinkage, Train and Run
time complexity, XGBoost:
Boosting + Randomization,
AdaBoost: geometric
intuition, Stacking models,
Cascading classifiers,
Kaggle competitions vs Real
world, Assignment-9: Apply
Random Forests & GBDT
Module 4:
Machine
Learning-II
2019-11-24 Ensemble Models Revision Ensemble Models
(Supervised
Learning
Models)
Introduction, Moving
window for Time Series
Module 5: Data, Fourier decomposition,
Feature Deep learning features:
Engineering, Featurization And Feature LSTM, Image histogram,
2019-11-25
Productionization Importance Keypoints: SIFT., Deep
and Deployment learning features: CNN,
of ML Models Relational data, Graph data,
Indicator variables, Feature
binning
Interaction variables,
Mathematical transforms,
Module 5: Model specific
Feature featurizations, Feature
Engineering, Featurization And Feature orthogonality, Domain
2019-11-26
Productionization Importance specific featurizations,
and Deployment Feature slicing, Kaggle
of ML Models Winners solutions,Revision
Featurization And Feature
Importance
Calibration of Models:Need
for calibration, Calibration
Plots., Platt’s Calibration/
Scaling., Isotonic
Module 5:
Regression, Code Samples,
Feature
Modeling in the presence of
Engineering,
2019-11-27 Miscellaneous Topics outliers: RANSAC,
Productionization
Productionizing models,
and Deployment
Retraining models
of ML Models
periodically., A/B testing.,
Data Science Life cycle, VC
dimension,Revision
Miscellaneous Topics
Business/Real world
problem : Problem
definition  , Business
objectives and constraints.,
Mapping to an ML problem
: Data overview , Mapping to
an ML problem : ML
problem and performance
metric., Mapping to an ML
problem : Train-test split,
Module 6:
EDA: Basic Statistics., EDA:
Machine
Quora Question Pair Basic Feature Extraction,
2019-11-28 Learning Real
Similarity EDA: Text Preprocessing,
World Case
EDA: Advanced Feature
studies
Extraction, EDA: Feature
analysis., EDA: Data
Visualization: T-SNE., EDA:
TF-IDF weighted Word2Vec
featurization., ML Models
:Loading Data, ML Models:
Random Model, ML Models
: Logistic Regression and
Linear SVM, ML Models :
XGBoost, Assignments
Module 6:
Machine
Quora Question Pair Revision Quora Question
2019-11-29 Learning Real
Similarity Pair Similarity
World Case
studies
Business/Real world
problem : Overview,
Business objectives and
constraints., ML problem
formulation :Data, ML
problem formulation:
Mapping real world to ML
problem., ML problem
Module 6:
formulation :Train, CV and
Machine
Personalized Cancer Test data construction,
2019-11-30 Learning Real
Diagnosis Exploratory Data
World Case
Analysis:Reading data &
studies
preprocessing, Exploratory
Data Analysis:Distribution of
Class-labels, Exploratory
Data Analysis: “Random”
Model, Univariate
Analysis:Gene feature,
Univariate Analysis:Variation
Feature
Univariate Analysis:Text
feature, Machine Learning
Models:Data preparation,
Baseline Model: Naive
Bayes, K-Nearest Neighbors
Classification, Logistic
Module 6:
Regression with class
Machine
Personalized Cancer balancing, Logistic
2019-12-01 Learning Real
Diagnosis Regression without class
World Case
balancing, Linear-SVM.,
studies
Random-Forest with one-hot
encoded features, Random-
Forest with response-coded
features, Stacking Classifier,
Majority Voting classifier,
Assignment
Module 6:
Machine
Personalized Cancer Revision Personalized
2019-12-02 Learning Real
Diagnosis Cancer Diagnosis
World Case
studies
Problem definition. ,
Overview of Graphs: node/
vertex, edge/link, directed-
edge, path. , Data format &
Module 6: Limitations. , Mapping to a
Machine Facebook Friend supervised classification
2019-12-03 Learning Real Recommendation Using problem. , Business
World Case Graph Mining constraints & Metrics. ,
studies EDA:Basic Stats,
EDA:Follower and following
stats., EDA:Binary
Classification Task,
EDA:Train and test split.
Feature engineering on
Graphs:Jaccard & Cosine
Module 6:
Similarities, PageRank,
Machine Facebook Friend
Shortest Path, Connected-
2019-12-04 Learning Real Recommendation Using
components, Adar Index,
World Case Graph Mining
Kartz Centrality, HITS
studies
Score, SVD, Weight features,
Modeling, Assignment
Module 6:
Machine Facebook Friend Revision Facebook Friend
2019-12-05 Learning Real Recommendation Using Recommendation Using
World Case Graph Mining Graph Mining
studies
Business/Real world
problem Overview,
Objectives and Constraints,
Mapping to ML problem
:Data, Mapping to ML
problem :dask dataframes,
Mapping to ML problem
:Fields/Features., Mapping
to ML problem :Time series
forecasting/Regression,
Mapping to ML problem
:Performance metrics, Data
Cleaning :Latitude and
Longitude data, Data
Module 6: Cleaning :Trip Duration.,
Machine Data Cleaning :Speed., Data
Taxi Demand Prediction in
2019-12-06 Learning Real Cleaning :Distance., Data
New York City
World Case Cleaning :Fare, Data
studies Cleaning :Remove all
outliers/erroneous points,
Data Preparation:Clustering/
Segmentation, Data
Preparation:Time binning,
Data Preparation:Smoothing
time-series data., Data
Preparation:Smoothing time-
series data cont.., Data
Preparation: Time series and
Fourier transforms., Ratios
and previous-time-bin
values, Simple moving
average, Weighted Moving
average.
Exponential weighted
moving average, Results.,
Regression models :Train-
Module 6: Test split & Features, Linear
Machine regression., Random Forest
Taxi Demand Prediction in
2019-12-07 Learning Real regression, Xgboost
New York City
World Case Regression, Model
studies comparison,
Assignment.,Revision Taxi
Demand Prediction in New
York City
Business/Real world
problem, Business objectives
and constraints, Mapping to
an ML problem: Data
overview, Mapping to an ML
problem:ML problem
Module 6:
formulation., Mapping to an
Machine
Stack Overflow Tag ML problem:Performance
2019-12-08 Learning Real
Predictor metrics., Hamming loss,
World Case
EDA:Data Loading,
studies
EDA:Analysis of tags,
EDA:Data Preprocessing,
Data Modeling : Multi label
Classification, Data
preparation., Train-Test
Split, Featurization
Logistic regression: One VS
Rest, Sampling data and
Module 6:
tags+Weighted models.,
Machine
Stack Overflow Tag Logistic regression revisited,
2019-12-09 Learning Real
Predictor Why not use advanced
World Case
techniques,
studies
Assignments.,Revision Stack
Overflow Tag Predictor
Problem Definition,
Objectives and Constraints,
Data Overview, ML Problem,
Train and Test Splitting,
Exploratory Data
Analysis:Class Distribution,
Exploratory Data
Analysis:Feature Extraction
from Byte Files, Exploratory
Data Analysis:Multivariate
analysis of features from
Module 6:
byte files, Train-Test class
Machine
Microsoft Malware Distribution, ML models –
2019-12-10 Learning Real
Detection using byte files only
World Case
:Random Model, K-NN,
studies
Logistic regression, Random
Forest and XGBoost, Feature
Extraction and Multi
Threading, File Size Feature,
Univariate Analysis, T-SNE
Analysis, ML Models on
ASM File features, Models
on all features: t-SNE,
Models on all features:
RandomForest and XGBoost,
Assignment
Module 6:
Machine
Microsoft Malware Revision Microsoft Malware
2019-12-11 Learning Real
Detection Detection
World Case
studies
What is Clustering?,
Unsupervised learning,
Applications, Metrics for
Clustering, K-Means:
Module 7: Data
Geometric intuition,
Mining
Centroids, K-Means:
(Unsupervised
Mathematical formulation:
Learning) and
2019-12-12 Clustering Objective function, K-Means
Recommender
Algorithm., How to initialize:
systems+Real
K-Means++, Failure cases/
World Case
Limitations, K-Medoids,
studies
Determining the right K,
Code Samples, Time and
space complexity,Revision
Clustering
Agglomerative & Divisive,
Dendrograms,
Agglomerative Clustering,
Module 7: Data
Proximity methods:
Mining
Advantages and Limitations.,
(Unsupervised
Time and Space Complexity,
Learning) and
2019-12-13 Hierarchical Clustering Limitations of Hierarchical
Recommender
Clustering, Code sample,
systems+Real
Assignment 10: Apply k-
World Case
means, agglomerative,
studies
DBSCAN Clustering
algorithms,Revision
Hierarchical Clustering
Density based clustering,
MinPts and Eps: Density,
Core, Border and Noise
Module 7: Data
points, Density edge and
Mining
Density connected points.,
(Unsupervised
DBSCAN Algorithm, Hyper
Learning) and
2019-12-14 DBSCAN Technique Parameters: MinPts and Eps,
Recommender
Advantages and Limitations
systems+Real
of DBSCAN, Time and Space
World Case
Complexity, Code samples.,
studies
Question and
Answers,Revision DBSCAN
Technique
Problem formulation: IMDB
Movie reviews, Content
Module 7: Data based vs Collaborative
Mining Filtering, Similarity based
(Unsupervised Algorithms, Matrix
Learning) and Recommender Systems and Factorization: PCA, SVD,
2019-12-15
Recommender Matrix Factorization Matrix Factorization: NMF,
systems+Real Matrix Factorization for
World Case Collaborative filtering,
studies Matrix Factorization for
feature engineering,
Clustering as MF
Module 7: Data Hyperparameter tuning,
Mining Matrix Factorization for
(Unsupervised recommender systems:
Learning) and Recommender Systems and Netflix Prize Solution, Cold
2019-12-16
Recommender Matrix Factorization Start problem, Word vectors
systems+Real as MF, Eigen-Faces, Code
World Case example., Assignment-11:
studies Apply Truncated SVD
Module 7: Data
Mining
(Unsupervised
Revision Recommender
Learning) and Recommender Systems and
2019-12-17 Systems and Matrix
Recommender Matrix Factorization
Factorization
systems+Real
World Case
studies
Problem Statement:
Recommend similar apparel
products in e-commerce
using product descriptions
and Images, Plan of action,
Amazon product advertising
API, Data folders and paths,
Module 7: Data Overview of the data and
Mining Terminology, Data cleaning
(Unsupervised and understanding:Missing
Learning) and Amazon Fashion Discovery data in various features,
2019-12-18
Recommender Engine Understand duplicate rows,
systems+Real Remove duplicates : Part 1 ,
World Case Remove duplicates: Part 2,
studies Text Pre-Processing:
Tokenization and Stop-word
removal, Stemming, Text
based product similarity
:Converting text to an n-D
vector: bag of words, Code
for bag of words based
product similarity
TF-IDF: featurizing text
based on word-importance,
Code for TF-IDF based
product similarity, Code for
IDF based product similarity,
Text Semantics based
product similarity:
Word2Vec(featurizing text
based on semantic
similarity), Code for Average
Word2Vec product similarity,
TF-IDF weighted Word2Vec,
Code for IDF weighted
Module 7: Data
Word2Vec product similarity,
Mining
Weighted similarity using
(Unsupervised
brand and color, Code for
Learning) and Amazon Fashion Discovery
2019-12-19 weighted similarity, Building
Recommender Engine
a real world solution, Deep
systems+Real
learning based visual
World Case
product similarity:ConvNets:
studies
How to featurize an image:
edges, shapes, parts, Using
Keras + Tensorflow to
extract features, Visual
similarity based product
similarity, Measuring
goodness of our solution :A/
B testing, Exercise :Build a
weighted Nearest neighbor
model using Visual, Text,
Brand and Color,Revision
Amazon Fashion Discovery
Engine
00
01Business/Real World
Problem:Problem Definition,
Objectives and Constraints,
Mapping to ML problem :
Data Overview, Mapping to
ML problem : ML problem
formulation, Exploratory
Data Analysis: Data
preprocessing, Exploratory
Data Analysis: Temporal
Train-Test split, Exploratory
Data Analysis: Preliminary
Data Analysis, Exploratory
Data Analysis: Sparse matrix
representation, Exploratory
Module 7: Data Data Analysis:Average
Mining ratings for various slices ,
(Unsupervised Exploratory Data
Learning) and Netflix Movie Analysis:Cold start problem,
2019-12-20
Recommender Recommendation system Computing Similarity
systems+Real matrices:User-User
World Case similarity matrix ,
studies Computing Similarity
matrices:Movie-Movie
similarity , Computing
Similarity matrices:Does
movie-movie similarity
work?, ML Models:Surprise
library , Overview of the
modelling strategy. , Data
Sampling. , Google drive
with intermediate files ,
Featurizations for
regression. , Data
transformation for
Surprise. , Xgboost with 13
features , Surprise Baseline
model. 
Xgboost + 13 features
+Surprise baseline model ,
Surprise KNN predictors ,
Module 7: Data
Matrix Factorization models
Mining
using Surprise , SVD ++
(Unsupervised
with implicit feedback ,
Learning) and Netflix Movie
2019-12-21 Final models with all
Recommender Recommendation system
features and predictors.,
systems+Real
Comparison between various
World Case
models.,
studies
Assignments,Revision
Netflix Movie
Recommendation system
History of Neural networks
and Deep Learning., How
Biological Neurons work?,
Module 8: Neural
Growth of biological neural
Networks,
networks, Diagrammatic
2019-12-22 Computer Vision Neural Networks
representation: Logistic
and Deep
Regression and Perceptron,
Learning
Multi-Layered Perceptron
(MLP)., Notation, Training a
single-neuron model.
Training an MLP: Chain
Rule, Training an
MLP:Memoization,
Module 8: Neural
Backpropagation., Activation
Networks,
functions, Vanishing
2019-12-23 Computer Vision Neural Networks
Gradient problem., Bias-
and Deep
Variance tradeoff., Decision
Learning
surfaces:
Playground,Revision Neural
Networks
Deep Multi-layer
perceptrons:1980s to 2010s,
Dropout layers &
Regularization., Rectified
Module 8: Neural
Linear Units (ReLU)., Weight
Networks,
Deep Multi Layer initialization., Batch
2019-12-24 Computer Vision
Perceptrons Normalization.,
and Deep
Optimizers:Hill-descent
Learning
analogy in 2D,
Optimizers:Hill descent in
3D and contours., SGD
Recap
Batch SGD with momentum.,
Nesterov Accelerated
Gradient (NAG),
Optimizers:AdaGrad,
Optimizers : Adadelta
Module 8: Neural
andRMSProp, Adam, Which
Networks,
Deep Multi Layer algorithm to choose when?,
2019-12-25 Computer Vision
Perceptrons Gradient Checking and
and Deep
clipping, Softmax and Cross-
Learning
entropy for multi-class
classification., How to train
a Deep MLP?, Auto
Encoders., Word2Vec
:CBOW
Module 8: Neural Word2Vec: Skip-gram,
Networks, Word2Vec :Algorithmic
Deep Multi Layer
2019-12-26 Computer Vision Optimizations.,Revision
Perceptrons
and Deep Deep Multi Layer
Learning Perceptrons
Tensorflow and Keras
overview, GPU vs CPU for
Deep Learning., Google
Colaboratory., Install
TensorFlow, Online
Module 8: Neural
documentation and tutorials,
Networks,
Softmax Classifier on MNIST
2019-12-27 Computer Vision Tensorflow And Keras
dataset., MLP: Initialization,
and Deep
Model 1: Sigmoid
Learning
activation., Model 2: ReLU
activation., Model 3: Batch
Normalization., Model 4 :
Dropout., MNIST
classification in Keras.
Module 8: Neural Hyperparameter tuning in
Networks, Keras., Exercise: Try
2019-12-28 Computer Vision Tensorflow And Keras different MLP architectures
and Deep on MNIST dataset.,Revision
Learning Tensorflow And Keras
Biological inspiration: Visual
Cortex, Convolution:Edge
Module 8: Neural Detection on images.,
Networks, Convolution:Padding and
2019-12-29 Computer Vision Convolutional Neural Nets strides, Convolution over
and Deep RGB images., Convolutional
Learning layer., Max-pooling., CNN
Training: Optimization,
Example CNN: LeNet [1998]
ImageNet dataset., Data
Augmentation., Convolution
Layers in Keras, AlexNet,
Module 8: Neural VGGNet, Residual Network.,
Networks, Inception Network., What is
2019-12-30 Computer Vision Convolutional Neural Nets Transfer learning., Code
and Deep example: Cats vs Dogs.,
Learning Code Example: MNIST
dataset., Assignment: Try
various CNN networks on
MNIST dataset.
Module 8: Neural
Networks,
Revision Convolutional
2019-12-31 Computer Vision Convolutional Neural Nets
Neural Nets
and Deep
Learning
Why RNNs? , Recurrent
Module 8: Neural
Neural Network., Training
Networks,
Long Short-Term RNNs: Backprop., Types of
2020-01-01 Computer Vision
Memory(LSTMS) RNNs., Need for LSTM/
and Deep
GRU., LSTM., GRUs., Deep
Learning
RNN., Bidirectional RNN.
Code example : IMDB
Module 8: Neural
Sentiment classification,
Networks,
Long Short-Term Exercise: Amazon Fine Food
2020-01-02 Computer Vision
Memory(LSTMS) reviews LSTM
and Deep
model.,Revision Long Short-
Learning
Term Memory(LSTMS)
Human Activity Recognition
Problem definition, Dataset
understanding, Data
cleaning & preprocessing,
Module 9: Deep EDA:Univariate analysis.,
Learning Real EDA:Data visualization using
2020-01-03 Human Activity Recognition
World Case t-SNE, Classical ML models.,
Studies Deep-learning Model.,
Exercise: Build deeper
LSTM models and hyper-
param tune them,Revision
Human Activity Recognition
Problem Definition,
Datasets., Data
understanding & Analysis
:Files and folders., Dash-cam
images and steering angles.,
Split the dataset: Train vs
Test, EDA: Steering angles,
Module 9: Deep
Mean Baseline model:
Learning Real
2020-01-04 Self Driving Car simple, Deep-learning
World Case
model:Deep Learning for
Studies
regression: CNN,
CNN+RNN, Batch load the
dataset., NVIDIA’s end to
end CNN model., Train the
model., Test and visualize
the output., Extensions.,
Assignment.
Module 9: Deep
Learning Real
2020-01-05 Self Driving Car Revision Self Driving Car
World Case
Studies
Real-world problem, Music
representation, Char-RNN
with abc-notation :Char-
RNN model, Char-RNN with
abc-notation :Data
preparation., Char-RNN with
abc-notation:Many to Many
Module 9: Deep
RNN ,TimeDistributed-
Learning Real Music Generation Using
2020-01-06 Dense layer, Char-RNN with
World Case Deep Learning
abc-notation : State full
Studies
RNN, Char-RNN with abc-
notation :Model
architecture,Model training.,
Char-RNN with abc-notation
:Music generation., Char-
RNN with abc-notation
:Generate tabla music
MIDI music generation.,
Module 9: Deep
Survey blog,
Learning Real Music Generation Using
2020-01-07 Assignment,Revision Music
World Case Deep Learning
Generation Using Deep
Studies
Learning

Applied AI Course Wishes You All The Best

Please mail us to team@appliedaicourse.com if you have any queries

You might also like