Professional Documents
Culture Documents
ENSEMBLE METHODS
MACHINE LEARNING
Vaishnavi Garg
Sundram Goyal
+
Introduction
Ensemble methods is a machine learning technique that
combines several base models in order to produce one
optimal predictive model.
1. Employee of Company XYZ: In the past, he has been right 70% times.
2. Financial Advisor of Company XYZ: In the past, he has been right 75%
times.
3. Stock Market Trader: In the past, he has been right 70% times.
5. Market Research team in same segment: In the past, they have been
right 75% of times.
6. Social Media Expert : In the past, he has been right 65% of times.
+
Given the broad spectrum of access we have, we can probably
combine all the information and make an informed decision.
ENSEMBLE Parallel
Ensemble
Learning
LEARNING (Bagging)
Stacking
+ Sequential Ensemble Learning
(Boosting)
Sequential ensemble methods where the base learners are generated
sequentially.
2. Train a classifier model 1 for this generated sample data and test the
whole training data-set.
4.Repeat this process until you get high accuracy from the system.
+
Predictio
n
+
Parallel Ensemble Learning
(Bagging)
Parallel ensemble methods where the base learners are generated in
parallel.
Bagging stands for Bootstrap Aggregating, is an approach where you
take random samples of data, build learning algorithms and take
simple means to find predictions.
Algorithms : Random Forest, Bagged Decision Trees, Extra Trees
Bagging Process :
Training
D11 D2 Dn
Classification
Models
C1 C2 Cn
Prediction
Predictions P1 P2 Pn
Voting
Final Prediction Pf
+
Bagging vs Boosting
Bagging Boosting
Partitioning of Random Higher vote to
data misclassified samples
Nature Parallel Sequential
Computer security
Distributed denial of service
Malware Detection
Intrusion detection
Face recognition
Emotion recognition
Fraud detection
Financial decision-making
Medicine
+
THE END