You are on page 1of 22

ENSEMBLE

LEARNING
ENSEMBLE LEARNING

• machine learning paradigm where multiple


learners are trained to solve the same problem.
• the idea is to select a whole collection, or
ensemble of hypotheses from hypothesis space
and combine their predictions.
• contains a number of learners which are usually
called base learners or also referred as weak
learners.
• Ensembles use multiple trained models.
ENSEMBLE LEARNING

• BIAS - measure of how flexible the model is.


- error from erroneous assumptions in the
learning algorithm. High bias can cause an algorithm to
miss the relevant relations between features and target
outputs (under fitting).
• VARIANCE - error from sensitivity to small
fluctuations in the training set. High variance can cause
an algorithm to model the random noise in the training
data, rather than the intended outputs (over fitting).
ENSEMBLE LEARNING
APPLICATION OF
ENSEMBLE LEARNING
Weak
learning
Weak learning

• consists of a learning algorithm that classifies the


input data better than random.
• an error is weak if its error is high, but the error
cannot be higher than 0.5
• ε < 0.5
• if an error is not very love, it’s a weak learner.
Weighted
training
set
Weighted
training set

• In such a training set, each example has


an associated weight wj > 0.
• The higher the weight, the higher is the
importance attached during the learning
of a hypothesis.
Weighted
training set
boosting
boosting

• machine learning ensemble meta-


algorithm for primarily reducing bias, and
also variance in supervised learning, and a
family of machine learning algorithms
that convert weak learners to strong ones.
boosting

• Focus new experts on examples that others get


wrong.
• Train experts sequentially.
• Errors of early experts indicate the “hard”
examples.
• Focus later classifier on getting these examples.
• Combine the whole set in the end.
• Convert many “weak” learners into complex
classifier.
Adaboost
(Adaptive boosting)
Adaboost
• Any learning algorithm that returns
hypotheses that are probably
approximately correct is called a PAC-
learning algorithm.
Decision
list
Decision list
• Logical expression of a restricted form.
• Consists of a series of tests, each of which is a
conjunction of literal.
• If a test succeeds when applied to an example
description, the decision list specifies the value to be
returned.
• If the test fails, processing continues with the next test
in the list.
• Decision lists resemble decision trees, but their
overall structure is simpler. In contrast, the individual
tests are more complex.
Decision list
Decision list
Decision list
references

• https://www.youtube.com/watch?v=nelJ3svz0_o
• https://www.youtube.com/watch?v=UHBmv7qCey4
• https://en.wikipedia.org/wiki/Boosting_(machine_learning)
• Artificial Intelligence A Modern Approach, Second
Edition (Stuart J. Russell and Peter Norvig )

You might also like