You are on page 1of 5

Initialize weights w = 1/N.

Assign Calculate error rates for each h Pick best h with smallest erro
each training point equal weight (hypothesis) rate

�(𝑥)=𝑠𝑖𝑔𝑛(�_1 ℎ_1 (𝑥)+�_2 ℎ_2 (�)+�_2 ℎ_2 (�)+…+�_𝑛 ℎ_𝑛 (�))


NN
Best model: K-NN
- Makes forecast errors small. Decision Trees
- Adjust to emphasize points that were previously missclasified. SVM

New New
Weights weights weights
Round 1 round 2 round 3

WA 0.2 0.125 �_𝑛𝑒�={■8(1/2


0.083 (1/ 〖 1−� 〗 _� ), 𝑖𝑓 𝑟𝑖𝑔ℎ𝑡@1/2 (1/�_� ), 𝑖𝑓 �𝑟𝑜𝑛𝑔
WB 0.2 0.125 0.250
WC 0.2 0.500 0.333
WD 0.2 0.125 0.083
WE 0.2 0.125 0.250
Total suma 1.000 1.000

Errors Error rates 1 Error rates 2 Error rates 3

X<2 BE 0.4 0.250 0.500


x<4 BCE 0.6 0.750 0.833
x<6 C 0.2 0.500 0.333
X>2 ACD 0.6 0.750 0.500
x>4 AD 0.4 0.250 0.167
x>6 ABDE 0.8 0.500 0.667

Best weak classifier


h X<6 X<2 x>4
Error rate 0.2 0.250 0.167 �= 1/2 ln((1−𝑒_𝑡)/𝑒_𝑡 )
alpha 1/2 ln(4) 1/2ln(3) 1/2ln(5)

�(𝑥)=𝑠𝑖𝑔𝑛(1/2 ln(4) (𝑋<6)+1/2 ln(3) (�<2)+1/2 ln(5) (𝑋>4))

missclassifies
x<6 C �_1 0.693
X<2 BE �_2 0.549 These three weights must satis
x>4 AD �_3 0.805 Any two of them added togeth

Some questions:
�= 1/2 ln((1−𝑒_𝑡)/
What if we chose a classifier with an error rate of 1/2? What's its voting power? �= 1/2 ln((1−𝑒_𝑡)/
What if you wanted to continue updating using a hypotehsis with error rate of 1/2?

Neat facts

Fact 1: 0 < w<1/2


Fact 2: �_𝑜𝑙𝑑≠�_𝑛𝑒�
Fact 3: If you have 3 or more disjoint classifiers, which we have in this case, you are guaranteed to make

missclassifies:
h1 ABC Just assign equal weights to all of them
h2 DE
h3 FG

However, if classifiers make overlapping errors, you are not guaranteed to get a perfect ensemb

Fact 4: If a classifier makes a superset of errors, and another classifiers makes errors in the samet super
Is H good enough? First criteria: If H is not good enough, we
Pick best h with smallest error # of round. Second criteria: no update weights
rate good classifiers left (error rates
higher than 0.5)
If H is good enough, we return H

Naive Bayes
Binary Classification

𝑖𝑔ℎ𝑡@1/2 (1/�_� ), 𝑖𝑓 �𝑟𝑜𝑛𝑔)}

1/2 ln((1−𝑒_𝑡)/𝑒_𝑡 )

hese three weights must satisfy triangle inequality


ny two of them added together must be greater than the third one

�= 1/2 ln((1−𝑒_𝑡)/𝑒_𝑡 )
�= 1/2 ln((1−𝑒_𝑡)/𝑒_𝑡 )

, you are guaranteed to make a perfect classifier out of them

teed to get a perfect ensemble classifier

kes errors in the samet super set, your total error of the ensemble classifier will be greater than those of the parts
enough, we
s

ugh, we return H

You might also like