You are on page 1of 2

Boosting example

The dataset used to construct bagging example.

The AdaBoost Algorithm

Step 1: Training records chosen during the boosting round 1:

Step 2: k=3
Step 3: Repeat the procedure till step 14
Step 4: The following table shows the examples chosen during each boosting round.
Step 5: Split point = Average of (0.7+0.8) = 0.75
Classifier for x<= 0.75 y = -1 else +1
Step 6:

Step 7: As for x=0.1,0.2,0.3 the prediction is wrong.


1
Error rate =10 [ ( 0.1 * 1) + ( 0.1 * 1) + ( 0.1 * 1) + (0.1 * 0) + (0.1 * 0) + (0.1 * 0) + (0.1 *
0) + (0.1 * 0) + (0.1 * 0) + (0.1 * 0)]
=0.03
Step 8:
Error rate <=0.5 YES
1 1−0.03
Step 12: α = 2 ln( ) = 1.738
0.03

Step 13: updated weights for second round:

0.1 1.738
For x= { 0.1, 0.2, 0.3 } Updated weight = 𝑒
𝑍𝑗

0.1
For x= { 0.4 to 1 } Updated weight = 𝑒 −1.738
𝑍𝑗

Equate sum of updated weights to 1 to get the value for 𝑍𝑗


0.1
[3 ∗ 𝑒 1.738 + 7 ∗ 𝑒 −1.738 ] = 1
𝑍𝑗
Solving give 𝑍𝑗=1.8289
How to select bootstrap sample when selection probabilities are different for different
data points?

You might also like