You are on page 1of 5

Homework 3: Individual Problem

Algorithm selected: J48 (pruned decision tree)

Error rates (correctly classified instances):


Vanilla algorithm With Bagging With boosting
87.5549 % 92.6794 % 92.2401 %

Bagging and boosting both help in improving the performance of the original algorithm.
Algorithm selected: IBk (k-nearest neighbours)

Error rates (correctly classified instances):


Vanilla algorithm With Bagging With boosting
91.2152 % 90.776 % 91.0688 %
I chose another algorithm (kNN), for which bagging performs slightly poor than the original
algorithm.

Bagging usually helps where the classifier is unstable (here decision tree) and may hurt the
performance of a stable classifier (here k-nearest neighbors). By averaging over bootstrapped
datasets it tries to reduce the variance but has little effect on bias.

You might also like