This document compares the performance of different machine learning algorithms (Random Forest, Gradient Boosting, 1D CNN) on an imbalanced classification task, with and without oversampling techniques. It shows that oversampling improves accuracy for all algorithms from around 75-90% without oversampling to 90-95% with oversampling. 1D CNN with SMOTE-Tomek oversampling achieves the best performance at 93% accuracy and 0.95 F1-score.
This document compares the performance of different machine learning algorithms (Random Forest, Gradient Boosting, 1D CNN) on an imbalanced classification task, with and without oversampling techniques. It shows that oversampling improves accuracy for all algorithms from around 75-90% without oversampling to 90-95% with oversampling. 1D CNN with SMOTE-Tomek oversampling achieves the best performance at 93% accuracy and 0.95 F1-score.
This document compares the performance of different machine learning algorithms (Random Forest, Gradient Boosting, 1D CNN) on an imbalanced classification task, with and without oversampling techniques. It shows that oversampling improves accuracy for all algorithms from around 75-90% without oversampling to 90-95% with oversampling. 1D CNN with SMOTE-Tomek oversampling achieves the best performance at 93% accuracy and 0.95 F1-score.