You are on page 1of 15

BAM1024 - Introduction

to Statistical Analysis
Class 10: Evaluation Techniques!
Evaluation Techniques
Haven’t we already done a lot of evaluation in the
last couple of weeks? We even had a mid-term
Confusion Matrix

A confusion matrix is a table that is used to define the performance of a


classification algorithm
It highlights both Type I and Type II errors i.e.
False Positives and False Negatives
It also produces key metrics such as
Recall (also Sensitivity) = TPR = TP/(TP+FN)
Precision = TP/(TP+FP)
Accuracy = (TP+TN)/N where N = TP+FP+TN+FN
Confusion Matrix
ROC Curve
History
Impact of Thresholds
Impact of Thresholds
From Confusion Matrix to ROC Curve
ROC Curve Example
Comparing different models
Choosing between models
AUC Challenges
Alternatives to ROC

2
Questions

What is ROC? Why do we need it?


What is AUC or AUROC?
How can we use ROC to compare different models?
What are the challenges with AUC?
What are F-measures?

You might also like