Evaluating Classification Models
- Accuracy: \(\frac{TP+TN}{total}\)
- Confusion matrix: columns = predictions, rows = actual, diagonal = correctly classified
- Recall (aka Sensitivity): \(\frac{TP}{TP+FN}\)
- “Of the true things, how many did the model ‘remember’?”
yardstick::recall()
- Specificity: \(\frac{TN}{TN+FP}\)
- “How good is the model at picking out bad things?”
yardstick::spec()
or yardstick::specificity()
- Precision: \(\frac{TP}{TP+FP}\)
- “What portion of the predicted true things are true?”
yardstick::precision()