5.2 Evaluating Classification Models

  • Accuracy: \(\frac{TP+TN}{total}\)
    • yardstick::accuracy()
  • Confusion matrix: columns = predictions, rows = actual, diagonal = correctly classified
    • yardstick::conf_mat()
  • Recall (aka Sensitivity): \(\frac{TP}{TP+FN}\)
    • “Of the true things, how many did the model ‘remember’?”
    • yardstick::recall()
  • Specificity: \(\frac{TN}{TN+FP}\)
    • “How good is the model at picking out bad things?”
    • yardstick::spec() or yardstick::specificity()
  • Precision: \(\frac{TP}{TP+FP}\)
    • “What portion of the predicted true things are true?”
    • yardstick::precision()