A table showing predicted vs actual classifications, revealing true positives, false positives, true negatives, and false negatives. Essential for understanding model error patterns.
A confusion matrix is a table that describes classification model performance in detail. It shows how predictions map to actual outcomes, revealing exactly what types of errors the model makes.
Binary confusion matrix: Predicted Pos Neg Actual Pos TP FN Neg FP TN
Key components:
Derived metrics:
Multi-class extension:
Confusion matrices reveal WHERE the model fails. A single accuracy number hides whether errors are false positives or false negatives - which have very different business implications.
We always examine confusion matrices for Australian business ML projects to understand error patterns and tune models appropriately.
"Confusion matrix reveals the churn model misclassifies high-value customers more often - informing targeted improvement efforts."