Confusion Matrix
A table showing predicted vs actual classifications, revealing true positives, false positives, true negatives, and false negatives. Essential for understanding model error patterns.
In-Depth Explanation
A confusion matrix is a table that describes classification model performance in detail. It shows how predictions map to actual outcomes, revealing exactly what types of errors the model makes.
Binary confusion matrix: Predicted Pos Neg Actual Pos TP FN Neg FP TN
Key components:
- True Positive (TP): Correctly predicted positive
- False Positive (FP): Incorrectly predicted positive
- True Negative (TN): Correctly predicted negative
- False Negative (FN): Incorrectly predicted positive
Derived metrics:
- Accuracy = (TP + TN) / Total
- Precision = TP / (TP + FP)
- Recall = TP / (TP + FN)
- Specificity = TN / (TN + FP)
Multi-class extension:
- Rows: actual classes
- Columns: predicted classes
- Diagonal: correct predictions
Business Context
How Clever Ops Uses This
We always examine confusion matrices for Australian business ML projects to understand error patterns and tune models appropriately.
Example Use Case
"Confusion matrix reveals the churn model misclassifies high-value customers more often - informing targeted improvement efforts."
Frequently Asked Questions
Related Resources
Precision
Of all positive predictions, what proportion was actually positive. High precisi...
Recall
Of all actual positives, what proportion did the model identify. High recall mea...
Accuracy
The proportion of correct predictions among total predictions. A basic classific...
Learning Centre
Guides, articles, and resources on AI and automation.
AI & Automation Services
Explore our full AI automation service offering.
AI Readiness Assessment
Check if your business is ready for AI automation.
