The proportion of true positive predictions among all positive predictions, measuring how reliable positive predictions are.
Precision measures how many of the positive predictions were actually correct. It answers: "When the model predicts positive, how often is it right?"
Formula: Precision = True Positives / (True Positives + False Positives)
When to prioritise precision:
Examples:
Trade-off with recall:
Prioritise precision when acting on false positives is costly or harmful. A selective model that is right when it acts.
We help Australian businesses understand precision trade-offs, choosing thresholds that match business costs of different error types.
"A fraud alert system prioritising precision to minimise false alarms that waste investigation resources and annoy legitimate customers."