A

Accuracy

The proportion of correct predictions among total predictions. A basic classification metric that can be misleading for imbalanced datasets.

In-Depth Explanation

Accuracy measures the percentage of correct predictions. While intuitive, it's often inadequate alone, especially for imbalanced classes.

Accuracy formula: Accuracy = (True Positives + True Negatives) / Total Predictions

When accuracy works well:

  • Balanced classes (similar counts)
  • All errors equally important
  • Simple performance summary needed

When accuracy misleads:

  • Imbalanced classes (99% accuracy on 1% fraud detection?)
  • Different error costs (false negatives vs false positives)
  • Multi-class with varying importance

Better alternatives:

  • Precision and Recall
  • F1 Score
  • ROC-AUC
  • Confusion matrix analysis
  • Domain-specific metrics

Business Context

Don't be fooled by high accuracy. A fraud detector saying "not fraud" always achieves 99% accuracy but catches nothing. Choose metrics that reflect business impact.

How Clever Ops Uses This

We help Australian businesses select appropriate metrics for their AI projects, ensuring models optimise for actual business outcomes.

Example Use Case

"A model with 95% accuracy on customer churn sounds good, but if only 5% of customers churn, predicting "won't churn" for everyone achieves 95% accuracy too."

Frequently Asked Questions

Category

data analytics

Need Expert Help?

Understanding is the first step. Let our experts help you implement AI solutions for your business.

Ready to Implement AI?

Understanding the terminology is just the first step. Our experts can help you implement AI solutions tailored to your business needs.

FT Fast 500 APAC Winner|500+ Implementations|Harvard-Educated Team