Solved – What does it imply if accuracy and recall are the same

accuracymachine learningprecision-recall

I did a number of machine learning experiments to predict a binary classification. I measured precision, recall and accuracy.

I noticed that my precision is generally quite high, and recall and accuracy are always the same numbers.

I used the following definitions:

$\text{Precision} = \frac{TP}{(TP + FP)}$

$\text{Recall} = \frac{TP}{(TP + FN)}$

$\text{Accuracy} = \frac{(TP + TN)}{(P + N)}$

I have some difficulties to interpret accuracy and recall. What does it mean if these two number are always the same in my case?

Best Answer

I suspect that you're measuring the micro-averages of precision, recall and accuracy for your two classes. If you're doing so instead of considering one class "positive" and the other "negative", you'll always get equal values for Recall and Accuracy, because the values of FP and FN will be always the same (you can check with more details here: http://metaoptimize.com/qa/questions/8284/does-precision-equal-to-recall-for-micro-averaging )