Confusion Matrix – What is the Term for FP / (FP + FN) in a Confusion Matrix?

classificationconfusion matrixdefinitionroc

For a confusion matrix, there are a variety of useful rates, ratios and indices. But I cannot find the one I care about:

FP / (FP + FN)

Of course this measure is not defined for a perfect classifier, one in which both false positives and false negatives are zero. But for everything short of perfect, this measure is useful as it shows whether false positives or false negatives dominate.

Is there a name for this measure? Perhaps I missed how it is algebraically equivalent to one of the other measures.

Best Answer

I would call this the proportion of the misclassifications/mistakes that are false positives.

The denominator is the total number of misclassifications/mistakes. Of these mistakes, some are false positives and some are false negatives; the particular numerator here considers the number of false positives.

Hence, dividing the number of false positives in particular by the total number of mistakes gives the proportion of mistakes that are false positives.

I do not know this to be one of the common confusion matrix statistics like sensitivity and specificity are, but I struggle to think of a term that provides more clarity about meaning than my proposed term.

Related Question