Solved – Matthews correlation coefficient – how much is a good classifier

classificationcorrelationmachine learning

I'm comparing performance of different classifiers on the data sets derived from financial markets, getting different accuracy and precision measures but Matthews correlation coefficient and Kappa statistic seldom exceeds 0.2. My data sets are quite big like 20000×170
so it should not be a problem with not enough of data.

So my question is: what value of MCC and Kappa can be considered as a 'good classifier' on such data.

Best Answer

I presume your are from Poland?

I do not think it is possible to answer for your question. Even in some statistics books about measures like AUC it is said it will be domain specific.

My classification models had recently their MCC in range of 0.2-0.25 with AUC bit over 0.7.

MCC can be calculated only if you specify a cutoff point when you have a probabilistic prediction.

But besides measures of accuracy for classification you might have a explicit cost/benefit measures for false positives and false negatives which might change cutoff point taken from the purely information point of view.