Solved – Combining one class classifiers to do multi-class classification

bayesian networkclassificationmachine learning

I am working on a 3-class classification problem. The classifier I'm using is Bayesian Networks which provides me with a classification accuracy of around 60%. When I do a two-class classification, I get 80% accuracy for differentiating between class 0 & class 1 and between class 0 & class 2. Also, I get only 60% accuracy for classification between class 1 & class 2. I believe the best way to do 3-class classification in this case would be combining the 2 two-class classifiers with 80% accuracy. What comes to my mind is using some sort weighted averaging scheme on the results of the two individual two-class classifiers. I have not solved such a problem in the past and am facing a dilemma as to how I should implement this. Any help/suggestions in this regard would be highly appreciated. Please fell free to suggest other alternatives if you think they may work.

Best Answer

I've done something like this using either of the following:

  1. (a) Given three different classes (e.g. A, B, C), create an input column for each class. Place '1' in the A column if the sample is an A, '0' otherwise - do this for B and C classes using the same logic. The foregoing columns will be your target fields for three separate binary classifiers (a classifier for A, B, and C).

(b) Feed the predictions - in addition to any other features - into a third classifier, a multiclass classifier whose target is the tri-level target.

  1. Taking the same approach as 1(a), take the predictions and use rule-based logic (or misclassification costs) to separate the class predictions - this is to avoid ending up with the same sample being predicted as both A and B, both A and C, etc.
Related Question