Solved – Multi class classification always have better result than one class classification

libsvmsvm

Currently I'm using svm to classify the test samples to two different classes (True and False). When I use multi class classification, I have both true and false samples in my training set, the accuracy is always above 90%, but if I use one class classification (only true samples in the training set), the accuracy is very low (0%). In one class classification, I use radial basic function, I also tried setting different value for nu, but it didn't work.

I'm using libsvm library for java. I also scaled data before training.

Then, Did I misunderstand something about one class classification with svm ?
Thank you very much.

Best Answer

You should use two-class SVM instead of one-class. One-class classification is usually used to estimating the support of the high-dimensional distribution of the data. It can be used to estimate a decision boundary between two classes (usually one class vs. rest). @cbeleites gives more details about the different situations for classifying two classes.

You may want to refer to part 2 in the documentation (pdf) of LibSVM for a quick understanding of the difference in the formulation.