Solved – feature selection for SVM

feature selectionsvm

So I have some experience when it comes to SVM mainly through a basic course in machine learning. However when it comes to features I've never needed to do any form of feature selection before.

The dataset that I have right now has around 40 features and about 650 data points. I've read about different feature selection techniques and decided to try out recursive feature elimination, RFE.

I've tried using the caret package for R and scikit for python 3 however I'm rather confused with the kernels used. It would seem that RFE is mostly done with a linear kernel so my question becomes if you can perform RFE with other kernels like radial, sigmoid and poly.?

Best Answer

You can read what the authors have to say about your question here. They wrote:

The method of eliminating features on the basis of the smallest change in cost function described in Section [...] can be extended to the non-linear case and to all kernel methods in general (Weston, 2000(b)). One can make computations tractable by assuming no change in the value of the α’s. Thus one avoids having to retrain a classifier for every candidate feature to be eliminated

So i guess is possible, but not implemented in scikit and R.

Related Question