Solved – About SVM cost and gamma parameters tuning

classificationhyperparameterkernel tricksvm

I am using R and e1071 package to tune a C-classification SVM.

My question is: regardless of the kernel type (linear, polynomial, radial basis or sigmoidal), is there any good criterion to choose the range in which cost and $\gamma$ parameters should range over and/or to choose what the granularity should be (that is, as an example, gamma = 10 ^ (1:2) or gamma = 1:2 or gamma = 100 ^ (1:2))?

I add a second question: can tune.svm() return the best kernel type, too?

Thanks,

Best Answer

Rather than using a grid search, it may be easier to use something like the Nelder-Mead simplex algorithm (which I believe has an R implementation) and use that to minimise the model selection criterion, and then you don't need to worry about the limits at all. Do minimise a continuous criterion though, for example the hinge loss on the test examples or radius-margin bound, rather than the test error rate (as that will be rather noisy and hard to optimised. Don't optimise the hyper-parameters too much though as it is also possible to get overfitting in model selection as well as in fitting the SVM.