Solved – How to select best parameter for polynomial kernel

kernel trickmachine learningpythonsvm

I am using LibSVM library for classification. For my problem I am using polynomial kernel and I need to select best parameters (d = degree of polynomial kernel, and C = soft margin constant). The LibSVM guide suggests for grid search for this job. In this library there exists a parameter selection tool (grid.py), but it seems to me this is implemented to tune parameters of RBF kernel (gamma). My questions are:

  1. Is there any other good solution for selecting suitable parameters rather than Grid search?
  2. Can anybody give me some hints / sample code for selecting parameters for polynomial kernel?

Best Answer

Grid search is a sensible procedure as @JohnSmith suggests, however it is not the only stable technique. I generally use the Nelder-Mead simplex algortihm, which I have found to be very reliable and more efficient than grid search as less time is spent investigating areas of hyper-parameter space that give poor models. If you are a MATLAB user, you can get my implementation of this method here. Nelder Mead simplex methods are attractive as they don't require gradient information (I suppose you can think of the gradient of the simplex as being an approximation of the local gradient) and is very easily implemented.

Also gradient descent optimisation of the span bound is a good way of optimising the hyper-parameters, see Chapelle et al. (also investigate the other papers that Olivier has written, there are some real gems).

One advantage of grid search however is that it is quite easy to over-fit the cross-validation error (or span bound etc.) when optimising the hyper-parameters, especially if there is little data and many hyper-parameters. This means you can get a very poor classifier if you tune the hyper-parameters to convergence, so a coarse grid search can often perform better (I have a paper in preparation on this).

Related Question