Solved – The difference of kernels in SVM

kernel trickmachine learningpattern recognitionsvm

Can someone please tell me the difference between the kernels in SVM:

  1. Linear
  2. Polynomial
  3. Gaussian (RBF)
  4. Sigmoid

Because as we know that kernel is used to mapped our input space into high dimensionality feature space.
And in that feature space, we find the linearly separable boundary..

When are they are used (under what condition) and why?

Best Answer

The linear kernel is what you would expect, a linear model. I believe that the polynomial kernel is similar, but the boundary is of some defined but arbitrary order

(e.g. order 3: $ a= b_1 + b_2 \cdot X + b_3 \cdot X^2 + b_4 \cdot X^3$).

RBF uses normal curves around the data points, and sums these so that the decision boundary can be defined by a type of topology condition such as curves where the sum is above a value of 0.5. (see this picture )

enter image description here

I am not certain what the sigmoid kernel is, unless it is similar to the logistic regression model where a logistic function is used to define curves according to where the logistic value is greater than some value (modeling probability), such as 0.5 like the normal case.