Solved – VC dimension of SVM with polynomial kernel in $\mathbb{R^{2}}$

classificationkernel trickself-studysvmvc-dimension

What is the VC dimension of SVM with the polynomial kernel $k(x,x')=(1+<x,x'>_{\mathbb{R^{2}}})^{2}$ for binary classification in $\mathbb{R^{2}}$?

It would be equal or more than v iff there exists a set of v points such that, any labeling (-1 or +1) of the points given, there exists a correct separating border.

It would be strictly less than v iff for all sets of v points, there exists a labeling of the points such that there is no correct separating border.

In this case, the separating border is a conic section or a line, so any idea based on these curves rather than SVM is welcome.

For instance, let $x=(x_{1},x_{2}) \in \mathbb{R^{2}}$. It is mapped to $\mathbb{R^{6}}$ using a certain $\phi$ and an hyperplane in this new space is a conic section in the former space. So my guess would be the VC dimension of a linear classifier in $\mathbb{R^{6}}$, which is 7.


Actually, the answer is 6: the equation of an hyperplane in $\mathbb{R^{d}}$ is $$<w,x>_{\mathbb{R^{d}}}+b=0$$ where $w \in \mathbb{R^{d}}$ and $b \in \mathbb{R}$. Here, we have $b=1$ and $w \in \mathbb{R^{5}}$, not $\mathbb{R^{6}}$.

Best Answer

If I'm not mistaken:

$[1 + \langle x,y\rangle]^2$ = $\langle[1, \sqrt{2}y_1, y_1^2, \sqrt{2}y_2, y_2^2, \sqrt{2}y_1y_2], [1, \sqrt{2}x_1, x_1^2, \sqrt{2}x_2, x_2^2, \sqrt{2}x_1x_2]^T\rangle$

So the separating rule is linear in $R^6$. Consequently, VC-dim is 7.