Optimization – Support Vector Machine, Complementary Slackness, and Marginal Hyperplane

optimizationsvm

One of the complementary slackness conditions for a support vector machine states that
$$\alpha_i ( y_i (\langle w, x_i \rangle + b ) -1 ) = 0,$$ where $\alpha_i$ is the lagrange variable. One can then conclude that if $\alpha_i \neq 0$ ($x_i$ is a support vector), $y_i (\langle w, x_i \rangle + b ) -1 $ ($x_i$ lies on the marginal hyperplane). My question is if one can say anything in the other direction; if a point $x_i$ lies on the marginal hyperplane, can one conclude that it is a support vector? ($\alpha_i\neq 0$).

Best Answer

Theoretically, no. You can see this in terms of smoothly adjusting $C$ - at some point $x_i$ may be on the margin but with $\alpha_i = 0$. Practically, yes. In particular, note that stopping conditions for the optimization allow for some $\epsilon$, generally in terms of the objective, so you could certainly have a "small enough" value for $\alpha_i$ for these points. Practically speaking though, when you actually solve the optimization problem, you will see that not all non-bound SVs ($0 < \alpha_i < C$) actually aren't classified as $\pm 1$, but rather are close to this value. This is why e.g., libSVM calculates the bias term, $b$, by averaging over the classification values for these points (see e.g. 4.1.5 in the libSVM paper)

Related Question