I'm trying to use LIBSVM's single class SVMs for some classification and need to extract the following sum post classification (i.e. the variable that the decision function takes in)
$$
\Sigma_{i=1}^{N} \alpha_i K(z,z_i)
$$
I'm not too sure on this so I just want to check that I have the right idea, but could I calculate the sum just by using the primal variable $\textbf{w}$? So for a libsvm model $m$ I just call
w = (m.SVs)'*(m.coef);
and then calculate $\textbf{w}^T \textbf{z}$, where $\textbf{z}$ is the feature vector of what I'd like to classify. I'm not sure though, I guess it may just be easier calculating the kernel function explicitly, but if anyone knows of a good way to access the info in libsvm I would be very grateful
Best Answer
Depending on what kernel you use, you cannot possibly compute $\mathbf{w}$ (the separating hyperplane in feature space) explicitly. For the RBF kernel, for example, $\mathbf{w}$ is infinite dimensional.
Instead, what you can compute is the inner product of $\mathbf{w}$ and the test instance $\mathbf{z}$ embedded in feature space $\phi(\mathbf{z})$ as follows $\langle \mathbf{w}, \phi(\mathbf{z})\rangle = \sum_{i\in SV} \alpha_i y_i K(\mathbf{x}_i, \mathbf{z})$.
If your question is whether or not you can compute the decision value, you can do that as follows: $$f(\mathbf{z}) = \sum_{i\in SV} \alpha_i y_i K(\mathbf{x}_i, \mathbf{z}) +b.$$
In libsvm, the
sv_coef
variable contains the vector $\alpha_i y_i$.