[Math] Reproducing Kernel Hilbert Spaces for Dummies

functional-analysishilbert-spaceslinear algebramachine learning

I am in the middle of some machine learning paper that states that for function $f$, imposing the norm constraint, $\|f \|=1$, corresponds to an orthogonal projection onto the direction selected in reproducing kernel Hilbert space. I do not get this.

I am lacking solid background in RKHS, so can anyone shortly tell me what it tries to say? How is an equality bounded norm related to orthogonality?

Do I need to study functional analysis?

Best Answer

If by "imposing the norm constraint, $|f|=1$, corresponds to an orthogonal projection onto the direction selected in reproducing kernel Hilbert space" you (or the paper) mean "projecting orthogonally from any point to a point with fixed norm is a closest-point projection", then I think this is true of all Hilbert spaces.

Let the orthogonal projection of the original vector be written as $v = a + b$. Assume $a$ is in the norm-constrained subspace, and $b$ is not. By way of contradiction, assume $\langle a, b \rangle \neq 0$, and so $|v|^2 = |a|^2 + |b|^2 + 2 \langle a, b \rangle $. If you draw a picture here, it should be easy to see in this case, you should be able to pick a new $a'$ close to $a$ (you can do that, since Hilbert spaces are complete) such that $v = a' + b'$, $a'$ in the subspace and $b'$ not, but so that $|b'|^2 < |a'|^2$, which means $a$ was not a closest-point projection. The only way you avoid this contradiction is by making $\langle a,b \rangle=0$, qed.

Related Question