Solved – Intuition behind RKHS (Reproducing Kernel Hilbert Space}

intuitionkernel trickmachine learningmathematical-statistics

Why has Reproducing Kernel Hilbert Space (RKHS) become such an important concept in machine learning in recent times? Is it because it allows us to represent a function of combination of linear functions?

What areas of mathematics does one need to cover before understanding RKHS?

Best Answer

As the name says, reproducing kernel Hilbert spaces is a Hilbert space, so some knowledge of Hilbert space/functional analysis comes in handy ... But you might as well start with RKHS, and then see what you do not understand, and what you need to read to cover that.

The usual example of Hilbert spaces, $L_2$, have the problem that the members are not functions, but equivalence classes of functions that coincide except on a set of (Lebesgue) measure zero. That way, they always give the same results when integrated ... and that is what $L_2$ spaces can be used for. Members of $L_2$ spaces cannot really be evaluated since you can change the value at one point without changing the value of the integral.

So in applications where you really want functions that you can evaluate at individual points (like in approximation theory, regression, ...) RKHS come in handy, because the defining property is equivalent to the requirement that the evaluation functional $$ E_x(f) = f(x) $$ is continuous in $f$ for each $x$. So you can evaluate the member functions, and replacing $f$ with some other function, say $f+\epsilon$ (in some sense ...) will only change the value a little bit. That is the intuition you asked for.