Orthogonalisation in Reproducing Kernel Hilbert Space (RKHS) and null space

reproducing-kernel-hilbert-spacesstochastic-processes

Let $\mathcal{X}$ be a set equipped with a positive definite kernel
$K$ with value $K(x,\, x')$. Let $\mathcal{K}$ be the corresponding
RKHS and consider a closed linear subspace $\mathcal{F}$ in
$\mathcal{K}$ and $\mathcal{G}$ be the orthogonal supplement of
$\mathcal{F}$ in $\mathcal{K}$. Beside the the inner product
$\langle\,, \,\rangle_{\mathcal{K}}$ we can define a new inner product
$\langle\,, \,\rangle_{\widetilde{\mathcal{K}}}$ on $\mathcal{K}$ by

$$
\left\langle h, \, h' \right\rangle_{\widetilde{\mathcal{K}}} :=
\left\langle P.h, \, P.h' \right\rangle_{\mathcal{K}}
$$

where $P.h$ denotes the orthogonal projection on $\mathcal{G}$. Then
$\mathcal{H}$ equipped with $\langle\,,
\,\rangle_{\widetilde{\mathcal{K}}}$
becomes a new RKHS with null
space $\mathcal{F}$. How can we relate the semi-definite positive
kernel $\widetilde{K}$ on $\mathcal{X}$ to the original kernel $K$?

Comments. In the RKHS framework for splines the original kernel is often taken
as $K(x, \, x') = F(x,\,x') + G(x,\,x')$ where $F$ and $G$ are
semi-positive definite kernels with RKHS $\mathcal{F}$ and
$\mathcal{G}$ which form a direct sum of $\mathcal{K}$ hence we simply
have $\widetilde{K}(x, \, x') = G(x, \, x')$ in this case. The null
space $\mathcal{F}$ is then usually finite-dimensional. Rather than
starting from a kernel which is a direct sum, I would like to start
from a general kernel hence to achieve some kind of subtraction of
kernels. Of special interest is the case where $\mathcal{F}$ is
finite-dimensional and an example with a closed form for $K(x, \, x')$
and $\widetilde{K}(x, \, x')$ would be great. The question relates to
universal Kriging applications where $\mathcal{X}$ is a domain in
$\mathbb{R}^d$, $\mathcal{F}$ is spanned by so-called trend
functions and $K$ is a kernel given in closed form: Whittle-Matérn,
square-exponential, …

Best Answer

The identity map of the Hilbert Space $\mathscr{H}$ can be written as $I=P + (I-P)$ where $P$ and $I-P$ are the orthogonal projections on your spaces $\mathscr{F}$ and $\mathscr{G}$. Now set $k_x(\cdot)=K(x,\cdot)$ then due to orthogonality
$$ K(x,y) = \langle k_x,k_y\rangle = \langle Pk_x,Pk_y\rangle + \langle (I-P)k_x, (I-P)k_y\rangle.$$ and nothing holds you back from simply writing $$ \tilde K(x,y) = K(x,y) - \langle(I-P)k_x,(I-P)k_y\rangle.$$ Note that the RHS is always non-negative since projections reduce norms, i.e. $\lVert (I-P)k_x\rVert\leq \lVert k_x\rVert.$

Now to your concrete example. The orthogonal projection of the function $k_x$ on the complement of the space spanned by the constant function $1$ is: $$ \tilde{k}_x = k_x - \frac{\langle k_x,1\rangle}{\lVert 1 \rVert^2}1$$ and $$ \tilde K(x,y) = \langle \tilde{k}_x, \tilde{k}_y\rangle = \langle k_x - \frac{\langle k_x,1\rangle}{\lVert 1 \rVert^2}1, k_y - \frac{\langle k_y,1\rangle}{\lVert 1 \rVert^2}1\rangle = \langle {k}_x, {k}_y\rangle - \frac{\langle {k}_x, 1\rangle \langle {k}_y, 1\rangle}{\lVert 1\rVert^2}.$$

Assume $\tau=1$ for simplicity and note that the inner product of your space is $$ \langle f,g \rangle = f(0)g(0) + f(T)g(T) + \int_0^T f(t)g(t)\,dt + \int_0^T f'(t)g'(t)\,dt $$ which means the two inner products can be evaluated using $$ \langle k_x,1 \rangle = k_x(0) + k_x(T) + \int_0^T k_x(t)\,dt.$$

By using Gram-Schmidt the above works for projections on any finite dimensional space.

Related Question