[Math] Applying a linear operator to a Gaussian Process results in a Gaussian Process: Proof

normal distributionprobability theoryreference-requeststochastic-processes

In this paper, it is stated without proof or citation that "Differentiation is a linear operation, so the derivative of a Gaussian process remains a
Gaussian process". Intuitively, this seems reasonable, as the linear combination of Gaussian random variables is also Gaussian, and this is just an extension to the case where instead of a vector-valued random variable we have a random variable defined on a function space. But I cannot find a source with a proof and the details of a proof elude me.

Proof Outline Let $x(t)\sim \mathcal{GP}(m(t),k(t, t^\prime))$ be a Gaussian process with mean function $m(t)$ and covariance function $k(t, t^\prime)$, and $\mathcal{L}$ a linear operator. For any vector $T=(t_1,…,t_n)$, let $x_T=(x(t_1),…,x(t_n))$. Then $x_T\sim \mathcal{N}(m_T,k_{T,T})$. Now consider the stochastic process $u(t)=\mathcal{L}x(t)$. It suffices to show that the finite dimensional distributions of $u(t)$ are Gaussian, but translating the action of the linear operator on $x(t)$ to the finite dimensional case is giving me trouble.

In the case of differentiation, we have $u(t)=\mathcal{L}x(t)=\frac{dx}{dt}=\lim_ {h\rightarrow 0}\frac{x(t+h)-x(t)}{h}$. For all $h>0$, the random variable $v(t)=\frac{x(t+h)-x(t)}{h}$ is normal, and by interchanging integration and the limit, we have

$$
\begin{array}{rcl}
m_u(t)&=&E\left(\lim\limits_{h\to 0}\frac{x(t+h)-x(t)}{h}\right)\\
&=&\lim\limits_ {h\to 0}E\left( \frac{x(t+h)-x(t)}{h}\right)\\
&=&\lim\limits_ {h\to 0}\frac{m(t+h)-m(t)}{h}\\
&=&m^\prime(t)
\end{array}$$

Of course, we need to verify when this interchange is appropriate. Similarly, we can intuit the covariance function of $u(t)$ has the form

$$
k_u(t,t^\prime)=\frac{\partial^2 x}{\partial t\partial t^\prime }k(t,t^\prime)
$$

but I am having a hard time making the leap from finite approximations to the infinite-dimensional case.

Reference Request If there is any textbook or paper that does more than mention this fact in passing, please let me know.

Best Answer

In the paper Ghosal, Subhashis; Roy, Anindya, Posterior consistency of Gaussian process prior for nonparametric binary regression, Ann. Stat. 34, No. 5, 2413-2429 (2006). ZBL1106.62039., part of the result of Theorem 5 states that

For a Gaussian process η(·) that has differentiable sample paths with mixed partial derivatives up to order α and the successive derivative processes, $D^w \eta$(·) are also Gaussian with continuous sample paths. Also, the derivative processes are sub-Gaussian with respect to a constant multiple of the Euclidean distance.

Related Question