Probability Theory – Understanding L2 Limit of Gaussian Random Vectors

probability theoryproof-explanationstochastic-analysisstochastic-calculusstochastic-processes

I am working on an exercise stating as follows:

Suppose the process $X_{t}$ is a Gaussian process, and let $H$ be the Hilbert space generated by $(X_{t})_{t\in\mathbb{R}}$, that is, the space consisting of $L^{2}-$limits of linear combinations of values of $X_{t}$. Prove that every element in $H$ is a Gaussian random variable.

The solution of the exercise said that:

It suffices to prove the $L^{2}-$limit of Gaussian random vectors is still a Gaussian random vector. That is, if $\vec{X}_{t}$, $t=1,2,\cdots$ is a sequence of Gaussian random vectors with mean $\vec{\mu}_{t}$ and covariance matrix $\Sigma_{t}$ such that $\vec{X}_{t}\longrightarrow \vec{X}_{\infty}$ in $L^{2}$, then $\vec{X}_{\infty}$ is a Gaussian random vector.

Then it proved as follows:

Recall that the convergence in $L^{2}$ of $\vec{X}_{t}$ to $\vec{X}_{\infty}$ implies $\vec{\mu}_{t}=\mathbb{E}\vec{X}_{t}$
converges to $\vec{\mu}_{\infty}=\mathbb{E}\vec{X}_{\infty}$, and the
element-wise convergence of covariance matrices $\Sigma_{t}$ to the
corresponding covariance matrix $\Sigma_{\infty}$. Further, the
$L^{2}-$convergence implies the corresponding convergence in
probability and hence by the bounded convergence we have
$$\varphi_{\vec{X}_{t}}(\vec{\theta})\longrightarrow
> \varphi_{\vec{X}_{\infty}}(\vec{\theta})\ \text{for each}\
\vec{\theta}\in\mathbb{R}^{d}.$$
Since
$\varphi_{\vec{X}_{t}}(\theta)=e^{i<\vec{\theta},\vec{\mu}_{t}>}e^{-\frac{1}{2}<\vec{\theta},
> \Sigma_{t}\vec{\theta}>}$
for any $t<\infty$, it follows that the same
applies for $t=\infty$, it follows that the same applies for
$t=\infty$.

It is a well-known fact of linear algebra that the element-wise limit
$\Sigma_{\infty}$ of positive semi-definite matrices $\Sigma_{n}$ is
necessarily also positive semi-definite.

Thus, in view of the definition of Gaussian random vector using
characteristic function, we see that the limit $\vec{X}_{\infty}$ is a
Gaussian random vector, whose parameters are the limits of the
corresponding parameters of $\vec{X}_{t}$.

I have three questions about this solution:

Firstly, why is showing the $L^{2}-$limit of Gaussian random vectors still being a Gaussian random vector sufficient? Is that because then we know that $<b,\vec{X}_{\infty}>$ is a Guassian random variable for all $b$? but where is $b$ living in? $\mathbb{R}^{\infty}$?

Secondly, in the proof, why $\varphi_{X_{t}}(\vec{\theta})\longrightarrow \varphi_{X_{\infty}}(\vec{\theta})$ for all $\vec{\theta}\in\mathbb{R}^{d}$? if we take the limit, should it be $\mathbb{R}^{\infty}$?

I believe the first and the second questions are both around one point: how to understand the limit of Gaussian vectors:

so this proof basically shows that the "entires" of the Guassian vector $\vec{X}_{t}$ is not increasing. It is always a $\mathbb{R}^{d}-$valued random vector.

But why and how? If we increase the limit, why does not the entries change? so the limit is not from $\vec{X}_{t}:=(Y_{1},\cdots, X_{t})\longrightarrow(Y_{1},\cdots, Y_{\infty})=\vec{X}_{\infty}$?

How to understand the Gaussian random vectors and Gaussian process?

Thank you!

Best Answer

Okay... I think your confusion mostly stems from a bad choice of notation and... maybe a slightly confusing formulation.

Let $(X_t)_{t\geq 0}$ be your Gaussian process. Then, you can consider the vector space

$$ \mathcal{H}_0=\left\{\sum_{i=1}^{M} a_i X_{t_i}|\;M\in \mathbb{N},a_i\in \mathbb{R}, t_i\geq 0\right\} $$ with the $L^2$-norm. That's a subspace of $L^2$, since all of these variables are Gaussian. Then, take the completion $\mathcal{H}=\overline{\mathcal{H}_0}$. The statement is that every $Y\in \mathcal{H}$ is Gaussian. As an even stronger statement, we get that any finite collection $(Y_i)_{1\leq i\leq N}\subseteq \mathcal{H}$ is jointly Gaussian. This is really what your author is proving.

Indeed, let $Z_{i,n}\in \mathcal{H}_0$ such that $\lim_{n\to\infty} Z_{i,n}=Y_i$ in $L^2$. Then, every vector $(Z_{i,n})_{1\leq i\leq N}$ is Gaussian and converges to $(Y_i)_{1\leq i\leq N}$ in $L^2$. The dimension here does not explode, it stays fixed. Therefore, it suffices for your author to consider sequences of Gaussian vectors of a fixed finite dimension, because all they want to prove is that $(Y_i)_{1\leq i\leq N}$ is a Gaussian vector. This should answer both of your questions.

Related Question