Gaussian process uniquely determined by mean and covariance function

normal distributionprobabilityprobability theorystochastic-processes

I am studying stochastic process and at some point the notion of Gaussian process appears. The definition is clear, however, when it is said that the process is uniquely determined by its mean function and covariance function I don't understand a notation that it is used.

I detail the process that led me to this misunderstanding

Consider $X = (X_t)_{t\in\mathbb{T}}$ is Gaussian process, it means that for all finite linear combination of its coordinates : $Z = \sum_{i=1}^{n}\lambda_i X_{t_{i}}$ follows a Gaussian random variable. It can be shown that $\varphi_{Z}(t)=\mathbb{E}[e^{itZ}] = e^{i\mathbb{E}[Z] – \frac{1}{2}\mathbb{V}[Z]}$ which shows that the law of $Z$ is determined by the functions $m_{X}$ and $\gamma_{x}$. Thus, the law of the Gaussian process is fully determined by its mean and covariance since this holds for every finite linear combination of its coordinates.

Until now, this is clear. But then, they use the notation $X\sim\mathcal{N}(m, \gamma)$ to mean that $X$ is a Gaussian process with mean $m$ and covariance $\gamma$. However, I don't understand well this notation since $m$ and $gamma$ are functions (of $t$) and $X$ is an infinite collection of random variable ?

Thank you a lot !

Best Answer

Indeed, $m$ and $\gamma$ are functions. More precisely, $m\colon\mathbb T\to\mathbb R$ is defined for $t\in\mathbb T$ as $m(t)=\mathbb E\left[X_t\right]$ and $\gamma\colon \mathbb T\times \mathbb T\to\mathbb R$ by $\gamma\left(t,t'\right)=\operatorname{Cov}\left(X_t,X_{t'}\right)$.

When $\mathbb T$ is finite, $m$ is simply denoted as a vector and $\gamma$ as a matrix.

Related Question