[Math] Convergence in distribution of Gaussian processes

convergence-divergenceprobabilityprobability distributionsprobability theorystochastic-processes

Assume given a sequence $(W_n)$ of Gaussian processes indexed by, say, $\mathbb{R}^p$, with mean zero and covariance function $R_n$. This means that for each $n$, the finite-dimensional distributions of $W_n$ are multivariate Gaussian with mean zero, and for each $x,y\in\mathbb{R}^p$, $\textrm{Cov}(W_n(x),W_n(y))=R_n(x,y)$.

A priori, each $W_n$ takes its values merely in the space of functions from $\mathbb{R}^p$ to $\mathbb{R}$. To ensure sufficient regularity, assume for definiteness that each $W_n$ in fact takes its values in some Banach space $B$. This could for example be the space of bounded functions, or the space of continuous functions.

My question is this: What are sufficient criteria for weak convergence of the sequence $(W_n)$?

A few comments: As projections onto finitely many coordinates generally are continuous, it is clear that weak convergence of $W_n$ implies convergence of finite-dimensional distributions. This also means that the candidate limit distribution is uniquely determined by the limits of the finite-dimensional distributions of $W_n$. Therefore, the candidate weak limit $W$ will have to be a Gaussian process as well, taking its values in the same Banach space $B$ as $(W_n)$. What is required for weak convergence of $W_n$ is some notion of tightness, which ideally should be expressed in the relationship between the covariance functions $R_n$ and their limit.

One might ask why weak convergence instead of mere convergence of finite-dimensional distributions is interesting: My own main interest is to ensure convergence of functionals such as the supremum as well, and to obtain this, convergence of finite-dimensional distributions in general does not suffice.

Best Answer

The Kolmogorov-Chentsov criterion can be helpful in the continuous case: Let $(X^n)_{n \in \mathbb{N}}$ a sequence of continuous processes indexed by $\mathbb{R}^d$ with values in a separable complete metric space $(S, \rho)$. Then if $(X_0^n)_{n \in \mathbb{N}}$ is tight and there exists constants $a,b,K > 0$ independent of $n$ such that $$ E[\rho(X_s^n, X_t^n)] \leq K |s-t|^{(d+b)}$$ Then $(X^n)$ is tight in $C(\mathbb{R}^d, S)$. Moreover the limiting process is almost surely Hölder continuous for every exponent in $(0, \frac{b}{a})$.

Intuitively you should bound $\rho(X_s^n, X_t^n)$ by the covariance functions in your sequence. At least in the Brownian case I'm sure you can do that.

You can see the details in Kallenberg's "Foundation of Modern Probability". Th 16.9.

Related Question