Convergence in Probability – Convergence in Probability and $L_2$ for Normal Random Variables

convergencenormal distributionprobability

In an answer here: Convergence of identically distributed normal random variables, the following lemma is mentioned:

Lemma: Let $X_1, X_2, \ldots$ be a sequence of zero-mean normal random
variables defined on the same space with variances $\sigma^2_n$. Then,
$X_n \to X_\infty$ both in probability and in $L_2$ (and hence also in
distribution) if and only if $\sigma^2_n \to \sigma^2 < \infty$. In
this case, the limit $X_\infty$ is also normally distributed with mean
zero and variance $\sigma^2$.

I'd like to find out more about this lemma. Does it have a name?

Best Answer

Unfortunately, the quoted statement in the question is a muddled version of the one I originally intended. Thank you for catching this. I've also updated the statement in the original question. A counterexample to the former is given at the end of this answer.

Here is the intended statement:

Lemma: Let $X_1,X_2,\ldots$ be a sequence of zero-mean normal random variables defined on the same space with variances $\sigma_n^2$. Then, $X_n \to X_\infty$ in probability if and only if $X_n \xrightarrow{\,L_2\,} X_\infty$, in which case $X_\infty \sim \mathcal N(0,\sigma^2)$ where $\sigma^2 = \lim_{n\to\infty} \sigma_n^2$.

Remark: The main points here are that (a) we can "upgrade" convergence in probability to $L_2$ convergence in the case of sequences of normals, (b) we are guaranteed that the distribution of the limit is normal (which is not otherwise obvious) and (c) we get both of the above without specifying anything about the joint distributions of the elements in the sequence.

Proof (sketch): One direction is easy: Convergence in $L_2$ always implies convergence in probability. To prove to the other direction: If $X_n \to X$ in probability, then $$\varphi_n(t) = \mathbb E e^{it X_n} \to \mathbb E e^{it X_{\infty}}$$ by dominated convergence. But, $\varphi_n(t) = e^{t^2 \sigma_n^2 / 2}$, and $e^{t^2 \sigma_n^2 / 2}$ converges for each $t$ as $n \to \infty$ if and only if $\sigma_n^2 \to \sigma^2$. This is enough to imply that $\sup_n \sigma_n^2 < \infty$, and hence the collection $\{X_n^2\}$ is uniformly integrable. Thus, $X_n \xrightarrow{\,L_2\,} X_\infty$. This further shows the limit $X_\infty$ must be normal since then $\mathbb E e^{it X_\infty} = e^{t^2 \sigma^2 / 2}$, which is the characteristic function of a normally distribution random variable.

Notes

The convergence of the sequence $\sigma_n^2$ and the fact that the limiting distribution $X_\infty$ is normally distributed are part of the conclusion. By the same exact argument, we can also replace $L_2$ convergence with the more general $L_p$ convergence by recognizing that the variance determines the distribution in this case and all moments are finite, so $\{X_n^p\}$ is also uniformly integrable.

From this it is clear that we have the following weaker result on convergence in distribution, which is well-known and given as an exercise in some probability textbooks.

Lemma: Let $X_1,X_2,\ldots$ be a sequence of normal random variables. Then, $X_n \to X_\infty$ in distribution if and only if $\mu_n \to \mu$ and $\sigma_n^2 \to \sigma^2$, in which case $X_\infty \sim \mathcal N(\mu,\sigma^2)$.

A nice application of the second lemma is to consider the marginal distribution of the Riemann integral of Brownian motion, $$ I_t = \int_0^t B_s \, \mathrm{d} s \> . $$ By considering the Riemann sums and using the second lemma, we see that $I_t \sim \mathcal N(0, t^3/3)$.


A counterexample to the quoted statement in the question can be found by considering $X_\infty \sim \mathcal N(0,1)$ and $X_n = (-1)^n X_\infty$. Here, $\sigma_n^2 = 1$ for all $n$ so $\sigma_n^2 \to 1$, but $X_n$ does not converge to $X_\infty$ in probability or $L_2$.

Related Question