[Math] Sequence of normal random variables converging in distribution

normal distributionprobabilityprobability theoryweak-convergence

Let $X_{n}$ be a sequence of normal random variables with mean $\mu_{n}$ and variance $\sigma^{2}_{n}$ for $n \geq 1$. Suppose $X_{n} \rightarrow X$ in distribution where $X \neq c$ almost surely for a constant $c \in \mathbb{R}$. It is to show that in this case there exist $\mu \in \mathbb{R}$ and $\sigma^{2} > 0$ such that $X$ is normal distributed with mean $\mu$ and variance $\sigma^{2}$. Anyone has an idea how to prove it?

Best Answer

A way is to consider characteristic functions. We know that for all fixed $t$, the sequence $\left(\exp\left(it\mu_n-\frac{t^2}2\sigma_n^2\right)\right)_{n\geqslant 1}$ is convergent and the pointwise limit of the sequence $(\varphi_n)$ defined by $\varphi_n(t):=\exp\left(it\mu_n-\frac{t^2}2\sigma_n^2\right)$ is pointwise convergent to a characteristic function. We can follow the following steps.

  1. The sequence $\left(\sigma_n^2\right)_{n\geqslant 1}$ is bounded. Otherwise, a subsequence would go to infinity. Look at what would happen to $\left\lvert \varphi_n(t)\right\rvert$ along this subsequence.

  2. Extract from $\left(\sigma_n^2\right)_{n\geqslant 1}$ a convergent subsequence so that we can assume without loss of generality that $\left(\sigma_n^2\right)_{n\geqslant 1}$ converges, say to $\sigma^2$. Indeed, it suffices to prove that the limit of a subsequence is Gaussian.

  3. The sequence $\left(\mu_n\right)_{n\geqslant 1}$ is bounded: to see this, choose $a\lt b$ such that $\int_a^b\varphi_X(t)\mathrm dt\neq 0$ and consider $\int_a^b\varphi_n(t)\mathrm dt$.

  4. It remains to check that $\sigma \neq 0$ which is guaranteed by the assumptions.

Related Question