[Math] Mean and Variance Convergence with r.v.

normal distributionprobability theoryrandom variablesweak-convergence

Let $(X_n)_{n\ge 1}$ be a sequence of random variables, with respective distributions being Gaussian, with respective mean $\mu_n \in \mathbb R$ and variance
$\sigma_n^2 > 0$. Prove that if $X_n$ converges in distribution, then $\mu_n$ and
$\sigma^2_n$ need to converge, and identify
the limiting random variable. fb

Best Answer

Hints Since $(X_n)_n$ is convergent in distribution (to some random variable $X$), the family of distributions $\{\mathbb{P}_{X_n}; n \in \mathbb{N}\}$ is tight. From this one can conclude that the sequences $(\mu_n)_n$ and $(\sigma_n^2)_n$ are bounded (see this post). Thus, there exist convergent subsequences, i.e. $$\mu := \lim_{k \to \infty} \mu_{n_k} \qquad \qquad \sigma^2 := \lim_{k \to \infty} \sigma_{n_k}^2 \tag{1}$$ Moreover, the weak convergence of $(X_n)_n$ implies the convergence of the characteristic functions, $$\mathbb{E}\exp \left( \imath \, \xi \cdot X_n \right) \to \mathbb{E}\exp \left( \imath \, \xi \cdot X \right) \qquad (n \to \infty)$$ Using the explicit formula for characteristic functions of Gaussian random variables and $(1)$, we conclude $X \sim N(\mu,\sigma^2)$.