Help in understanding proof that the limit of a sequence of normal random variable is normal

complex-analysisprobability theory

Let $(X_n)$ is a sequence of random variables with $X_n\sim\mathsf{N}(\mu_n,\sigma^2_n)$ that converges in law to $X$, then $X\sim\mathsf{N}(\mu,\sigma^2)$, where $\mu=\lim\limits_{n\rightarrow\infty}\mu_n$ and $\sigma^2=\lim\limits_{n\rightarrow\infty}\sigma^2_n$.

There are quite a few posts relating to the above theorem such as the one here and here and while they do go into quite some detail, my analysis is quite rusty and I still have some big gaps in understanding, so would really appreciate it if anyone could help me addressing the issues I have. Thanks in advance.

Let $\varphi(.)$ denote the characteristic function of a r.v. So convergence in law then implies that $\varphi_{X_n}(t)=e^{it\mu-\frac{1}{2}t^2\sigma_n^2}\overset{n}{\underset{\infty}{\longrightarrow}}\varphi_X(t)$.
Therefore $|\varphi_{X_n}(t)|=e^{-\frac{1}{2}t^2\sigma_n^2}\overset{n}{\underset{\infty}{\rightarrow}}|\varphi_X(t)|$ and from this we need to infer that $(\sigma_n^2)$ and $(\mu_n)$ are convergent sequences.

I fully understand why $(X_n)$ is a tight sequence (Lévy’s continuity theorem) and why tightness implies that $(\sigma_n^2)$ and $(\mu_n)$ are bounded sequences.

  1. However I have seen in some texts where for the $(\sigma_n^2)$, tightness is not used and it is just said that $(\sigma_n^2)$ has to be convergent as the limit has to be continuous, so my first issue is understanding this argument, as it is implied to be quite obvious but I don't see it, so below I outline my understanding.

    • Continuity Argument: As characteristic functions are (uniformly) continuous hence, are non-vanishing in a region around zero (as $\varphi_X(0)=1$) then $(\sigma_n^2)$ has to be bounded otherwise $\limsup_n \sigma^2_n=\infty$ would imply $|\varphi_{X_n}(t)|\rightarrow 0$ in a region where $\varphi_{X}(t)\neq 0$. This then permits us to apply logs and for any convergent subsequence $(\sigma_{n_k}^2)$, we have $\sigma_{n_k}^2\overset{n}{\underset{\infty}{\rightarrow}}-2\ln|\phi_{X}(1)|$, which then implies $\sigma_{n}^2\overset{n}{\underset{\infty}{\rightarrow}}-2\ln|\phi_{X}(1)|=\sigma^2$
      Is this correct, I am a not very confident if my reasoning on why it is safe to apply logs, i.e. why $\varphi_X\neq 0$ is correct.
  2. Now by tightness $(\mu_n)$ must be bounded. Then for any convergent subsequences with limit $\mu$, we have $$e^{it\mu_{n_k}}\overset{n}{\underset{\infty}{\longrightarrow}}e^{\frac{1}{2}t^2\sigma^2}\varphi_X(t)=e^{it\mu}.$$ So if $(\mu_{n_{k'}})$ is another subsequence with limit $\mu'$ then we have $e^{i(\mu-\mu')}=1$, thus $\mu-\mu'=2\pi p$, $p\in\mathbb{Z}$ so how do we infer that $p=0$. I know in 1 a choice of $t$ is chosen so that we can be in the domain of the principal Log function, but I'm still a bit hazy on how this uniquely determines $\mu$. Is it beacuse this has to be true for all $t$, then for that carefully chosen choice of $t$ which allows us the apply the principal log forces that value of $mu$ to be uniquely determined? So any further explanation is greatly needed.

  3. I really don't know why $\sigma^2\neq 0$ As $\sigma^2=0$ iff $|\phi_X(1)|=1$, so why is it that $|\phi_X(1)|\neq1$?

Any feedback is welcomed, thanks.

Best Answer

$e^{-\frac 1 2 t^{2}\sigma_n^{2}} \to |\phi_X (t)|$ for all $x$. If $t>0$ is suffuciently small then $|\phi_X (t)| >0$. [By continuity of ch. function and the fact ch. functions have the value $1$ at $0$]. Fix such a number $t$ and use continuity of logarithm to conclude that $\sigma_m^{2} \to -2\frac 1 {t^{2}} \ln |\phi_X (t)|$. Hence $\sigma_n^{2}$ is convergent.

An elementary lemma in complex analysis say that if $c_n$ is a sequence of complex numbers such that $e^{itc_n}$ converges for all real numbers $t$ then $\{c_n\}$ is itself convergent. [ In fact it is enough if convergence holds for all $t$ in some set of positive Lebesgue measure].

This lemma shows that $\mu_n$ is also convergent.

Now by DCT we see that $|\phi_X (t)|=lim |\phi_{X_n} (t)|=e^{i\mu t} e^{-t^{2}\sigma^{2}/2}$ where $\sigma^{2}=\lim \sigma_n^{2}$ and $\mu =\lim \mu_n$. Hence $X$ is normal with mean $\mu$ and variance $\sigma^{2}$.

What happens when $\sigma_n \to 0$? In this case $|\phi_X(t)|=1$ for all $t$ and this implies that $X$ is a constant random variable. Constants are considered as normal with variance $0$.