[Math] Convergence in probability of iid normal random variables

convergence-divergencenormal distributionprobability theoryrandom variables

Let $X_1, X_2,\ldots$ be a sequence of iid normal random variables with zero mean and unit variance. I read the following as a trivial example: (1) $X_n \to X_1$ in law, (2) $X_n \not\to X_1$ in probability.

So for the first one, I suppose since $F_{X_n}(x) = F_{X_1}(x)$, for every $n$, there is nothing to show as $n\to \infty$ . Is that right?

I do not know how to show the second (does not converge in probability). I know the definitions and some ideas but not sure if they are true. If somebody gives an idea of how to prove second, that would be wonderful. Thanks!

Best Answer

You are right for convergence in distribution.

In order to prove that convergence in probability does not hold, notice that $X_n-X_1$ is Gaussian and its variance is $2$. Therefore $\mathbb P\{|X_n-X_1|>2\delta\}=\mathbb P\{|N|>\delta\}>0$ where $N$ is normally distributed with mean zero and unit variance. Therefore, there is not convergence in probability to $X_1$ (or any other random variable).

Related Question