[Math] an intuitive example of convergence in distribution does not imply convergence in probability

probability distributionsprobability theoryprobability-limit-theorems

In statistical inference by casella, it provides a nice example that shows how convergence in probability does not imply almost surely convergence. However, there is no good counterexample showing how convergence in distribution does not imply convergence in probability.

After spending some time, it seems like if I can find two CDFs that are identical but have different random variables, it should work out. But the most I spend time on it, this seems impossible. Convergence in distribution says that $$\lim_{n \to \infty} F_{X_n}(x) = F_X(x)$$ at all points x, and $F_X(x)$ be continuous. But if the cdfs are the same, then the pdfs are also the same. How can we have convergence in distribution and not convergence in prob?

Best Answer

Take the sample space $\Omega = \{0,1\}$ with $P(\omega= 0) = P(\omega = 1) = 1/2$.

Let $X_n$ for all $n \in \mathbb{N}$ and $X$ be random variables defined on the same sample space $\Omega$ such that $X_n(0) = 0 = X(1)$ and $X_n(1) = 1 = X(0)$.

Since $|X_n(\omega) - X(\omega)| = 1$, $X_n$ fails to converge to $X$ in probability.

However, you can show that $X_n$ and $X$ have the same distribution function,

$$F_X(x) = F_{X_n}(x) = \begin{cases}1/2, \,\,\, 0 \leqslant x < 1\\ 1, \,\,\,\,\,\,\,\,\, x \geqslant 1 \end{cases}$$

and, hence, $X_n \to X$ in distribution.

Related Question