Convergence in probability – sum of normal and discrete distribution

normal distributionprobability theory

Let $X \sim N(0,1)$, and let $Y_n$ be random variable such that $ P(Y_n = n) = \frac{1}{n} $ and $P(Y_n = 0) = 1 – \frac{1}{n}$. What does $Z_n = Y_n + X$ converge in probability to?

  1. I can see that $Y_n$ converges in probability to $0$, which makes $Z_n$ converges to $N(0,1)$ in distribution by Slutsky's theorem, but I don't know if we can find the probability limit of $Z_n$

  2. I think that $E[Z_n] = 1$ and $Var(Z_n) = n$ (?). However, if my reasoning above is correct, the limiting distribution is $N(0,1)$. What makes this difference?

Best Answer

You've made some good observations.

  1. For the first point, it is true that $Y_n$ converges in probability to $0$ and indeed, Slutsky's lemma implies the convergence in distribution of $Z_n$ to $X$, i.e. a standard normal.
    But actually it is not hard to see that $Z_n$ converges to $X$ in probability, by direct computation : $$\mathbb P\left(|Z_n - X|>\varepsilon\right) = \mathbb P\left(|Y_n|>\varepsilon\right) = \frac1n \to 0$$
  2. Your second point is where it gets more interesting : we have just seen that $Z_n$ converges to $X$ in probability but somehow the expectation and variance of $Z_n$ do not converge to that of $X$ ! How come ?? As it turns out, it is a well known result that a sequence of random variables $(Y_n)$ converges in expectation to a random variable $Y$ if and only if $(Y_n)$ converges to $Y$ in probability and is uniformly integrable.
    In our case, $(Z_n)$ converges to $X$ in probability, but as you observed, not in expectation. Hence we deduce that $Z_n$ is not uniformly integrable (I invite you to check with the definition that $Z_n$ is indeed not u.i.).

More intuitively, uniformly integrable families of random variables are such that "most of their mass" is contained in a compact (bounded) set, hence they are "better behaved" than simply integrable random variables, as your exercises illustrates well.