Convergence in probability of sequence of random variables

probabilityrandom variables

Let $X \sim \text{Ber}(1/2)$. I am asked to show that the sequence of random variables $\{X_n = (1+\frac{1}{n})X\}$ converges in probability to $X$. My attempt:

Let $\epsilon > 0$, take $N = \frac{1}{\epsilon}$, then for all $n > N$ we have
$$\begin{cases}
& X_n = 0 \text{ wp } 1/2 \\
& X_n = 1+\delta \text{ wp } 1/2
\end{cases}$$

Where $\delta < \epsilon$ for all $n > N$. Hence $P(|X_n-X| > \epsilon) = 0$ and the result follows. Is this enough to prove it? I am familiar with convergence of sequences of functions from real analysis but am having a harder time with random variables. Also, I know that almost surely convergence implies convergence in probability. Can I use that fact here?

Best Answer

A slight nitpick is you should say 'take $N > \frac{1}{\epsilon}$' rather than an equality, but that really is just a nitpick. Other than that your proof looks fine.

As you have guessed, you can also use the fact that almost sure convergence implies convergence in probability. Can you see why it's clear that $X_n \to X$ almost surely as $n \to \infty$?