Is this proof of almost sure convergence correct

convergence-divergencemeasure-theoryprobability theoryproof-verificationreal-analysis

I have a sequence of random variables $X_i$ such that $P[X_n = 1 ] = \frac{1}{n}$ and $P[X_n = 0 ] = 1 -\frac{1}{n}$.

We can see that this sequence $X_i$ converges in probablity to the sequence $X = 0$ because $P[|X_n – 0| > \epsilon] = P[X_i = 1] = \frac{1}{n} $ and hence, $\lim_{n \to \infty} P[|X_n – 0| > \epsilon] = 0$

My attempt at almost sure convergence:

We have to prove/disprove that $P[\lim_{n \to \infty} X_n = 0] = 1$. One way to look at this is: We have a sequence of random variables $X_n(\omega)$ being generated by the Bernoulli distribution whose PMF is ${(\frac{1}{n})}^{\omega}{(\frac{n-1}{n})}^{1-\omega}$ where $\omega$ can take values {0,1}.

As $n \to \infty$, the chance of observing the 1's will reduce with n. However, we cannot for sure that we will always observe $X_n = 0$, no matter how large the n is. We can say, for sure, we will always observe $X_n = 0$ only if $n = \infty$, otherwise there is always +ve chance that we can observe 1's. Hence the probability of the event $[\lim_{n \to \infty} X_n = 0]$ is zero. Hence, the above sequence of random variables does not almost surely converge to $0$.

Is this argument correct? Are there more elegant arguments either to disprove or prove the almost sure convergence?

Best Answer

As already pointed out in the comments your argument is not valid. This sequence need not converge almost surely and, if $\{X_n\}$ is independent, then it does not converge almost surely. This is because $\sum P\{X_n >\frac 1 2 \} =\sum \frac 1 n =\infty$. Apply Borel -Cantelli Lemma to conclude that $\{X_n\}$ converges to $0$ with probability $0$!.

Related Question