Almost Sure Convergence of a Random Variable

random variables

I'm having trouble understanding almost sure convergence of random variables.

The definition given on wikipedia is that a sequence $\{X_n\}$ of random variables defined on a probablity space $\left(\Omega,\Sigma,P\right)$, almost surely converges to X if $$P\left(\omega\in\Omega:\lim_{n\to\infty}X_n(\omega)=X(\omega)\right)=1$$

I think I'm misunderstanding this definition, as I don't understand how it implies convergence in probability, and many other things, but here is the simplest example for which I'm confused; I read on a different question that a sequence of $i.i.d \space Bernoulli\left(\frac{1}{2}\right)$ random variables does not converge almost surely, and I don't see how.

Setting up the probability space, $$\Omega=\{0,1\}
$$
$$P\left(\omega\right)=\frac{1}{2},\space\omega=0,1$$$$X_n(\omega)=\omega,\forall\omega$$$$X(\omega)=\omega,\forall\omega$$

The limit
$$\lim_{n\to\infty}X_n(\omega)=X(\omega)$$
holds for $\omega\in\{0,1\}=\Omega$, so the sequence almost surely converges to X.
Where is my mistake?

Best Answer

As you have written it, we indeed have that $X_n\to X$ almost surely. Your mistake is that this sequence is not IID. Indeed, clearly $X_n=X$ for all $n$. Therefore, $$P(X_m=0,X_n=0)=P(X=0)=\frac12\neq \frac14=P(X_m=0)P(X_n=0).$$

Related Question