[Math] Proof of Levy’s zero-one law

convergence-divergencerandom variablesstochastic-analysisstochastic-calculusstochastic-processes

Let $(\Omega, \mathcal{F},\mathbb P)$ be a probability space and let $X$ be a random variable in $L^1$. Let $(\mathcal{F}_k)_k$ be any filtration, and define $\mathcal{F}_{\infty}$ to be the minimal $σ$-algebra generated by $(\mathcal{F}_k)_k$. Then

$$E[X|\mathcal{F}_k]\rightarrow E[X|\mathcal{F}_{\infty}],\ \ k\rightarrow\infty$$

both $\mathbb P$-almost surely and in $L^1$.

Could someone give me a proof of this proposition or a reference? Many thanks!

Best Answer

Let $Y_n = E(X| \mathcal{F}_n)$. Then it $Y_n$ is a martingale, and $$\sup_n E(|Y_n|) = \sup_n E(|E(X| \mathcal{F}_n)|) \leq \sup_n E(E(|X||\mathcal{F}_n)) = E(|X|) $$ where the bound in the middle is due the conditional Jensen inequality.

Now the heavy artillery, by Doob's Convergence theorem $Y_\infty := \lim_{n \to \infty} Y_n$ exists almost surely. And since the sequence is dominated by $X$ (again conditional Jensen) we conclude the $L^1$ convergence and thus the convergence in probability.

You can find the Doob's Convergence theorem in Williams' "Probability with Martingales" Thm. 11.5. Is a rather important result based on a "band argument" and it can be extended to continuous time martingales.

Related Question