Sufficient condition for $L^1$ convergence using uniformly integrabllity

measure-theoryprobability theorysolution-verificationuniform-integrability

I would like to prove the following result : let $(Xn)_n$ be a sequence in $L^{1}(\Omega,\mathcal{F}_t, \mathbb{P})$ that converges almost surely to $X\in L^1$. Then if $(X_n)_n$ is uniformly integrable, $X_n$ converges in $L^1$ to $X$.

Here is my attempt for the first implication :

By almost sure convergence we have

$$
\mathbb{P}(\{\omega : \forall k\in\mathbb{N}^{*}, \exists N\in\mathbb{N}, \forall n\geq N, \lvert X_n(\omega) – X(\omega)\rvert\leq\frac{1}{k}\}) = \mathbb{P}(\cap_{k\geq 1}\{\omega : \exists N\in\mathbb{N}, \forall n\geq N, \lvert X_n(\omega) – X(\omega)\rvert\leq\frac{1}{k}\}) = 1
$$

If we denote $A_k = \{\omega : \exists N\in\mathbb{N}, \forall n\geq N, \lvert X_n(\omega) – X(\omega)\rvert\leq\frac{1}{k}\}$ we see that for all $k\geq 1$ $\mathbb{P}(A_k)=1$ and $\mathbb{P}(A_{k}^{c})=0$.

Moreover

$$
\lvert X_n – X\rvert \leq \lvert X_n\rvert1_{A_{k}^{c}} + \lvert X\rvert1_{A_{k}^{c}} + \lvert X_n – X\rvert1_{A_{k}}
$$

Now take $\epsilon>0$. By uniformly integrability of $X_n$ and $X$ we can find a $\delta>0$ such that $\mathbb{P}(B)\leq\delta$ implies that $\mathbb{E}(\lvert X_n\rvert1_{B})$ and $\mathbb{E}(\lvert X \rvert1_{B})$ are less than $\frac{\epsilon}{3}$. Clearly $A_{k}^{c}$ will always satisfy this condition. On the other hand, there exists $k$ big enough such that on $A_k$ we have the existence of $N$ such that for all $n\geq N$ we have $\lvert X_n(\omega) – X(\omega)\rvert\leq\frac{1}{k}\leq\frac{\epsilon}{3}$.

Thus we have

$$
\mathbb{E}(\lvert X_n – X\rvert)\leq 3 \frac{\epsilon}{3} = \epsilon
$$

I would like to know if what I did is correct please, and if not have some hints in order to continue to work on this.

Thank you a lot !

Best Answer

This is the cleanest proof that I wrote when I was doing my grad coursework and I have used it since then.

The first reduction as also done by Kakashi is to take $X_{n}-X$ which is u.i.(uniformly integrable) and then show that if $Y_{n}\xrightarrow{P} 0$ and $Y_{n}$ is u.i. then $Y_{n}\xrightarrow{L^{1}}0$

First fix $\epsilon>0$ and find $M>0$ such that $\sup_{n}E(|Y_{n}|\mathbf{1}_{|Y_{n}|>M})<\epsilon$

Now, $E(|Y_{n}|)=E(|Y_{n}|\mathbf{1}_{|Y_{n}|>M})+E(|Y_{n}|\mathbf{1}_{|Y_{n}|\leq M})\leq \epsilon+E(|Y_{n}|\mathbf{1}_{|Y_{n}|\leq M})$

Now $E(|Y_{n}|\mathbf{1}_{|Y_{n}|\leq M})=E(|Y_{n}|\mathbf{1}_{\{|Y_{n}|\leq M,|Y_{n}|\leq \epsilon\}}) +E(|Y_{n}|\mathbf{1}_{\{|Y_{n}|\leq M,|Y_{n}|> \epsilon\}})$

Now as $Y_{n}\xrightarrow{P}0$, $P(|Y_{n}|>\epsilon)<\delta$ for all $n>N(\delta,\epsilon)$ . Set $\delta=\frac{\epsilon}{M}$

Hence, we have $E(|Y_{n}|\mathbf{1}_{|Y_{n}|\leq M})\leq \epsilon\cdot P(\{|Y_{n}|\leq M,|Y_{n}|\leq \epsilon\})+ M\cdot P(\{|Y_{n}|\leq M,|Y_{n}|> \epsilon\})$

The above, is less than $\epsilon\cdot 1 +M\cdot\delta=\epsilon+M\cdot\frac{\epsilon}{M}$ for all $n\geq N(\epsilon,\frac{\epsilon}{M})$ (which depends only on $\epsilon$)

Thus $E(|Y_{n}|)<3\epsilon$ for all $n\geq N(\epsilon)$

The other direction is also true, i.e. if $X_{n}$ is $L^{1}$ Cauchy, then it is uniformly integrable and obviously $X_{n}$ being $L^{1}$ Cauchy implies it converges in probability to some random variable.