Uniform integrability of a martingale

martingalesprobabilityprobability theoryrandom variablesuniform-integrability

Let $\left(X_{n}\right)_{n \geq 0}$ is a martingale with $X_{0}=0$. Assume

$$\sum_{n=1}^{\infty} E\left(\left(X_{n}-X_{n-1}\right)^{2}\right)<\infty .$$

Show that the martingale is uniformly integrable.


I have a result that says that a martingale $X=(X)_{n\geq 0}$ is bounded ($\sup_nE(X_n^2)<\infty$) if and only if

$$\sum_{n=1}^{\infty} E\left(\left(X_{n}-X_{n-1}\right)^{2}\right)<\infty $$

However, this says nothing about the fact that if the martingale is uniformly integrable.

I have a theorem saying that if $\sup_{X\in \mathcal{H}}E(|X|^p)<\infty$ for some $p>1$, then the class of random variables $\mathcal{H}$ is uniformly integrable. Is this applicable here? Thanks.

Best Answer

Yes this is directly applicable!

Let us restate this useful criterion:

Basically boundedness in $L^p$ for $p>1$ (note that $p=1$ does not work) implies that a family of random variables $(X_i)_{i \in \mathbb N}$ is uniformly integratable, more precisely:

Lemma: Let $(X_i)_{i \in \mathbb N}$ be a family of random variables such that $$ \sup_{i \in \mathbb N} \mathbb E[|X_i|^p] \leq C < \infty, $$ then $(X_i)_{i \in \mathbb N}$ is u.i.

Proof: Let $p>1$ and $\sup_{i \in \mathbb N} \mathbb E[|X_i|^p] \leq C$. Then observe that $$ \mathbb E[|X_i| \mathbb 1_{|X_i| > K}] \leq \mathbb E\bigg[\frac{|X_i|^{p-1}}{K^{p-1}} |X_i| \mathbb 1_{|X_i|>K}\bigg] \leq \frac{1}{K^{p-1}} \mathbb E[|X_i|^p] \leq \frac{C}{K^{p-1}}. $$ Now taking the supremum over $i \in \mathbb N$ and $K \to \infty$ shows that $\{X_i\}_{i \in \mathbb N}$ is u.i.

So you are indeed already done:

Since you already know that your condition implies that $\sup_{i \in \mathbb N} \mathbb E[|X_i|^2] \leq C < \infty$ we know that $\{X_i\}$ is bounded in $L^2$ and by above this is sufficies.

Appendix on Boundedness: Let us first speak some basic observations:

  1. If $|X_n| \leq M$ then $\mathbb E[|X_n|] \leq M$.
  2. If $M_1 \leq X_n \leq M_2$ then $\mathbb E[|X_n|] \leq |M_1| + |M_2|$ (rewrite the integral in terms o positive and negative part of $X_n$
  3. If $M_1 \leq X_n \leq M_2$ then $\mathbb E[|X_n|] \leq (|M_1| + |M_2|)^p$

For more I advise to check out the following question [https://math.stackexchange.com/questions/3433896/if-a-random-variable-is-bounded-does-it-mean-its-expectation-is-bounded].

Generally speaking one also knows that if $X$ is bounded in $L^q$ for $1 \leq p \leq q$ then it is also bounded in $L^p$ by applying Hölder's inequality. Usually the idea about this can maybe explained in terms of bounded second moments, i.e. that $E[|X|^2] < \infty$, since then not only the the absolute value $|X|$ is finite almost surely but also it's second moment $E[|X|^2]$, which in case that $X \sim N(0,\sigma)$ gives you automatically that also the variance of single random variable or the variance of a whole family of r.v. has finite variance. Does this help or do you have a more specific question about this?

Related Question