Proof of convergence of random variables in $L^p$ via convergence in probability and uniform integrability.

analysismeasure-theoryprobability theoryself-learning

Consider the following proposition. Part (i) i have no problem with. Its the proof of part (ii) that (because of my lack of knowledge of advanced measure theory) am having trouble in understanding.

enter image description here

The proof goes as follows:

enter image description here

Now the problematic issues are as follows:

  1. How does it follow that since seqquences of mean and variances are bounded we have that $ \sup_n \mathbb{E}[|X_n|^q] < \infty$ for all $q$?
  2. How does it follow that $\sup_n \mathbb{E}[|X_n|^q] < \infty \implies \sup_n \mathbb{E}[|X_n-X|^q] < \infty$ for all $q$?
  3. How do we know that the random variables $Y_n$ converge to 0 in probability?
  4. How does boundedness in $L^2$ imply unfirom integrability?

The last point (that 4 above implies convergence of $Y_n$ to 0 in $ L^1$) I understand as it is a standard result in measure theory.

The book which I am studying is Brownian Motion, Martingales and Stochastic Processes by Jean-François Le Gall.

Best Answer

  1. This can be proved in many ways. First way: Note that $$E(|X_n|^q) = E(|\sigma_nN + m_n|^q) \leq E((\sigma_n|N| + |m_n|)^q).$$ For $a, b \geq 0$, $(a + b)^q \leq (2\max(a, b))^q = 2^q\max(a, b)^q \leq 2^q(a^q + b^q)$. Hence $$E(|X_n|^q) \leq 2^q(\sigma_n^qE(|N|^q) + |m_n|^q) \to 2^q(\sigma^qE(|N|^q) + |m|^q) < \infty.$$ Second way: Use DCT to conclude that $E(|X_n|^q) = \int |x|^q \frac{1}{\sqrt{2\pi}\sigma_n}\exp(-\frac{(x - m_n)^2}{2\sigma_m^2})\,dx \to \int |x|^q \frac{1}{\sqrt{2\pi}\sigma}\exp(-\frac{(x - m)^2}{2\sigma^2})\,dx < \infty$.

  2. This is again due to $(a + b)^q \leq 2^q(a^q + b^q)$ for $a, b \geq 0$.

  3. You know that $X_n \to X$ in $L^2$, so $X_n \to X$ in probability. This means $|X_n - X| \to 0$ in probability. Then by continuous mapping theorem for convergence in probability, you get $|X_n - X|^p \to 0^p = 0$ in probability.

  4. On a probability space, uniform integrability of a collection $\{X_i : i \in I\}$ is equivalent to existence of a function $f : [0, \infty) \to [0, \infty)$ such that $f(x)/x \to \infty$ as $x \to \infty$ and $\sup_{i \in I}E(f(|X_i|)) < \infty$. See theorem 6.19 on page 154 of "Probability Theory" by Klenke. Now take $f(x) = x^2$.

Related Question