[Math] Equivalent condition of uniform integrability of a sequence of random variables

probability theoryrandom variablesuniform-integrability

Here's the definition I have for a sequence of random variables to be uniformly integrable:

$(1)$ A sequence of random variables $X_1, X_2, \ldots$ is uniformly integrable (U.I.) if for every $\epsilon>0$ there exists a $K$ such that for each $n$, $\mathbb{E}[|X_n|I\{|X_n|>K\}]<\epsilon. $

I've seen that U.I. implies the following:

$(2)$ Given $\epsilon>0$, there is a $\delta>0$ such that if $\mathbb{P}(A)<\delta$, then $\mathbb{E}(|X_n|I_A)<\epsilon $.

In the book I'm reading, the author states a fact about a condition that implies U.I. and in the proof, he just shows that $(2)$ holds. Is it the case that $(1)\iff(2)$?

Also, this is supposed to be pre-measure-theoretic, so if you can answer my question without measure theory that would be ideal. Thanks in advance for your input!

Best Answer

http://www.statslab.cam.ac.uk/~beresty/teach/pmnotes.pdf

see page 42. the author uses (2) as definiton, and prove they are equivalent. The proof is fairly short. about 1/3 of the page

It might also beneficial read about 1/2 of page before it.

Related Question