Use of law of total expectation without checking integrability

conditional-expectationexpected valueprobability theory

$\newcommand{\E}{\mathbb E}$In basic probability classes, people often use the formula, namely the law of total expectation
$$\E[X]=\E[\E[X\mid Y]]$$
without checking integrability of $X$.

I can't get my head around it, because we sometimes use it to check the integrability of $X$… Wow! We don't even know if $\E[X\mid Y]$ exists, yet we use it.

Example.
For a simple example, I wanted to do it properly. For instance, consider nonnegative integrable random variables $Z$ and $X_i$ i.i.d. independent of $Z$. We need to check whether $$\sum_{i= 1}^ZX_i $$
is integrable. I can do it like this
$$\E\left[\sum_{i= 1}^ZX_i \right]=\E\left[\E\left[\sum_{i= 1}^ZX_i\mid Z\right] \right]=\E\left[\sum_{i=1}^Z \E[X_i] \right]=\E[Z]\E[X_1]$$
But I thought this is cheating, so I considered $X\mathbb I_{X<K}$ for some constant $K>0$, then we have
\begin{align}
\sum_{i=1}^Z X_i\mathbb I_{X_i<K} \leq ZK
\end{align}

This is clearly integrable. So we can apply the same steps as before "legally" to show integrability and then MCT does the rest of the work
\begin{align}
\E\left[\sum_{i=1}^Z X_i\right]\stackrel{\text{MCT}}{=}\lim_{K\to\infty}\E\left[\sum_{i=1}^Z X_i\mathbb I_{X_i<K} \right]=\lim_{K\to\infty}\E[X_1\mathbb I_{X_1<K}]\E[Z]\stackrel{\text{MCT}}{=}\E[X_1]\E[Z]
\end{align}

Do something similar always works, giving us the reason why people just use it? Indeed, I guess it is boring to repeat these kind of steps if it always works, therefore my question:

Question. Can we actually apply the law of total expectation to show integrability? If the answer is "yes", is there a general proof? If the answer is "no", are there examples where it fails?

Best Answer

All the usual laws of conditional expectation work for nonnegative random variables, integrable or not, if you allow the value $+\infty$. For instance, if $X$ is nonnegative, $E[X \mid Y]$ will always exist, but potentially could equal $+\infty$ with positive probability. And if $X$ is nonnegative and not integrable, then you will get $E[E[X \mid Y]] = +\infty$ as well (i.e. $E[X \mid Y]$ is nonnegative and not integrable). These cases are often left as exercises since it is a little tedious to write out the proofs, but you can show it by considering "cut off" random variables like $X_n = X 1_{X \le n}$ and using monotone convergence.

The properties do not necessarily hold for signed random variables that may not be integrable. However, you can determine whether a random variable $X$ is integrable by computing $E[|X| \mid Y]$ and seeing if $E[E[|X| \mid Y]] < \infty$, and if so then you will know that $E[X] = E[E[X \mid Y]]$.

Related Question