If for every $x>0,P(|X_k|>x) \leq P(Y>x)$ then $\lim_k E[X_k|\mathcal{G}]=E[X|\mathcal{G}]$ a.s.

conditional-expectationmeasure-theoryprobability theory

Consider on a probability space $(\Omega,\mathcal{F},P),$ a sub-$\sigma$-algebra $\mathcal{G}$ and a sequence of random variables $(X_k)_k$ converging a.s. to $X$. Let $Y \in L^1.$

  1. Prove that if $|X_k| \leq Y$ then $E[X_k|\mathcal{G}]$ converges a.s. to $E[X|\mathcal{G}]$.

  2. Does it follow that $\lim_k E[X_k|\mathcal{G}]=E[X|\mathcal{G}]$ a.s.

    a. if for every $x>0,P(|X_k|>x) \leq P(Y>x)?$

    b. if for every $x>0,P(|X_k|>x|\mathcal{G}) \leq P(Y>x|\mathcal{G})$ a.s. ?

Attempt:

  1. The result follows by applying Fatou's lemma: $E[\liminf_k(2Y-|X_k-X|)|\mathcal{G}] \leq \liminf_k E[2Y-|Y_k-Y||\mathcal{G}]$ a.s. which implies that $2E[Y|\mathcal{G}]\leq 2 E[Y|\mathcal{G}]-\limsup_kE[|X_k-X||\mathcal{G}]$ a.s. so that $\limsup_kE[|X_k-X||\mathcal{G}]=0$ a.s. concluding the proof.

Any ideas for part 2. are welcomed!

Best Answer

Let $(X_n)_{n\geqslant 1}$ be a sequence of random variables on a probability space $(\Omega,\mathcal F,\mathbb P)$ that converges almost surely to a random variable $Y$. Let $\mathcal G$ be a sub-$\sigma$-algebra of $\mathcal F$. Consider the following four conditions:

  • $(C1)$: there exists an integrable random variable $Y$ such that $\lvert X_n\rvert\leqslant Y$ almost surely.
  • $(C2)$: there exists an integrable random variable $Y$ such that for each $x>0$, $\mathbb P\left(\lvert X_n\rvert>x\mid\mathcal G\right)\leqslant \mathbb P\left(Y>x\mid\mathcal G\right)$ almost surely.
  • $(C3)$: there exists an integrable random variable $Y$ such that for each $x>0$, $\mathbb P\left(\lvert X_n\rvert>x\right)\leqslant \mathbb P\left(Y>x\right)$.
  • $(C4)$: the sequence $\left(X_n\right)_{n\geqslant 1}$ is uniformly integrable.

The chain of implication $(C1)\Rightarrow (C2)\Rightarrow (C3) \Rightarrow (C4)$ takes places. Now the question is: which conditions among $(Ci)$, $1\leqslant i\leqslant 4$ guarantees that $$ \tag{*}\mathbb E\left[X_n\mid\mathcal G\right]\to \mathbb E\left[X\mid\mathcal G\right]. $$

As it was shown in the opening post, $(C1)$ is sufficient and it was established in this thread that $(C4)$ is not sufficient in general.

Interestingly, the example provided in this thread satisfies also the stochastic domination in $(C3)$. To this this, let write explicitly the counter-example. Let $\left(A_n\right)_{n\geqslant 1}$ and $\left(B_n\right)_{n\geqslant 1}$ be two mutually independent sequences of events, both sequences consisting of independent events, and such that $\mathbb P(A_n)=1/n$, $\mathbb P(B_n)=1/n^2$. Let $Y_n=\mathbf{1}_{A_n}$, $Z_n=n^2\mathbf{1}_{B_n}$ and $X_n=Y_nZ_n$. Let $\mathcal G:= \sigma(A_k,k\geqslant 1)$.

  • By the first Borel-Cantelli lemma, $\mathbb P(\limsup_n B_n)=0$ hence for almost every $\omega$, there exists an $n(\omega)$ such that $X_n(\omega)=0$ for $n\geqslant n(\omega)$. Therefore, $X_n\to X:=0$ almost surely.
  • For a positive $x$, $\mathbb P(\lvert X_n\rvert>x)=\mathbf 1_{(0,n^2]}(x)\mathbb P(A_n\cap B_n)=\mathbf 1_{(0,n^2)}(x)n^{-3}$ hence $$ \sup_{n\geqslant 1}\mathbb P(\lvert X_n\rvert>x)\leqslant \sup_{n\geqslant 1}\mathbf 1_{(0,n^2)}(x)n^{-3}, $$ which is reached at $n=\lfloor \sqrt{x}\rfloor+1$ hence any random variable $Y$ whose tail behaves as $(\lfloor \sqrt{x}\rfloor+1)^{-3}$ would do the job, as such a random variable would be integrable.
  • $\mathbb E\left[X_n\mid\mathcal G\right]=n^2\mathbf{1}_{A_n}\mathbb E\left[\mathbf{1}_{B_n}\mid\mathcal G\right]=\mathbf{1}_{A_n}$ and by the second Borel-Cantelli lemma, $\mathbb E\left[X_n\mid\mathcal G\right]$ does not converge to $0$ almost surely.

It remains to see whether $(C2)$ is sufficient. For $R>0$, let $\Phi_R\colon\mathbb R\to \mathbb R$ be the function defined by $\Phi_R(t)=-R$ if $t<-R$; $\Phi_R(t)=t$ if $-R\leqslant t\leqslant R$ and $ \Phi_R(t)=R$ if $t>R$. Since $\left( \Phi_R(X_n) \right)_{n\geqslant 1}$ converges almost surely to $\Phi_R(X)$ and $\lvert \Phi_R(X_n)\rvert\leqslant R$, we derive that $\mathbb E\left[ \Phi_R(X_n)\mid \mathcal G\right]\to \mathbb E\left[ \Phi_R(X)\mid \mathcal G\right]$. Moreover, we can show that $$ \tag{C5} \sup_{n\geqslant 1} \mathbb E\left[\lvert X_n\rvert\mathbb{1}_{\{\lvert X_n\rvert>2^k\}}\mid\mathcal G\right]\to 0\mbox{ a.s. as }k\to \infty $$ by applying the inequality in $(C2)$ with $x=2^j$ and sum over $j\geqslant k$. Since $$ \left\lvert \mathbb E\left[X_n\mid\mathcal G\right]-\mathbb E\left[X\mid\mathcal G\right]\right\rvert\leqslant \left\lvert \mathbb E\left[\Phi_{2^k}(X_n)\mid\mathcal G\right]-\mathbb E\left[\Phi_{2^k}(X)\mid\mathcal G\right]\right\rvert+\left\lvert \mathbb E\left[X_n-\Phi_{2^k}(X_n)\mid\mathcal G\right]\right\rvert+\left\lvert \mathbb E\left[X-\Phi_{2^k}(X)\mid\mathcal G\right]\right\rvert $$ and for a random variable $Z$, $\lvert Z-\Phi_{2^k}\rvert\leqslant \lvert Z \rvert\mathbf{1}_{\{\lvert Z \rvert>2^k\}}$, it follows that $$ \left\lvert \mathbb E\left[X_n\mid\mathcal G\right]-\mathbb E\left[X\mid\mathcal G\right]\right\rvert\leqslant \left\lvert \mathbb E\left[\Phi_{2^k}(X_n)\mid\mathcal G\right]-\mathbb E\left[\Phi_{2^k}(X)\mid\mathcal G\right]\right\rvert+\left\lvert \mathbb E\left[\lvert X_n\rvert\mathbf{1}_{\{\lvert X_n \rvert>2^k\}}\mid\mathcal G\right]\right\rvert+\left\lvert \mathbb E\left[\lvert X \rvert\mathbf{1}_{\{\lvert X \rvert>2^k\}}\mid\mathcal G\right]\right\rvert $$ hence $(C2)$ is sufficient.

Actually, the proof shows that (C5) is sufficient. We have $(C1)\Rightarrow (C5)\Rightarrow (C2)$.