Consider on a probability space $(\Omega,\mathcal{F},P),$ a sub-$\sigma$-algebra $\mathcal{G}$ and a sequence of random variables $(X_k)_k$ converging a.s. to $X$. Let $Y \in L^1.$
-
Prove that if $|X_k| \leq Y$ then $E[X_k|\mathcal{G}]$ converges a.s. to $E[X|\mathcal{G}]$.
-
Does it follow that $\lim_k E[X_k|\mathcal{G}]=E[X|\mathcal{G}]$ a.s.
a. if for every $x>0,P(|X_k|>x) \leq P(Y>x)?$
b. if for every $x>0,P(|X_k|>x|\mathcal{G}) \leq P(Y>x|\mathcal{G})$ a.s. ?
Attempt:
- The result follows by applying Fatou's lemma: $E[\liminf_k(2Y-|X_k-X|)|\mathcal{G}] \leq \liminf_k E[2Y-|Y_k-Y||\mathcal{G}]$ a.s. which implies that $2E[Y|\mathcal{G}]\leq 2 E[Y|\mathcal{G}]-\limsup_kE[|X_k-X||\mathcal{G}]$ a.s. so that $\limsup_kE[|X_k-X||\mathcal{G}]=0$ a.s. concluding the proof.
Any ideas for part 2. are welcomed!
Best Answer
Let $(X_n)_{n\geqslant 1}$ be a sequence of random variables on a probability space $(\Omega,\mathcal F,\mathbb P)$ that converges almost surely to a random variable $Y$. Let $\mathcal G$ be a sub-$\sigma$-algebra of $\mathcal F$. Consider the following four conditions:
The chain of implication $(C1)\Rightarrow (C2)\Rightarrow (C3) \Rightarrow (C4)$ takes places. Now the question is: which conditions among $(Ci)$, $1\leqslant i\leqslant 4$ guarantees that $$ \tag{*}\mathbb E\left[X_n\mid\mathcal G\right]\to \mathbb E\left[X\mid\mathcal G\right]. $$
As it was shown in the opening post, $(C1)$ is sufficient and it was established in this thread that $(C4)$ is not sufficient in general.
Interestingly, the example provided in this thread satisfies also the stochastic domination in $(C3)$. To this this, let write explicitly the counter-example. Let $\left(A_n\right)_{n\geqslant 1}$ and $\left(B_n\right)_{n\geqslant 1}$ be two mutually independent sequences of events, both sequences consisting of independent events, and such that $\mathbb P(A_n)=1/n$, $\mathbb P(B_n)=1/n^2$. Let $Y_n=\mathbf{1}_{A_n}$, $Z_n=n^2\mathbf{1}_{B_n}$ and $X_n=Y_nZ_n$. Let $\mathcal G:= \sigma(A_k,k\geqslant 1)$.
It remains to see whether $(C2)$ is sufficient. For $R>0$, let $\Phi_R\colon\mathbb R\to \mathbb R$ be the function defined by $\Phi_R(t)=-R$ if $t<-R$; $\Phi_R(t)=t$ if $-R\leqslant t\leqslant R$ and $ \Phi_R(t)=R$ if $t>R$. Since $\left( \Phi_R(X_n) \right)_{n\geqslant 1}$ converges almost surely to $\Phi_R(X)$ and $\lvert \Phi_R(X_n)\rvert\leqslant R$, we derive that $\mathbb E\left[ \Phi_R(X_n)\mid \mathcal G\right]\to \mathbb E\left[ \Phi_R(X)\mid \mathcal G\right]$. Moreover, we can show that $$ \tag{C5} \sup_{n\geqslant 1} \mathbb E\left[\lvert X_n\rvert\mathbb{1}_{\{\lvert X_n\rvert>2^k\}}\mid\mathcal G\right]\to 0\mbox{ a.s. as }k\to \infty $$ by applying the inequality in $(C2)$ with $x=2^j$ and sum over $j\geqslant k$. Since $$ \left\lvert \mathbb E\left[X_n\mid\mathcal G\right]-\mathbb E\left[X\mid\mathcal G\right]\right\rvert\leqslant \left\lvert \mathbb E\left[\Phi_{2^k}(X_n)\mid\mathcal G\right]-\mathbb E\left[\Phi_{2^k}(X)\mid\mathcal G\right]\right\rvert+\left\lvert \mathbb E\left[X_n-\Phi_{2^k}(X_n)\mid\mathcal G\right]\right\rvert+\left\lvert \mathbb E\left[X-\Phi_{2^k}(X)\mid\mathcal G\right]\right\rvert $$ and for a random variable $Z$, $\lvert Z-\Phi_{2^k}\rvert\leqslant \lvert Z \rvert\mathbf{1}_{\{\lvert Z \rvert>2^k\}}$, it follows that $$ \left\lvert \mathbb E\left[X_n\mid\mathcal G\right]-\mathbb E\left[X\mid\mathcal G\right]\right\rvert\leqslant \left\lvert \mathbb E\left[\Phi_{2^k}(X_n)\mid\mathcal G\right]-\mathbb E\left[\Phi_{2^k}(X)\mid\mathcal G\right]\right\rvert+\left\lvert \mathbb E\left[\lvert X_n\rvert\mathbf{1}_{\{\lvert X_n \rvert>2^k\}}\mid\mathcal G\right]\right\rvert+\left\lvert \mathbb E\left[\lvert X \rvert\mathbf{1}_{\{\lvert X \rvert>2^k\}}\mid\mathcal G\right]\right\rvert $$ hence $(C2)$ is sufficient.
Actually, the proof shows that (C5) is sufficient. We have $(C1)\Rightarrow (C5)\Rightarrow (C2)$.