Conditional expectation in markov chain proof

conditional-expectationmarkov chainsprobability theory

In Durrett's probability theory and examples, when proving chapman-kolmogorov's equation, we do the following:

\begin{align*}
P_x(X_{n+m} = z) &= E_x[P_x(X_{m+n} = z \mid \mathcal{F}_m)]\\
&= E_x [ P_{X_m} (X_n = z)] \quad (\text{by Markov Property})\\
&= \sum_y P_x(X_m = y) P_y(X_n = z)
\end{align*}

To get the first and second equality, I think what he did was
\begin{align*}
P_x(X_{n+m} = z) &= E_x[1_{\{X_{m+n} = z\}}]\\
&= E_x[1_{\{X_{n} = z\}} \circ \theta_m]\\
&= E_x[E_x[1_{\{X_{n} = z\}} \circ \theta_m \mid \mathcal{F}_m]] \quad (?)\\
\end{align*}

where $\theta_m$ is a shift operator.

How can you take the conditional expectation on $\mathcal{F}_m$ when we don't know whether $1_{\{X_{n} = z\}} \circ \theta_m \in \mathcal{F}_m$?

Best Answer

We do not need that $1_{\{X_{n} = z\}} \circ \theta_m$ is $\mathcal F_m$-measurable (and it does not hold in general). What is used is rather that $\mathbb E\left[X\right]=\mathbb E\left[\mathbb E\left[X\mid\mathcal G\right]\right]$ for any $\sigma$-algebra $\mathcal G$ and any integrable random variable $X$. And this holds without assuming that $X$ is $\mathcal G$-measurable.

Related Question