Markov Property definition via conditional expectation

markov-process

In several textbooks I have seen the following equivalent statement for the Markov property:

Let $\{X\}_{t \geq 0}$ be a stochastic process, $\mathcal{F}_u^v = \sigma\{X_s, s \in [u,v]\}$. Then $\{X_s\}_s$ has the Markov property iff for all $0 \leq t < s$ and any any bounded Borel function $g$ it is $$ \mathbf{E}\{g(X_s)|\mathcal{F}_0^t\} = \mathbf{E}\{g(X_s)| \mathcal{F}_t^t\}.$$

I wonder whether the condition of $g$ being bounded is really necessary. Does the statement hold for any Borel function $g$ with $\mathbf{E}|g(X_s)| < \infty$ as well?

Best Answer

Yes. If the equation holds for bounded measurable functions it holds for simple functions. By Monotone convergence Theorem it holds for all non-negative measurable functions. Hence it holds whenever $E|g(X_s)| <\infty$.