Markov Property – Is It Equivalent to Past and Future Independence Given Present

markov chainsmarkov-processprobabilitystochastic-processes

Let $(X_n)_{n\geq 0}$ be a stochastic chain.

We say that it has the Markov property if, for all $n$, $$\mathbb{P}(X_n\in \cdot|X_{n-1},…,X_0)=\mathbb{P}(X_n\in \cdot|X_{n-1}),$$ i.e. the distribution of $X_n$ given the whole past $(X_0,…,X_{n-1})$ only depends on $X_{n-1}$.

We say that "past and future are independent given the present" if for all $n,k\geq 0$ and all measurable $f,g\geq 0$ we have $$\mathbb{E}(f(X_0,…,X_{n-1})g(X_{n+1},…,X_{n+k})|X_n)=\mathbb{E}(f(X_0,…,X_{n-1})|X_n)\mathbb{E}(g(X_{n+1},…,X_{n+k})|X_n).$$

It is easy to show that the Markov property implies that "past and future are independent given the present". Is if the reverse implication also true (as John Dawkins's answer to this question The "inverse" of the Markov property: conditioning on the future rather than the past seems to suggest)?

Best Answer

Yes. Let $f$ be the indicator function $I_A$ where $A$ is the set $\{x_0\}\times\{x_1\}\times\cdots\times\{x_{n-1}\}$. So $A$ is the singleton consisting of the vector $(x_0,x_1,\ldots,x_n)\in {\mathbb R}^{n+1}$. Next, put $g(u_1,\ldots,u_k):=I_C(u_1)$, where $C$ is the singleton $\{z\}$. Then pafaigtp applied to $f$ and $g$ asserts that $$H(X_n)=F(X_n)G(X_n)$$ where $$\begin{aligned} H(y)&:=P(X_0=x_0,X_1=x_1,\ldots,X_{n-1}=x_{n-1},X_{n+1}=z\mid X_n=y),\\ F(y)&:=P(X_0=x_0,X_1=x_1,\ldots X_{n-1}=x_{n-1}\mid X_n=y),\\ G(y)&:=P(X_{n+1}=z\mid X_n=y).\end{aligned}$$ Now rearrange the identity $H(y)=F(y)G(y)$ into the form $$P(X_{n+1}=z\mid X_0=x_0,X_1=x_1,\ldots,X_{n-1}=x_{n-1},X_n=y)=P(X_{n+1}=z\mid X_n=y), $$ which implies the Markov property. Note we are asserting the Markov property with $n+1$ in place of the $n$ appearing in the OP.