[Math] Independence of past and future states in Markov Chains

independencemarkov chainsprobabilityprobability theory

I have seen this statement in a quiz:

Let $X_i$ denote state $i$ in a Markov chain. It is necessarily true
that $X_{i+1}$ and $X_{i-1}$ are uncorrelated.

Apparently, this statement is false but I can't figure out why. I thought that for Markov Chains the past and future states are independent given the present. Did I misunderstand this?

Best Answer

The key phrase is "given the present". If past and future are independent given the present, it doesn't follow that past and future are unconditionally independent.

For example, consider the simple random walk that takes a step either left or right with equal probability. If you know where I am today, then the knowledge of where I was yesterday won't affect where you think I'll be tomorrow. OTOH if you don't know where I am today, then knowing where I was yesterday will affect where you think I will be tomorrow, since tomorrow I can be at most two steps away from where I was yesterday.