Markov property conditioned on a future event

markov chainsmarkov-processprobability

By definition of Markov Property, we know that

$$P ( X_n = x_n | X_{n-1} = x_{n-1} , . . . , X_0 = x_0 ) = P ( X_n = x_n | X_{n-1} = x_{n-1})$$

However, in my problem I have this probability

$$P(X_n = n \mid X_{n+k} = i, X_{n-1} = j)$$

can I say that this is equivalent to $P(X_n = n \mid X_{n-1} = j)$? Can the future affect the past in Markov Chain problems?

Best Answer

Yes, information can go backward in time, so you cannot do the reduction that you want to do here.

For an example, think of something like the simple random walk started at $0$; if you know $X_3=3$ then it had to be that $X_1=1$ and $X_2=2$, there is no other way to reach $3$ in time.

Related Question