Introduction: Natural language represented using Markov model
A very simple and practical example would be that of natural language.
Now first of all, natural language is just a terminology for any language you speak and have acquired or learned, such as English. When you communicate, you have a set of words {$W$} and a set of operations you can perform on these words. You then proceed further to arrange the words that you know in a certain order that produces meaningful elements of communication/semantics -- "sentences".
When you use words sentences, they are obviously not arbitrarily placed words, they have some order. In fact, sentence formation can be well modeled as a Markov chain model with order "$k$"$=1$ (reference).
Sentence formation and conditional probability
Now establish a relation between your possible states $[1, 6]$ and certain six words. It can be experimentally determined that $P(C | B) \geq P(C|A,B)$ where $A,B$ is an ordered occurrence of the words $A$ and $B$ in a sentence. This relation always holds in case of a natural language as $P (C | B)$ already includes all possible trigrams that can be formed as "$X B C$" where $X$ is a general word of the language. In computational linguistics, this is also termed as the transition probability.
I'll give you an example of that as well:
Let let us, however, consider the backward probability $P(A|$"$A,B$"$)$ i.e., the probability that $A$ is the word processing $B$. Now let $A$ be "chocolate" and $B$ be "chip". Now there will exist a certain probability in the historical literature of the language for $P(A|$"$A,B$"$) = p_{k=1}$. However, let us now consider another word $C$, that represents "cookie". When you compute the new probability, $P(A|``A,B,C")=p_{k=2}$, then you find that $p_{k=2} \leq p_{k=1}$, because there are more words that you can say after saying "chocolate chip" other than simply "cookie". I have denoted it as "$\leq$" because theoretically it is possible that only one such meaningful triplet can be formed in a natural language, although that's rarely ever the case. You cohld simply replace is with a strict inequality relation for all practical purposes.
I think this provides a sufficient as well as intuitive example for the problem thay you have stated. Let me know if something from the answer is unclear.
As requested in comments:
In your Monopoly game, the past has caused the present position, and the present position will affect the future, so in that sense the past affects the future
But several different possible pasts could have led to the present position and the future will not be affected by which one of those possible pasts was the actual past, so given the present position the future will be independent of the actual past
Best Answer
Yes, information can go backward in time, so you cannot do the reduction that you want to do here.
For an example, think of something like the simple random walk started at $0$; if you know $X_3=3$ then it had to be that $X_1=1$ and $X_2=2$, there is no other way to reach $3$ in time.