A semantic question on Markov chain random variable

markov chainsprobability

Based on Wikipedia definition introduction paragraphs it says:

A Markov chain is "a stochastic model describing a sequence of
possible events in which the probability of each event depends only on
the state attained in the previous event".[1]

In probability theory and related fields, a Markov process, named
after the Russian mathematician Andrey Markov, is a stochastic process
that satisfies the Markov property[2][3] (sometimes characterized as
"memorylessness"). Roughly speaking, a process satisfies the Markov
property if one can make predictions for the future of the process
based solely on its present state just as well as one could knowing
the process's full history, hence independently from such history;
i.e., conditional on the present state of the system, its future and
past states are independent.

A statement from a Linear Algebra book (i won't advertise it….) that i studied from says:

A Markov chain is a sequence of random variables with the property that given
the present state,the future and past states are independent.

My question is:
If Monopoly game can be modeled as a Markov chain i feel that there is a contradiction here based on the book's claim. I suspect that future dice outcomes are completely random(the dice is fair) but the territory you occupy with your pawns does affect your future moves. Past moves and land you occupy does affect future moves, it is not independent. What am i misunderstanding ??

Best Answer

As requested in comments:

In your Monopoly game, the past has caused the present position, and the present position will affect the future, so in that sense the past affects the future

But several different possible pasts could have led to the present position and the future will not be affected by which one of those possible pasts was the actual past, so given the present position the future will be independent of the actual past