Are all Stochastic Processes “Automatically” Markov Chains

markov chainsprobabilitystochastic-processes

I was reading these notes on Stochastic Process over here (http://faculty.washington.edu/yenchic/18A_stat516/Lec3_DTMC_p1.pdf). Over here, the following claim is stated:

enter image description here

The way this claim is written, it seems to suggest that there might be some Discrete Time Stochastic Process that might not be a Markov Chain if the stated condition is not satisfied (i.e. P(Xn = i conditional on Xn-1 = j….) only depends on P(Xn = i conditional on Xn-1 = j….) ). In this case, I believe that this stated condition is sometimes also called the "Markov Property".

However, I tried to think of counterexamples but I could not seem to think of a counterexample of a Discrete Time Stochastic Process that does satisfy the stated condition – suggesting that perhaps every Discrete Time Stochastic Process is "automatically" a Markov Chain.

I am aware that there are some Markov Chains in which the probability of the chain being in some state at a future time does not necessarily only depend on the current state – but rather, might depend on some finite history of states. For instance, a Markov Chain can be classified as a "Second Order Markov Chain" in which the probability of being in some state depends on the previous two states instead of only the previous state (What is an example of a second-order markov chain?).

  • But in general, can all Discrete Time Stochastic Processes be considered as "some type of Markov Chains" (e.g. if not 1st order, perhaps 2nd order, 3rd order, etc.)?

Is this just a formality, or are there actually some Discrete Time Stochastic Processes that can fundamentally not be considered as Markov Chains?

Thank you!

Best Answer

You could have a process that looks at the history arbitrarily far back. For example, you could have a coin that respects the gambler's fallacy: after a streak of $n$ times coming down one side, it has a $\frac1{n+1}$ probability of landing on that side again, and a $\frac{n}{n+1}$ probability of landing on the other side.

Then no $k^{\text{th}}$ order Markov chain can properly describe it. For any $k$, we can consider two situations: a streak of length $k$, and a streak of length $k+1$. These give different probabilities ($\frac1{k+1}$ and $\frac1{k+2}$) that the streak continues, but the $k^{\text{th}}$ order Markov chain can only treat them the same.

Related Question