[Math] Markov Process, Markov Chain

markov chainsmarkov-processstochastic-processes

I am trying to explain the differences between the following concepts to someone and I realized I myself am super confused:

Continuous/discrete Markov Process

Continuous/Discrete Markov chains

Markov property : $\mathrm{P}\{X_n=i|X_{n−1}=j,X_{n−2}=k,…\}=\mathrm{P}\{X_n=i|X_{n−1}=j\}\mathrm{P}\{X_n=i|X_{n−1}=j,X_{n−2}=k,…\}=\mathrm{P}\{X_n=i|X_{n−1}=j\} ?$

I used to think: Every process that has Markov property is a Markov Process. Every Markov process is a Markov chain and every Markov chain is a Markov process.

But it seems crazy now when I think about it, because if they are all the same, why there are different names for them?

And they are continuous (discrete) if their parameter set TT is continuous (discrete) regardless of their state space?

I want to start with homogeneous Markov chain and process too. But since I am already too confused and Wikipedia is making me more confused, I prefer to wait till I get these basic definitions straight first (any nice analogy that can be useful to teach them to others would be highly appreciated too if any teacher here knows any.).

Thanks a lot

Best Answer

It might be important to differentiate between the various stochastic process types based on both state space and time variable. (Note: discrete space/time can also be called countable.) So there are 4 types:

  • Discrete-spacetime: the process moves from state-to-state (each of which can be represented by integers) in discrete steps. For example, imagine a random walk on a graph that takes a step for each $t\in \mathbb{Z}_{\geq 0}$.
  • Discrete-time continuous-space: the process moves in discrete turns, but takes continuous values. For instance, the classic (discrete-time) random walk of unit step-size on $\mathbb{R}^n$. (See also here).
  • Continuous-time discrete-space: the process moves continuously in time, but in a countable space (e.g. see Continuous-time discrete-space models for animal movement).
  • Continuous spacetime: the time variable is continuous, and the process moves in a continuous space (e.g. $\mathbb{R}^n$). This includes Brownian motion and other Ito processes.

The next part is not so clearly agreed upon in the literature. I will simply state the definitions I am used to seeing.

A Markov process is any stochastic process that satisfies the Markov property. It doesn't matter which of the 4 process types it is.

A Markov chain is a Markov process with a discrete state space (i.e. can be type 1 or 3).

A Discrete-time Markov chain (or discrete Markov chain) is a Markov process in discrete time with a discrete state space (i.e. type 1, above).

A Continuous-time Markov chain (or continuous Markov chain) is a Markov process with a discrete state space in continuous time (i.e. of type 3). (E.g. see here).

A Stationary process is a stochastic process with a joint probability distribution that does not change when translated in time (see here).

A Time-homogeneous Markov chain is a stationary Markov chain. This means that the transition probabilities do not change in time. So, the probability of going from one state $s_1$ to another state $s_2$, once you are at $s_1$, is always the same (i.e. it doesn't matter when you get there).

A discrete-time stationary Markov chain is the most classic case (and in fact what most people mean when they say Markov chain).

Related Question