[Math] How to a Markov chain be written as a measure-preserving dynamic system

dynamical systemsmarkov chainsprobabilitystochastic-processes

From http://masi.cscs.lsa.umich.edu/~crshalizi/notabene/ergodic-theory.html

irreducible Markov chains with finite state spaces are ergodic
processes, since they have a unique invariant distribution over the
states. (In the Markov chain case, each of the ergodic components
corresponds to an irreducible sub-space.)

By "ergodic processes", I understand it to be the same as "ergodic measure-preserving dynamic system", if I am correct.

As far as I know an ergodic measure-preserving dynamic system is a mapping $\Phi: T \times S \to S$ that satisfies a couple of properties, where $S$ is the state space, and $T$ is the time space. Sometimes there is a measure preserving mapping on $S$ that can generate the system by repeating itself.

So I wonder how a Markov chain can be written as a mapping $\Phi: T \times S \to S$, and what the measure preserving mapping that generates the Markov chain is?

Thanks!

Best Answer

The article talks about a (stationary) Markov chain ${(X_n)}_{n \in \mathbb{Z}}$ in discrete time with each $X_n$ taking its values in a finite set $E$. The canonical space of the Markov chain is the product set $E^{\mathbb{Z}}$. The trajectory $X=(\ldots, X_{-1}, X_{0}, X_1, \ldots)$ of the Markov chain is a random variable taking its values in $E^{\mathbb{Z}}$. Denoting by $\mu$ its distribution (which could be termed as the law of the Markov process) then $\mu$ is invariant under the classical shift operator $T \colon E^{\mathbb{Z}} \to E^{\mathbb{Z}}$. Then the Markov chain can be considered as the dynamical system $(T,\mu)$. In fact here we only use the fact that ${(X_n)}_{n \in \mathbb{Z}}$ is a stationary process. In the Markov case we can say in addition that the ergodicity of $T$ is equivalent to the irreducibility of ${(X_n)}_{n \in \mathbb{Z}}$.

Related Question