[Math] How to determine if a Markov chain converge to equilibrium

convergence-divergencemarkov chainsmatricesprobabilitystatistics

For example, if there is a matrix$$ \begin{pmatrix}
0 & 0 & 1 & 0 \\
0 & 0 & 0.5 & 0.5 \\
0.3 & 0.7 & 0 & 0 \\
1 & 0 & 0 & 0 \\
\end{pmatrix}$$
the stationary distribution is when $\pi_2$ = 1.07692$\pi_1$,$\pi_3$ = 1.53846$\pi_1$,$\pi_4$ = 0.53846$\pi_1$

So the stationary distribution exists.
Does it mean that the matrix converges to equilibrium?

I am learning this chapter on my own and I am quite confusing … Thanks in advance!

Best Answer

The Markov chain converges to a unique equilibrium if there is only one recurrent class and it is aperiodic. In this case the directed graph corresponding to the Markov chain looks like this:

enter image description here

The Markov chain is irreducible (i.e. every state communicates with every other state), but it is periodic with period $2$, because from states $1$ and $2$ you can only go to $3$ and $4$ and vice versa. Therefore it does not converge to an equilibrium. In fact, the even powers of the matrix converge to $$ \frac{1}{27} \pmatrix{13 & 14 & 0 & 0\cr 13 & 14 & 0 & 0\cr 0 & 0 & 20 & 7\cr 0 & 0 & 20 & 7\cr}$$ and the odd powers converge to $$ \frac{1}{27} \pmatrix{0 & 0 & 20 & 7\cr 0 & 0 & 20 & 7\cr 13 & 14 & 0 & 0\cr 13 & 14 & 0 & 0\cr}$$