[Math] Show irreducibility of markov chain

markov chainsprobability theorystochastic-processes

I need to show that the markov chain that has transition matrix written below is irreducible.
\begin{bmatrix}
0.2 & 0.5 & 0.1 & 0.1 & 0.1 \\
0.2 & 0.5 & 0.3 & 0 & 0 \\
0.2 & 0 & 0.4 & 0.4 & 0 \\
0.2 & 0 & 0.2 & 0.4 & 0.2 \\
0.2 & 0 & 0 & 0.1 & 0.7
\end{bmatrix}

Is it enough for me to say for $n = 1,2,3,4,5$ we have that $\mathbb{P}(X_1 = n | X_0 =1) > 0$ and $\mathbb{P}(X_1 = 1 | X_0 =n) > 0$? Hence its irreducible. Is there any other (easier) way to show irreducibility?

Also, to calculate the stationary distribution, is the fastest/most convenient way to use $\pi P = \pi$?

Best Answer

You are right about irreducibility.

You usually find the invariant measure using $\pi=\pi P$ and linear algebra. The invariant probability $\pi$ will be unique, since your chain is irreducible.

But your transition matrix is special, so there is a shortcut. The column sums of $P$ are all equal to one. Such a transition matrix is called doubly stochastic and its unique invariant probability measure is uniform, i.e., $\pi=\left({1\over 5},{1\over 5},{1\over 5},{1\over 5},{1\over 5}\right).$

Related Question