Solved – Determine the communication classes for this Markov Chain

markov-process

Say we have a Markov Chain with probability matrix

$$ P = \begin{pmatrix} 0.25 & 0.25 & 0.5 & 0 & 0 \\
0 & 0.66 & 0 & 0.33 & 0 \\
0 & 0.25 & 0.25 & 0.25 & 0.25 \\
0.5 & 0 & 0 & 0.25 & 0.25 \\
0 & 0 & 0 & 0.2 & 0.8 \end{pmatrix} $$

I'm confused, it may seem really basic but I don't want to leave it to chance in the exam.

What would be the communication classes.

I would have thought that there would be only one as all the states communicate, hence it is irreducible.

But states 4 and 5 could be a class?

Best Answer

I was not familiar with the definition of communicating classes for Markov chains, but found agreement in the definitions given on Wikipedia and on this webpage from the University of Cambridge.

Assume that $\{X_n\}_{n\geq 0}$ is a time homogenous Markov Chain. Both sources state a set of states $C$ of a Markov Chain is a communicating class if all states in $C$ communicate. However, for two states, $i$ and $j$, to communicate, it is only necessary that there exists $n>0$ and $n^{\prime}>0$ such that

$$ P(X_n=i|X_0=j)>0 $$

and

$$ P(X_{n^{\prime}}=j|X_0=i)>0 $$

It is not necessary that $n=n^{\prime} = 1$ as stated by @Varunicarus. As you mentioned, this Markov chain is indeed irreducible and thus all states of the Markov chain form a single communicating class, which is actually the definition of irreducibility given in the Wikipedia entry.

It is often helpful for problems with small transition matrices like this to draw a directed graph of the Markov chain and see if you can find a cycle that includes all states of the Markov Chain. If so, the chain is irreducible and all states form a single communicating class. For larger transition matrices, more theory and\or computer programming will be necessary.

Related Question