[Math] A few questions about Markov chains

markov chainsstochastic-processes

Let $\{X_n\}$, $n \geq 0$ be a Markov chain with the transition matrix $P$ such that
$$
\begin{array}{c|ccc}
&A &B &C \\
\hline
A &0.2 & 0.2 &0.6\\
B &0 & 0.25 &0.75\\
C &0.3 & 0.3 & 0.4
\end{array}.
$$

How do I find all the stationary distributions?

If the Markov Chain starts in state $A$, what is the expected number of steps before it returns to $A$?

And last one 🙂

How many times, on average, does the Markov Chain visit state $B$ between two visits to $A$.

I'm finding this topic quite tricky, so I really appreciate the help you guys are giving me 🙂 I'd love if someone could explain all the steps to me.. 🙂

Best Answer

To summarize a discussion in the comments, the OP uses as a textbook Introduction to Probability, by Grinstead and Snell. The authors explain how to find the stationary distribution(s) of a Markov chain on a finite number of states from its transition matrix. Section 11.3 Ergodic Markov chains gives the theory in Theorem 11.8, and explains the practice starting at Example 11.19 and continuing with many other examples afterwards.

Related Question