[Math] Markov Chain expected number of visits

canonical-transformationmarkov chainsmarkov-processprobability theorystochastic-processes

\begin{pmatrix}
0&0&.2&.8&0\\
0&0&0&.9&.1\\
.6&0&0&0&.4\\
.2&.8&0&0&0\\
0&.9&.1&0&0
\end{pmatrix}

Question: Suppose the Markov Chain Starts at state C. What is the expected number of visits to state B before reaching state A.

My professor showed several ways to solve problems similar to these but I am on with this one.

I have tried put the matrix into canonical form and using that to solve for the Q matrix, but I am running into issues doing that.

Best Answer

What I would do is make the state $A$ absorbing and then calculate the fundamental matrix $N = (I-Q)^{-1}$ where $Q$ is the block with state $A$ removed (assuming you indexed the matrix with states $A, B, C, D, E$)

$$ Q= \begin{bmatrix} 0&0&.9&.1 \\ 0&0&0&.4 \\ .8&0&0&0 \\ .9&.1&0&.4 \end{bmatrix} $$

You get

$$N= \frac{5}{447} \begin{bmatrix} 480 & 5 & 432 & 50 \\ \color{red}{180} & 95 & 162 & 56 \\ 384 & 4 & 435 & 40 \\ 450 & 14 & 405 & 140 \end{bmatrix}$$

This matrix $N$ encodes the expected number of visits to each state $i=B, C, D, E$ when starting from state $j=B, C, D, E$ before being absorbed. So you can read the answer is $\frac{5\times180}{447} = \frac{300}{149}$.