[Math] Starting from state 0, the mean number of visits of state 2 before coming back to state 0 is

markov chainsprobability

The transition probability matrix is$$
\mathrm P= \begin{pmatrix}
0 & 2/3 & 1/3 \\
1/2 & 0 & 1/2 \\
1/2 & 1/2 & 0 \\
\end{pmatrix}
$$
Starting from state 0, the mean number of visits of state 2 before coming back to state 0 is

  1. 5/9
  2. 6/9
  3. 7/9
  4. 8/9
  5. None of the above is correct.

If my step is from 0->1->2->0, the probability = $1/6$, but I don't know the total step from state 0 toward itself, and also what is the meaning of "the mean number of visits"?

Best Answer

Let $m_i$ be the mean number of visits to state $2$ before returning to state $0$, starting from state $i$. We want to determine $m_{0}$. By using a one-step analysis, these $m_{i}$ satisfy

\begin{align} m_{0} &= \frac{2}{3} m_{1} + \frac{1}{3} m_{2}, \\ m_{1} &= \frac{1}{2} \cdot 0 + \frac{1}{2} m_{2}, \\ m_{2} &= 1 + \frac{1}{2} \cdot 0 + \frac{1}{2} m_{1}. \end{align}

The interpretation is as follows. Since this is a Markov chain: if I start in state $0$, with probability $\frac{2}{3}$ I transition to state $1$ and can start the process from there and with probability $\frac{1}{3}$ I transition to state $2$ and can start the process from there. The $\frac{1}{2} \cdot 0$ terms indicate that with probability $\frac{1}{2}$ we have returned to state $0$ and the excursion ends. The $1$ term in the $m_2$ expression counts that the process has spent 1 time unit in state $2$.

You can probably solve this system of equations.