[Math] Mean return time Markov Chain

markov chainsprobabilitystochastic-processes

A Markov chain has states $0,1,2$ with transition probabilities $$P=\begin{pmatrix} 0.8 & 0.1 & 0.1 \\ 0.3 & 0.5 & 0.2 \\ 0.2 & 0.4 & 0.4 \end{pmatrix}.$$

I struggle to calculate the mean return time to state 1 given that we start at state 1.

Define $T=\min{\{n\geq1 :X_n=1\}}$ and $g_i=E(T|X_0=i)$. I seek $g_1$. By total probability and Markov property $$g_i=\sum_{j\in {1,2,3}}E(T|X_0=i,X_1=j)P(X_0=i,X_1=j))) =\sum_{j\in {1,2,3}} E(T+1|X_0=j)P(X_0=i,X_1=j))=\sum_{j\in {1,2,3}}g_ip_{ij}+1$$
So I get the system of equations
\begin{cases}
g_0 = 0.8g_0+0.1g_1+0.1g_2+1 \\
g_1 = 0.3g_0+0.5g_1+0.2g_2+1 \\
g_2 = 0.2g_0+0.4g_1+0.4g_2+1 \\
\end{cases}

which has no solution. What is wrong with my equations?

Best Answer

For all states $j \ne 1$, if you enter state $j$, you have on average $g_j$ more steps before you get to state $1$. However, if you enter state $1$, you don't have on average $g_1$ more steps before you get to state $1$: if you enter state $1$, you're done!

In other words, $E(T \mid X_0 = i, X_1 = 1)$ should just simplify to $1$, unlike every other instance of $E(T \mid X_0 = i, X_1 = j)$, because if we know that $X_1 = 1$, we know that $T=1$.

So your system of equations should instead be: \begin{cases} g_0=0.8g_0+0.1\color{red}{\cdot 0}+0.1g_2+1 = 0.8g_0 + 0.1g_2 + 1\\ g_1=0.3g_0+0.5\color{red}{\cdot 0}+0.2g_2+1 = 0.3g_0 + 0.2g_2 + 1\\ g_2=0.2g_0+0.4\color{red}{\cdot 0}+0.4g_2+1 = 0.2g_0 + 0.4g_2 + 1\\ \end{cases}

Related Question