[Math] Finding Hitting probability from Markov Chain

markov chainsprobability

I have a Markov chain with states {1,2,3,4,5} which has the following transition matrix:

$$P= \begin{bmatrix} 0.3 & 0 & 0.7 & 0 & 0\\ 0 & 1 & 0 & 0 & 0\\ 0.5 & 0 & 0.5 & 0 & 0\\ 0.2 & 0 & 0 & 0.5 & 0.3\\ 0 &1 & 0 & 0 & 0\\\end{bmatrix}$$

From here, I need to calculate the hitting time, $h_{42}$, the probability that starting from state 4, the chain ever reaches state 2.

My answer was:

$h_{42}$ = $p_{45}$ $h_{52}$ + $p_{44}$ $h_{42}$ + $p_{41}$ $h_{12}$

$h_{42}$ = 0.3 $h_{52}$ + 0.5 $h_{42}$ + 0

from here, I calculated $h_{52}$ which, $h_{52}$ = 1

Finally, I got:

0.5 $h_{42}$ = 0.3

$h_{42}$ = 0.3/0.5 = 0.6 or 3/5

Could anyone please help me if I had find the solution correctly? (it was my first attempt trying to find the hitting time).

Similarly, if I want to find $h_{41}$, is it:

$h_{41}$ = 1 – $h_{42}$ = 2/5?

Best Answer

It seems that you found the probability of the event that the chain hits state $2$ starting from state $4$ in finitely many steps. However, it is not standard to call this probability a "hitting time" (it is typically called the "hitting probability"). Rather, the "hitting time" you are referring to is the random variable $H_2 = \min\{n \geq 0 : X_n = 2\}$ where $\{X_n\}_{n=0}^{\infty}$ is the Markov chain related to your transition matrix $P$; by intuitive convention, we define $H_2 = \infty$ when $\{n \geq 0 : X_n = 2\} = \varnothing$.

To relate $H_2$ and $h_{42}$, let $\Bbb{P}_4$ be the probability measure with $\Bbb{P}_4(X_0 = 4)=1$ (the probability measure when the chain starts in state $4$). Then you found $h_{42} = \Bbb{P}_{4}(H_2 < \infty).$