The probability a Markov Chain never reaches a state

discrete timemarkov chainstransition matrix

Given a Discrete Time Markov Chain and an initial distribution, how do you find the probability the chain will never reach a state?

For example, an easy DTMC, knowing that it started at state 0, what would be the probability it never reaches state 2?

\begin{bmatrix}1/2&1/2&0\\1/3&1/3&1/3\\0&1&0\end{bmatrix}

My attempt: I know how to find the probability of a certain state at a time n, multiplying the initial state by the transition matrix raised to the nth power, so I was thinking I would take 1 minus this time for a sufficiently large n (to find the probability it eventually reaches the certain state), but that wouldn't guarantee the chain had never reached the certain state.

I saw this: Markov Chain never reaches a state
but it didn't answer the second question.

Best Answer

For finite state chains, the only positive-probability way this can happen is there exists $k$ such that $k$ is reachable from $i$ and $j$ is not reachable from $k$. Then if you hit $k$ before $j$ then you will never reach $j$. Thus in your example this probability is just zero.

Now let $u$ be a column vector so that $u_i$ is the probability to never reach $j$ started from $i$. You want some particular $u_{i_0}$, but it is easier to get $u_{i_0}$ by getting all the $u_i$ together or at least for all $i$ reachable from $i_0$. Let $A$ consist of all $k$ such that $j$ is not reachable from $k$. Finally let $P$ be the transition probability matrix in the row-stochastic convention. Then you have

$$(Pu)_i=u_i \quad i \not \in A \cup \{ j \} \\ u_i=1 \quad i \in A \\ u_j=0.$$

I can give the derivation of this system but it basically follows from using the total probability formula for one step.