[Math] Markov Chain never reaches a state

markov chainsstochastic-processes

I've gotten stuck on this problem. Here's my transition matrix:

$$P=\begin{bmatrix} 0.4 & 0.1 & 0.5 \\ 0.4 & 0 & 0.6 \\ 0.6 & 0.4 & 0\end{bmatrix}$$
with state space $S$ = {1,2,3}.

I've actually been asked two questions here is the first one:

When starting in state 1, find the probability that state 3 is visited before state 2.

I realise that this question is similar to the one asked here, Markov Chain Reach One State Before Another, however as my transition matrix is not diagonisable I cannot use the method described in the answer. If anyone could provide a solution to this problem or an alternative method it would be massively appreciated.

The second problem is as follows:

When starting in state 1, find the probability that state 2 is never visited.

I don't really have any idea how to go ahead with this problem other that than I'd assume it will use a similar method to the first problem. Again I'd really appreciate any help with a method for solving this problem or a solution.

Best Answer

Starting in state $1$, you may stay in that state for some number of steps, but with probability $1$ you will eventually leave. The first step that doesn't stay in state $1$ will decide the question of which of states $2$ and $3$ get visited first. The conditional probability of going to state $3$ from state $1$ in a step, given that you are not staying in state $1$, is $0.5/(0.1+0.5) = 5/6$, so that is the probability that state $3$ is visited before state $2$.