Markov Chains (probability of event not occurring)

markov chainsmatricesprobabilitystatisticsstochastic-processes

Given a transition matrix $$P(3X3) = \begin{pmatrix} 0.3& 0.7& 0\\
0.4 &0 & 0.6\\
0 &0.5& 0.5 \end{pmatrix}$$

Starting with level $0$ on the top left and moving down to level $2$ towards the bottom.

Question: If the person starts on level $0$ what is the probability they will not reach level $2$ once in the next four years?

I have tried multiple things such as calculating the probability of reaching level 2 at each year then multiplying those together and subtracting them from one but I don't get the right answer for any. Please help.

Best Answer

Here is the question I assume you are trying to ask:

A person starts on level zero among levels $0,1,2$, and can transition between levels every year. The probabilities of each transition are given in the transition matrix $$P_{3 \times 3} = \begin{pmatrix} 0.3& 0.7& 0\\ 0.4 &0 & 0.6\\ 0 &0.5& 0.5 \end{pmatrix}$$

Where $P_{ij}$ denotes the probability of moving from level $i-1$ to level $j-1$.

Question: If the person starts on level $0$ what is the probability they will not reach level $2$ once in the next four years?

Following an idea from the comments: if we make level 2 an absorbing state, i.e. if we replace the bottom row with $(0,0,1)$ to form the matrix $Q$, then the probability of reaching level 2 at some point is the $1,3$ entry of the matrix $Q^4$. That is, it is the $1,3$ entry of the matrix $Q^4$, where $$ Q = \pmatrix{0.3& 0.7& 0\\ 0.4 &0 & 0.6\\ 0 & 0 & 0.5}. $$ With a direct computation, we see that this probability is $0.7014$. It follows that the desired probability is equal to $1 - 0.7014 = 0.2986$.

Related Question