Probability a given markov chain is in two different states at two different times

markov chainsmarkov-processstochastic-processes

Let us say that our state space $S = \{1, 2, 3, 4\}$

Now let us say our transition matrix $P$ is given by:
\begin{bmatrix}
1/2 & 1/2 & 0 & 0 \\
1/3 & 0 & 1/3 & 1/3 \\
1/6 & 1/6 & 2/3 & 0 \\
1/2 & 1/4 & 1/4 & 1/2
\end{bmatrix}

Given that a Markov chain $X_n$ is in state 3 at time 0 (i.e $X_0 = 3$), what is the probability that it is in state 1 at time 4 and state 2 at time 5? To rephrase the question, given that $X_0 = 3$, what is the probability that $X_4 = 1$ and $X_5 = 2$ both occur

Originally, the way I thought to go about answering this question was to take the row matrix $\pi_0$:

\begin{bmatrix}
0 & 0 &1&0 \\
\end{bmatrix}

which gives the probability distribution of the Markov chain at time 0, and take the first entry of the row matrix given by $\pi_0P^4$, (the probability of the Markov chain being in state 1 at time 4) and the second entry of the row matrix given by $\pi_0P^5$ (the probability of the Markov chain being in state 2 at time 5, and multiplying these two entries together to get our final answer. However, this would require the two events to be independent, which, due to the definition of Markov processes, I am not sure they are. Is this approach correct? If not, how would you solve this problem?

Best Answer

The required probability is the probability of transition from state 3 to state 1 in 4 steps (between time 0 and 4), multiplied by the probability of transition from state 1 to state 2 in 1 step (between time 4 and 5). We can multiply these probabilities directly because of the Markov property: what happens before time 4 does not affect the probability of transitioning from state 1 to state 2 between times 4 and 5. In other words, the probability is equal to $$P^4_{31}\cdot P_{12}$$ Note: here $P^4_{31}$ refers to the 4-step transitional probability, not $(1/6)^4$.