[Math] Calculating probability from Markov Chain

markov chainsprobability

I have a Markov Chain with states {1,2,3,4,5} which has the following transition matrix below:

$$P= \begin{bmatrix} 0.3 & 0 & 0.7 & 0 & 0\\ 0 & 1 & 0 & 0 & 0\\ 0.5 & 0 & 0.5 & 0 & 0\\ 0.2 & 0 & 0 & 0.5 & 0.3\\ 0 &1 & 0 & 0 & 0\\\end{bmatrix}$$

From here, I need to calculate:

1) $P(X_6=1 | X_4=4,X_5=1, X_0=4)$

What I have tried so far is that I believe that this is same as $P(X_6=1 | X_4=4,X_5=1)$ which is $$0.2 * 0.3 = 0.06$$

Is this correct? or please help me if I am wrong here.

2) $P(X_2=3, X_1=3 | X_0=1)$

What I have tried is rearranging the formula to: $P(X_1=3, X_2=3 | X_0=1)$ and then I get $$0.7 * 0.5 = 0.35$$

I think my answer to question 2 is correct but not really sure for question 1.

I would appreciate any help here please:)

Best Answer

The Markov property says the distribution given past time only depends on the most recent time in the past.

1) $P(X_6=1 | X_4=4,X_5=1, X_0=4) = P(X_6 =1 | X_5 =1)$ which is the 1->1 transition entry (in position (1,1) which is $0.3$. The Markov property told us that the conditional property only depended on $X_5=1$.

2) $P(X_2 = 3, X_1 = 3 | X_0 = 1) = P(X_2 = 3 | X_1 = 3 , X_0 = 1) P(X_1 = 3 | X_0 = 1) = P(X_2 = 3 | X_1 = 3) P(X_1 = 3 | X_0 = 1)$ so this is the probability of transitioning from 3->3 times the probability of transitioning from 1->3. I don't know if your probabilities evolve from the left or right (i.e. if you left or right multiply by transition matrix for a probability vector, but once you know that you can look at either the (3,3) and (1,3) locations of the matrix or (3,3) and (3,1) entries of the matrix). We just used definition of conditional property and the Markov property here.