[Math] Markov Chain Conditional Probability

statisticsstochastic-processes

A Markov chain has the transition probability matrix as follows.
$$To$$
$$ From
\begin{matrix}
STATES& 0 & 1 & 2 \\
0 & 0.6 & 0.3 & 0.1 \\
1 & 0.3 & 0.3 & 0.4\\
2 & 0.4 & 0.1 & 0.5\\
\end{matrix}
$$
Assume that the initial value $X_0$ has the distribution: $P [X_0 = 0] = 0.3 , P [X_0 = 1] = 0.4 $, and$ P[X_0 = 2] = 0.3$

Find $P[X_0 = 0, X_1 = 2, X_2 = 1]$

I am just starting to learn how this stuff works. From my understanding I am finding $P[0.6, 0.4, 0.1]$ right? I need some help understanding how to move around the map.

Best Answer

The transition probability matrix tells you the probability of $X_n$ to be at state $k$ given that the previous time ($n-1$) you where at state $j$. So the probability you want is: $$P(X_0=0,X_1=2,X_2=1)=0.3\times 0.1\times 0.1$$

Note that $0.3$ is the probability that comes from the initial distribution.

The way of working with the transition matrix is: look at the transition matrix and see if you are in state $1$ for example go to the line that is state $1$ (in this matrix is the second row) and then if you want to go for example to state $0$ then go to the column of $0$ (in this matrix is the first one).

Related Question