Markov Chain Transition Probability – P(X1=3|X2=1) Calculation

markov-processself-study

A Markov chain {Xn, n ≥ 0} with states 1, 2,3 has the transition probability matrix P

\begin{bmatrix}0&0.4&0.6\\1&0&0\\0.3&0.3&0.4\end{bmatrix}

with an initial distribution A (0.5,0,0.5), what is $$P(X_1=3|X_2=1)$$?

(I know a Markov chain property is the future, given the present, is independent of the past. the question here look like given future, what is the probability of the past? I am wondering

$$P(X_1=3|X_2=1)=P(X_1=3)=A_3=0.5$$

or
$$P(X_1=3|X_2=1)=P_{13}=0.6$$
or else?

Best Answer

Rather than trying to guess the answer, one can apply probability laws. Since

  1. the transition matrix $\mathsf P$ is made of the probabilities $\mathbb P(X_t=j|X_{t-1}=i)$ as $(i,j)$ entries,

  2. the probability $\mathbb P(X_{t-1}=j|X_{t}=i)$ can be written as$$\mathbb P(X_{t-1}=j|X_{t}=i)=\dfrac{\mathbb P(X_t=i|X_{t-1}=j)\mathbb P(X_{t-1}=j)}{\mathbb P(X_t=i)}$$by Bayes' theorem,

  3. the probability $\mathbb P(X_t=i)$ can be written as$$\mathbb P(X_t=i)=\sum_{j=1}^3 \mathbb P(X_t=i|X_{t-1}=j)\mathbb P(X_{t-1}=j)$$by the law of total probability,

the reverse probabilities $\mathbb P(X_{1}=j|X_{2}=i)$ can be derived from the entries of the homework question.