Solved problem of a three state Markov chain conditioned on two states

markov chainsmarkov-processstochastic-processes

Given a Markov chain with three states, $J = \{1,2,3\}$, and a transition matrix
$$P = \begin{bmatrix} 1/2 & 1/4 & 1/4 \\ 1/3 & 0 & 2/3 \\ 1/2 & 1/2 & 0 \end{bmatrix}$$

If we know that $$P(X_1 = 1 ) = P(X_1 = 2) = 1/4$$
Find $$P(X_1 = 3, X_2 = 2, X_3 = 1)$$

For a joint Markov chain for example, this could have been \begin{align*}P(X_1 = 3, X_2 = 2, X_3 = 1 ) & = P(i_3) \cdot P(i_3, i_2) \cdot P(i_2, i_1) \\ & = (?) \times (1/2) \times (1/3) \end{align*}

But here is the case that we have a three state Markov chain conditioned on two states. Is there a general theorem or whatsoever to tackle this kind of problems? I would be glad to know about it.

Best Answer

Using the definition of conditional probability (conditioning on the event $\{X_1=3\}\cap \{ X_2 = 2\}= \{X_1=3,X_2=2\}$), we get $$P(X_1=3,X_2=2,X_3=1)=P(X_3=1|X_2=2,X_1=3)\cdot P(X_2=2,X_1=3)$$ Using Markov property, since the state of $X_3$ depends only on the state of $X_2$, hence independent of $X_1$, the right side above can be modified as $$P(X_1=3,X_2=2,X_3=1)=P(X_3=1|X_2=2)\cdot P(X_2=2,X_1=3) \\ = \frac13 \cdot P(X_2=2,X_1=3) \\(\text{using conditional probability }) \ \ = \frac13 \cdot P(X_2=2|X_1=3)\cdot P(X_1=3) \\ (\because \{1,2,3\} \text{ are the only possible states})\ = \frac13 \cdot \frac12\cdot (1-P(X_1=1)-P(X_1=2)) \\ = \frac16 \left(1-\frac14-\frac14\right)=\frac1{12}$$

so the term you are looking for is $P(X_1=3)$ which can be determined from the given probabilities of $X_1$ being $1,2$ in the question.

In the context of Markov chains, the probabilities $P(X_1=i)$ for $\{i=1,2,3\}$ in this case determine the initial distribution of a Markov chain (the first state of the random variable is usually referred to as $X_0$ instead of $X_1$).