Calculations with a Markov chain

markov chainsmarkov-processprobabilityprobability distributionsprobability theory

Let $(X_n)_{n \in N_0}$ be a Markov chain with the following graph:

chain

(a) Find its probability transition matrix.

(b) Find all stationary (invariant) distributions of the Markov chain.

(c) Let the initial distribution be $X_0 = (0.5, 0.5, 0)$. Calculate the distribution of $X_2$ and $P(X_0·X_2 = 1)$.

Here's my work:

(a) $P = \begin{bmatrix}0 & 0.5 & 0.5 \\0.5 & 0.5 & 0 \\ 0 & 0 & 1 \end{bmatrix}.$

(b) Since $3$ is the only absorbing state, the stationary distribution has to be $(0, 0, 1).$

(c) The distribution of $X_2 = X_0 \cdot P^2 = (\dfrac{1}{4}, \dfrac{3}{6}, \dfrac{3}{8}).$


I'm not sure how to solve for $P(X_0 \cdot X_2 = 1)$. Is the question looking for $P(X_0 = 1, X_2 = 1)?$

Best Answer

The statement of the question is a little muddled. If the Markov chain is $\ \big(X_n\big)_{n\in N_0}\ $ with the transitions illustrated in the given graph and initial distribution $\ \pi_0=(0.5, 0.5,0)\ $, then $\ X_0\ $ must be the state at time $0$, a random variable which assumes the value $0$ with probability $0.5$ and the value $1$ with probability $0.5$. It cannot be its own distribution, so using the same symbol "$X_0$" to represent that distribution is a confusing misuse of notation, which makes it a little puzzling as to what $\ P\big(X_0\cdot X_2=1\big)\ $ is supposed to mean.

However, your interpretation of it is the only one I can see that makes any sense. Taking $\ X_0\ $ to be the state of the chain at time $0$ (rather than the distribution of that state), and $\ X_2\ $ to be its state at time $2$, the only way $\ X_0\cdot X_2\ $ can be $1$ is if $\ X_0\ $ and $\ X_2\ $ are both $1$.