Transition matrix and stationary distribution(with proof)

markov chainsprobabilitystochastic-processes

Suppose that ${X_n}$ ; $n≥0$ is an irreducible Markov chain on $S$ with stationary distribution
$π = (π(i))$
$i∈S
$
and let $Y_n$ = $(X_n, X_{n+1})$ be the Markov chain on $S'$ = ${(i, j) : P_{i,j} > 0}$ $⊂ S×S$.

a) Let $P'$ be the transition matrix for the Markov chain ${Y_n}$ ; $n≥0$. Compute the matrix $P'$
in the special case when ${X_n}$;$n≥0$ is the Markov chain on $S$ = $[0, 1]$ with transition
matrix P =
$$\begin{pmatrix}
1/4 & 3/4 \\
1/2& 1/2 \\
\end{pmatrix}$$

b) Compute the stationary distribution $π'$ for the Markov chain ${Y_n}$; $n≥0$ in the special case of the part above.

c) Generalize your answer from part b) by showing that it is always true that $π'$ =
($π'(i, j))_{(i,j)∈S'}$ given by $π'(i, j)$ = $π(i)P_{i,j}$ is a stationary distribution for ${Y_n}$;$n≥0$.

What I have done so far:
a) The transition matrix that I got for $Y_{n}$ is
$$\begin{pmatrix}
1/4 & 3/4 &0 &0\\
0&0 &1/2 &1/2\\
1/4&3/4 &0 &0\\
0&0 &1/2&1/2 \\
\end{pmatrix}$$

where the the rows and columns are in order of $00,01,10,11$

b) From the transition matrix – I have four equations for the stationary distribution which in the end gives me $$π_{1} =π_{2}=π_{3}=π_{4} $$ and therefore my $π$ = ${1/10,3/10,3/10,3/10}$ ( I am not sure whether it is correct or not – it feels weird to me)

c) I am having difficulty understanding what it is saying.

Best Answer

To understand what is required to prove in part (c) it is firstly impotrant to use the same notations as in the body of the question. Let $\pi'=(\pi'(0,0),\,\pi'(0,1),\, \pi'(1,0),\,\pi'(1,1))$ is a stationary distribution for $Y_n$.

(b) You solved $\pi'=P'\pi'$, this is not the system of equations for determine stationary distribution. For any stochastic matrices this system of equations gives the vector with the same coordinates.

If $\pi'$ is an initial distribution, the distribution on the next step is $\pi'\cdot P'$, so the equality $\pi'=\pi' P'$ defines stationary distribution. It is $$ (\pi'(0,0),\,\pi'(0,1),\, \pi'(1,0),\,\pi'(1,1)) = \left(0.1,\,0.3,\,0.3,\,0.3\right). $$

Return to (c). You need firstly to find stationary distribution $\pi=(\pi(0),\pi(1))$ for initial chain $X_n$ by $\pi=\pi P$, and check whether $\pi'(i,j)=\pi(i) P_{i,j}$ that is $$ 0.1=\pi'(0,0) = \pi(0)\cdot P_{0,0}=\pi(0)\cdot\frac14, $$ $$ 0.3=\pi'(0,1) = \pi(0)\cdot P_{0,1}=\pi(0)\cdot \frac34, $$ $$ 0.3=\pi'(1,0) = \pi(1)\cdot P_{1,0}=\pi(1)\cdot \frac12 $$ and $$ 0.3=\pi'(1,1) = \pi(1)\cdot P_{1,1}=\pi(1)\cdot \frac12. $$

And then there are two possibilities for understand what generalizations are needed in part (c).

Either you generalize it for arbitrary transition matrix $P=\pmatrix{a & 1-a\\ 1-b & b}$ on state space $S=\{0,1\}$ and repeat all the steps from the beginning: write $P'$, find stationary distribution $\pi'$ for it, find stationary distribution $\pi$ and check whether $\pi'(i,j)=\pi(i) P_{i,j}$ for all $i,j\in\{0,1\}$,

or (which looks more probable for me), suppose that $S$ is arbitrary finite state space, $P$ is a transition matrix for $X_n$ for which there exists stationary distribution $\pi$, $Y_n=(X_n,X_{n+1})$ is a MC on $S\times S$, then stationary distribution $\pi'$ satisfies $\pi'(i,j)=\pi(i) P_{i,j}$ for all $i,j\in S$.

For the last case, there is no need to write $P'$. Stationary distribution satisfies the property: if the chain starts from stationary distribution, it stays in stationary distribution for any step. So the only thing you need to check is :

Let $\mathbb P(X_0=i,X_1=j)=\pi'(i,j)=\mathbb P(X_1=i,X_2=j)$ for all $i,j$ then $\pi'(i,j)=\pi(i) P_{i,j}$.

And this can easily be checked: $$ \pi'(i,j)=\mathbb P(X_0=i,X_1=j)= \mathbb P(X_0=i)\mathbb P(X_1=j\mid X_0=i) = \mathbb P(X_0=i) P_{i,j} $$ $$ \pi'(i,j)=\mathbb P(X_1=i,X_2=j)= \mathbb P(X_1=i)\mathbb P(X_2=j\mid X_1=i)=P(X_1=i) P_{i,j} $$ and these probabilities coincide iff $P(X_0=i)=P(X_1=i)$ for all $i$, so $X_n$ works in stationary distribution $P(X_n=i)=\pi(i)$. And then $$ \pi'(i,j) = \mathbb P(X_0=i) P_{i,j} = \pi(i)P_{i,j}. $$