The transition matrix and stationary distribution of this markov chain

markov chainsprobability

Let $N$ be a poisson-process with intensity $\lambda=1$. Let $\big(X_n\big)_{n \in \mathbb N_0}$ be a markov-chain (independent of $N$) on the set $\{1,2,3,4\}$ with transition matrix:

$$P=\begin{pmatrix} 0 & \frac{1}{2} & 0 & \frac{1}{2} \\ \frac{1}{2} & 0 &\frac{1}{2} & 0 \\ 0 & \frac{1}{2} & 0 & \frac{1}{2} \\ \frac{1}{2} & 0 &\frac{1}{2} & 0 \end{pmatrix}$$

and starting distribution $\nu$. Define the random variable $$\tilde{X}_t=X_{N_t}$$ Determine $P(\tilde{X}_t=k \vert \tilde{X}_0=l)$ for $k,l \in \{1,2,3,4\}$ Show that the process $\big(\tilde{X}_n\big)_{n \in \mathbb N_0}$ is a $(\nu, \tilde{P})$-Markov chain with transition matrix $\tilde{P}$. Determine $\tilde{P}$ and all stationary distributions.

My questions:

  1. As far as I understand the "jump" to the next state is determined by the poisson process. If there are $k$ arrivals in the interval $[0,t]$ the markov chain will jump to the state $X_k=\tilde{X}_t$. So let's assume that $k=3$ then the markov chain $\tilde{X}_t $ will be in the state $X_3$ correct? Does this mean that in this case $X_3=3$?

  2. How can I find the transition probabilities and determine the transition matrix? I found this resource from another question (see page 120) but I don't understand how the have arrived at the probabilities.

  3. In order to find the stationary distributions I have to solve the eigenvalue problem $$\pi \tilde{P}=\pi$$ Do I only have to solve this eigenvalue problem for the eigenvalue $\lambda=1$ or do I have to find all the eigenvalues and corresponding eigenvectors?

Best Answer

Some clues: This Markov chain is periodic of period 2. Cyclically alternating subclasses are $C_1=\{1,3\}$ and $C_2=\{2,4\}.$ So the chain is not ergodic. [if the initial vector, step 1. is $\nu = (1/2,\, 0,\, 1/2,\, 0),$ then it will visit class $C_1$ only at odd numbered steps.]

However, it is doubly stochastic (columns sum to unity), so the uniform distribution on the four states is a stationary (steady state) distribution. You can easily verify this using $\pi = (1/4,\, 1/4,\, 1/4,\, 1/4)$ in $\pi\mathbf{P}= \pi.$ Can you find others?

The terminology used to discuss Markov chains is not exactly standard from one text to another. I hope I have used terminology you can understand.

Related Question