Solved – the difference between “limiting” and “stationary” distributions

markov-process

I'm doing a question on Markov chains and the last two parts say this:

  • Does this Markov chain possess a limiting distribution. If your answer is "yes", find the limiting distribution. If your answer is "no", explain why.
  • Does this Markov chain possess a stationary distribution. If your answer is "yes", find the stationary distribution. If your answer is "no", explain why.

What is the difference? Earlier, I thought the limiting distribution was when you work it out using $P = CA^n C^{-1}$ but this is the $n$'th step transition matrix. They calculated the limiting distribution using $\Pi = \Pi P$, which I thought was the stationary distribution.

Which is which then?

Best Answer

From An Introduction to Stochastic Modeling by Pinsky and Karlin (2011):

A limiting distribution, when it exists, is always a stationary distribution, but the converse is not true. There may exist a stationary distribution but no limiting distribution. For example, there is no limiting distribution for the periodic Markov chain whose transition probability matrix is $$ \mathbf{P}=\left\|\begin{matrix}0 & 1\\1 & 0\end{matrix}\right\| $$ but $\pi=\left(\frac{1}{2},\frac{1}{2}\right)$ is a stationary distribution, since $$ \left(\frac{1}{2},\frac{1}{2}\right)\left\|\begin{matrix}0 & 1\\1 & 0\end{matrix}\right\|=\left(\frac{1}{2},\frac{1}{2}\right) $$ (p. 205).

In a prior section, they had already defined a "limiting probability distribution" $\pi$ by

$$\lim_{n\rightarrow\infty}P_{ij}^{(n)}=\pi_j~\mathrm{for}~j=0,1,\dots,N$$

and equivalently

$$\lim_{n\rightarrow\infty}\operatorname{Pr}\{X_n=j|X_0=i\}=\pi_j>0~\mathrm{for}~j=0,1,\dots,N$$ (p. 165).

The example above oscillates deterministically, and so fails to have a limit in the same way that the sequence $\{1,0,1,0,1,\dots\}$ fails to have a limit.


They state that a regular Markov chain (in which all the n-step transition probabilities are positive) always has a limiting distribution, and prove that it must be the unique nonnegative solution to

$$\pi_j=\sum_{k=0}^N\pi_kP_{kj},~~j=0,1,\dots,N,\\ \sum_{k=0}^N\pi_k=1$$ (p. 168)

Then on the same page as the example, they write

Any set $(\pi_i)_{i=0}^{\infty}$ satisfying (4.27) is called a stationary probability distribution of the Markov chain. The term "stationary" derives from the property that a Markov chain started according to a stationary distribution will follow this distribution at all points of time. Formally, if $\operatorname{Pr}\{X_0=i\}=\pi_i$, then $\operatorname{Pr}\{X_n=i\}=\pi_i$ for all $n=1,2,\dots$.

where (4.27) is the set of equations

$$\pi_i \geq 0, \sum_{i=0}^{\infty} \pi_i=1,~\mathrm{and}~\pi_j = \sum_{i=0}^{\infty} \pi_iP_{ij}.$$

which is precisely the same stationarity condition as above, except now with an infinite number of states.

With this definition of stationarity, the statement on page 168 can be retroactively restated as:

  1. The limiting distribution of a regular Markov chain is a stationary distribution.
  2. If the limiting distribution of a Markov chain is a stationary distribution, then the stationary distribution is unique.