[Math] How to a Markov chain have more than one but a finite amount of stationary distributions

fixed points-markov chainssteady statestochastic-matricesstochastic-processes

Here's my understanding of it: Assume we have an $n\times n$ stochastic matrix $P$ that represents our Markov chain such that $x$ and $y$ are stationary distributions for $P$. Then

$P(x) = x$

$P(y) = y$

$P(ax+by) = P(ax) + P(by) = aP(x) + bP(y)$ where $ax+by$ is a convex linear combination

$ = ax + by$

meaning that $ax+by$ is a stationary distribution, so there is an infinite amount of stationary distributions of P if there are at least 2.

Does this mean a Markov chain either has one or infinitely many stationary distributions?

Best Answer

Yes, the set of stationary distributions will always be convex, for the reason you give.

Some Markov chains (like simple random walk on the integers) have no stationary distributions. But finite MCs have stationary distributions, so your statement about "1 or infinitely many" is correct, in this case.