[Math] the necessary and sufficient condition for a Markov chain to have a limiting distribution

markov chainsprobabilitystochastic-processes

I learned that if a Markov chain is ergodic (irreducible, aperiodic and positive-recurrent), then it is guaranteed that a limiting distribution exists (ref: http://www.columbia.edu/~ks20/stochastic-I/stochastic-I-MCII.pdf).

It seems ergodicity is a sufficient but not necessary condition for a limiting distribution. As I can think of an absorbing Markov chain (which is not irreducible) having a limiting distribution with all the mass on the absorbing state.

So what is the necessary and sufficient condition for a Markov chain to have a limiting distribution?

Best Answer

For a finite Markov chain, the nicest proof that I know of goes through under the weaker assumption that for any two states, there is a common third state they can both reach with positive probability, after some number of steps. And that it's aperiodic.

(This proof involves starting one Markov chain from a stationary distribution, another one from an arbitrary state, and coupling them so that if they're ever in the same state, they continue to be in the same state. Then the probability that the two Markov chains get stuck together approaches $1$ with time, so the second one converges to the stationary distribution.)

It's easy to see that both conditions are also necessary, so that answers the question for finite Markov chains.

For infinite Markov chains, this condition needs to be stronger for the same proof to work: that there exist $N, \epsilon$ such that for any two states, there is a common third state that they can both reach with probability $\epsilon$ after $N$ steps. We get this for free in the finite case, but it's not a good hypothesis to take in the infinite case: it's false for many perfectly well-behaved Markov chains.