[Math] Finite State Markov Chain Stationary Distribution

markov chainsprobability theory

How does one show that any finite-state time homogenous Markov Chain has at least one stationary distribution in the sense of $\pi = \pi Q$ where $Q$ is the transition matrix and $\pi$ is the stationary distribution? My instinct is it involved eigenvalues and the Perron Frobenius theorem but I'm having trouble completing the argument.

Best Answer

There are a number of ways to prove this. In a recent article "What is a stationary measure?", Alex Furman outlines a straightforward proof using only linear algebra. He notes that the existence of an invariant probability measure also follows from the Brouwer fixed point theorem.

Related Question