Convergence of Stationary Distributions in Markov Chains

markov chainspr.probabilityreference-requeststochastic-processes

I fairly new in the field of Stochastic Processes and Markov Chains so excuse my ignorance.

My question is: If we have a sequence of Markov chains such that each one has a stationary distribution $\pi^{(n)}$ and the chains converge in some way to another Markov chain that has stationary distribution $\pi$, can we say that the $\pi^{(n)}$'s converge to $\pi$ (in some way)?

More precisely:
Let $G$ be a simple (ie no loops or multiple edges), finite, connected graph. Suppose that we have a sequence of Markov chains over $G$. Let $\boldsymbol{P}_1, \boldsymbol{P}_2, \dots$ denote the corresponding transition matrices. Assume that all chains have a stationary distribution (for example, this can be guaranteed when the weights on each edge are positive since $G$ is connected), call them $\pi^{(n)}$. Now say that $\boldsymbol{P}_n\to\boldsymbol{P}$ in some way (for example, let's say that we have entry-wise almost sure convergence, or $\|\boldsymbol{P}_n-\boldsymbol{P}\|\to 0$ for some matrix norm). Suppose that $\boldsymbol{P}$ is a stochastic matrix with stationary distribution $\pi$. Then can we say that $\pi^{(n)}\to\pi$ in some way (similar to the way that the matrices converge)?

My feeling is that there should exist such theorems (maybe with some stronger assumptions). I tried to find such results but I was not successful. Can someone give a reference about such results?

Best Answer

We assume that the Markov chains are on a finite state space, that $P_n \to P$ pointwise, and the limit matrix $P$ is irreducible, so its stationary measure $\pi$ is unique. Let $\pi^{(n_k)} \to \mu$ be a convergent subsequence of $\pi^{(n)}$. Then $\pi^{(n_k)}P_{n_k}=\pi^{(n_k)}$, so continuity of multiplication implies that $\mu P=\mu$. Thus $\mu=\pi$. Since this holds for every convergent subsequence and the simplex of probability vectors is compact, we conclude that $\pi^{(n)} \to \pi$.

Related Question