[Math] When a Markov chain converges to a steady state, what kind of convergence is it

convergence-divergencemarkov chainsmatricesprobability theory

Let $A$ be a transition matrix, the steady state distribution $x$ satisfies the distribution $Ax = x$. One can prove that under certain circumstances, $$\lim_{n\rightarrow\infty}A^n q=x$$
where $q$ is a probability distribution for the states in Markov chain.
What kind of convergence is this? Is this convergence almost surely, in probability, or in distribution?

Best Answer

What is being said here is just convergence in distribution, and that fact is actually somewhat vacuous. The whole situation here is that you haven't specified an actual sequence of random variables, you've only specified the sequence of distributions given by $A^n q$. A Markov chain also introduces a corresponding sequence of random variables; in particular, given an initial distribution and $\omega \in \Omega$, we can obtain a sample path. But that's another matter entirely from the usual notion of "steady state" for Markov chains. In particular, a Markov chain will typically not converge a.s.; this would mean that the sequence $X_n$ converges to a (randomly chosen) state. Since the state space is discrete, that means the sequence is eventually constant (for a fixed $\omega$). That certainly doesn't happen for, say, $A=\begin{bmatrix} 1/2 & 1/2 \\ 1/2 & 1/2 \end{bmatrix}$. In this case soon enough there will be another transition, there is no "last" transition in the sequence.