[Math] Stationary distribution in general Markov Chains

markov chainspr.probabilityreference-request

This is just a reference request for a result which is very general, useful and should be well-known, but I've failed to find a good reference to cite.

The problem is to define the "most natural" stationary distribution of a finite Markov Chain with specified initial state. What I mean here by "stationary distribution" is just a solution of $\pi P=\pi$, where $P$ is the matrix of the Markov chain.

The case that is widely studied and keeps popping wherever I look is the irreducible aperiodic case, for which we have a unique stationary distribution.

But if you take a Markov Chain without these properties, you can still easily compute the stationary distribution which gives you the average time spent in each state on the long run. More precisely, we want to compute the distribution $\pi$ such that for each state $X$, $\pi(X)=\lim_{n\to\infty} \frac{E_n(X)}{n} $, where $E_n(X)$ is the expected number of occurrences of $X$ in an $n$-step process of the Markov chain.

I call a Strongly Connected Component (SCC) "ergodic" if we cannot get out of it, and "transient" otherwise.

It suffices to compute the probability $\rho(C)$ to end up in each ergodic component $C$. Remind that we know the initial state, which is why we can compute these probabilities. Then we are back to the irreducible case to compute the stationary distribution of each component. We then compose them to get the final distribution, using $\rho(C)$ to weight the local distribution in each $C$.
The periodicity (inside an ergodic SCC) is not a problem either: it means that the states can be partitioned according to the period (like even states and odd states for instance). Looking at the Markov Chain induced by one class gets us back to the irreducible aperiodic case, and from this we easily deduce the distribution for all states.

A small example: if you start from $X_0$ which goes to $X_1$ with probability $\frac{1}{4}$ and $X_2$ with probability $\frac{3}{4}$. Then $X_1$ is a sink (goes to itself with probability 1), and $X_2$ goes to $X_3$ with probability 1, which goes back to $X_2$ with probability $1$.
Then the "good" stationary distribution would be $\pi(X_0)=0$, $\pi(X_1)=\frac{1}{4}$, $\pi(X_2)=\pi(X_3)=\frac{3}{8}$.

Can someone point me to a reference describing this distribution?

Best Answer

I call a Strongly Connected Component (SCC) "ergodic" if we cannot get out of it, and "transient" otherwise.

It suffices to compute the probability ρ(C) to end up in each ergodic component C

Although your case is not technically an absorbing Markov chain (because not every state will eventually reach an absorbing state) you can still use absorbing Markov chain concepts and notation to compute those probabilities.

Transform your process to an absorbing Markov chain by removing all of the connections within the ergodic components, so that every state in an ergodic component of the original process is an absorbing state in the transformed process.

Now use the formulas in that wikipedia page to compute the 'canonical form' and the 'fundamental matrix' for the transformed process, and use these to find the absorbing probabilities of the transformed process.

The probability p(C) is the sum of absorbing probabilities of the component's states in the transformed absorbing process.

Related Question