[Math] Conditions for Stationary Distributions in Markov Chains

matricesstochastic-processes

The book by Durrett "Essentials on Stochastic Processes" states on page 55 that:

If the state space S is finite then there is at least on stationary
distribution.

  1. How can I find the stationary distribution for example for the square 2×2 matrix $[[a,b],[1-a, 1-b]]$? I have tested this with WA and chains of such form seems to converge to certain probalities, as here. But if you look in the general case, here, I feel quite confused to find the general formula. I just know that it exist but I cannot see any general formula as $n \rightarrow \infty$.

  2. But look here, $[[0,1],[1,0]]$ does not have a stationary distribution! So am I right to say that this chain is depended on initial conditions? If the $M=[[0,1],[1,0]]$, the chain will not converge to stationary condition. But how can I find out this from the matrix? (not through series of observations)

  3. And how can I know when a certain markov chain is depended on initial conditions? For example, with the above example, its $det = a-b$ and for eigenvalues $\lambda_{1} =1$ and $\lambda_{2} = a-b$.

  4. Now Hypertextbook mentions that "behavior, which exhibits sensitive dependence on initial conditions, is said to be chaotic", look we found a case with initial condition sensitivity. Is it chaotic?

Best Answer

I am not a specialist in invariant distributions but so I hope there will be other answers - just it's to much to write as a comment.

  1. As Sasha wrote you, in there exists an invariant measure $\pi = [0.5,0.5]$ and you can easily check that it works for all $0\leq a,b\leq1$.

  2. The chain will not converge to the stationary distribution since it is periodic with a period $2$. Although the chain does not converge to the stationary distribution, still it exists (since the convergence is sufficient, not necessary condition). You may want to take a look at the notion of ergodicity here.

  3. The chain will depend on the initial condition, say in the case $a=1,b=0$ - then you have to absorbing states, and wherever you start, you stay there forever.

  4. Finally, chaos has no a strict definition (which will order it). To be precise, that statement from the Hypertextbook is not a definition of the chaos. You may say that the chain I told in 3. exhibits the dependence on the initial data, though you may think that non-ergodic chains are chaotic since they exhibit the dependence on the initial distribution.