Some have more than one stationary distributions

linear algebramarkov chainsstatistics

Given a Markov chain transition matrix $P$, I want to find its stationary distribution. According to the definition, if $\pi$ satisfies the equation $\pi = \pi P$, then $\pi$ is called a stationary distribution.

The above definition is what my professor said in the class. When I learned linear algebra, I was told that stationary distribution is the eigenvector of the matrix $P$ (row sum =1) associated with the eigenvalue 1. I verify this is correct in R.

However, the eigenvector method got problem when I face the 3 by 3 identity matrix. This is the example of "have more than one stationary distributions". I use the eigenvector method in R, and find $(1,0,0), (0,1,0), (0,0,1)$ are all eigenvectors corresponding to eigenvalue 1. However, in fact, the true answer is any row vector sum to 1 is the stationary distribution. However, the eigenvector method doesn't show it. Why?

Best Answer

"$(1,0,0),(0,1,0),(0,0,1)$ are all eigenvectors corresponding to eigenvalue 1"

What you have found is not all eigenvectors, but a basis for the vector space of the eigenvectors.

If you want all eigenvectors, you want the space spanned by these:

$$\{t(1,0,0)+u(0,1,0)+v(0,0,1):t,u,v\in\mathbf{R}\}$$

that is,

$$\{(t,u,v):t,u,v\in\mathbf{R}\}$$

that is,

$$\mathbf{R}^3$$

Notice that for any $\mathbf{x}\in\mathbf{R}^3$ you have $\mathbf{x}.I=1\mathbf{x}$, so these all are, in fact, (left) eigenvectors with eigenvalue 1.

Since you only want distributions, you actually have

$$\{(t,u,v):t+u+v=1\}$$

That's infinitely many stationary distributions. Indeed, if I have probability $t$ of being in state 1, and probability $u$ of being in state 2, then after 1 step, I still (since I never change state) have probability $t$ of being in state 1, and probability $u$ of being in state 2, so all of these are stationary distributions - applying a step of the Markov process doesn't change the probability distribution of the state I might be in.