The steady state of a stochastic matrix if it has two linearly independent eigenvectors corresponding to the eigenvalue $1$

linear algebramarkov chainsmatricesstochastic-matrices

A stochastic matrix $A$ is a matrix with the following two properties:

  1. All entries of $A$ are $\geq 0.$
  2. All columns of $A$ add up to $1$.

It is known that for a stochastic matrix, $\lambda = 1$ is an eigenvalue and all other eigenvalues are $\leq 1$.

The source from where I'm studying says that the steady state of $A$ is the eigenvector corresponding to $\lambda = 1$. This comes from the solution to $u_k = A^k u_0$ being $u_k = \Sigma_i c_i\lambda_i^kx_i$, where each $x_i$ is an eigenvector corresponding to $\lambda_i$, and $c_i \in \mathbb{R}$, and as $k \rightarrow \infty, u_k \rightarrow c_1x_1$ (assuming $\lambda_1 = 1$).

My question is: if $\lambda_1 = 1$ has multiple linearly independent eigenvectors, how does the solution to $u_k = A^k u_0$ change? What is the steady state in this case?

Best Answer

If the matrix $A$ is irreducible, then the algebraic multiplicity of $\lambda = 1$ is at most $1$ and there can be only one left-eigenvector, corresponding to the steady state distribution. As it turns out, every stochastic matrix has identical algebraic and geometric multiplicity for the eigenvalue $1$.

If the matrix $A$ fails to be irreducible but the digraph corresponding to the Markov chain associated with $A$ is weakly connected (so that there is a single absorbing communication class), then the eigenvalue $1$ will still have algebraic multiplicity $1$, and the left eigenvector corresponds to a steady state distribution (whose support lies over the absorbing communication class). The fact that the algebraic multiplicity is $1$ is a consequence of the fact that a substochastic matrix cannot have an eigenvalue of $1$.

Finally, $A$ will have multiple eigenvectors associated with $1$ if and only if $A$ is permutation similar to a block-diagonal matrix. The dimension of the eigenspace corresponds to the number of weakly connected components of the Markov chain, and the final distributions over each component form a basis for the eigenspace.

Putting all this together, a stochastic matrix $A$ can necessarily be written (via a permutation-similarity) in the block-diagonal form $$ A = \pmatrix{A_1 \\ & \ddots \\ && A_k}, $$ with each $A_j$ (corresponding to the weakly connected components) stochastic of size $A_j$. If each $A_j$ is aperiodic, then $A^k u_0$ indeed converges. If we conformally divide the initial vector $u_0$ into $u_0 = (u_0^1,\dots,u_0^k)$ (so that $u_0^j$ is a length-$n_j$ column vector), we find that $A^k u_0 \to (p_1 \vec 1, \dots, p_k \vec 1)$ for some real numbers $p_j$.