Equilibrium of the matrix markov

linear algebramarkov chainsmatrices

Suppose to have the following 2 x 2 Markov Matrix.

$\begin{pmatrix}
.8 & .1\\
.2 & .9
\end{pmatrix}$

Considering the system

$\begin{pmatrix}
.8 \\
.2
\end{pmatrix} \cdot 150 + \begin{pmatrix}
.1\\
.9
\end{pmatrix} \cdot 150 = \begin{pmatrix}
a\\
b
\end{pmatrix}$

when $a= 100$ and $b = 200$ the entire system is in equilibrium.

Is there a theorem about reaching (always?) an equilibrium point for a Markov matrix?

Best Answer

This (stochastic) matrix $A$ has two eigen values $\lambda_1=1$ and $\lambda_2=0.7$ so it is diagonalizable, i.e. $A=P\begin{bmatrix}1&0\\0&0.7\end{bmatrix}P^{-1}=PDP^{-1}$, where $P=\begin{bmatrix}1&-1\\2&1\end{bmatrix}$

To get the equilibrium vector. We want to compute $v$ such that $\lim_{n \to \infty}A^nv=v$.

Now $v=c_1v_1+c_2v_2$, where $v_1,v_2$ are eigenvectors corresponding to the eigenvalues $1$ and $0.7$.

Thus $$A^nv=c_1A^nv_1+c_2A^nv_2 =c_1(1)^nv_1+c_2(0.7)^nv_2$$ $$\lim_{n \to \infty}A^nv=c_1v_1$$ So the multiple of first eigenvector (for eigenvalue $1$) is the equilibrium vector.