Raising non-diagonalizable Markov transition matrix to the power of $n$

exponentiationmarkov-processmatricesmatrix decompositionstochastic-matrices

Consider an aperiodic and irreproducible Markov process with absorbing states. For each non-absorbing state, how can I numerically compute the likelihood of ending up in each absorbing state?

Assume we have four non-absorbing states, and two absorbing states, with the following transition matrix:

$$A = \begin{pmatrix}
0 & 0 & 0 & 0 & 0 & 0 \\
0 & 0 & 0 & 1 & 0 & 0 \\
0 & 0 & 0 & 0 & 0 & 0 \\
0 & 0 & 1 & 0 & 0 & 0 \\
1 & 0 & 0 & 0 & 1 & 0 \\
0 & 1 & 0 & 0 & 0 & 1
\end{pmatrix}$$

By raising $A$ to the power of $n$, I may calculate the transition probability in $n$ steps between any two states. By letting $n$ converge to infinity, I will get the likelihood of ending up in each absorbing state after infinite time. In the above example I would get that the first non-absorbing state would transition into the first absorbing state, and all other non-absorbing states in the second absorbing state.

As the matrix $A$ is non-diagonalizable, I wonder how I could numerically compute $A^n$, with $n$ approaching infinity, in a robust and efficient way.

The above example of $A$ is a simplified version of my typical transition matrix that I provide here only for illustration. Typically, my transition matrix would be much larger, e.g. involving between 1,000 and 10,000 states, with typically however only a few links between states. (I have a cell grid, allowing for movements between adjacent cells, with each cell representing a state).

Is my only possibility of numerically computing $A^n$ in a robust way by iterative multiplication? If so, what would be a good criterion for termination, i.e., what measure could I use to assess the similarity between two iterations?

Best Answer

$A$ can be written as $J=SAS^{-1}$ such that $J=D+N$ where $D$ is diagonal and $N$ is nilpotent, such decomposition is called Dunford decomposition and is computed using the Jordan normal form $J$. In this case:

$$A^n=SJ^nS^{-1}$$

Once $S,D$ and $N$ are known, you will be left to compute $N^n$ up to the order of the nilpotent matrix $N$, denoted $k_N$, then for $n<k_N$ and by commutativity of diagonal matrices :

$$J^n=(D+N)^n=\sum_{l=0}^n \binom{n}{l} N^lD^{n-l}.$$

For $n\ge k_N$:

$$J^n=\sum_{l=0}^{k_N} \binom{n}{l} N^lD^{n-l}+D^n$$

Indeed, $N^l=0$ as soon as $l\ge k_N$.

The efficiency of such procedure depends on how far you need to compute $n$ and how much evaluations you want to make. If only a few evaluations are required, using a recursive multiplication algorithm should be faster (but more greedy in terms of memory).

Related Question