[Math] dominant eigenvector

eigenvectorlinear algebramatricessp.spectral-theorytensor-products

Hi, everyone! Is there any efficient way to simplify the following tensor product

$X \otimes X + X^T \otimes X^T$, where $X$ is a square $n \times n$ matrix.

My goal is to efficiently compute the dominant eigenvector of $X \otimes X + X^T \otimes X^T$. However, the direct way is computationally expensive. Is it possible to simplify it avoid tensor computation.

For example, to compute the dominant eigenvector of $X \otimes X$, i can compute the dominant eigenvector of $X$ first, denoted by $v$. Then i only need to compute $v \cdot v^T$, which is the dominant eigenvector of $X \otimes X$. However, when it comes to $X \otimes X + X^T \otimes X^T$, i have no idea. Could anyone give me some suggestion or reference? Thank you in advance!

Best Answer

In general, there is no simple way to relate the eigenvalues of a sum to the eigenvalues of the original matrices.

What you can do computationally is switching to iterative algorithms (Arnoldi method; eigs in Matlab, scipy.sparse.linalg.eigs, or the library Arpack if you are stuck with Fortran/C++). In this way, the computational cost is $O(M\cdot steps)$, where $steps$ is the number of steps needed (usually moderate, if the dominant eigenvalue is well-separated and/or you only need a small number of significant digits), and $M$ is the cost of a matrix-vector multiplication with your matrix, which is easier in your case due to the simple tensor product structure. You will need to code a subroutine for the map $v\mapsto Mv$.

(This is generally a good idea anyway if $n$ is large and you only need a small part of the spectrum.)

If the dominant eigenvalue is not well-separated, you have a computationally challenging problem, and you may want to visit your favorite numerical analyst for a more thorough diagnosis.