[Math] Gradients of the Dominant Eigenvalue and Eigenvector

eigenvalueseigenvectorlinear algebra

How can I compute the partial derivatives of the dominant eigenvalue and eigenvectors of a real symmetric matrix $\mathbf{A}$?

In particular, given

$ \mathbf{v}^* = \arg\max_{\mathbf{v}} \mathbf{v}^{\top}\mathbf{A}\mathbf{v},\quad \text{subject to } ~ \|\mathbf{v}\|_2 = 1,
$

how can I find $\partial v_{i}^*/\partial A_{jk}$?

I know the power method is the usual way to compute the dominant eigenvalue and eigenvector. Is their any similar algorithm for computing the gradient?

Unlike another question, I am interested in an efficient computational solution rather than an analytical one.

Best Answer

See (68) here: if $\lambda$ is a simple eigenvalue, $Av=\lambda v$ and $v$ is normalized to be orthogonal, $$ \frac{\partial v}{\partial A_{jk}} = (\lambda I - A)^+ E^{jk} v, $$ where $E^{jk}$ is the matrix with $1$ in position $jk$ and $0$ everywhere else. That plus symbol is a Moore-Penrose pseudoinverse.

The formula is easy to compute explicitly because it only requires the eigenvector itself and the pseudoinverse. Moreover, If the eigenvector is simple then you know that $\lambda I-A$ has rank $n-1$, so you don't have a (potentially difficult) rank decision to make.

Also, related answer with a technique that can produce those formulas.

Related Question