[Math] Partial derivative of matrix w.r.t. its eigenvalue and eigenvector

linear algebrapartial derivative

Standard algorithms have been proposed to compute the partial derivative of eigenvalue and eigenvector w.r.t. the matrix, e.g. http://www.win.tue.nl/casa/meetings/seminar/previous/_abstract051019_files/Presentation.pdf.

However, as far as I know, no solution has derived for the partial derivative of the matrix w.r.t. its eigenvalue and eigenvector. Suppose the matrix $A\in \mathbb{R}^{n \times n}$ with eigenvalue $\lambda\in \mathbb{R}$ and eigenvector $X\in \mathbb{R}^{n}$, i.e. $ AX = \lambda X$,the problem is to compute $\frac{\partial A}{\partial \lambda}$ and $\frac{\partial A}{\partial X}$. Does anyone have some ideas? Thanks for your help!

Best Answer

Assuming the matrix has a full set of eigenvalues and eigenvectors, then you can write $A = X\Lambda X^{-1}$ where $X$ is the (columnwise) matrix of eigenvectors and $\Lambda$ is the diagonal matrix of eigenvalues. In this form, the derivative with respect to the eigenvalue is easy to see: $$ \frac{\partial A}{\partial \lambda_i} = X_i [(X^{-1})^T_i]^T$$ which is the rank-1 outer product between the left eigenvector and corresponding (suitably normalized) right eigenvector row.

With regards to the derivative with respect to the eigenvector, what do you expect the format of the result to be in, a rank-3 tensor?

Related Question