[Math] Differentiability of eigenvalue and eigenvector on the non-simple case

eigenvalueseigenvectormatrix-theorynonlinear optimizationsp.spectral-theory

Let $h:\mathbb{R}^n\to\mathbb{R}^m, n>1$ be a twice continuously differentiable function and $J_h:\mathbb{R}^n\to\mathbb{R}^{m\times n}$ be its jacobian matrix. Let us consider the functions $A(x):=J_h^\mathtt{T}(x)J_h(x)\in\mathbb{R}^{n\times n}$ and $B(x):=J_h(x)J_h(x)^\mathtt{T}\in\mathbb{R}^{m\times m}$.

I'm interested in sufficient conditions ensuring differentiability of the functions $U(x)$, $\Sigma(x)$ and $V(x)$ in a singular value decomposition of $J_h(x)=U(x)\Sigma(x)V(x)^\mathtt{T}$ when there is at least one repeating zero singular value (rank deficient case).

The question can be equivalently stated in terms of eigenvalues/eigenvectors of the symmetric matrices $A$ and $B$. Are there sufficient conditions to ensure differentiability of an eigenpair with a non-simple eigenvalue?

Appreciate any help.

Best Answer

I think Theorem 6.8 on page 122 in Kato: Perturbation Theory for Linear Operators may help (at least for the question concerning the eigenvalues of the symmetric $A$ and $B$ matrices).

Theorem: Assume that $T(x)$ is a symmetric and continuously differentiable ($N \times N$ matrix) function in an interval $I$ of $x$. Then there exist $N$ continuously differentiable functions $\mu_n(x)$ on $I$ that represent the repeated eigenvalues of $T(x)$.