Spectral Theory – Conditions for Smooth Dependence of Eigenvalues and Eigenvectors

ca.classical-analysis-and-odeseigenvectormatricespolynomialssp.spectral-theory

Let $A\in\mathcal M_n$ be an $n\times n$ real [symmetric] matrix which depends smoothly on a [finite] set of parameters, $A=A(\xi_1,\ldots,\xi_k)$. We can view it as a smooth function $A:\mathbb R^k\to\mathcal M_n$.

1. What conditions should the matrix $A$ satisfy so that its eigenvalues
$\lambda_i(\xi_1,\ldots,\xi_k)$,
$i=1,\ldots,n$, depend smoothly on the
parameters $\xi_1,\ldots,\xi_k$?

e.g. if the characteristic equation is $\lambda^3-\xi=0$, then the solution $\lambda_1=\sqrt[3] \xi$ is not derivable at $\xi=0$.

2. What additional conditions should the matrix $A$ satisfy so that we can
choose a set of eigenvectors
$v_i(\xi_1,\ldots,\xi_k)$,
$i=1,\ldots,n$, which depend smoothly
on the parameters
$\xi_1,\ldots,\xi_k$?

Update – important details

  • The domain is simply connected
  • The rank of $A$ can change in the domain
  • The multiplicities of the eigenvalues can change in the domain, they can cross
  • The matrix $A$ is real symmetric
  • $n$ and $k$ are finite

Update 2

  • A relaxation of the conditions of the problem:
    For fixed $p=(\xi_{01},\ldots,\xi_{0k})$, can we find an open neighborhood of $p$ in the domain and a set of conditions ensuring the smoothness of the eigenvalues and the eigenvectors?

Best Answer

The fact that the entries of the matrix are real does seem to help. The state of the art is the following.

  • The spectrum is continuous functions of $\xi$. However, it is not always possible to label the eigenvalues so that they individually are continuous functions.
  • When the multiplicities $m_1,\ldots,m_r$ do not change as $\xi$ varies (no crossing of eigenvalues), then the eigenvalues are as smooth as the matrix. If the domain is simply connected, the eigenvalues may be labelled so as to be smooth functions.
  • When the entries are analytic functions of a single variable ($k=1$) and the eigenvalues remain real, then the eigenvalues may be labelled so as to be analytic functions. However, in case of crossing, this nice labelling is not the obvious one (i.e. not $\lambda_1\le\lambda_1\le\cdots$). This become false for $k\ge 2$, as shown by the example $$\left(\begin{array}{cc} \xi_1 & \xi_2 \\\\ \xi_2 & -\xi_1 \end{array}\right)\qquad\qquad (1).$$
  • The situation is not that good concerning the eigenvectors. The following is called Petrowski's example, $$\left(\begin{array}{ccc} 0 & \xi_1 & \xi_1 \\\\ 0 & 0 & 0 \\\\ \xi_1 & 0 & \xi_2 \end{array}\right).$$ The eigenvalues are real for every $\xi$, distinct when $ \xi_1\ne0$. The matrix is diagonalisable for every $\xi$, but two eigenvectors have the same limit when $ \xi_1\rightarrow0$.

If the domain is not simply connected, you may have additional difficulties with eigenvectors. Take example (1) above, with $\xi$ running over the unit circle $S^1$. When you follow continuously a unit eigenvector $V(\xi)$, it is flipped (i.e. multiplied by $-1$) after one loop