If $A$ has distinct eigenvalues, then all matrices close enough to $A$ also have distinct eigenvalues

analysiseigenvalues-eigenvectorsmatricessolution-verification

EDIT: I have refined the proof as per @TedShifrin's recommendation.

This is problem 2.4.P1 from "Matrix Analysis" 2nd ed. by Horn and Johnson. Let $A\in M_n$ have distinct eigenvalues. Show that there is an $\epsilon > 0$ such that every $B\in M_n$ that satisfies $\lVert A – B \rVert_F^2 = \sum_{i,j=1}^n \lvert a_{ij}-b_{ij}\rvert^2 < \epsilon$ also has distinct eigenvalues. Conclude that the set of matrices with distinct eigenvalues is an open subset of $M_n$. Remark: we're supposed to use theorem 2.4.9.2 from the book to prove this.

Theorem 2.4.9.2

Let an infinite sequence $A_1,A_2,\ldots\in M_n$ be given and suppose that $\lim_{k\to\infty}A_k = A$ (entrywise convergence). Let $\lambda(A) = [\lambda_1(A) ~\ldots~ \lambda_n(A)]^T$ and $\lambda(A_k) = [\lambda_1(A_k) ~\ldots~ \lambda_n(A_k)]^T$ be given presentations of the eigenvalues of $A$ and $A_k$ respectively, for $k=1,2,\ldots$. Let $S_n = \{\pi ~:~ \pi ~\text{is a permutation of}~ \{1,2,\ldots,n\} \}$. Then for each given $\varepsilon > 0$ there exists a positive integer $N = N(\varepsilon)$ such that
$$
\min_{\pi\in S_n}\max_{i=1,\ldots,n} \lvert \lambda_{\pi(i)}(A_k) – \lambda_i(A)\rvert \leq \varepsilon~\text{for all}~k\geq N.
$$

Attempt

Define $\mathcal{B}_\epsilon(A) = \{ B\in M_n ~|~ \lVert A – B\rVert_F < \epsilon \}$, and assume for the sake of contradiction that for each $\epsilon > 0$ there is a $B\in\mathcal{B}_\epsilon(A)$ with at least one repeated eigenvalue. Let $(\epsilon_k)$ be an arbitrary, positive, strictly decreasing sequence, and let $B_k \in \mathcal{B}_{\epsilon_k}(A)$ have a repeated eigenvalue. Note then that $\epsilon_k$ decreasing implies that $\lim_{k\to\infty}B_k = A$ entrywise.

Now, let $\pi_k = \arg\min_{\pi\in S_n}\max_{i=1,\ldots,n} \lvert \lambda_{\pi(i)}(B_k) – \lambda_i(A)\rvert$, $c = \min_{i\neq j}\lvert \lambda_i(A) – \lambda_j(A)\rvert > 0$, and let $\beta_k = \lambda_{\pi_k(i_k)}(B_k) = \lambda_{\pi_k(j_k)}(B_k)$ be the repeated eigenvalue of $B_k$ for some $i_k\neq j_k,k=1,2,\ldots$. Theorem 2.4.9.2 states that for $\varepsilon < c$ there is an $N$ such that $\lvert \beta_k – \lambda_{i_k}(A)\rvert \leq \varepsilon/2$ and $\lvert \beta_k – \lambda_{j_k}(A)\rvert \leq \varepsilon/2$ for all $k \geq N$. But then
$$
c \leq \lvert \lambda_{i_k}(A) – \lambda_{j_k}(A)\rvert\leq \lvert \beta_k – \lambda_{i_k}(A)\rvert + \lvert \beta_k – \lambda_{j_k}(A)\rvert\leq\varepsilon < c,
$$

a contradiction. Since the sequence $(\epsilon_k)$, and consequently the sequence $(B_k)$ are arbitrary, we conclude that there is an $\epsilon$ such that every $B\in\mathcal{B}_\epsilon(A)$ has distinct eigenvalues.

Is this proof correct? My main concerns are whether the strict inequality holds, and whether the contradiction covers all cases.

Best Answer

For the record, here's a much simpler proof. For $A \in \mathbb C^{n\times n}$. Consider the following $n\times n$ Hankel Matrix.

$H_A=\begin{pmatrix} \text{trace}\big(I_n\big) & \text{trace}\big(A^1\big) & \text{trace}\big(A^2\big) &\cdots &\text{trace}\big(A^{n-1}\big) \\ \text{trace}\big(A^1\big) & \text{trace}\big(A^2\big) & \text{trace}\big(A^3\big) & \cdots & \text{trace}\big(A^{n}\big) \\ \text{trace}\big(A^2\big) & \text{trace}\big(A^3\big)& \text{trace}\big(A^4\big) & \cdots &\text{trace}\big(A^{n+1}\big) \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \text{trace}\big(A^{n-1}\big) & \text{trace}\big(A^{n}\big) &\text{trace}\big(A^{n+1}\big) & \cdots & \text{trace}\big(A^{2n-2}\big) \end{pmatrix}$

$\text{rank } H_A=\text{number of unique eigenvalues of A}$. Since $A$ has $n$ unique eigenvalues, this tells us $\det\big(H_A\big) = c\neq 0$. The map

$f: M_n\longrightarrow \mathbb C$
given by $f(X)= \det\big(H_X\big)$ is a polynomial function of matrix $X$'s entries hence continuous. Applying the definition of continuity with the distance metric $d\big(A,B) = \big \Vert A-B\big \Vert_F$ and selecting $\epsilon := \frac{\vert c\vert}{2}$ we know there is some $\delta\gt 0$ such that
$d\big(A,B\big)\lt \delta \implies\big \vert f(A) -f(B)\big \vert \lt \epsilon = \frac{\vert c\vert}{2}\implies f(B) \neq 0\implies \text{rank}H_B = n\implies B \text{ has n unique eigenvalues}$


addendum re: the determinant of the Hankel Matrix
Let $X\in \mathbb C^{n\times n}$

All that is needed for purposes of this post is
$\text{rank}\big(H_X\big) =n\iff \text{X has n distinct eigenvalues}$

proof:
$X$ has $n$ eigenvalues (with repetition). Come up with some arbitrary labeling for them and collect their respective moment curves in the rows of the below square Vandermonde matrix.

$V:=\begin{bmatrix} 1 & \lambda_0 & \lambda_0^2 & \dots & \lambda_0^{n-1}\\ 1 & \lambda_1 & \lambda_1^2 & \dots & \lambda_1^{n-1} \\ \vdots & \vdots & \vdots & \ddots & \vdots & \\ 1 & \lambda_{n-1} & \lambda_{n-1}^{2} & \dots & \lambda_{n-1}^{n-1} \end{bmatrix} =\bigg[\begin{array}{c|c|c|c} \mathbf v_0 & \mathbf v_1 &\mathbf v_2 &\cdots & \mathbf v_{n-1} \end{array}\bigg]$

For formal convenience $\lambda_k^0:=1$

Then $H_X=V^T V $ because for $i, j \in \big\{0,1,\dots, n-1\big\}$
$\mathbf e_i^T H_X \mathbf e_j = \text{trace}\big(X^{i+j}\big) =\sum_{k=0}^{n-1} \lambda_k^{i+j} =\sum_{k=0}^{n-1} \lambda_k^{i}\cdot \lambda_k^j=\mathbf v_i^T \mathbf v_j = \mathbf e_i^T V^T V \mathbf e_j $

And since $V$ is a square Vandermonde matrix it is invertible iff all $\lambda_k$ are distinct. Thus if not all eigenvalues are distinct then $\text{rank}\big(H_X\big)=\text{rank}\big(V^T V\big)\leq \text{rank}\big(V\big)\lt n$ and if all eigenvalues are distinct $H_X$ is invertible hence has rank $n$.

Alternatively observe that $\det\big(H_x\big)= \det\big(V\big)^2$ which is the discriminant of the characteristic polynomial of $X$. (This motivates yet another approach: working directly with the resultant of $X$'s characteristic polynomial and said polynomial's derivative, i.e. via the determinant of the associated Sylvester Matrix but the above approach seems more accessible.)

Related Question