Eigenvalues of symmetric tridiagonal matrices with complex entries

characteristic polynomialeigenvalues-eigenvectorslinear algebramatricestridiagonal-matrices

In this paper the authors proved that for a real symmetric tridiagonal matrix $T_n$, where $b_i \neq 0$, as follows

$$T_n = \begin{bmatrix}
a_1&b_1&0&0&0&0&0&0&\cdots&0\\
b_1&a_2&b_2&0&0&0&0&0&\cdots&0\\
0&b_2&a_3&b_3&0&0&0&0&\cdots&0\\
0&0&b_3&a_4&b_4&0&0&0&\cdots&0\\
0&0&0&b_4&a_5&b_5&0&0&\cdots&0\\
0&0&0&\ddots&\ddots&\ddots&\ddots&\ddots&\ddots&0\\
0&0&0&0&0&b_{n-4}&a_{n-3}&b_{n-3}&0&0\\
0&0&0&0&0&0&b_{n-3}&a_{n-2}&b_{n-2}&0\\
0&0&0&0&0&0&0&b_{n-2}&a_{n-1}&b_{n-1}\\
0&0&0&0&0&0&0&0&b_{n-1}&a_n\\
\end{bmatrix},
$$

no two successive leading principal minors of $T_n$ have the same eigenvalue. The $i$th leading principal minor of $T_n$ is denoted by $T_i$ and its characteristic polynomial is denoted by $P_i(\lambda) = \det(\lambda I – T_i)$. For example $T_4$ is as follows

$$
T_4 = \begin{bmatrix}
a_1&b_1&0&0\\
b_1&a_2&b_2&0\\
0&b_2&a_3&b_3\\
0&0&b_3&a_4
\end{bmatrix}.
$$

In fact they proved there is no $x \in \Bbb R$ that will be root of any $P_i(\lambda)$ and $P_{i+1}(\lambda)$. The proof is as follows.


Proof. It is well known that the following recursive relation of $P_n(\lambda)$ exists:

\begin{align*}
P_1(\lambda) &= \lambda-a_1\\
P_j(\lambda) &= (\lambda-a_j)P_{j-1}(\lambda) – b_{j-1}^2P_{j-2}(\lambda), 2 \leq j \leq n.
\end{align*}

If $P_1(\lambda) = 0 = P_2 (\lambda)$, then $(\lambda − a_2 ) P_1 (\lambda) − b_1^2 = 0$, which implies $b_1 = 0$, but this contradicts the restriction on $T_n$ that $b_1 \neq 0$. Once again, for $2 \lt j \leq n$, if $P_{j−1} (\lambda) = 0 = P_j (\lambda)$, then the recurrence $(\lambda − a_{j+1} ) P_j (\lambda) − b^2_j P_{j−1} (\lambda) = 0$, which gives $P_{j−1} (\lambda) = 0$. This will in turn imply that $P_{j−2} (\lambda) = 0$. Thus, we will end up with $P_2 (\lambda) = 0$, implying that $b_1 = 0$ which is a contradiction.


The proof is for real symmetric tridiagonal matrices which have real eigenvalues. I would like to know what if the symmetric matrix has some complex entries? In this way some of its eigenvalues will be complex, but I don't see any problem to apply the proof given above to complex case.

I would like to know if it is "legal" to apply the proof given above to the complex case? Please note that I am interested in the case where the complex tridiagonal matrices are symmetric rather than Hermitian.

Best Answer

Yes.

More generally, the same conclusion and the same proof are valid if we replace $\Bbb R$ with any commutative ring (with zero and identity throughout this answer), such as $\Bbb C$, $\Bbb Z$ or $\Bbb Q_p$ or $\Bbb Z/2022\Bbb Z$.


In the case of $\Bbb C$, the conclusion and the proof are literally the same. As long as you do not use positivity of a number (such as the square of a nonzero number is positive or the Archimedean property etc.), a proof for $\Bbb R$ is usually automatically valid for $\Bbb C$. So, we could say that the case for $\Bbb C$ is trivially true once we have proved for $\Bbb R$. (Of course, we should double check that every relevant concept and every step is valid in $\Bbb C$. I have.)


In the case of a general commutative ring $\mathfrak R$, we need to define all relevant concepts over $\mathfrak R$ first. This is easy to do, although it might take a while to become familiar and comfortable with them such as the determinant of a matrix over $\mathfrak R$. Then all the steps will just as easy as the case of $\Bbb R$ except two places.

  • The recursive relation of $P_n(\lambda)$ still holds over $\mathfrak R$. This one can be proven directly by the definition of the determinant as the algebraic sum of $n!$ products.

  • The Cayley–Hamilton theorem that says every square matrix satisfies its own characteristic equation also holds over $\mathfrak R$. This fact is proven here.

    This theorem ensures that any eigenvalue is a root of the characteristic equation.

Related Question