Eigenvalues – Symmetric Tridiagonal Matrices

eigenvalueslinear algebramatricestridiagonal-matrices

Suppose I have the symmetric tridiagonal matrix:

$ \begin{pmatrix}
a & b_{1} & 0 & … & 0 \\\
b_{1} & a & b_{2} & & … \\\
0 & b_{2} & a & … & 0 \\\
… & & … & & b_{n-1} \\\
0 & … & 0 & b_{n-1} & a
\end{pmatrix} $

All of the entries can be taken to be positive real numbers and all of the $a_{i}$ are equal. I know that when the $b_{i}$'s are equal (the matrix is uniform), there are closed-form expressions for the eigenvalues and eigenvectors in terms of cosine and sine functions. Additionally, I know of the recurrence relation:

$det(A_{n}) = a\cdot det(A_{n-1}) – b_{n-1}^{2}\cdot det(A_{n-2})$

Additionally, since my matrix is real-symmetric, I know that its eigenvalues are real.

Is there anything else I can determine about the eigenvalues? Furthermore, is there a closed-form expression for them?

Best Answer

The type of matrix you have written down is called Jacobi matrix and people are still discovering new things about them basically their properties fill entire bookcases at a mathematics library. One of the reasons is the connection to orthogonal polynomials. Basically, if $\{p_n(x)\}_{n\geq 0}$ is a family of orthogonal polynomials, then they obey a recursion relation of the form $$ b_n p_{n+1}(x) + (a_n- x) p_n(x) + b_{n-1} p_{n-1}(x) = 0. $$ You should be able to recognize the form of your matrix from this.

As far as general properties of the eigenvalues, let me mention two:

  1. The eigenvalues are simple. In fact one has $\lambda_j - \lambda_{j-1} \geq e^{-c n}$, where $c$ is some constant that depends on the $b_j$.

  2. The eigenvalues of $A$ and $A_{n-1}$ interlace.

Related Question