[Math] Anti-bidiagonal matrix with main anti-diagonal {1,2,3,…} and first sub-anti-diagonal {-1,-2,-3,…} has eigenvalues lambda={1,-2,3,-4,…}

eigenvalueslinear algebramatrices

Consider the anti-bidiagonal matrix $B_6\in\mathbb{R}^{6\times 6}$, defined along its anti-diagonals as follows

$$
B_6=\begin{bmatrix} & & & & & 6\\
& & & & 5 & -5\\
& & & 4 & -4\\
& & 3 & -3\\
& 2 & -2\\
1 & -1
\end{bmatrix}.
$$

Its eigenvalues are $\lambda(B_6)=\{1,-2,3,-4,5,-6\}$, a fact easily verified numerically e.g. on MATLAB.

Furthermore, one can numerically verify that this pattern persist. With this, I'm looking for a proof to the following statement.

Proposition 1. For a matrix $B_n$ defined as above, its eigenvalues are given $$\lambda(B_n)=\{(-1)^{i+1}i\}_{i=1}^n\equiv\{1,-2,3,-4,\ldots\}.$$

This has turned out to be surprisingly difficult. It is tempting to predict the eigenvalues by reading off the diagonals. However the matrix is genuinely not triangular, nor does it share many properties with triangular matrices. It is easy to construct counter-examples where the eigenvalues do not coincide with the antidiagonals.

The ramification of this statement is that it characterizes an entire class of graphs to be "well-connected", and the corresponding class of linear algebra problems to be well-conditioned irrespective of problem size. The fact that the eigenvalues are integers is quite significant, since it implies that the solutions to integer and linear program versions of this matrix would coincide, and this can be exploited to prove several graph results. I can give more background if necessary.

Some candidate approaches I have tried:

  1. Recursive characteristic polynomial. It is routine to write down a formula for $\mathrm{det}(B_n – \lambda I_n)$. However, it's difficult to find its roots without resorting to variations on Newton iterations.

  2. Golub-Kahan form. One may reorder the rows and columns to yield a tridiagonal matrix with zeros along the main diagonal. More specifically, let $e_i$ denote the i-th column of the identity, define a similarity transform $E=\begin{bmatrix} e_1 & e_n & e_2 & e_{n-1} & \ldots\end{bmatrix}$, and consider $\hat{B}=E^TBE$. However, its characteristic polynomial is a mess, and we encounter the same problem trying to prove roots of the polynomial.

  3. Matrix inverse. The matrix $B_6$ has the following inverse $$B_{6}^{-1}=\begin{bmatrix}\frac{1}{6} & \frac{1}{6} & \frac{1}{6} & \frac{1}{6} & \frac{1}{6} & \frac{1}{6}\\
    \frac{1}{5} & \frac{1}{5} & \frac{1}{5} & \frac{1}{5} & \frac{1}{5}\\
    \frac{1}{4} & \frac{1}{4} & \frac{1}{4} & \frac{1}{4}\\
    \frac{1}{3} & \frac{1}{3} & \frac{1}{3}\\
    \frac{1}{2} & \frac{1}{2}\\
    1
    \end{bmatrix}.$$ A proof of this is simply gaussian elimation of a triangle-like matrix. Thus Prop 1 can be reworded as $\lambda(B_6^{-1})=\{1,-1/2,1/3,-1/4,1/5,-1/6\}$. It does not appear to be any easier to obtain the eigenvalues of $B^{-1}_n$. Indeed, its characteristic polynomial is even harder to write down.

  4. Interpretation as difference operator. Finally, the matrix can be viewed as a difference operator, and in the limit $\lim_{n\to\infty}B_n$, its eigenfunctions are orthogonal polynomials. This approach yields a proof. However it does not say anything about the cases where $n$ is small, e.g. 4 or 5.

Any advice or suggestions?

Best Answer

This problem is essentially the same as this one. In particular, let $J$ be the anti-diagonal identity matrix, and $P^{-1}$ be the matrix mentioned in the link above. Then, the matrix in the current post is nothing but \begin{equation*} B_n = JP^{-T}J^T. \end{equation*} Since $J^TJ=I$, we can recover eigenvectors and values of $B_n$ using the derivation for $P$ in the linked post.

Related Question