[Math] Diagonalization of a bisymmetric matrix

diagonalizationeigenvalues-eigenvectorsmatricessymmetric matricestoeplitz-matrices

Is there some way to easily diagonalize a rank-$n$ bisymmetric Toeplitz matrix with only zeros on its main diagonal? Direct calculation is out of the question. I need some trick.


Addendum: I don't know why I didn't do it before, but I can write the matrix down.

$$A= \begin{pmatrix}
0 & 1 & 1/2 & 1/3 & \cdots & 1/(n-1) \\
1 & 0 & 1 & 1/2 & \cdots & 1/(n-2) \\
1/2 & 1 & 0 & 1 & \cdots & 1/(n-3) \\
\vdots & & \vdots & \ & \ddots & \vdots \\
1/i & \cdots & 1/|i-j| &\cdots& & 1/(n-i) \\
\vdots & & \vdots & \ & & \vdots \\
1/(n-1) & \cdots & 1/3 & 1/2 & 1 & 0\\
\end{pmatrix}$$

or, simpler yet

$$A_{i,j}=\begin{cases}
0 & i=j \\
\frac{1}{|i-j|} & i,j=1\dots n\\
\end{cases}$$

Best Answer

A real symmetric matrix has a full basis of orthonormal eigenvectors, so given $A$ as defined in the question, there exists real orthogonal matrix $U$ such that $U^T A U$ is diagonal. For reasons outlined here the practical methods for computing the eigenvalues and eigenvectors of matrices are iterative rather than "direct" (in the sense of giving an explicit formula).

The premise of the SciComp.SE question linked above is that iterative methods for real symmetric matrix eigenvalue/diagonalization typically begin by reducing to a similar (real symmetric) tridiagonal form using Householder transformations, or in some cases Givens rotations, another type of real orthogonal matrix. Orthogonal matrices are desirable not merely because one gets their matrix inverse cheaply (by taking the transpose) but because their numerical error properties are advantageous (multiplication by an orthogonal matrix preserves the length of a vector, and this has a benefit also in controlling the growth of rounding errors).

While the tridiagonalization process is direct, getting from there to a diagonal matrix is where iterative methods come into play. The main benefit is that instead of working with a "full" matrix, which requires $O(n^2)$ operations to perform matrix-vector multiplication, in tridiagonal form matrix-vector multiplies cost only $O(n)$.

When working with a Toeplitz matrix, matrix-vector multiplication can be done in $O(n \ln n)$ time by embedding the $n \times n$ Toeplitz matrix in a $2n \times 2n$ circulant matrix, padding the vector with zeros and using FFT.

An approach of that kind preserves the Toeplitz structure of the original matrix, and so can be called a structure-preserving algorithm. Structure-preserving algorithms for bisymmetric eigenproblems have been studied, eg. Structure Preserving Algorithms for Perplectic Eigenproblems. We restrict ourselves for now to describing the "standard trick" mentioned in my comment for transforming a $2n \times 2n$ bisymmetric matrix into an orthogonally similar matrix with two $n \times n$ blocks on the diagonal.

Let $ \def\iddots{ {\kern3mu\raise1mu{.}\kern3mu\raise6mu{.}\kern3mu\raise12mu{.}}} P = \begin{pmatrix} 0 & \cdots & 1 \\ \vdots & \iddots & \vdots \\ 1 & \cdots & 0 \end{pmatrix}$ be the order-reversing $n \times n$ permutation matrix, clearly symmetric as well as orthogonal. Then define $U = \frac{1}{\sqrt{2}}\begin{pmatrix} I & P \\ -P & I \end{pmatrix}$, an orthogonal $2n \times 2n$ matrix.

The bisymmetric $2n \times 2n$ matrix $A$ has the form $\begin{pmatrix} M & B \\ B^T & PMP \end{pmatrix}$, where $M$ is symmetric and also $PB = B^T P$ is symmetric. Then:

$$ U^T A U = \begin{pmatrix} M - BP & 0 \\ 0 & PMP + PB \end{pmatrix} $$

Thus diagonalizing $A$ amounts to diagonalizing each of the half-sized symmetric blocks of this reduced form.

Added:

Below are some computations of eigenvalues for small sizes, showing the interlaced nature of the eigenvalues with incrementing $n$:

    n=6        n=7        n=8        n=9        n=10

                                              3.945471
                                   3.740494  
                        3.512377              1.559856
             3.223818              1.361472  
  2.960929              1.142162              0.604773
             0.930378              0.414335  
  0.620470              0.205781             -0.024785
             0.005202             -0.203630  
 -0.278546             -0.396355             -0.455340
            -0.666005             -0.619805  
 -0.824663             -0.792701             -0.776527
            -0.949686             -0.922321  
 -1.149049             -1.069249             -1.010681
            -1.184892             -1.134022  
 -1.329141             -1.249268             -1.181594
            -1.358814             -1.277188  
                       -1.352748             -1.296923
                                  -1.359335  
                                             -1.364250
Related Question