Linear Algebra – Finding Lengths of Principal Axes of Ellipsoid

geometrylinear algebraquadratic-formsspectral-theory

Find the lengths of the principal axes of the ellipsoid $$\sum_{i \leq j} x_ix_j = 1.$$

— Arnold, Trivium 85

My solution is below. I request verification, feedback, or alternate approaches (especially ways to simplify).


Solution: In $\mathbb R^n$, the ellipsoid $\sum_{i \leq j} x_ix_j = 1$ has a single axis of length $\sqrt {\frac 8 {n+1}}$ and all other axes length $2\sqrt 2$.

Proof: Recall that if $D$ is a diagonal matrix, then $$\mathbf x^\top D \mathbf x = 1$$ is an ellipsoid in standard position, with axis $i$ of length $\frac 2 {\sqrt {D_{ii}}}$.

Simple multiplication shows that the ellipsoid $\sum_{i \leq j} x_ix_j = 1$ has equation $$\mathbf x^\top S \mathbf x = 1$$ where $$S_{ij} = \begin{cases}1 &\text{ if } i = j\\ \frac 1 2 &\text{ otherwise}.\end{cases}$$

Since $S$ is symmetric, the spectral theorem shows that $S$ has $n$ orthogonal eigenvectors with real eigenvalues, and that $S$ decomposes into $S = DQ$, with $Q$ orthogonal and $D$ diagonal. The diagonal entries of $D$ are the eigenvalues of $S$.

Since $Q$ is orthogonal, it preserves lengths. Consequently, if the eigenvalues of $S$ are $\lambda_i$, then the ellipsoid's axes will have length $\frac 2 {\sqrt {\lambda_i}}$.

Inspection shows that the vector $\mathbf \ell$ with all components $1$ is an eigenvector of $S$ with eigenvalue $\frac {n+1} 2$. Inspection likewise shows that for $1 \leq i < n$, the vectors $\mathbf m_i$ with components $$m_{i_j} = \begin{cases}
1 &\text{ if } j = i \\
1 – \sqrt n – n &\text{ if } j = n\\
1 + \sqrt n&\text{ otherwise}\end{cases}$$

are other eigenvectors, each with eigenvalue $\frac 1 2$, which completes the proof.

Remark: The components of $\mathbf m_i$ were determined by solving for $$a + (n-2)b + c = 0 \text{ since } \mathbf m_i \cdot \mathbf \ell = 0 \\
2ab + (n-3)b^2 + c^2 = 0 \text{ since } \mathbf m_i \cdot \mathbf m_j = 0 \text { when } i \neq j.$$
Is there a simpler way to determine them? The fact that the rows of $S$ are rotations of each other suggests some type of symmetry of its eigenvalues, and we know their sum from $\operatorname{trace} S = n$, but I could not develop this further.

Best Answer

If you are given a matrix $S$ which is such that all of it's rows are permutations of the same elements then it automatically follows that the sum along each row of the matrix is constant.

This means that the vector $(1,...,1)'$ is an eigenvector with eigenvalue equal to the sum of a row. This eigenvalue is $1 + (n-1)/2 = \frac{n+1}{2}.$

It can also be seen that

$$ S = \frac{1}{2}\mathbf 1_{n\times n} + \frac{1}{2}I$$ where $\mathbf{1}_{n \times n}$ is the $n\times n $ matrix containing only $1$'s. So it is clear that

$$ \det(S - \frac{1}{2}I) = 0$$ and it is not hard to see that the eigenspace for $\lambda =1/2$ has dimension $n-1$.

Indeed $S - (1/2)I$ is a matrix where all the rows are constant an equal to $1/2$. Therefore the system $(S-(1/2)I)x = 0$ can be reduced to the equation $$ (1/2)x_1 + ... + (1/2)x_n = 0$$ whose solution is an $n-1$ dimensional subspace.


Alternatively since $S$ is symmetric you know that the eigenvectors corresponding to different eigenvalues are orthogonal. Once you found that $(1,...,1)$ is an eigenvector it suggests that other eigenspaces will be subspaces of $x_1 + ... + x_n = 0$ i.e. subspaces of the hyperplane with normal vector (1,...,1). This can be quite useful to know if you are trying to "guess" an eigenvector. In this specific example though the whole hyperplane is an eigenspace.

Indeed take $x$ such that $x_1 + ... + x_n = 0$ then $$ (Sx)_j = x_j + \frac{1}{2} \sum_{i \neq j} x_i = x_j + \frac{1}{2}(-x_j) = \frac{1}{2}x_j$$ which shows that $x$ is an eigenvector.

Related Question