Linear Algebra: Find four unit vectors in $\mathbb{R}^3$ with the same angle between each.

linear algebrasymmetric matrices

I need to determine the set of all angles $\theta$ such that there exists some four distinct unit vectors $\vec{v}_1, …, \vec{v}_4 \in \mathbb{R}^3$ and the angle between any $\vec{v_i}$ and $\vec{v_j}$ is $\theta$ if $i\ne j$. I cannot geometrically imagine an instance of this, although I can pretty easily see examples for three distinct vectors. If you have two vectors, you can easily add a third such that its angle with the other two is equal to the angle of the first two. But if you have three vectors of equal pair-wise angle, I can't see any way to rearrange or place a fourth so that they share the same angle.

The dot-product sounds relevant since any vector dotted with itself is 1, and dotted with any other vector is $\cos\theta$. I've seen the trick enough times to suspect that I form the matrix $A^TA$ where the columns of $A$ are the vectors. Then

$$A^TA = \begin{pmatrix}
1 & \cos\theta & \cos\theta & … & \cos\theta \\
\cos\theta & 1 & \cos \theta & … & \cos\theta \\
… & … & … & … & … \\
\cos\theta & \cos\theta & … & \cos\theta & 1
\end{pmatrix}$$

That is to say, 1s down the main diagonal, $\cos\theta$ everywhere else. It's a symmetric matrix, but I'm not sure what to do with it. I could take the determinant but I'm not sure what that would give me.

Best Answer

The approach you guessed actually works: As you noted yourself, an eigendecomposition of the inner product matrix $G=G_{\theta}$ (which is typically called Gramian matrix, by the way) is, with $x:=\cos(\theta)$,

$$ G= \begin{pmatrix} 1 &1 &0 &0\\ 1 &-1 &1 &0\\ 1 &0 &-1 &1 \\ 1 &0 &0 &-1 \end{pmatrix} \begin{pmatrix} 1+3x & 0 & 0 & 0\\ 0 & 1-x & 0 & 0\\ 0 & 0 & 1-x & 0 \\ 0 & 0 & 0 & 1-x \end{pmatrix} \begin{pmatrix} 1 &1 &0 &0\\ 1 &-1 &1 &0\\ 1 &0 &-1 &1 \\ 1 &0 &0 &-1 \end{pmatrix}^{-1} $$ If there exists a matrix $A\in\mathbb{R}^{3\times 4}$ such that $G=A^{\top}A$, then $G$ can have at most rank $3$. Since the left and right matrices above are independent of $x$ and full rank, the middle matrix must have rank $3$. This can only happen if $x=1$ (which is a trivial case I leave to you) or when $x=-1/3$. This already proves the first bit:

If a solution exists, we must have $x=-1/3$ or $\theta=\arccos(-1/3)=109.5 ^{\circ} $.

To prove that there is a solution, i.e a matrix $A$ such that $G_{109.5}=A^{\top}A$, note that for $x=-1/3$ you have a positive semi-definite matrix. Such matrices have a decomposition as $A^{\top}A$ for $A\in \mathbb{R}^{3\times 4}$; you could use the singular value decomposition to prove this. $\Box$

Practically, you can find $A$ as follows:

Write the eigenvector matrix from above be $$ (v_1|V_2) $$ where $v_1=(1,1,1,1)$ and $V_2$ is the remaining $3\times 4$ part. Find an orthonormal basis of that part to get a new matrix $\tilde{V_2}$ of eigenvectors to the eigenvalue $1-x$. Since $$ M=(v_1/2|\tilde{V_2}) $$ is an orthonormal basis, you have $$ G=MDM^{-1}=MDM^{\top}. $$ with $D$ the middle matrix from above. Since the first entry of $D$ is zero, this reduces to $$ \tilde{V_2}\tilde{D}\tilde{V_2}^{\top}=(\tilde{V_2}\sqrt{\tilde{D}})(\sqrt{\tilde{D}}\tilde{V_2})^{\top}, $$ where you can read off the solution $A:=(\sqrt{\tilde{D}}\tilde{V_2})^{\top}$.

Related Question