Decomposition of matrix occuring in problem of finding $n+1$ vectors in $\mathbb{R}^n$ with pairwise equal inner product

eigenvalues-eigenvectorslinear algebramatrix decompositionvector-spaces

I was toying with my intuition that there are always $n+1$ unit vectors in $\mathbb{R}^n$ such that every pair $v_i$ and $v_j$ ($i\neq j$) has the same angle between them. As those vectors are normalized this is equivalent to $$v_i\cdot v_j=\begin{cases} 1,& i=j \\m, & i\neq j \end{cases}$$ with $|m|<1$. So if $V$ is the $n\times (n+1)$ matrix whose columns are formed by the $v_i$ we have $$M:=V^TV=\begin{pmatrix}1 & m & \cdots &m \\m&1&\cdots&m\\ \vdots &&\ddots&\vdots \\ m&m&\cdots&1 \end{pmatrix}$$ $M$ is a $(n+1)\times (n+1)$ matrix and must be singular (as it is the product of two rank $n$ matrices). The sum of its rows is $(nm+1,nm+1,\ldots,nm+1)$. For $m=-\frac{1}{n}$ this is the null vector giving the required dot product $m$ as a function of $n$. E.g., for $n=2$, $m=-\frac{1}{2}$ corresponding to the expected angle of $120°=\arccos{-\frac{1}{2}}$. Now I'm asking is there a matrix decomposition so that I can get $V$ back from the product $M$? (I know that $V$ is not unique since I can freely rotate the vectors $v_i$ about an arbitrarily chosen fixed axis without changing their pairwise dot product.) I tried spectral decomposition of $M$ but that gives $n+1$ dimensional vectors and the $n$ nonzero eigenvalues of $M$ are all equal to $\frac{n+1}{n}$ which means that any linear combination of eigenvectors corresponding to that eigenvalue is also an eigenvector making the dot product between those eigenvectors rather arbitrary.

Best Answer

In ${\mathbb R}^n$, having $n+1$ unit vectors such that the angle between any two of them be equal
implies that the vectors individuate the vertices of a regular n-simplex inscribed in the unit sphere.
That means that the cosine of the angle (your $m$) shall be $-1/n$.

$n$ of the vectors will be independent, while the (n+1)-th one will necessarily be dependent.
Therefore the $(n+1) \times (n+1)$ matrix $$ M = V^{\,T} V $$ will have null determinant.
But a $n \times n$ diagonal submatrix of it ($M'$)will be full-rank, symmetric (Hermitean) and positive definite, being a matrix of inner products.
So it admits a Cholesky decomposition $$ M' =L L^{\,T} $$ which is unique and therefore we shall have $$ M' = L\,L^{\,T} = V'^{\,T} V'\quad \Rightarrow \quad V' = L^{\,T} $$ where $V'$ is the $n \times n$ matrix of $n$ of the (column) vectors.

The additional $n+1$ vector $v$ will be derived by solving $$ V'^{\,T} v = Lv = - \frac{1}{n}\left( {\begin{array}{*{20}c} 1 \\ 1 \\ \vdots \\ 1 \\\end{array}} \right) $$

For example, for ${\mathbb R}^3$ we get $$ \begin{array}{l} M' = \left( {\begin{array}{*{20}c} 1 & { - 1/3} & { - 1/3} \\ { - 1/3} & 1 & { - 1/3} \\ { - 1/3} & { - 1/3} & 1 \\ \end{array}} \right) = \\ = L\,L^{\,T} = \left( {\begin{array}{*{20}c} 1 & 0 & 0 \\ { - \frac{1}{3}} & {\frac{{2\sqrt 2 }}{3}} & 0 \\ { - \frac{1}{3}} & { - \frac{{\sqrt 2 }}{3}} & {\frac{{\sqrt 3 \sqrt 2 }}{3}} \\ \end{array}} \right)\left( {\begin{array}{*{20}c} 1 & { - \frac{1}{3}} & { - \frac{1}{3}} \\ 0 & {\frac{{2\sqrt 2 }}{3}} & { - \frac{{\sqrt 2 }}{3}} \\ 0 & 0 & {\frac{{\sqrt 3 \sqrt 2 }}{3}} \\ \end{array}} \right) \\ \end{array} $$ and $$ V = \left( {\begin{array}{*{20}c} 1 & { - \frac{1}{3}} & { - \frac{1}{3}} & { - \frac{1}{3}} \\ 0 & {\frac{{2\sqrt 2 }}{3}} & { - \frac{{\sqrt 2 }}{3}} & { - \frac{{\sqrt 2 }}{3}} \\ 0 & 0 & {\frac{{\sqrt 3 \sqrt 2 }}{3}} & { - \frac{{\sqrt 3 \sqrt 2 }}{3}} \\ \end{array}} \right) $$