[Math] How far is a set of vectors from being orthogonal

geometrylinear algebraoperator-theory

Given some vectors, how many dimensions do you need to add (to their span) before you can find some mutually orthogonal vectors that project down to the original ones?

Or, more formally…

Suppose $v_1,v_2,\ldots,v_k \in \mathbb C^m$. Take $n \ge m$ as small as possible such that if we consider $\mathbb C^m$ as a subspace of $\mathbb C^n$ in the natural way, then there is a projection $\pi$ of $\mathbb C^n$ onto $\mathbb C^m$ such that for some mutually orthogonal vectors $\hat v_1,\hat v_2,\ldots,\hat v_k \in \mathbb C^n$, $\pi(\hat v_i) = v_i$ for each $i$.

Intuitively, this $n$ provides a measure of how "far from orthogonal" the original vectors are. My (deliberately open-ended) question is the following. Does anyone recognize this $n$, or does the idea seem familiar to anyone from any other context? I can't identify it with or connect it to anything else I've encountered, but I'm wondering if it might appear in some other guise in linear algebra, or elsewhere.

Best Answer

Well, you'll never need more than $k-1$ extra dimensions. Let $\hat{v}_1 = v_1 \oplus (1,0,\ldots, 0)$, $\hat{v}_2 = v_2 \oplus (a,1,0,\ldots,0)$, $\hat{v}_3 = v_3 \oplus (b,c,1,0,\ldots, 0)$, etc., and choose $a, b, c, \ldots$ to ensure orthogonality. But this does not tell us under what conditions we can get by with fewer extra dimensions.

If we write $\hat{v}_i = v_i \oplus w_i$ then we have $\langle \hat{v}_i, \hat{v}_j\rangle = \langle v_i, v_j\rangle + \langle w_i, w_j\rangle$. So the problem is to find vectors $w_i$ which span as low dimensional a space as possible, such that $\langle w_i, w_j\rangle = -\langle v_i, v_j\rangle$ for all distinct $i$ and $j$. This can be expressed by saying that the Gramian matrix of the $w_i$'s is minus the Gramian matrix of the $v_i$'s, except on the diagonal (where it can be anything, we don't care).

I think the dimension of the space spanned by the $w_i$'s equals the rank of their Gramian matrix --- if so, then the problem is to fill in the diagonal entries in a way that (1) minimizes rank while (2) keeping the matrix positive semidefinite. Any positive semidefinite matrix is the Gramian matrix of some family of vectors, so solving (1) and (2) will give us the vectors $w_i$. In some sense that answers the question, but it is not a very explicit answer.

Related Question