Invertibility Regarding Inner Product Spaces

inner-productslinear algebra

Exercise Let $(V, \langle \,\, , \,\,\rangle)$ be an inner product space and let $\mathcal{L} = \{v_1, \dots, v_n\} \subset V$. Prove that $\mathcal{L}$ is linearly independent if and only if

$$A = \begin{bmatrix}
\langle v_1, v_1 \rangle & \langle v_2, v_1 \rangle & \dots & \langle v_n, v_1 \rangle \\
\langle v_1, v_2 \rangle & \langle v_2, v_2 \rangle & \dots & \langle v_n, v_2 \rangle \\
\vdots & \vdots & \vdots & \vdots \\
\langle v_1, v_n \rangle & \langle v_2, v_n \rangle & \dots & \langle v_n, v_n \rangle \\
\end{bmatrix}$$

is invertible.


For the $(\Rightarrow)$ forward direction, if we suppose $\mathcal{L}$ is linearly independent, then we know that if

$$a_1v_1 + \dots a_nv_n = 0$$

then each $a_i = 0, \hspace{0.4cm} 1 \leq i \leq n$. Now, we know that orthogonal implies linear independent, but linear independence need not imply orthogonality. So it is tricky to see how we will arrive at the conclusion $\det(A) \neq 0$, i.e. $A$ is invertible.

For the $(\Leftarrow)$ backwards direction, we suppose that $A$ is invertible and so $\det(A) \neq 0$. Now, since $\det(A) \neq 0$, then there exists $v_i, v_j \in \{v_1, \dots, v_n\}$ so that

$$\langle v_i, v_j \rangle \cdot \det(A^*) \neq 0$$

where $A^*$ is the square matrix obtained from $A$ by eliminating the $i^{th}$ column and $j^{th}$ row. In other words, at least one term of the $n \times n$ determinant of $A$ is nonzero.


Are these approaches headed in the right direction or am I misled? I am unsure how to complete the proof for either direction. Any advice or suggestions are greatly appreciated in advance.

Best Answer

Hint. Let $V=[v_1~\cdots~v_n]$ be the matrix of column vectors $v_i$. Then the matrix in question is $V^\dagger V$ (assuming your complex inner products are conjugate-linear in the first argument - if we're talking about real inner product spaces we can just ignore the complex stuff). Notice $a^\dagger(V^\dagger V)a=\|Va\|^2$.

In more explicit terms (and also coordinate-free if you don't want to write vectors as columns):

$$ \begin{bmatrix} \overline{a_1} & \cdots & \overline{a_n} \end{bmatrix} \begin{bmatrix} \langle v_1,v_1\rangle & \cdots & \langle v_1,v_n\rangle \\ \vdots & \ddots & \vdots \\ \langle v_n,v_1\rangle & \cdots & \langle v_n,v_n\rangle \end{bmatrix} \begin{bmatrix} a_1 \\ \vdots \\ a_n \end{bmatrix} = \|a_1v_1+\cdots+a_nv_n \|^2 $$

In fact, for real inner product spaces, $\det(V^T V)$ (the Grammian determinant) is the squared volume $\mathrm{vol}^2$ of the parallelotope spanned by $v_1,\cdots,v_n$. When the dimension of $V$ matches the number of vectors $n$, this is the special case that $\det V=\mathrm{vol}$, but otherwise it is more general.

This can be generalized even further to an inner product on the exterior power $\Lambda V$ which can be used to calculate the "volume distorion factor" associated with orthogonally projecting one subspace onto another.

Related Question