Finding vectors orthonormal to a given vector set and the Gram-Schmidt process

inner-productsorthogonalityorthonormalvectors

Here is a likely very simple problem that I am confused about: Let $\{v_k\}$ be a set of vectors ($k=1, …, n$). I would like to find a set of vectors $\{q_k\}$ such that

$$\langle q_i| v_j \rangle = \delta_{ij}$$

where $\langle .| . \rangle$ is the inner product (assume standard inner product for simplicity). We do not place any additional restrictions on the properties of $\{q_k\}$.

This question has (in modified form) already been asked on the network: Given a set of non orthogonal functions. Find another set of functions that are orthogonal to the first set.. The answer there links to the Gram-Schmidt process as the solution. I think I understand the latter, but I don't understand how it solves the above problem.

Specifically, I do not understand how the functions $\{u_k\}$ obtained from the Gram-Schmidt process (notation consistent with the wikipedia article above) correspond to the $\{q_k\}$, since the $\{u_k\}$ do not fulfill the required property. This can be seen straightforwardly, since
$$u_1=v_1,$$
such that (assuming the $\{v_k\}$ are already normalized)
$$\langle u_1| v_1 \rangle = 1$$
and
$$\langle u_1| v_2 \rangle = \langle v_1| v_2 \rangle \neq 0 .$$

Also from a conceptual perspective, the two problems look rather different to me. Gram-Schmidt generates an orthogonal basis that spans the same subspace (vectors whose inner-product with themselves is identity matrix), while what I am looking for are vectors whose inner product with the original vectors is the identity matrix. The $\{q_k\}$ are likely not orthonormal themselves in general.

I am probably missing something really simple (physicist here, please be gentle…). Any help would be appreciated. I am not clear under what conditions the required set of vectors can be constructed, so assume linear independence or other properties of $\{v_k\}$ where necessary.

Also it is acceptable if the $\{q_k\}$ lie outside the span of $\{v_k\}$ (or if the vector space and inner product have to suitably be extended for $\{q_k\}$ to exist). E.g. if $\{v_k\}$ are functions $\{v_k(r)\}$and the inner product is the $l^2$-Norm, the functions $\{q_k\}$ do not have to be linear combinations of $\{v_k\}$. In this sense, we are essentially looking for the functions which invert the matrix of the original vectors.

Best Answer

Let $q_i = \sum_{j=1}^n \beta_{ij}v_j$, we want it to satisfy

$$\langle q_i, v_j\rangle= \delta_{ij} $$

$$\langle \sum_{k=1}^n \beta_{ik}v_k, v_j\rangle= \delta_{ij} $$

$$\sum_{k=1}^n \beta_{ik}\langle v_k, v_j\rangle= \delta_{ij} $$

This is a linear system of equations.

That is if we define a matrix $A$ such that the $(i,j)$-th entry is $\langle v_i, v_j\rangle$. If $A$ is invertible, then $\beta_{ij}$ is the $(i,j)$-th entry of $A^{-1}$.

Upon knowing $\beta_{ij}$, we can now solve for $q$.

If $\{ v_1, \ldots, v_n\}$ is not linearly independent, then no such $q$ exists.

WLOG, if $v_1 = \sum_{k=2}^n c_k v_k$, suppose $q_1$ exists, then we have

$$1=\langle v_1, q_1\rangle = \sum_{k=2}^n c_k \langle v_k, q_1 \rangle=0$$

we get a contradiction.


Now, suppse $\{v_1, \ldots, v_d\}$ form a basis where we pick $v_{n+1}, \ldots, v_d$ to be orthonormal and orthogonal to the first $n$ vectors.

Now, let $q_i = \sum_{j=1}^d \beta_{ij}v_j$,

$$\langle \sum_{k=1}^d \beta_{ik}v_k, v_j\rangle= \delta_{ij} $$

$$\sum_{k=1}^d \beta_{ik}\langle v_k, v_j\rangle= \delta_{ij} $$

which reduces to

$$\sum_{k=1}^n \beta_{ik}\langle v_k, v_j\rangle= \delta_{ij} $$

which is the previous case.

Related Question