[Math] Set of unit orthogonal vectors in $\mathbb R^n$ forms basis

linear algebrareal numbersvector-spaces

Let $(\mathbb R^n,d)$ be an Euclidean space where $d$ is the Euclidean distance function.

Let two vectors $(a_1,…,a_n)$ and $(b_1,…,b_n)$ be called orthogonal iff $\displaystyle \sum_{i=1}^n a_ib_i$.

Let a vector $(v_1,…,v_n)$ be called a unit vector iff $d\big((0,…,0),(v_1,…,v_n)\big)=1$

How to prove that set $S$ with $n$ elements such that all $s \in S$ are orthogonal to each other and have unit length forms a basis for that vector space?

There was a question like this already, but the answer there is in form of a tip which I can't understand, not solution, and it's pointless to expect an answer when responding to a 3 year old comment.

Best Answer

Let us denote the set of $n$ orthonormal vectors in $\mathbb{R}^n$ as follows: $$S = \{e_1, e_2, \ldots e_n\}$$ where $e_i = (e_{1i}, e_{2i}, \ldots e_{ni}) \in \mathbb{R}^n$ and $e_i, e_j$ are orthogonal if $i \neq j$.

Consider a the following linear combination: $$0 = a_1e_1 + a_2 e_2 + \ldots + a_ne_n$$ where the $a_i \in \mathbb{R}$ and $0 = (0,0, \ldots, 0)$. If we can show that all $a_i$ are zero, we find that the vectors in $S$ are linearly independent. Let us use the orthogonality of the vectors in this set to do so. Consider the following inproduct (I will denote the inproduct by $\cdot$: $$e_i \cdot (a_1e_1 + a_2 e_2 + \ldots + a_ne_n).$$ Because of linearity, we find that this is equal to $$a_1 (e_i \cdot e_1) + \ldots a_i (e_i \cdot e_i) + \ldots a_n (e_i \cdot e_n).$$ Because $e_i$ is orthogonal to every $e_j$ with $i \neq j$, we have that $e_i \cdot e_j = 0$ if and only if $i \neq j$. For $e_i \cdot e_i$, we have that this is non-zero (since the vectors in $S$ are orthonormal, we have that $e_i \cdot e_i = 1$, try to find this yourself). Hence we find that $$a_1 (e_i \cdot e_1) + \ldots a_i (e_i \cdot e_i) + \ldots a_n (e_i \cdot e_n) = 0 + \ldots + a_i + \ldots 0 = a_i.$$

However, we have that $e_i \cdot (0,0, \ldots, 0) = 0$, so we find that $a_i = 0$ and this holds for every $i \in \{1, \ldots, n\}$. So the vectors in $S$ are linearly independent.

Since the dimension of $\mathbb{R}^n$ is $n$, we have that $S$ forms a basis of this vectorspace.

$\textbf{Edit:}$ since this proof only uses that $e_i$ are of norm 1 in order to show that $e_i \cdot e_i \neq 0$, the same proof holds to show that any set of orthogonal vectors is lineary independent!

Related Question