[Math] Are all Vectors of a Basis Orthogonal

linear algebramatrices

Assuming we have a basis for a set $\mathbb{R}^n$, would any set of linearly independent vectors that form a basis for $\mathbb{R}^n$ also be orthogonal to each other?

Take the trivial case of $(1,0)$ and $(0,1)$. Now any set of linear independent vectors would be a scalar multiple of these two vectors that form a Basis for $\mathbb{R}^2$ hence they have to be orthogonal. Right?

Best Answer

No. The set $\beta=\{(1,0),(1,1)\}$ forms a basis for $\Bbb R^2$ but is not an orthogonal basis. This is why we have Gram-Schmidt!

More general, the set $\beta=\{e_1,e_2,\dotsc,e_{n-1},e_1+\dotsb+e_n\}$ forms a non-orthogonal basis for $\Bbb R^n$.

To acknowledge the conversation in the comments, it is true that orthogonality of a set of vectors implies linear independence. Indeed, suppose $\{v_1,\dotsc,v_k\}$ is an orthogonal set of nonzero vectors and $$ \lambda_1 v_1+\dotsb+\lambda_k v_k=\mathbf 0\tag{1} $$ Then applying $\langle-,v_j\rangle$ to (1) gives $\lambda_j\langle v_j,v_j\rangle=0$ so that $\lambda_j=0$ for $1\leq j\leq k$.

The examples provided in the first part of this answer show that the converse to this statement is not true.