Axler “Linear Algebra Done Right” Exercise 6.B.13

inner-productslinear algebralinear-transformationssolution-verificationvector-spaces

This exercise appears in Section 6.B "Orthonormal Bases" in Linear Algebra Done Right by Sheldon Axler. Inner product spaces, norms, orthogonality, and orthonormal bases have been introduced.

The unofficial solution manual I am using presents a somewhat involved proof of this exercise that involves induction and the Gram-Schmidt procedure, which I will not reproduce here. However, I arrived at an alternative proof that is much simpler in my opinion. I would like to check if my proof is correct.


$(V, \langle \cdot, \cdot \rangle)$ is an inner product space over the field $\mathbb{F}$, which stands for either $\mathbb{R}$ or $\mathbb{C}$. (Axler, pp. 4, 167)

  1. Suppose $v_1, \dotsc, v_m$ is a linearly independent list in $V$. Show that there exists $w \in V$ such that $\langle w, v_j \rangle > 0$ for all $j \in \{1, \dotsc, m\}$. (Axler, p. 191)

Proof. Let $U = \operatorname{span}(v_1, \dotsc, v_m)$. Consider the linear map $\phi : U \to \mathbb{F}^m$ defined by
$$
\phi(u) = (\langle u, v_1 \rangle, \dotsc, \langle u, v_m \rangle).
$$

We show that $\phi$ is injective. Suppose that $u \in U$ and $\phi(u) = 0$. Then,
$$
\langle u, v_1 \rangle = \dotsb = \langle u, v_m \rangle = 0.
$$

Since $u \in \operatorname{span}(v_1, \dotsc, v_m)$,
we have $\langle u, u \rangle = 0$, so $u = 0$. Hence, $\phi$ is injective. But $\dim U = \dim \mathbb{F}^m = m$, so $\phi$ is surjective. Choose $w \in U$ such that
$$
\phi(w) = (1, \dotsc, 1).
$$

Therefore, $\langle w, v_j \rangle = 1 > 0$ for all $j \in \{1, \dotsc, m\}$.

Best Answer

Axler broadened the scope of No. 13 in his 4th Edition.

  1. Suppose $v_1,...,v_n$ is a linearly independent list in $V$. Show that there exists $w\negthinspace\in\negthinspace V$ such that $\langle v_k,w\rangle\negthinspace =\negthinspace 1$ for all $k\negthinspace\in\negthinspace\negthinspace\{1,...,n\}$.
  1. Suppose $v_1,...,v_n$ is a basis of $V$. Prove that there exists a basis $u_1,...,u_n$ of $V$ such that

$$\langle v_j, u_k\rangle = \begin{cases} 1 &\text{if } j=k \\ 0 &\text{if } j\not= k \end{cases}$$

For the sake of generality, we shall refine the problem 19 as

  1. Suppose $\{v_1,...,v_n\}$ is a linearly-independent list in $V$. Prove that there exists a linearly-independent list $\{u_1,...,u_n\}$ in $V$ such that

$$\langle u_j, v_k\rangle = \begin{cases} 1 &\text{if } j=k \\ 0 &\text{if } j\not= k \end{cases}$$

$\hspace{1cm}$

  1. Take $w=u_1+...\negthinspace +u_n$ with $u_1,...,u_n$ from Problem 19.

  2. Apply the Gram-Schmidt procedure to the list $\{v_1,...,v_n\}$ to obtain an orthonormal list $\{e_1,...,e_n\}.$

From the procedure above, one obtains

\begin{align*} (1) &&\text{span}(e_1,...,e_j) = \text{span}(v_1,...,v_j), \ \ j=1,...,\negthinspace n\\ && \\ (2) &&\langle e_1,v_1\rangle = \Vert v_1\Vert > 0\\ &&\forall\negthinspace _ {k\in\{2,...,n\}}\negthinspace : \langle e_k, v_k\rangle = \left\Vert v_k-\sum_{j=1}^{k-1}\langle v_k, e_j\rangle e_j\right\Vert > 0 \\ && \\ (3) &&\forall\negthinspace _ {k\in\{1,...,n-1\}}\negthinspace : e_{k+1},...,e_n \text{ are orthogonal to } v_k \end{align*}

$\hspace{1cm}$

Let $\thinspace u_n=\frac{1}{\langle e_n,v_n\rangle} e_n$.

Define $u_k$ inductively (backward) as follows:

$$u_k = \frac{1}{\langle e_k,v_k\rangle} \left\lbrace e_k -\negthinspace\sum_{j=k+1}^n \langle e_k, v_j\rangle u_j \right\rbrace, \ \ \thinspace k=1,...,\negthinspace n\negthinspace -\negthinspace 1$$

$ $

By $(3), \ e_n$ is orthogonal to $v_1,...,v_{n-1}$

$\implies\negthinspace\negthinspace u_n$ is orthogonal to $v_1,...,v_{n-1}$

$$\implies\negthinspace\negthinspace\langle u_n, v_j\rangle = \begin{cases} 1 &\text{if } j=n \\ 0 &\text{if } j\not= n \end{cases}$$

$ $

Let $\, 1\leq m < n$.

Assume that

$$\langle u_k, v_j\rangle = \delta_{kj} = \begin{cases} 1 &\text{if } j=k \\ 0 &\text{if } j\not= k \end{cases}$$

holds for each $k\in\!\{m\! +\! 1,...,\negthinspace n\}$.

$ $

By $(3), \ e_m$ is orthogonal to $v_1,...,v_{m-1}$, then

\begin{align*} \langle u_m,v_k\rangle &= \frac{1}{\langle e_m,v_m\rangle}\left\lbrace \langle e_m,v_k\rangle -\negthinspace\negthinspace\sum_{j=m+1}^n \!\langle e_m,v_j\rangle\thinspace \delta\! _{jk}\right\rbrace \\ &= \begin{cases} \frac{1}{\langle e_m,v_m\rangle} \left\lbrace 0\! -\negthinspace\negthinspace\sum_{j=m+1}^n \langle e_m,v_j\rangle (0)\right\rbrace = 0 &\text{if } k\!\in\!\!\{1,...,m\! -\! 1\} \\ \frac{1}{\langle e_m,v_m\rangle}\left\lbrace\langle e_m,v_m\rangle \! - \negthinspace\sum_{j=m+1}^n \langle e_m,v_j\rangle (0) \right\rbrace = 1 &\text{if } k\! =\! m \\ \frac{1}{\langle e_m,v_m\rangle}\big\lbrace \langle e_m,v_k\rangle -\langle e_m,v_k\rangle \big\rbrace = 0 &\text{if } k\!\in\!\!\{m\! +\! 1,...,n\}\end{cases} \end{align*}

In other words, $$\langle u_m,v_j\rangle = \begin{cases} 1 &\text{if } j=m \\ 0 &\text{if } j\not= m \end{cases}$$

(backward) Induction complete.

$\hspace{1cm}$

Let $a_1,...,a_n\negthinspace\in\negthinspace\mathbb{F}$ be chosen arbitrarily such that

$$a_1u_1\thinspace +\thinspace ...+\thinspace a_nu_n = 0$$

For each $k\negthinspace\in\!\negthinspace\{1,...,n\}$, apply $\langle\thinspace\cdot\thinspace ,v_k\rangle$ on both sides. One obtains $$\forall\negthinspace _ {k\in\{1,...,n\}}\negthinspace : a_k = 0 $$

As such $a_1,...,a_n\negthinspace\in\negthinspace\mathbb{F}$ were chosen arbitrarily, \begin{align*} \implies &\forall\negthinspace _ {a_1,...,a_n\in\mathbb{F}}\negthinspace : \biggr(a_1u_1+...\negthinspace +a_nu_n\negthinspace\negthinspace =\negthinspace 0\negthinspace\implies\negthinspace a_1\negthinspace\negthinspace =...\negthinspace =\negthinspace a_n\negthinspace\negthinspace =\negthinspace 0 \biggr) \\ \iff &\{u_1,...,u_n\} \text{ is a linearly-independent list in } V \end{align*}

We have found the sought-after list.

$$\tag*{$\blacksquare$}$$