Doubt about solution to Axler’s Linear Algebra Done Right problem

linear algebraproof-explanationsolution-verification

I am confused about a solution by Stanford's MATH113 class to a problem in Sheldon Axler's Linear Algebra Done Right, 3rd Ed. I have seen solutions elsewhere (on Slader) that are very similar.

The question (3.A.11, pg 58) is below, where $\mathcal{L}(V, W)$ denotes the set of all linear maps from $V$ to $W$.

Suppose $V$ is finite-dimensional. Prove that every linear map on a subspace of $V$ can be extended to a linear map on $V$. In other words, show that if $U$ is a subspace of $V$ and $S \in \mathcal{L}(U, W)$, then there exists $T \in \mathcal{L}(V, W)$ such that $Tu = Su$ for all $u \in U$.

Below is the solution from Stanford's MATH113 class, Fall 2015. I was unable to find any of the Propositions or Definitions mentioned.

Proof. Suppose $U$ is a subspace of $V$ and $S \in \mathcal{L}(U, W)$. Choose a basis $u_1, \ldots, u_m$ of $U$. Then $u_1, \ldots, u_m$ is a linearly independent list of vectors in $V$ and so can be extended to a basis $u_1, \ldots, u_m, v_1, \ldots, v_n$ of $V$ (by Proposition 2.33). Using Proposition 3.5, we know that there exists a unique linear map $T \in \mathcal{L}(V, W)$ such that
\begin{align}
Tu_i = Su_i \quad &\text{for all} \quad i \in \{1, 2, \ldots, m\} \\
Tv_j = 0 \quad &\text{for all} \quad j \in \{ 1, 2, \ldots, n \} .
\end{align}

Now we are going to prove $Tu = Su$ for all $u \in U$. For any $u \in U$ $u$ can be written as $a_1 u_1 + \cdots + a_m u_m$. Since $S \in \mathcal{L}(U, W)$, by Definition 3.2 we have
$$Su = a_1 Su_1 + a_2 S u_2 + \cdots + a_m S u_m.$$
Since $T \in \mathcal{L}(V, W)$, we have
\begin{align}
Tu &= T(a_1 u_1 + \cdots + a_m u_m) \\
&= a_1 T u_1 + a_2 T u_2 + \cdots a_m T u_m \\
&= a_1 S u_1 + a_2 S u_2 + \cdots a_m S u_m \\
&= Su.
\end{align}

Therefore we have $Tu = Su$ for all $u \in U$, so we have proved that every linear map on a subspace of $V$ can be extended to a linear map on $V$.

The solution proves that $Tu = Su$ for all $u \in U$, but I do not see how that is sufficient to show that $T$ is linear — what about those elements in $V$ that are not in $U$?

Suppose $U'$ is the complementary subspace to $U$ (that is, $U \oplus U' = V$). I believe any element in $U'$ can be expressed as a linear combination of $v_1, \ldots, v_n$. Now take $a, b \in V$ and $\lambda \in \mathbb{F}$. To prove $T$ is linear, I believe we need to show that
$$ T(\lambda a + b) = \lambda T(a) + T(b) $$
holds, even if (among other combinations) $a \in U$ and $b \in U'$. How (if at all) does the Stanford proof address this? It seems to me that they only consider the case where both $a$ and $b$ are in $U$.

Best Answer

This point is addressed "under the hood" in the statement

Using Proposition 3.5, we know that there exists a unique linear map $T \in \mathcal{L}(V, W)$ such that \begin{align} Tu_i = Su_i \quad &\text{for all} \quad i \in \{1, 2, \ldots, m\} \\ Tv_j = 0 \quad &\text{for all} \quad j \in \{ 1, 2, \ldots, n \} . \end{align}

It is specified that $T$ is a linear map, so we already know that $T$ is linear.

As for what $T$ does to elements not in $U$: a vector $v \notin U$ can be expressed as $$ v = a_1 u_1 + \cdots + a_m u_m + b_1 v_1 + \cdots + b_n v_n, $$ and the fact that $v \notin U$ tells us that one of the coefficients $b_j$ is non-zero. We find that $$ T(v) = a_1 T(u_1) + \cdots + a_m T(u_m) + b_1 T(v_1) + \cdots + b_n T(v_n) \\ = a_1 T(u_1) + \cdots + a_m T(u_m). $$ Note that they could have equivalently made a proof using a complementary subspace. The complementary subspace corresponding to the map constructed in the proof is $U' = \operatorname{span}\{v_1,\dots,v_m\}$, and $T$ was defined so that $T|_{U'} = 0$.

Related Question