[Math] How to prove there exists a unique linear map such that $T(e_i) = w_i$ in an infinite-dimensional vector space

linear algebralinear-transformationsvector-spaces

Problem: (a) Let $V$ and $W$ be two finite dimensional vectorspaces over a field $F$, and let $\left\{e_1, e_2, \ldots, e_n\right\}$ be a basis for $V$. Then there exists for each $w_i \in W$ an unique linear map $T: V \rightarrow W$ such that $T(e_i) = w_i$ for $i = 1,2 \ldots, n$.

(b) Now let $V$ be an infinite dimensional vectorspace with basis $\left\{e_i \mid i \in \mathbb{N}\right\}$. Let also $w_i\in W$ be a vector in the vectorspace $W$, with $i \in \mathbb{N}$. Prove there exists an unique linear map $T: V \rightarrow W$ such that $T(e_i) = w_i$ for $i \in \mathbb{N}$.

I proved (a) as follows, but don't know how to do (b), and if it can be deduced easily from (a).

Proof: Let $v \in V$. Then we have for unique scalars $a_i \in F$ that \begin{align*} v = \sum_{i=1}^{n} a_i e_i. \end{align*} Now we will construct the following map: \begin{align*} T: V \rightarrow W: T(v) \mapsto \sum_{i=1}^{n} a_i w_i. \end{align*} Then this map is well defined since every scalar $a_i$ is unique. We now proof this map is linear. Let $x, y \in V$ and $\lambda \in F$. Then \begin{align*} x = \sum_{i=1}^{n} \delta_i e_i \ \text{and} \ y = \sum_{i=1}^{n} \varepsilon_i e_i \end{align*} for some scalars $\delta_i, \varepsilon_i \in F$. It follows that \begin{align*} \lambda x + y = \sum_{i=1}^{n}(\lambda \delta_i + \varepsilon_i)e_i. \end{align*} The image of this is \begin{align*} T(\lambda x + y) = \sum_{i=1}^{n}(\lambda \delta_i + \varepsilon_i)w_i = \lambda \sum_{i=1}^{n} \delta_i w_i + \sum_{i=1}^{n} \varepsilon_i w_i = \lambda T(x) + T(y). \end{align*} This means the map is linear. Now, to prove the uniqueness of it, suppose there exists another map $T': V \rightarrow W$ with $T'(e_i) = w_i$ for each $i = 1,2, \ldots, n$. Then we have for each $v \in V$ that \begin{align*} T'(v) = T'\big(\sum_{i=1}^{n} a_i e_i \big)= \sum_{i=1}^{n} a_i T'(e_i) = \sum_{i=1}^{n}a_i w_i = T\big(\sum_{i=1}^{n} a_i e_i \big) = T(v). \end{align*} Hence it follows that $T' = T$, and thus the constructed map is always unique. Q.E.D.

(b) How should I do this part now? I'm not used to working with infinite dimensional spaces. I don't know what I should change in my proof so that $V$ can be infinite dimensional. I mean, if $V$ is infinite dimensional, can I then still write a vector $v \in V$ as a linear combination of the basis vectors? Help would be appreciated.

Best Answer

Recall that a subset $\beta$ of a vector space $V$ is a basis if every vector in $V$ can be expressed as a linear combination of finitely many vectors in $\beta$ in a unique way.

Let's prove a generalization of your result.

Proposition. Let $\beta=\{v_j\in V:j\in J\}$ be a basis for a vector space $V$ and let $S=\{w_j\in W:j\in J\}$ be a subset of a vector space $W$. Then there exists a unique linear map $T:V\to W$ such that $T(v_j)=w_j$ for $j\in J$.

Proof. For existence, define $T:V\to W$ by the formula $$ T(v)=\lambda_1 w_{j_1}+\dotsb+\lambda_n w_{j_n} $$ where $$ v=\lambda_1 v_{j_1}+\dotsb+\lambda_n v_{j_n}\tag{1} $$ Note that $T$ is well-defined by the uniqueness of the expression (1). To see that $T$ is linear, let $v,w\in V$ and let $\lambda\in\Bbb k$. Write \begin{align*} v &= \lambda_1v_{j_1}+\dotsb+\lambda_n v_{j_n} & w &= \gamma_1v_{j_1}+\dotsb+\gamma_n v_{j_n} \end{align*} Then \begin{align*} T(\lambda v+w) &= T\bigl((\lambda\lambda_1+\gamma_1)v_{j_1}+\dotsb+ (\lambda\lambda_n+\gamma_n)v_{j_n}\bigr) \\ &= (\lambda\lambda_1+\gamma_1)w_{j_1}+\dotsb+ (\lambda\lambda_n+\gamma_n)w_{j_n} \\ &=\lambda(\lambda_1w_{j_1}+\dotsb+\lambda_nw_{j_n})+ \gamma_1 w_{j_1}+\dotsb+\gamma_n w_{j_n} \\ &=\lambda T(v)+T(w) \end{align*} Thus $T$ is linear.

For uniquness, suppose that $T$ and $T^\prime$ are two linear maps $V\to W$ satisfying $T(v_j)=w_j$ for $j\in J$. Let $v\in V$ and write $$ v=\lambda_1v_{j_1}+\dotsb+\lambda_nv_{j_n} $$ Then \begin{align*} T(v) &= \lambda_1 w_{j_1}+\dotsb+\lambda_n w_{j_n} \\ &= \lambda_1 T^\prime(v_{j_1})+\dotsb+\lambda_n T^\prime(v_{j_n}) \\ &= T^\prime(\lambda_1v_{j_1}+\dotsb+\lambda_nv_{j_n}) \\ &= T^\prime(v) \end{align*} Hence $T=T^\prime$. $\Box$