Spaces of linear maps and dual space

dual-spaceslinear algebralinear-transformationssolution-verificationvector-spaces

Yesterday I learned about dual spaces when reading about spaces of linear maps. The concept of a linear map and why linear maps form a vector space is clear to me. But there are some details about the dual space and its basis that I could not fully understand.

The text I am reading states the following:

Furthermore, for fixed vector spaces $U$ and $V$ over $K$, the operations of addition and scalar multiplication on the set $\operatorname{Hom}_K(U,V)$ of all linear maps from $U$ to $V$ makes $\operatorname{Hom}_K(U,V)$ into a vector space over $K$.

Given a vector space $U$ over a field $K$, the vector space $U^{*} = \operatorname{Hom}_K(U,K)$ plays a special role. It is often called the dual space or the space of covectors of $U$. One can think of coordinates as elements of $U^{*}$. Indeed, suppose that $U$ is finite-dimensional and let $e_{1},…,e_{n}$ be a basis of $U$. Every $x \in U$ can be uniquely written as
$$x=\alpha_{1}e_{1}+…+\alpha_{n}e_{n}, \alpha_{i} \in K.$$
The scalars $\alpha_{1},…\alpha_{n}$ depend on $x$ as well as on the choice of basis, so for each $i$ one can write the coordinate function
$$e^{i}: U \to K, e^{i}(x)=\alpha_{i}.$$
It is routine to check that each $e^{i}$ is a linear map, and indeed the functions $e^{1},…,e^{n}$ form a basis of the dual space $U^{*}$.

Now I have two questions:

1) The text states that $\operatorname{Hom}_K(U,V)$ is a vector space for fixed $U$ and $V$. This is perfectly clear to me, but is it correct that the dual space is $U^{*}=\operatorname{Hom}_K(U,K)$, i.e. it consists of all linear maps from $U$ to the field $K$? At first I thought this was a typo, but from what I've read on wikipedia and from other sources the notation seems to be correct. It also seems to make no sense to say the coordinate functions form a basis for $\operatorname{Hom}_K(U,V)$.

2) It is easy to see that the coordinate functions $e^{1},…,e^{n}$ are linear maps and I have also tried to check that the claim that they form a basis for $U^{*}$. However, I am unsure if my proof is correct and I think this is mainly because of my confusion stated in the first question.

My proof goes as follows:

We need to prove that $e^{1},…,e^{n}$ are linearly independent and span $U^{*}$.

First note that linear maps are uniquely determined by their action on a basis. Now let $0_{UK}:U \to K, 0_{UK}(u)=0$ $\forall u$ be the zero map.
To prove linear independence we need to show that

$(*)$ $b_{1}e^{1}+…+b_{n}e^{n}=0_{UK} \implies b_{i}=0$ $\forall i$

or in other words the only linear combination of $e^{1},…,e^{n}$ that gives $0_{UK}$ is the trivial linear combination.
Now assume $b_{i}=0$ for some $i$, then(*) clearly fails if $x$ has a non-zero ith coordinate and so $b_{1}e^{1}+…+b_{n}e^{n}$ is not the zero map.

To prove that $e^{1},…,e^{n}$ span $U^{*}$ we need to prove that any vector in $U^{*}$ (every linear map) can be written as a linear combination of $e^{1},…,e^{n}$, i.e.
$$T(u)=k_{1}e^{1}+…k_{n}e^{n}$$ for any vector $u \in U$.
To see this note that
$$\begin{align*}
T(u)&=T(\alpha_{1}e_{1}+…\alpha_{n}e_{n}) \\
&=\alpha_{1}T(e_{1})+…\alpha_{n}T(e_{n}) \\
&=e^{1}T(e_{1})+…e^{n}T(e_{n}) \end{align*}$$

where $T(e_{i})$ are scalars by definition of $T$.

Is my proof correct or have I missed anything?

Thanks very much for any hints and comments.

Best Answer

  • The linear independence can be shown more directly, suppose that $$b_1e^1 + b_2e^2 + \cdots + b_ne^n = \textit0 \quad \textrm{for some } b_1,\dots,b_n \in K$$ where $\textit0 : U \to K$ is the zero map. Now, for all $i$, $1\leq i\leq n$, $$\begin{align} b_i = b_ie^i(e_i) &= \sum_{j=1}^n b_j e^j(e_i) \\ &= \Big( \sum_{j=1}^n b_j e^j \Big)(e_i) = \textit0(e_i) = 0 \end{align}$$ so, $b_1 = b_2 = \cdots = b_n = 0$.

    • Also, to check that $e^1,\dots,e^n$ spans $U^*$, let $f$ be arbitrary in $U^*$, and observe that for any $x\in U$ written as $x = \alpha_1e_1 + \cdots + \alpha_n e_n$, we have $$\begin{align} f(x) &= \sum_{j=1}^n \alpha_j f(e_j) \\ &= \sum_{j=1}^n f(e_j) \alpha_j \\ &= \sum_{j=1}^n f(e_j) e^j(x) = \Big( \sum_{j=1}^n f(e_j)e^j \Big) (x) \end{align}$$ so, $$f = f(e_1)e^1 + f(e_2)e^2 + \cdots + f(e_n)e^n$$