[Math] Dual spaces for dummies

change-of-basislinear algebra

I'm at a complete loss for dual space right now, and linear functionals by association. My notes from lecture and my book are completely unhelpful, and I'm finding myself making up solutions to homework from patterns I'm gleaning from answers I find online or the examples in the book.

The current problem I need to do reads:

Define f (some special non-italicised notation for a linear functional, whatever that is — I have no idea how to denote that in text and will just toss it into the \$'d limits) $\in(\Bbb R^2)^*$ by f$(x,y) = 2x + y$ and $T:\Bbb R^2 \rightarrow \Bbb R^2$ by $T(x,y) = (3x + 2y, x)$.

(a) Compute $T^t(f)$.

(b) Compute $[T^t]_{\beta^*}(f)$, where $\beta$ is the standard ordered basis for $\Bbb R^2$ and $\beta^* = \{f_1, f_2\}$ is the dual basis, by finding scalars $a, b, c$, and $d$ such that $T^t(f_1) = af_1 + cf_2$ and $T^t(f_2) = bf_1 + df_2$.

(c) Compute $[T]_\beta$ and $([T]_\beta)^t$, and compare your results with (b).

I have no clue where to even start. I thought maybe I could consider f as some sort of thing I could plug into the transformation (like $f = \{2, 1\}$ then $T(2,1) = (8,2)$) but that doesn't get me anywhere, let alone a matrix to transpose. I don't think I understand anything regarding this topic; I can't find any examples in the book that make any sense, my notes are equally cryptic (and most of my lecture time I'm frantically scribbling what the professor writes on the board with no concept of what's going on), and I'm not seeing anything online that's of help either. Usually there are a decent amount of pdf's to search… but not this time.

Best Answer

Let me first give some general context before considering the question at hand. If $V$ is a real finite dimensional vector space, the dual space of $V$ is the space of all linear maps $f \colon V \rightarrow \mathbb{R}$. Such maps $f$ are called linear functional - you feed them vectors in $V$ and they spit out scalars. Given a basis $\beta = (v_1, \dots, v_n)$ for $V$, one can construct a basis $\beta^{*} = (f_1, \dots, f_n)$ for the dual space $V^{*}$ that satisfies $f_i(v_j) = \delta_{ij}$ (where $\delta_{ij} = 1$ if $i = j$ and $0$ otherwise). The dual basis $\beta^{*}$ is determined uniquely by the original basis $\beta$. In particular, this shows that if $V$ is $n$-dimensional then so is the dual space $V^{*}$.

If $T \colon V \rightarrow V$ is a linear map, it induces a linear map $T^{*} \colon V^{*} \rightarrow V^{*}$ between the dual spaces by the formula $T(f)(v) = f(Tv)$ (that is, the linear functional $T(f)$ eats a vector $v \in V$, applies $T$ to it and then applies $f$ to the result).


In part $(a)$, you are asked to compute the linear functional $T^{*}(f)$ (you denote it by $T^{t}(f)$ but I think it is best to reserve this notation only for matrices in order to avoid some confusion). Let us try and do that. The expression $T^{*}(f)$ should be a linear functional on $\mathbb{R}^2$ so let us try and feed it with a vector $(x,y)$:

$$ (T^{*}(f))(x,y) = f(T(x,y)) = f(3x + 2y, x) = 2(3x + 2y) + x = 7x + 4y.$$

Thus, if we set $g(x,y) = 7x + 4y$, we see that $T^{*}(f) = g$.

In part $(b)$, you are asked to compute the matrix representation of the dual operator $T^{*}$ with respect to a given basis $(f_1,f_2)$. The basis $(f_1,f_2)$ is said to be the dual basis to the standard basis $(e_1,e_2)$. Let us try and write $f_1,f_2$ explicitly. A general linear functional $f$ on $\mathbb{R}^2$ has the form $f(x,y) = ax + by$ for some $a,b \in \mathbb{R}$. Writing $f_1(x,y) = ax + by$, we see that it must satisfy

$$ f_1(e_1) = f_1(1,0) = a = 1, f_1(e_2) = f_1(0,1) = b = 0 $$

and so $f_1(x,y) = x$. Similarly, $f_2(x,y) = y$ and so the dual basis acts on a vector $(x,y)$ simply by returning the coordinates of the vector. Now, in order to compute the matrix representation of $T^{*}$ with respect to the basis $(f_1,f_2)$, we must compute $T^{*}(f_1),T^{*}(f_2)$ and express the result in terms of $f_1, f_2$:

$$ T^{*}(f_1) = a f_1 + c f_2, T^{*}(f_2) = bf_1 + c f_2. $$

Having done that, we will know that

$$ [T^{*}]_{\beta^{*}} = \begin{pmatrix} a & b \\ c & d \end{pmatrix} $$

(this has nothing to do with dual spaces, it simply follows from the definition of what it means to represent an operator as a matrix with respect to a given basis). In our case,

$$ (T^{*}(f_1))(x,y) = f_1(T(x,y)) = f_1(3x + 2y, x) = 3x + 2y = (3f_1 + 2f_2)(x,y), \\ (T^{*}(f_2))(x,y) = f_2(T(x,y)) = f_2(3x + 2y, x) = x = f_1(x,y) $$

and so $T^{*}(f_1) = 3f_1 + 2f_2, T^{*}(f_2) = f_1$ and

$$ [T^{*}]_{\beta^{*}} = \begin{pmatrix} 3 & 1 \\ 2 & 0 \end{pmatrix}. $$

Finally, for part $(c)$, we need to compute $[T]_{\beta}$ and so we need to compute $T(e_1),T(e_2)$ and express the result in terms of $e_1,e_2$:

$$ T(e_1) = T(1,0) = (3, 1) = 3e_1 + e_2, \\ T(e_2) = T(0,1) = (2, 0) = 2e_1 + 0 \cdot e_2 $$

and we get

$$ [T]_{\beta} = \begin{pmatrix} 3 & 2 \\ 1 & 0 \end{pmatrix}, \left( [T]_{\beta} \right)^{t} = \begin{pmatrix} 3 & 1 \\ 2 & 0 \end{pmatrix}. $$

You might notice that we got $[T^{*}]_{\beta^{*}} = \left( [T]_{\beta} \right)^t$ and in fact you can prove that this will always be the case.