[Math] Show that $V^*$, set of all Linear Transformations from $V$ to $R$, is a vector space

linear algebralinear-transformationsvector-spaces

$V$ is a vector space, and $V^*$ is the set of all LT's from $V$ to $\mathbb{R}$.

a) Show that $V^*$ is a vector space.

b) Suppose $\{v_1,\dots,v_n\}$ is a basis for $V$. For $i = 1,\dots ,n$ define $f_i$ belongs to $V^*$ by:
$f_i(a_1 v_1+\dots+ a_n v_n) = a_i$ for constants $a_i$. Prove that $\{f_1,\dots,f_n\}$ is a basis for V*.

c) Deduce that whenever $V$ is finite-dimensional, $\dim V^* = \dim V$.


Here is my attempt at answers. Please help me point out any inconsistencies or wrong statements.

a) Let $T_1, T_2$ belong to $V^*$.

$(T_1+T_2)A = T_1 A + T_2 A$ by matrix distributive property.

$(cT_1)A = c(T_1 A) = T_1 (cA)$ by associative property of matrices (scalar mult).

Since $V^*$ is a set of matrices, the $0$-vector is a matrix filled with zeros.

This proves that $V^*$ is a subspace of $V$, in turn proving $V^*$ is a vector space.

b) Note that $f$ is uniquely valued for each $v_j$ in $V$, for $j$ in $[1,n]$.

Furthermore, $f_i (v_j) = 1$ for $i=j$, $0$ otherwise.

Consider the linear combination $a_1 f_1 + \cdots + a_n f_n=0$.

Taking the value at every $v_j$, we get $a_j=0$ showing that $\{f_1,\ldots, f_n\}$ is linearly independent.

Let $g$ be a vector in $V^*$. Assume $g$ cannot be written as $a_1 f_1 + \cdots + a_n f_n$. Then the value of $g$ is zero. This shows that every member of $V^*$ is either $0$ or linear combination of $a_1 f_1 + \cdots + a_n f_n$. This proves that $a_1 f_1 + \cdots + a_n f_n$ spans $V^*$.

Thus, $a_1 f_1 + \cdots + a_n f_n$ is a basis for $V^*$.

c) It follows from part b, since size of basis for both spaces is $n$.

Best Answer

You asked for pointers to inconsistencies or inaccuracies, so I'll do that. The most serious ones are in part (a).


$(T_1+T_2)A=T_1A+T_2A$ is true, but it is not the matrix distributive property which makes it true, since matrix multiplication is not the only linear transformation, and indeed it doesn't even make sense for $V\neq F^n$ (with $F$ being $\Bbb R$ or $\Bbb C$ or a generic field, depending on your definition).

Similarly your other property is true, but it has nothing to do with matrices.

These two properties do not show directly that $V^*$ is a vector space, but rather that it is a subspace of a larger space $W$. Certainly if it is a subspace of $W$ then it is a vector space, but you never mention which larger space $W$ you are working in.

Actually, that's not strictly true; you imply that $W=V$ in your last line, which is false. For instance, if $V=\Bbb R$ and $f$ is defined by $f(x)=x$, then $f$ is not a real number, so it is not even true that $V\subseteq V^*$.


Your proof of linear independence is great. Rough around the edges, but logically sound.

Span has some issues. First, it is not true that $0$ cannot be written as $a_1f_1+\cdots+a_nf_n$; set $a_1=\cdots=a_n=0$.

Even if you add the condition that "not all $a_i$ are zero", you still must prove that $0$ is the only linear transformation that cannot be written this way. A priori, this is far from obvious.

As a more helpful comment; the usual method for proving that span$\,B = U$ is to say "Suppose $g\in U$ is an arbitrary vector" and then construct explicitly some $a_i$ such that $\sum a_i b_i=g$ (where $b_i\in B$ are basis elements).


The third part is perfect.

Related Question