Proving that the input space of a surjective linear transformation has a dimension at least as large as the output space

linear algebralinear-transformations

Say we have a linear transformation $T : V \to W$. We know that $T$ is surjective or “onto $W$”.

I'm trying to prove that $n = \dim(V) > m = \dim(W)$. Intuitively, this makes sense. $V$ would have to be at least as “large” as $W$ in order for every vector in $W$ to be the image of at least one vector in $V$.

I'm having trouble sort of “wrapping up” this proof. Here's what I have.


Let $x \in V, y \in W$.

We know $x$ must be a linear combination of some basis for $V$. Say that basis is $\{v_1, \cdots, v_n\}$. $x = c_1v_1 + \cdots + c_nv_n$.

From the properties of a linear transformation we know that: $T(x) = c_1T(v_1) + \cdots + c_nT(v_n) = y$.

This tells me that $y$ is a linear combination of the set $\{T(v_1), \cdots T(v_n)\}$. This means that the some (non-strict) subset of $\{T(v_1), \cdots, T(v_n)\}$ must form a basis for $W$. A basis for $W$ requires at least $m$ linearly independent vectors. Thus, $n \geq m$.


My gut says this feels incomplete. I feel like I've made an assumption that the set $\{T(v_1), \cdots, T(v_n)\}$ is linearly independent because we know $\{v_1, \cdots, v_n\}$ is, but I haven't shown that. Any hints?

Best Answer

Your idea is good: if $\{v_1,\dots,v_n\}$ is a basis of $V$, then $\{T(v_1),\dots,T(v_n)\}$ is a spanning set for $W$, which you can extract a basis from. Therefore $\dim W\le\dim V$.

How can you extract a basis? Suppose that $\{u_1,\dots,u_r\}$ is a spanning set of the vector space $U$. Then either the set is linearly dependent or independent. In the latter case you're finished. Otherwise, one vector is a linear combination of the others; without loss of generality, it can be taken as $u_r$ and then $\{u_1,\dots,u_{r-1}\}$ is again a spanning set. Repeat until you have to stop because the set you get is linearly independent.


The rank-nullity theorem tells even more: if $T\colon V\to W$ is a linear map, then $$ \dim V=\dim\ker T+\dim\operatorname{im}T $$ (where $\ker T$ is the kernel and $\operatorname{im}T$ is the image). If $T$ is surjective, then $\dim W=\dim V-\dim\ker T\le \dim V$.