[Math] Is $\text{Rank }(T) = \text{Dim}(V)$ all the time

linear algebra

Thm
Let $V$ and $W$ be Vector spaces and let $T:V \to W$ be linear

If $\beta = \{ v_1,\dots ,v_n \}$ is a basis for $V$
then $$ R(T)=\text{span}(T(\beta))=\text{span}(\{ T(v_1),\dots,T(v_n) \} ) $$


Dimension Theorem

Let $V$ and $W$ be Vector spaces and let $T:V \to W$ be linear

If $V$ is finite dimensional then $\text{Nullity}(T)+\text{Rank}(T)=\text{dim}(V)$


My impression is that $\text{Dim}(V)=\text{Rank}(T)$ since if $\text{dim}(V)=2$ there are $2$ vectors in the basis and $T$ of that basis will make the basis for image of $T$ so $\text{Rank}(T)=\text{dim}(V)$.

Asked when the teacher when class was over but was told that was not the case. Did get answer from the proffesor but I do not know if he did not understood my question or I did not understood his answer.

Best Answer

Let's look at your specific claim: that if $\dim V$ is $2$, then are $2$ vectors in the basis of $V$ (call them $v_{1}, v_{2}$) and the images $T(v_{1}), T(v_{2})$ form a basis for the image of $T$. This is a perfectly reasonable sounding thing to suggest - unfortunately, it's also totally false!

Here's an explicit example for you. Consider the linear transformation $T \colon \mathbb{R}^{2} \to \mathbb{R}^{1}$ given by projection onto the first coordinate, i.e. $T((a, b)) = a$ for all $(a, b) \in \mathbb{R}^{2}$. This is a perfectly good linear transformation, as you can check yourself. The images of the standard basis vectors $(1, 0), (0, 1)$ are given by $T((1, 0)) = 1$ and $T((0, 1)) = 0$. But these are clearly linearly dependent elements of the image of $T$, since any set of vectors that includes the zero vector is linearly dependent.

It might seem like the example above was dependent on the basis, but this isn't the case. It's not hard to show that the linear transformation $T$ defined above is surjective, i.e. the image of $T$ is all of $\mathbb{R}^{1}$. The dimension of $\mathbb{R}^{1}$ (as an $\mathbb{R}$ vector space, of course) is $1$, so any two vectors in $\mathbb{R}^{1}$ are linearly dependent. Thus, the example above would've worked for any two basis vectors for $\mathbb{R}^{1}$. The problem is that we can perfectly reasonably have linear transformations from higher dimensional vector spaces onto lower dimensional vector spaces over the same field. If this is the case, then the image of $T$ has dimension bounded by the dimension of the target space, so the rank of $T$ can't be as big as the dimension of $V$.

This is concretely realized in the form of so called 'nullity', or having a nontrivial kernel. In the example above, any element of the form $(0, a) \in \mathbb{R}^{2}$ gets sent to $0$ by $T$. I encourage you to play around with examples of linear transformations to get a better understanding of why rank-nullity works the way it does, and to see why your specific claim is false.

Let me also prove mathers101's claim in the comments. Let $T \colon V \to W$ be a linear transformation between vector spaces with scalar field $F$ (if you have only worked with real or complex vector spaces, just take $F = \mathbb{R}$ or $F = \mathbb{C}$), and suppose for $v_{1}, \ldots, v_{n} \in V$, the images $T(v_{1}), \ldots, T(v_{n})$ are linearly independent in $W$. Then we claim that $v_{1}, \ldots, v_{n}$ are linearly independent elements of $V$. Suppose not, i.e. there exist $a_{1}, \ldots, a_{n} \in F$ not all zero such that $a_{1}v_{1} + \cdots + a_{n}v_{n} = 0$. Then

$$0 = T(0) = T(a_{1}v_{1} + \cdots + a_{n}v_{n}) = a_{1}T(v_{1})+\cdots+a_{n}T(v_{n})$$

contradicting the linear independence of $T(v_{1}), \ldots, T(v_{n})$.

Related Question