[Math] Relation between the dual space, transpose matrices and rank-nullity theorem

duality-theoremslinear algebramatrix-ranktransposevector-spaces

Summing up, how can one use linear functionals, transpose matrices, row and column rank equality and annihilators to prove the rank-nullity theorem?


While studying linear algebra, I'm trying to get the precise relation between the following concepts: given two vector spaces $V$ and $W$, I can form the dual spaces $V^*$ and $W^*$ by taking all the linear functionals on $V$ and $W$. The basis of each one of those spaces lifts to the duals (though this demonstration I'm still going to carry on, but I have the idea how to do so). If I have a linear transformation $T: V \to W$, I have a natural way of defining a transformation $T^*: W^* \to V^*$, by composition: given a functional $g \in W^*$, I define $f \in V^*$ by:
$$f(\alpha) = g(T\alpha)$$

This way of defining $T^*$ is familiar to me, as it is similar a construction commonly done on modules over a ring $R$. Now, if I represent $T$ as a matrix on a choice of basis for $V$ and $W$, and $T^*$ as the matrix on the basis given by the lifting, then $T^*$ will be the transpose of $T$ (the exchange of rows and columns of $T$). I don't see clearly why this happens. Furthermore, the rank of $T$ is equal to the rank of $T^*$, what proves that the column rank of a matrix equals it's row rank. Given that the rank is the dimension of the image subspace, I think that this can be show by carrying out the image of $V$ on $W$ to the duals.

I also read that there is a relation between the duals and the kernel of a linear transformation (the annihilator (?)), but that isn't very clear to me either. Using that and the facts above, one can prove the rank-nullity theorem. The reason I'm asking this is that I was given a proof of the rank-nullity theorem without using the linear functionals, and that seemed to depend only on $T$ having the same row and column rank (which was proved by smart manipulation of some equations on vectors). That proof didn't gave me a satisfactory intuition on the rank-nullity theorem, specially when $V \neq W$. I believe that the proof through linear functionals will be more enlightening.

EDIT: I was following the treatment given on Hoffman's Linear Algebra, chapters 3.5 through 3.7 (linear functionals, annihilators and transposes), if that's of interest.

Best Answer

Let $(v_1,\dots,v_n)$ be a basis for $V$, and form a dual basis $(v_1^*,\dots,v_n^*)$ for $V^*$. Similarly let $(w_1,\dots,w_m)$ be a basis for $W$, and form a dual basis $(w_1^*,\dots,w_m^*)$ for $W^*$. Suppose $Tv_j=\sum_{i=1}^ma_{ij}w_i$, so that the matrix for $T$ with respect to these bases is $[a_{ij}]$. Then \begin{align*} (T^*w_j^*)\left(\sum_{i=1}^n b_iv_i\right)&=w_j^*\left(T\sum_{i=1}^n b_iv_i\right)\\&=w_j^*\left(\sum_{i=1}^nb_iTv_i\right)\\&=w_j^*\left(\sum_{i=1}^nb_i\sum_{k=1}^ma_{ki}w_k\right)\\&=\sum_{i=1}^n\sum_{k=1}^mb_ia_{ki}w_j^*(w_k)\\&=\sum_{i=1}^n\sum_{k=1}^mb_ia_{ki}\delta(k=j)\\&=\sum_{i=1}^nb_ia_{ji}\\&=\sum_{i=1}^na_{ji}v_i^*\left(\sum_{r=1}^nb_rv_r\right) \end{align*} and so $T^*w_j^*=\sum_{i=1}^na_{ji}v_i^*$. Hence, the matrix representing $T^*$ with respect to these bases is $[a_{ji}]$, the transpose of the matrix for $T$.

The relationship between the dual map and the kernel of $T$ is that $T^*(W^*)$ is the annihilator of $\ker(T)$. Since $(V/U)^*\cong\text{Ann}(U)$ for any subspace $U\subset V$, in particular we have $$ T^*(W^*)=\text{Ann}(\ker(T))\cong (V/\ker(T))^* $$ so that $\dim(\text{range}(T^*))=\dim(V)-\dim(\ker(T))=\dim(\text{range}(T))$ by the rank-nullity theorem, proving that the rank of a matrix is invariant under transposition. You might also find a different way to prove that $\dim(\text{range}(T^*))=\dim(\text{range}(T))$, and from this deduce the rank-nullity theorem.

As an aside, you can prove the rank-nullity theorem very easily by the first isomorphism theorem, which tells you that $\text{range}(T)\cong V/\ker(T)$. Comparing dimensions on both sides of the isomorphism yields $$ \dim(\text{range}(T))=\dim(V/\ker(T))=\dim(V)-\dim(\ker(T)). $$