Let $(v_1,\dots,v_n)$ be a basis for $V$, and form a dual basis $(v_1^*,\dots,v_n^*)$ for $V^*$. Similarly let $(w_1,\dots,w_m)$ be a basis for $W$, and form a dual basis $(w_1^*,\dots,w_m^*)$ for $W^*$. Suppose $Tv_j=\sum_{i=1}^ma_{ij}w_i$, so that the matrix for $T$ with respect to these bases is $[a_{ij}]$. Then
\begin{align*}
(T^*w_j^*)\left(\sum_{i=1}^n b_iv_i\right)&=w_j^*\left(T\sum_{i=1}^n b_iv_i\right)\\&=w_j^*\left(\sum_{i=1}^nb_iTv_i\right)\\&=w_j^*\left(\sum_{i=1}^nb_i\sum_{k=1}^ma_{ki}w_k\right)\\&=\sum_{i=1}^n\sum_{k=1}^mb_ia_{ki}w_j^*(w_k)\\&=\sum_{i=1}^n\sum_{k=1}^mb_ia_{ki}\delta(k=j)\\&=\sum_{i=1}^nb_ia_{ji}\\&=\sum_{i=1}^na_{ji}v_i^*\left(\sum_{r=1}^nb_rv_r\right)
\end{align*}
and so $T^*w_j^*=\sum_{i=1}^na_{ji}v_i^*$. Hence, the matrix representing $T^*$ with respect to these bases is $[a_{ji}]$, the transpose of the matrix for $T$.
The relationship between the dual map and the kernel of $T$ is that $T^*(W^*)$ is the annihilator of $\ker(T)$. Since $(V/U)^*\cong\text{Ann}(U)$ for any subspace $U\subset V$, in particular we have
$$
T^*(W^*)=\text{Ann}(\ker(T))\cong (V/\ker(T))^*
$$
so that $\dim(\text{range}(T^*))=\dim(V)-\dim(\ker(T))=\dim(\text{range}(T))$ by the rank-nullity theorem, proving that the rank of a matrix is invariant under transposition. You might also find a different way to prove that $\dim(\text{range}(T^*))=\dim(\text{range}(T))$, and from this deduce the rank-nullity theorem.
As an aside, you can prove the rank-nullity theorem very easily by the first isomorphism theorem, which tells you that $\text{range}(T)\cong V/\ker(T)$. Comparing dimensions on both sides of the isomorphism yields
$$
\dim(\text{range}(T))=\dim(V/\ker(T))=\dim(V)-\dim(\ker(T)).
$$
That's correct, the dimension of the column space should be equal to the dimension of the row space. More clearly, if $A$ is an $m \times n$ matrix, then
$$Rank(A) = \dim(RS(A)) = \dim(CS(A))$$
where $RS(A)$ and $CS(A)$ denote the row and column space of $A$, respectively. So, when you reduce a matrix to echelon form, then the number of pivots must equal $Rank(A)$. What you have done is found a basis , $B_1$, for the row space of $A$ and a basis, $ B_2$, for the column space of $A$. Then you have noticed that indeed, $B_1$ and $B_2$ each have 3 linearly independent vectors, to which we can conclude
$$ \dim(RS(A)) = \dim(CS(A)) = 3.$$
Now, since this is a $3 \times 4$ matrix, clearly each row vector should be in $\mathbb{R}^4$, and each column vector in $\mathbb{R}^3$. So no, there is not a lack of agreement in the dimensions, you are actually correct. If you had an $n \times n $ matrix, then the rows and the column vectors would both be in $\mathbb{R}^n$, however in your example the matrix isn't square.
Best Answer
Hints:
$A=\mathbf v\mathbf w^T\implies\operatorname{rank}A=1$ should be pretty easy to prove directly. Multiply a vector in $\mathbb R^m$ by $A$ and see what you get.
For the other direction, think about what $A$ does to the basis vectors of $\mathbb R^m$ and what this means about the columns of $A$.
Solution
Suppose $A=\mathbf v\mathbf w^T$. If $\mathbf u\in\mathbb R^m$, then $A\mathbf u=\mathbf v\mathbf w^T\mathbf u=(\mathbf u\cdot\mathbf w)\mathbf v$. Thus, $A$ maps every vector in $\mathbb R^m$ to a scalar multiple of $\mathbf v$, hence $\operatorname{rank}A=\dim\operatorname{im}A=1$.
Now, assume $\operatorname{rank}A=1$. Then for all $\mathbf u\in\mathbb R^m$, $A\mathbf u=k\mathbf v$ for some fixed $\mathbf v\in\mathbb R^n$. In particular, this is true for the basis vectors of $\mathbb R^m$, so every column of $A$ is a multiple of $\mathbf v$. That is, $$ A=\pmatrix{w_1\mathbf v & w_2\mathbf v & \cdots & w_m\mathbf v}=\mathbf v\pmatrix{w_1&w_2&\cdots&w_m}=\mathbf v\mathbf w^T. $$