Let us use upper indices to index row(-vector)s and lower index for column(-vector)s.
Let $x$ be an arbitrary vector, and $[x]$ its coordinates in the standard basis.
We will also consider two other bases $\{v_i\}$, $\{w_i\}$ and later the corresponding dual bases for the dual space of linear functionals (or 1-forms, or covectors).
Let $V=\left([v_1]\mid [v_2]\mid\ldots [v_n]\right)$ be the matrix made of coordinate columns-vectors of the first basis, and similarly $W$ for the second basis $\{w_i\}$.
If the coordinates of a vector in the two bases are related by
$$
[x]_w=T[x]_v\tag{1},
$$
then, identifying the left-most and the right-most side in $V[x]_v=[x]=W[x]_w=W(T[x]_v)=(WT)[x]_v$, we have
$$V=WT\tag{2}$$
Let $V^{\prime}=\left(\begin{smallmatrix} [v^1]\\ [v^2] \\ \cdots\\ [v^n]\end{smallmatrix}\right)$ be the matrix of stacked row-vectors of the first dual basis, and similarly $W^{\prime}$ for the second dual basis $\{w^i\}$.
Using the dual basis we can write $[x]_v=\left(\begin{smallmatrix} v^1(x)\\ v^2(x) \\ \cdots\\ v^n(x)\end{smallmatrix}\right)=V^{\prime}[x]$ and similarly for the $[x]_w$.
From $TV^{\prime}[x]=T[x]_v=[x]_w=W^{\prime}[x]$ we have
$$W^{\prime}=TV^{\prime}\tag{3}$$
Finally, for any dual vector (co-vector) $\alpha$ we can write $[\alpha]=[\alpha]_{w^{\prime}}W^{\prime}=[\alpha]_{w^{\prime}}TV^{\prime}$ and therefore, $$[\alpha]_{v^{\prime}}=[\alpha]_{w^{\prime}}T\tag{4}$$
The (1)-(4) tell us in which direction and from which side to apply the transformation matrix to the coordinates of vectors, covectors, and their bases.
Let $(v_1,\dots,v_n)$ be a basis for $V$, and form a dual basis $(v_1^*,\dots,v_n^*)$ for $V^*$. Similarly let $(w_1,\dots,w_m)$ be a basis for $W$, and form a dual basis $(w_1^*,\dots,w_m^*)$ for $W^*$. Suppose $Tv_j=\sum_{i=1}^ma_{ij}w_i$, so that the matrix for $T$ with respect to these bases is $[a_{ij}]$. Then
\begin{align*}
(T^*w_j^*)\left(\sum_{i=1}^n b_iv_i\right)&=w_j^*\left(T\sum_{i=1}^n b_iv_i\right)\\&=w_j^*\left(\sum_{i=1}^nb_iTv_i\right)\\&=w_j^*\left(\sum_{i=1}^nb_i\sum_{k=1}^ma_{ki}w_k\right)\\&=\sum_{i=1}^n\sum_{k=1}^mb_ia_{ki}w_j^*(w_k)\\&=\sum_{i=1}^n\sum_{k=1}^mb_ia_{ki}\delta(k=j)\\&=\sum_{i=1}^nb_ia_{ji}\\&=\sum_{i=1}^na_{ji}v_i^*\left(\sum_{r=1}^nb_rv_r\right)
\end{align*}
and so $T^*w_j^*=\sum_{i=1}^na_{ji}v_i^*$. Hence, the matrix representing $T^*$ with respect to these bases is $[a_{ji}]$, the transpose of the matrix for $T$.
The relationship between the dual map and the kernel of $T$ is that $T^*(W^*)$ is the annihilator of $\ker(T)$. Since $(V/U)^*\cong\text{Ann}(U)$ for any subspace $U\subset V$, in particular we have
$$
T^*(W^*)=\text{Ann}(\ker(T))\cong (V/\ker(T))^*
$$
so that $\dim(\text{range}(T^*))=\dim(V)-\dim(\ker(T))=\dim(\text{range}(T))$ by the rank-nullity theorem, proving that the rank of a matrix is invariant under transposition. You might also find a different way to prove that $\dim(\text{range}(T^*))=\dim(\text{range}(T))$, and from this deduce the rank-nullity theorem.
As an aside, you can prove the rank-nullity theorem very easily by the first isomorphism theorem, which tells you that $\text{range}(T)\cong V/\ker(T)$. Comparing dimensions on both sides of the isomorphism yields
$$
\dim(\text{range}(T))=\dim(V/\ker(T))=\dim(V)-\dim(\ker(T)).
$$
Best Answer
Yes, it maps $\Bbb{R} \to \Bbb{R}^3$. If $v$ is the column vector, then for $\lambda \in \Bbb{R}$, the map is given by
$$ S^T(\lambda) = \lambda \, v $$
The map is just scalar multiplication by the column vector.
It is just matrix multiplication, actually. It is the product of the column vector (a $3 \times 1$ matrix) and the scalar (a $1 \times 1$ matrix).