Let $\textsf{T}: \textsf{V}\to \textsf{W}$ be a linear transformation and let $\beta =\{v_1,v_2,\dots,v_n\}$ and $\gamma =\{w_1,w_2,\dots,w_m\}$ basis for $\textsf{V}$ and $\textsf{W}$, respectively.
To compute the matrix associated with $\textsf{T}$ respect to $\beta$ and $\gamma$, we always, that is to say always, obtain the vector $\textsf{T}(v_j)$ and put them as a linear combination of the elements that conform $\gamma$, like this :
$$\textsf{T}(v_j)=A_{1j}w_1+A_{2j}w_2+\cdots+A_{mj}w_m$$
then, we put the coefficients
$$\begin{pmatrix} A_{1j} \\ A_{2j} \\ \vdots \\ A_{mj} \end{pmatrix}$$
in the $j$-th column of the matrix $A:= [\textsf{T}]_{\beta}^{\gamma}$ (for $j=1,2,\dots,n$).
Now, for your specific doubt, consider $\textsf{T}: \mathbb{R}^2 \to \mathbb{R}^2$ given by $\textsf{T}(a,b)=(-b,a)$ (doesn't matter) and the basis $\beta=\{(2,1),(3,5)\}$ and $\gamma=\{e_1,e_2\}$ (the standard basis).
Above I said, we must first to compute $\textsf{T}(2,1)$ and the result put it as a linear combination of the elements in the second basis. In this case
$$\textsf{T}(2,1)=(-1,2)=(-1)e_1+2e_2$$
As you can see, the coefficients that we will put in the first column of the matrix will look exactly like the vector $(-1,2)$ (maybe this is what you meant). To finish, note also that
$$\textsf{T}(3,5)=(-5,3)=(-5)e_1+3e_2$$
So
$$[\textsf{T}]_{\beta}^{\gamma} =\begin{pmatrix} -1 & -5 \\ 2 & 3 \end{pmatrix}$$
(an apology for my english, I hope this has clarified your doubt a bit)
Best Answer
I'll suppose the rank of a linear transformation $L$ is defined as the dimension of its image $L(V)=\{\, L(v)\mid\,v\in V\}$ and that the rank of an $m\times n$ matrix$~A$ is defined as the dimension of the subspace of$~K^m$ (with $K$ the base field, you may take $K=\Bbb R$) spanned by its columns.
First note that if $\{ v_1,\ldots, v_n\}$ is a basis of$~V$, then the vectors $L(v_1),\ldots, L(v_n)\in W$ span $L(V)$, since $L$ respects linear combinations. Also, the matrix $A$ above defines a linear transformation $L_A:K^n\to K^m$ by $L_A(v)=A\cdot v$. Since column$~j$ of $A$ is just $A\cdot e_j$ where $\{e_1,\ldots,e_n\}$ is the standard basis of $K^n$, we have that the subspace spanned by the columns of$~A$ is the subspace of $K^m$ spanned by $L_A(e_j)$ for $j=1,\ldots,n$, which by the above is the image of$~L_A$. So according to the definitions, the rank of $A$ is the rank of$~L_A$.
Finally the bases of the question define isomorphisms $\phi_v:V\to K^n$ (finding coordinates of any vector in $V$ in the basis $\{ v_1,\ldots, v_n\}$) and $\psi_w:K^m\to W$ (using the $m$-tuple of scalars as coordinates in the basis $\{ w_1,\ldots, w_m\}$ to get a vector of$~W$), and the matrix $A$ of$~L$ with respect to these bases is by definition such that $L=\psi_w\circ L_A\circ\phi_v$ (given $v\in V$ find its coordinates by$~\phi_v$, multiply those by$~A$, and interpret the result as a vector of$~W$ via$~\psi_w$). The isomorphism $\psi_w$ then maps the image of $L_A$ to that of $L$, so the two linear transformations have the same rank, which is also the rank of $A$.