By definition of the matrices $A,B$, we have $\psi(v_j) = \sum_k [A]_{kj} v_k$, and $\psi^t(\phi_j) = \sum_k[B]_{kj} \phi_k$.
Then we have $\phi_i(\psi(v_j)) = [A]_{ij}$, and $(\psi^t(\phi_j))(v_i) =[B]_{ij} $.
Since $(\psi^t(\phi_j))(v_i) = \phi_j(\psi(v_i))$, we get $A=B^T$.
I'll suppose the rank of a linear transformation $L$ is defined as the dimension of its image $L(V)=\{\, L(v)\mid\,v\in V\}$ and that the rank of an $m\times n$ matrix$~A$ is defined as the dimension of the subspace of$~K^m$ (with $K$ the base field, you may take $K=\Bbb R$) spanned by its columns.
First note that if $\{ v_1,\ldots, v_n\}$ is a basis of$~V$, then the vectors $L(v_1),\ldots, L(v_n)\in W$ span $L(V)$, since $L$ respects linear combinations. Also, the matrix $A$ above defines a linear transformation $L_A:K^n\to K^m$ by $L_A(v)=A\cdot v$. Since column$~j$ of $A$ is just $A\cdot e_j$ where $\{e_1,\ldots,e_n\}$ is the standard basis of $K^n$, we have that the subspace spanned by the columns of$~A$ is the subspace of $K^m$ spanned by $L_A(e_j)$ for $j=1,\ldots,n$, which by the above is the image of$~L_A$. So according to the definitions, the rank of $A$ is the rank of$~L_A$.
Finally the bases of the question define isomorphisms $\phi_v:V\to K^n$ (finding coordinates of any vector in $V$ in the basis $\{ v_1,\ldots, v_n\}$) and $\psi_w:K^m\to W$ (using the $m$-tuple of scalars as coordinates in the basis $\{ w_1,\ldots, w_m\}$ to get a vector of$~W$), and the matrix $A$ of$~L$ with respect to these bases is by definition such that $L=\psi_w\circ L_A\circ\phi_v$ (given $v\in V$ find its coordinates by$~\phi_v$, multiply those by$~A$, and interpret the result as a vector of$~W$ via$~\psi_w$). The isomorphism $\psi_w$ then maps the image of $L_A$ to that of $L$, so the two linear transformations have the same rank, which is also the rank of $A$.
Best Answer
Let $\{v_1,\ldots,v_n\}$ and $\{w_1,\ldots,w_n\}$ be any two bases for $V$. Recall that the matrix for a linear transformation $T:V\to V$ is given by $(a_{ij})$ where the entries $a_{ij}$ are determined by $$T(v_j)=a_{1j}w_1+\cdots a_{nj}w_n.$$ In other words, the matrix for $T$ looks like $$\left[T(v_1)\;\middle|\;\cdots \;\middle|\;\vphantom{$\strut_\strut^{\strut^\strut}$} T(v_n)\right]$$ Your goal is for this to be the identity matrix. Thus, you want $$T(v_1)=\begin{bmatrix}1\\0\\\vdots\\0\end{bmatrix}=1\cdot w_1+0\cdot w_2+\cdots +0\cdot w_n=w_1$$ $$\vdots$$ $$T(v_n)=\begin{bmatrix}0\\\vdots\\0\\1\end{bmatrix}=0\cdot w_1+0\cdot w_2+\cdots +1\cdot w_n=w_n$$ Thus, for any basis $B_1=\{v_1,\ldots,v_n\}$ of $V$ you want, letting the other basis $B_2$ just be $\{T(v_1),\ldots,T(v_n)\}$ will make the linear transformation $T$ be represented by the identity matrix.