Let $\mathbb B_1=\{v_1, v_2,....,v_n\}$ be the standard basis for $\mathbb R^n$.
As you are interested in $m\times n$ matrix consider $\mathbb B_2=\{u_1, u_2,....,u_m\}$ as the standard basis for $\mathbb R^m$.
Say $T$ and $S$ be two linear transformation from $\mathbb R^n$ to $\mathbb R^m$.
First assume $[T]_{\mathbb B_1}^{\mathbb B_2}=[S]_{\mathbb B_1}^{\mathbb B_2}=(a_{ij})_{m\times n}$
So $T(v_j)=\displaystyle \sum_{i=1}^ma_{ij}u_i=S(v_j)\forall j\in \{1,2,...,n\}$
Let $x\in \mathbb R^n$ Then $x=\displaystyle \sum_{i=1}^nc_jv_j$ for some scalars $c_j$
So $T(x)=T(\displaystyle \sum_{i=1}^nc_jv_j)=\displaystyle \sum_{i=1}^nc_jT(v_j)=\displaystyle \sum_{i=1}^nc_j\displaystyle \sum_{i=1}^ma_{ij}u_i=\displaystyle \sum_{i=1}^nc_jS(v_j)=S(\displaystyle \sum_{i=1}^nc_jv_j)=S(x)$
Hence $T=S$
Conversely assume $A=(a_{ij})_{m\times n}$ and $B=(b_{ij})_{m\times n}$ are two $m\times n$ matrices (wrt bases $\mathbb B_1$ and $\mathbb B_2$) corresponding to two linear transformations $T$ and $S$ from $\mathbb R^n$ to $\mathbb R^m$ such that $T=S$.
So $[T]_{\mathbb B_1}^{\mathbb B_2}=(a_{ij})_{m\times n}$ and $[S]_{\mathbb B_1}^{\mathbb B_2}=(b_{ij})_{m\times n}$
$\Rightarrow T(v_j)=\displaystyle \sum_{i=1}^ma_{ij}u_i$ and $S(v_j)=\displaystyle \sum_{i=1}^mb_{ij}u_i \forall j\in \{1,2,...,n\}$
Note that any linear transformation is completely determined by the image of basis elements.So here $T(v_j)=S(v_j)\forall j\in \{1,2,...,n\}$ as $T=S$
Thus $T(v_j)=\displaystyle \sum_{i=1}^ma_{ij}u_i=S(v_j)=\displaystyle \sum_{i=1}^mb_{ij}u_i \forall j\in \{1,2,...,n\}$
$\Rightarrow \displaystyle \sum_{i=1}^m(a_{ij}-b_{ij})u_i=0 \forall j\in \{1,2,...,n\}$
Since $B_2=\{u_1, u_2,....,u_m\}$ is a basis hence it i linearly independent.So we have from the previous line
$(a_{ij}-b_{ij})=0 \forall i\in \{1,2,...,m\}\forall j\in \{1,2,...,n\}$
$\Rightarrow a_{ij}=b_{ij} \forall i\in \{1,2,...,m\}\forall j\in \{1,2,...,n\}$
$\Rightarrow A=B$
Just to be sure that there are no misunderstandings, I will add an answer. Suppose that $\{v_1,\ldots,v_n\}$ is a basis for $V$, and that $\{w_1,\ldots,w_n\}$ is a basis for $W$. Define a map $T:V\to W$ by the linear extension of the assignment $v_i\to w_i$ for all $i\in\{1,\ldots,n\}$. To prove that $T$ is injective, it suffices to show that the $\ker(T)=\{0\}$. So suppose that $v=\sum_{i=1}\lambda_iv_i\in\ker(T)$, then
\begin{equation}
\begin{split}
0&=T\left(\sum_{i=1}^n\lambda_iv_i\right) \\
&=\sum_{i=1}^n\lambda_iT(v_i) \\
&=\sum_{i=1}^n\lambda_iw_i,
\end{split}
\end{equation}
which implies that $\lambda_i=0$ for all $i\in\{1,\ldots,n\}$, since $\{w_1,\ldots,w_n\}$ is a basis. Surjectivitiy of $T$ is of course trivial, for if $w=\sum_{i=1}^n\lambda_iw_i\in W$ is given, then $\sum_{i=1}^n\lambda_iv_i\in V$ is such that $T\left(\sum_{i=1}^n\lambda_iv_i\right)=\sum_{i=1}^n\lambda_iw_i$.
Best Answer
Hint: You need to show that $S(v) = T(v)$ for all vectors $v\in V$. Write $v = a_1v_1 + \dots+ a_n v_n$, where $a_1, a_2, \dots, a_n$ are scalars and use linearity of $S$ and $T$.