You can formulate change of basis rigorously as follows, given $S$ is a linear transformation. Let $[S]_{\beta}^{\beta '}$ be the matrix corresponding to $S$ with respect to basis $\beta$ in the domain and $\beta'$ in the codomain. In our case, if $\beta =\{v_1,...,v_n\}$and $\beta'=\{w_1,...,w_n\}$, then we can write $S(v_j)=\sum_{i}a_{ij}w_i$. Then $[S]_{\beta}^{\beta'}=\begin{bmatrix}a_{11} & a_{12}&\dots&a_{1n} \\a_{21}&a_{22}&\dots&a_{2n}\\\vdots&\vdots&\ddots&\vdots\\a_{n1}&a_{n2}&\dots&a_{nn} \ \end{bmatrix}$. Now matrix multiplication is compatible with linear transformation if the bases are in the "right place". For example if $S$ and $T$ are linear transformation (from $\mathbb R^n$ to itself), then given $\beta,\beta',\beta''$ are bases of $\mathbb R^n$. Then $[ST]_{\color{blue}{\beta}}^{\color{green}{\beta ''}}=[S]_{\color{red}{\beta'}}^{\color{green}{\beta''}}[T]_{\color{blue}{\beta}}^{\color{red}{\beta '}}$
Now back to your question. Let's say you have a linear transformation $A$, and you know its matrix corresponding to a certain basis $\beta$, i.e. given $[A]_{\beta}^{\beta}$. You seek to find the matrix of $A$ corresponding to another basis $\beta'$, i.e. $[A]_{\beta'}^{\beta'}$. Then by the identity above, let $I$ be the identity map, we should have $[A]_{\beta'}^{\beta'}=[I]^{\beta'}_{\beta}[A]_{\beta}^{\beta}[I]_{\beta'}^{\beta}$. Specifically in your case, $\beta'=\{(1,2),(2,0)\}$, $[I]_{\beta'}^{\beta}$ is just asking how to write $a\begin{bmatrix}1\\2\end{bmatrix}+b\begin{bmatrix}2\\0\end{bmatrix}$ in terms of something like $f(a,b)\begin{bmatrix}1\\0\end{bmatrix}+g(a,b)\begin{bmatrix}0\\1\end{bmatrix}$. Then,
$$[I]_{\beta'}^{\beta}:\begin{bmatrix}a\\b\end{bmatrix}\mapsto\begin{bmatrix}f(a,b)\\g(a,b)\end{bmatrix}=\begin{bmatrix}a+2b\\2a\end{bmatrix}=\begin{bmatrix}1&2\\2&0\end{bmatrix}\begin{bmatrix}a\\b\end{bmatrix}$$,
i.e. $[I]_{\beta'}^{\beta}=\begin{bmatrix}1&2\\2&0\end{bmatrix}$
And then $[I]_{\beta}^{\beta'}=([I]_{\beta'}^{\beta})^{-1}=\begin{bmatrix}1&2\\2&0\end{bmatrix}^{-1}$, because their product should give you $\begin{bmatrix} 1&0\\0&1 \end{bmatrix}$.
But when you do the reversed direction, you need to find write $\begin{bmatrix}1\\0\end{bmatrix}$and $\begin{bmatrix}0\\1\end{bmatrix}$ in terms of $\begin{bmatrix}1\\2\end{bmatrix}$ and $\begin{bmatrix}2\\1\end{bmatrix}$, so it should be the reversed process. Again we start by writing any vector as $a\begin{bmatrix}1\\0\end{bmatrix}+b \begin{bmatrix}0\\1\end{bmatrix}$, the matrix should send this $(a,b)$ to the coefficients with respect to the other basis' vectors, i.e. $f'(a,b)\begin{bmatrix}1\\2\end{bmatrix}+g'(a,b)\begin{bmatrix}2\\0\end{bmatrix}$, which you can find by solving the linear equations or finding matrix's inverse.
In short, the "easy to remember method" for change of basis matrix $[I]_{\beta}^{\beta'}$ from $\beta=\{v_1,...,v_n\}$ to $\beta'=\{w_1,...,w_n\}$ is that the $\text{j}^{\text{th}}$ column is the column of unique coefficients to write $v_j=a_{1j}w_1+...+a_{nj}w_n$, i.e. $\begin{bmatrix} a_{1j}\\a_{2j}\\ \vdots\\a_{nj}\end{bmatrix}$
It is not a trick.
Fix $z=a+bi \in \mathbb C$ and consider the map $\mu : w \mapsto zw$.
Seeing $\mathbb C$ as a vector space over $\mathbb R$, the matrix of $\mu$ with respect to the basis $1,i$ is exactly
$$\begin{bmatrix}a&-b\\ b&a\end{bmatrix}$$
The map $z \mapsto \mu$ is an injective homomorphism of $\mathbb R$-algebras $\mathbb C \to \text{End}_\mathbb R(\mathbb C) \cong M_2(\mathbb R)$.
The same construction works for every finite extension of fields $E/F$: the matrix ring $M_n(F)$ contains copies of all extensions of $F$ of degree $n$.
In particular, for instance, $\mathbb Q(\sqrt 2)$ can be given a matrix interpretation in $M_2(\mathbb Q)$. Try it!
Best Answer
Since
the matrix that you are after is$$\begin{bmatrix}8&7&4&0\\6&11&0&4\\1&0&7&7\\0&1&6&10\end{bmatrix}.$$