The matrix of a transformation is the matrix that turns the vector of coordinates of the input into the vector of coordinates of the output in certain bases. Coordinate vectors are always column vectors of some $\mathbb{R}^n$.
For example: In the standard basis $\{\begin{bmatrix}1&0\\0&0\end{bmatrix},\begin{bmatrix}0&1\\0&0\end{bmatrix},\begin{bmatrix}0&0\\1&0\end{bmatrix},\begin{bmatrix}0&0\\0&1\end{bmatrix}\}$, of the space of matrices. The coordinate vector of the matrix $\begin{bmatrix}a&b\\c&d\end{bmatrix}$, is the column $\begin{bmatrix}a\\b\\c\\d\end{bmatrix}$.
In your case the transformation is $$T(\begin{bmatrix}a&b\\c&d\end{bmatrix})=\begin{bmatrix}2ia&b+ci\\c+bi&2id\end{bmatrix}$$
When you compute the matrix of this transformation in the standard basis you get $$A_T:=\begin{bmatrix}2i&0&0&0\\0&1&i&0\\0&i&1&0\\0&0&0&2i\end{bmatrix}$$
What this means is that when you multiply the vector of coordinates $$\begin{bmatrix}a\\b\\c\\d\end{bmatrix}$$ of some vector $\begin{bmatrix}a&b\\c&d\end{bmatrix}$ of your space, by $A_T$ you get the vector of coordinates $$\begin{bmatrix}2ia\\b+ci\\c+bi\\2di\end{bmatrix}=A_t\begin{bmatrix}a\\b\\c\\d\end{bmatrix}$$
of the output of $T$. This is, the vector of coordinates of the vector $\begin{bmatrix}2ai&b+ci\\c+bi&2di\end{bmatrix}$ in the standard basis.
You can formulate change of basis rigorously as follows, given $S$ is a linear transformation. Let $[S]_{\beta}^{\beta '}$ be the matrix corresponding to $S$ with respect to basis $\beta$ in the domain and $\beta'$ in the codomain. In our case, if $\beta =\{v_1,...,v_n\}$and $\beta'=\{w_1,...,w_n\}$, then we can write $S(v_j)=\sum_{i}a_{ij}w_i$. Then $[S]_{\beta}^{\beta'}=\begin{bmatrix}a_{11} & a_{12}&\dots&a_{1n} \\a_{21}&a_{22}&\dots&a_{2n}\\\vdots&\vdots&\ddots&\vdots\\a_{n1}&a_{n2}&\dots&a_{nn} \ \end{bmatrix}$. Now matrix multiplication is compatible with linear transformation if the bases are in the "right place". For example if $S$ and $T$ are linear transformation (from $\mathbb R^n$ to itself), then given $\beta,\beta',\beta''$ are bases of $\mathbb R^n$. Then $[ST]_{\color{blue}{\beta}}^{\color{green}{\beta ''}}=[S]_{\color{red}{\beta'}}^{\color{green}{\beta''}}[T]_{\color{blue}{\beta}}^{\color{red}{\beta '}}$
Now back to your question. Let's say you have a linear transformation $A$, and you know its matrix corresponding to a certain basis $\beta$, i.e. given $[A]_{\beta}^{\beta}$. You seek to find the matrix of $A$ corresponding to another basis $\beta'$, i.e. $[A]_{\beta'}^{\beta'}$. Then by the identity above, let $I$ be the identity map, we should have $[A]_{\beta'}^{\beta'}=[I]^{\beta'}_{\beta}[A]_{\beta}^{\beta}[I]_{\beta'}^{\beta}$. Specifically in your case, $\beta'=\{(1,2),(2,0)\}$, $[I]_{\beta'}^{\beta}$ is just asking how to write $a\begin{bmatrix}1\\2\end{bmatrix}+b\begin{bmatrix}2\\0\end{bmatrix}$ in terms of something like $f(a,b)\begin{bmatrix}1\\0\end{bmatrix}+g(a,b)\begin{bmatrix}0\\1\end{bmatrix}$. Then,
$$[I]_{\beta'}^{\beta}:\begin{bmatrix}a\\b\end{bmatrix}\mapsto\begin{bmatrix}f(a,b)\\g(a,b)\end{bmatrix}=\begin{bmatrix}a+2b\\2a\end{bmatrix}=\begin{bmatrix}1&2\\2&0\end{bmatrix}\begin{bmatrix}a\\b\end{bmatrix}$$,
i.e. $[I]_{\beta'}^{\beta}=\begin{bmatrix}1&2\\2&0\end{bmatrix}$
And then $[I]_{\beta}^{\beta'}=([I]_{\beta'}^{\beta})^{-1}=\begin{bmatrix}1&2\\2&0\end{bmatrix}^{-1}$, because their product should give you $\begin{bmatrix} 1&0\\0&1 \end{bmatrix}$.
But when you do the reversed direction, you need to find write $\begin{bmatrix}1\\0\end{bmatrix}$and $\begin{bmatrix}0\\1\end{bmatrix}$ in terms of $\begin{bmatrix}1\\2\end{bmatrix}$ and $\begin{bmatrix}2\\1\end{bmatrix}$, so it should be the reversed process. Again we start by writing any vector as $a\begin{bmatrix}1\\0\end{bmatrix}+b \begin{bmatrix}0\\1\end{bmatrix}$, the matrix should send this $(a,b)$ to the coefficients with respect to the other basis' vectors, i.e. $f'(a,b)\begin{bmatrix}1\\2\end{bmatrix}+g'(a,b)\begin{bmatrix}2\\0\end{bmatrix}$, which you can find by solving the linear equations or finding matrix's inverse.
In short, the "easy to remember method" for change of basis matrix $[I]_{\beta}^{\beta'}$ from $\beta=\{v_1,...,v_n\}$ to $\beta'=\{w_1,...,w_n\}$ is that the $\text{j}^{\text{th}}$ column is the column of unique coefficients to write $v_j=a_{1j}w_1+...+a_{nj}w_n$, i.e. $\begin{bmatrix} a_{1j}\\a_{2j}\\ \vdots\\a_{nj}\end{bmatrix}$
Best Answer
Hint:
Notice that each vector in $\beta$ is a linear combination of vectors in $\beta_s$: this defines what $\beta_s$ is in terms of $\beta$. So, to find $Q$, simply write out the matrix whose columns represent these linear combinations. The rest is as you say.