[Math] Finding change of basis matrix when given two bases as a set of matrices

linear algebramatricesvector-spaces

Find the change of basis matrix between the following bases:

$\alpha = \left\{ \begin{pmatrix} 1 & 1 \\ -1 & 2 \end{pmatrix}, \begin{pmatrix} 3 & 1 \\ -1 & 2 \end{pmatrix}, \begin{pmatrix} 1 & -1 \\ 2 & -3 \end{pmatrix}, \begin{pmatrix} 2 & 1 \\ -1 & 0 \end{pmatrix}\right\}$ and $\beta = \left\{ \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix}, \begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix}, \begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix}\right\}$.

I'm not sure how to do this. I'm confused because I'm given matrices now. I know how to do it when I'm given a set of vectors: then I just have to write them as linear combinations of each other and write the coefficients in the column of a matrix. But what should I do here? If I write $\begin{pmatrix} 1 & 1 \\ -1 & 2 \end{pmatrix}$ as a linear combination of the basis matrices in $\beta$, then I get a $4 \times 4$-matrix, which can't be right?

Best Answer

For the transition matrix from $\alpha$ to $\beta$: note that $$ \begin{pmatrix} 1 & 1 \\ -1 & 2 \end{pmatrix} = 1 \cdot \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} + 1 \cdot \begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} + (-1) \cdot \begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix} + 2 \cdot \begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix} $$ So, the first column of the transition matrix will be $(1,1,-1,2)^T$. Similarly, we may find the other columns so that the transition matrix is given by $$ [I]_{\alpha \to \beta} = \pmatrix{ 1&3&1&2\\ 1&1&-1&1\\ -1&-1&2&-1\\ 2&2&-3&0 } $$ We then calculate $$ [I]_{\beta \to \alpha} = ([I]_{\alpha \to \beta})^{-1} $$

Related Question