[Math] Linear transformation: change of basis matrix representation

change-of-basislinear algebralinear-transformationsmatrices

I have some difficulties trying to understand the general method to find the matrix representation of a linear transformation with respect to a different basis.

On my textbook there is written that given the matrix in a basis $\beta$, the matrix of the linear operator $L$ with respect to another basis $\beta'$ is given by

$$ S^{-1}A_{\beta}S,$$

where $S$ is a matrix whose columns are the coordinates of the vectors in the new basis with respect to the old one. Nevertheless, I am not able to use it in the exercises I have done. For instance: suppose

$$ \begin{bmatrix} -1/2 & 5/4\\ 1 & 3/2 \end{bmatrix}$$

is the matrix representation of $L$ with respect to the canonical basis. Now we want to find the matrix in the basis given by $\{ (1,2), (2,0) \}$. The method works, since we have

$$
\begin{bmatrix}
1 & 2\\2 & 0
\end{bmatrix}^{-1}
\begin{bmatrix}
-1/2 & 5/4\\ 1 & 3/2
\end{bmatrix}
\begin{bmatrix}
1 & 2\\2 & 0
\end{bmatrix}
=
\begin{bmatrix}
2 & 1\\0 & -1
\end{bmatrix}
= A_{\beta'}.
$$

Suppose now I am given the matrix in this new basis and I want to find the matrix with respect to the canonical one. If I apply the same method, I do not get the right result. Instead, if I compute $SA_{\beta'}S^{-1}$, it works. What am I doing wrong?

Best Answer

You can formulate change of basis rigorously as follows, given $S$ is a linear transformation. Let $[S]_{\beta}^{\beta '}$ be the matrix corresponding to $S$ with respect to basis $\beta$ in the domain and $\beta'$ in the codomain. In our case, if $\beta =\{v_1,...,v_n\}$and $\beta'=\{w_1,...,w_n\}$, then we can write $S(v_j)=\sum_{i}a_{ij}w_i$. Then $[S]_{\beta}^{\beta'}=\begin{bmatrix}a_{11} & a_{12}&\dots&a_{1n} \\a_{21}&a_{22}&\dots&a_{2n}\\\vdots&\vdots&\ddots&\vdots\\a_{n1}&a_{n2}&\dots&a_{nn} \ \end{bmatrix}$. Now matrix multiplication is compatible with linear transformation if the bases are in the "right place". For example if $S$ and $T$ are linear transformation (from $\mathbb R^n$ to itself), then given $\beta,\beta',\beta''$ are bases of $\mathbb R^n$. Then $[ST]_{\color{blue}{\beta}}^{\color{green}{\beta ''}}=[S]_{\color{red}{\beta'}}^{\color{green}{\beta''}}[T]_{\color{blue}{\beta}}^{\color{red}{\beta '}}$


Now back to your question. Let's say you have a linear transformation $A$, and you know its matrix corresponding to a certain basis $\beta$, i.e. given $[A]_{\beta}^{\beta}$. You seek to find the matrix of $A$ corresponding to another basis $\beta'$, i.e. $[A]_{\beta'}^{\beta'}$. Then by the identity above, let $I$ be the identity map, we should have $[A]_{\beta'}^{\beta'}=[I]^{\beta'}_{\beta}[A]_{\beta}^{\beta}[I]_{\beta'}^{\beta}$. Specifically in your case, $\beta'=\{(1,2),(2,0)\}$, $[I]_{\beta'}^{\beta}$ is just asking how to write $a\begin{bmatrix}1\\2\end{bmatrix}+b\begin{bmatrix}2\\0\end{bmatrix}$ in terms of something like $f(a,b)\begin{bmatrix}1\\0\end{bmatrix}+g(a,b)\begin{bmatrix}0\\1\end{bmatrix}$. Then, $$[I]_{\beta'}^{\beta}:\begin{bmatrix}a\\b\end{bmatrix}\mapsto\begin{bmatrix}f(a,b)\\g(a,b)\end{bmatrix}=\begin{bmatrix}a+2b\\2a\end{bmatrix}=\begin{bmatrix}1&2\\2&0\end{bmatrix}\begin{bmatrix}a\\b\end{bmatrix}$$, i.e. $[I]_{\beta'}^{\beta}=\begin{bmatrix}1&2\\2&0\end{bmatrix}$

And then $[I]_{\beta}^{\beta'}=([I]_{\beta'}^{\beta})^{-1}=\begin{bmatrix}1&2\\2&0\end{bmatrix}^{-1}$, because their product should give you $\begin{bmatrix} 1&0\\0&1 \end{bmatrix}$.

But when you do the reversed direction, you need to find write $\begin{bmatrix}1\\0\end{bmatrix}$and $\begin{bmatrix}0\\1\end{bmatrix}$ in terms of $\begin{bmatrix}1\\2\end{bmatrix}$ and $\begin{bmatrix}2\\1\end{bmatrix}$, so it should be the reversed process. Again we start by writing any vector as $a\begin{bmatrix}1\\0\end{bmatrix}+b \begin{bmatrix}0\\1\end{bmatrix}$, the matrix should send this $(a,b)$ to the coefficients with respect to the other basis' vectors, i.e. $f'(a,b)\begin{bmatrix}1\\2\end{bmatrix}+g'(a,b)\begin{bmatrix}2\\0\end{bmatrix}$, which you can find by solving the linear equations or finding matrix's inverse.

In short, the "easy to remember method" for change of basis matrix $[I]_{\beta}^{\beta'}$ from $\beta=\{v_1,...,v_n\}$ to $\beta'=\{w_1,...,w_n\}$ is that the $\text{j}^{\text{th}}$ column is the column of unique coefficients to write $v_j=a_{1j}w_1+...+a_{nj}w_n$, i.e. $\begin{bmatrix} a_{1j}\\a_{2j}\\ \vdots\\a_{nj}\end{bmatrix}$

Related Question