[Math] Give a transformation matrix with respect two bases, find the bases

change-of-basislinear algebralinear-transformations

I'm trying some old exam questions to prepare for my linear algebra exam and there is the following question, which I can't figure out.

Given the following linear transformation
$$ L: \mathbb{R}^4 \rightarrow \mathbb{R}^4 : (x,y,z,w) \mapsto (2x,2y-w,-2x+4y-2w,5z+5w)$$
Find a basis $\alpha$ and a basis $\beta$, both in $\mathbb{R}^4$, such that
$$L_\alpha^\beta =\begin{pmatrix} 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \\ \end{pmatrix} $$
where $L_\alpha^\beta$ is the matrix representation of L with respect to bases $\alpha$ and $\beta$.

I'm not really sure how to begin with this problem. If we take a random vector $(a,b,c,d)$ in $\mathbb{R^4}$ and multiply with $L_\alpha^\beta$ , we get the following : $$L_\alpha^\beta \cdot \begin{pmatrix} a \\ b \\ c \\ d \end{pmatrix} = (a, 0, c, d) ,$$

which means that all vectors in the basis $\beta$ will be of that form. Is that even possible? How can I express a vector $(0,1,0,0) \in \mathbb{R^4}$ with respect to that basis? Am I completely misunderstanding something?

Can someone point me in the right direction?

Thank you.

EDIT: I think I've found $v_2 = (0,\frac{1}{2},-1,1)$ but I'm still no further. I hope edit might bump the post and someone will be able to help me.

Best Answer

The $i$-th column of the matrix $L_\alpha^\beta$ contains the image of $\alpha_i$ with respect to the basis $\beta$. Since the second column is the zero vector, it is the image of an element of the null space (or kernel).

Based on your edit, I believe you were already looking in this direction. I'd take $(0,1,-2,2)$ but yours works too. Since you want its image in the second column, you take this vector as $\alpha_2$. Extend $\alpha$ to a complete basis of $\mathbb{R^4}$, e.g. by adding standard basis vectors: $$\alpha = \left\{ \begin{pmatrix} 1 \\ 0 \\ 0 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 1 \\ -2 \\ 2 \end{pmatrix}, \begin{pmatrix} 0 \\ 1 \\ 0 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 0 \\ 1 \\ 0 \end{pmatrix} \right\}$$ Now the trick to the simple form of $L_\alpha^\beta$ is to choose the basis vectors of $\beta$ carefully: use the images of the $\alpha$'s! That is, pick: $\beta_j = L(\alpha_j)$ for $j=1,3,4$. Not for $j=2$, because by our choice of $\alpha_2$, we have $L(\alpha_2) = 0$. Extend $\beta$ to a basis of $\mathbb{R^4}$ by picking, for example, $\beta_2 = (0,0,1,0)^T$.


For $\beta_1$, we find the image of $\alpha_1$: $$\beta_1 = L(\alpha_1) = L \begin{pmatrix} 1 \\ 0 \\ 0 \\ 0 \end{pmatrix} =\begin{pmatrix} 2 \\ 0 \\ -2 \\ 0 \end{pmatrix} $$ Repeat this for $\beta_3$ and $\beta_4$ to get: $$\beta = \left\{ \begin{pmatrix} 2 \\ 0 \\ -2 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 0 \\ 1 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 2 \\ 4 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 0 \\ 0 \\ 5 \end{pmatrix} \right\}$$


Notice that other choices are possible: you fix $\alpha_2$ and then complete $\alpha$ to a basis by adding three linearly independent vectors. You then choose the $\beta$'s as the images of these three basis vectors you added to $\alpha$ to obtain the given, simple form of the transformation matrix. These three will span the image of $L$ and you simply extend $\beta$ to a basis by adding any linearly independent vector.

Related Question