From S.L Linear Algebra:
Find the matrix associated with the following linear maps. The vectors
are written horizontally with a transpose sign for typographical
reasons.
(a) $F:\mathbb{R}^4 \rightarrow \mathbb{R}^2$ given by $F\left ((x_1, x_2, x_3, x_4)^T \right)=(x_1, x_2)^T$ (the projection)
Solution Attempt
I will use Theorem 2.1 from the book:
Let $L: K^n \rightarrow K^m$ be a linear map. Then there exists a
unique matrix $A$ such that $L = L_A$.
Hence, for all $X$ we have $L(X)=AX$.
In this case, some $A$ is the matrix associated with linear map $F$ such that:
$$F(X^T)=AX^T$$
$$F\left ((x_1, x_2, x_3, x_4)^T \right)=A\left ((x_1, x_2, x_3, x_4)^T \right)$$
$$AX^T=A\left ((x_1, x_2, x_3, x_4)^T \right)=(x_1, x_2)^T$$
Solving for $A$:
$$A=\left ((x_1, x_2, x_3, x_4)^T \right)^{-1}(x_1, x_2)^T$$
Now this makes no sense, because $X$ must be non-degenerate square matrix in order to be invertible, but $X$ is just a vector with cardinality (dimension) $4$.
Perhaps I could invert $A$ (then I would have to show that $F$ is injective with trivial kernel), but I don't see any point of doing this.
Is $A$ truly a matrix associated with $F$? If not, how to find $A$ if it is a matrix that is truly associated with $F$?
P.S
I also found this in the book:
Let $F: \mathbb{R}^3 \rightarrow \mathbb{R}^2$ be the projection, in
other words the mapping such that $F(x_1, x_2, x_3) = (x_1, x_2)$.
Then the matrix associated with $F$ is:$$\begin{pmatrix} 1 & 0 & 0 \\ 0&1 & 0 \end{pmatrix}$$
I don't know how was it exactly calculated, but the matrix above contains standard basis vectors only, which I believe must mean something.
Thank you!
Best Answer
To begin with, the Vector Space of Matrices $\mathscr{M}_{mn}(\mathbb{R}^4)$ is not abelian under matrix product! It is a fundamental error to change sides of a matrix by "inverting it" in an equation.
And keep in mind: you can see obviously that $\mathbb{R}^2$ can be extended so to be isomorphic to $\mathbb{R}^4$. It is why they indicated that the image was a projection in fact.
Now let's calculate:
The first intuition is to write down what your matrix operations look like; to find out if the product is well-defined somehow: this will eliminate some big issues in your thoughts.
$$ F(\left ( \begin{matrix} x_1 \\ x_2 \\ x_3 \\ x_4 \end{matrix} \right ) )=\left ( \begin{matrix} x_1 \\ x_2 \end{matrix} \right ) $$
ie.
$$ \left ( \begin{matrix} a_{11} & a_{12} & a_{13} & a_{14} \\ a_{21} & a_{22} & a_{23} & a_{24} \end{matrix} \right ) .\left ( \begin{matrix} x_1 \\ x_2 \\ x_3 \\ x_4 \end{matrix} \right ) =\left ( \begin{matrix} x_1 \\ x_2 \end{matrix} \right ) $$
when we introduce the Matrix of elements $(a_{ij})$ as the Matrix associated to the linear transformation. We must have a $2x4$ matrix for this transformation.
When you do the calculations, you'll find:
$$ \begin{cases} a_{11}x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = x_1 \\ a_{21}x_{1} + a_{22}x_{2} + a_{23}x_{3} + a_{24}x_{4} = x_2 \end{cases} $$
ie.
$$ \begin{cases} (a_{11}-1)x_{1} + a_{12}x_{2} + a_{13}x_{3} + a_{14}x_{4} = 0 \\ a_{21}x_{1} + (a_{22}-1)x_{2} + a_{23}x_{3} + a_{24}x_{4} = 0 \end{cases} $$
But as we're looking for a basis, we know it must be linearly independent (within row vctors). So that, we know, every coefficients are zero, with $(a_{11}-1)=0$ and $(a_{22}-1)=0$ so that we have $a_{11}=1$ and $a_{22}=1$, over row vectors.
Thus, we conclude that one basis for this linear transformation in $\mathbb{R}^4$ is:
$$ \left ( \begin{matrix} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \end{matrix} \right )$$