What’s the relationship between linear transformations and systems of equations?

linear algebralinear-transformationsmatricessystems of equations

I began watching Gilbert Strang's lectures on Linear Algebra and soon realized that I lacked an intuitive understanding of matrices, especially as to why certain operations (e.g. matrix multiplication) are defined the way they are. Someone suggested to me 3Blue1Brown's video series (https://youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab) and it has helped immensely. However, it seems to me that they present matrices in completely different ways: 3Blue1Brown explains that they represent linear transformations, while Strang depicts matrices as systems of linear equations. What's the connection between these two different ideas?

Furthermore, I understand why operations on matrices are defined the way they are when we think of them as linear maps, but this intuition breaks when matrices are thought of in different ways. Since matrices are used to represent all sorts of things (linear transformations, systems of equations, data, etc.), how come operations that are seemingly defined for use with linear maps the same across all these different contexts?

Best Answer

Not sure if I am addressing what you really are after, but I always like considering simple examples if there is something I don't understand. Using the same matrix in two different contexts. Here is an example of a linear transformation. $$ A = \begin{bmatrix} 2 & 0 \\ 0 & 1 \end{bmatrix} $$ If you input some vector, say $[x_1, x_2] = [1, 1]$, it transforms it by stretching it out in the $x_1$ direction into $[2, 1]$.

The same matrix is used in a related linear equation. $Ax = [2,1]$. What $x$ values gave the stretched vector $[2,1]$? $$ Ax = [2,1] $$ In matrix form. $$ \begin{bmatrix} 2 & 0 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} = \begin{bmatrix} 2 \\ 1 \end{bmatrix} $$ As a system of linear equations. $$ 2x_1 + 0 x_2 = 2 \\ 0x_1 + x_2 = 1 $$

Related Question