[Math] relationship between a matrix of vectors and a matrix of equations

linear algebramatrices

I'm just getting a more in depth intuition for Linear Algebra, and I've come across a couple uses of matrices that seem like there should be a relationship, but I'm not sure.

Thinking about matrices as columns of vectors that determine a transformation from the basis, we can insert vectors $\begin{bmatrix}1 \\ 3 \end{bmatrix}$ and $\begin{bmatrix}5 \\ 7 \end{bmatrix}$ into a matrix:

$$
\begin{bmatrix}
1 & 5 \\
3 & 7
\end{bmatrix}
$$

If $\hat{i}$ is the x-axis unit vector and $\hat{j}$ is the y-axis unit vector then $$\begin{bmatrix}1 \\ 3 \end{bmatrix} = \ \hat{i} + 3\hat{j} \quad and \quad \begin{bmatrix}5 \\ 7 \end{bmatrix} = \ 5\hat{i} + 7\hat{j}$$ so any vector in the space can have its coordinates determined after the transformation.

But then we can also use matrices as a way to express systems of linear equations. So if we have:

$$x + 5y = 0 \\
3x + 7y = 0 \\$$

And they can be expressed as:

$$\begin{bmatrix}1 & 5 \\ 3 & 7 \end{bmatrix} \begin{bmatrix}x \\ y \end{bmatrix} = \begin{bmatrix}0 \\ 0 \end{bmatrix}$$

Is there some relationship between the first transformation matrix and this system of linear equations with the same coefficients values?

Best Answer

There are two fundamental and complementary ways to specify a subspace $V$ of $\mathbb R^n$. The first is as the span of some set of vectors $\{v_1,\dots,v_k\}$, i.e., as the set of all linear combinations $\sum_{i=1}^kc_iv_i$. In matrix form, this is $$\begin{bmatrix}v_1&\cdots&v_k\end{bmatrix}\begin{bmatrix}c_1\\\vdots\\c_k\end{bmatrix}$$ i.e., $V$ is the column space of the matrix with the $v_i$ as its columns. We can view the product $Av$ as specifying a particular element of $A$’s column space, namely, the linear combination of the columns of $A$ with the corresponding components of $v$ as coefficients.

The other way is to specify a set of vectors $\{w_1,\dots,w_l\}$ that span its orthogonal complement $W=V^\perp$. $V$ is then the subspace that consists of all the vectors that are orthogonal to every $w_i$. This condition can be expressed as the set of homogeneous linear equations $$\begin{align} w_1\cdot(x_1,\dots,x_n)&=0\\ w_2\cdot(x_1,\dots,x_n)&=0 \\ \vdots \\ w_l\cdot(x_1,\dots,x_n)&=0. \end{align}$$ In matrix form, this is $$\begin{bmatrix}w_1^T\\\vdots\\w_l^T\end{bmatrix}\begin{bmatrix}x_1\\\vdots\\x_n\end{bmatrix}.$$ This gives us another view of the product $Av$ as the vector of dot products of $v$ with the rows of $A$. The subspace $V$ is then the null space of the matrix that has the $w_i$ for its rows, and we can see that the row space and null space of a matrix are orthogonal complements.

We can connect the two views through the equation $Ax=b$. The row view shows us that this equation is equivalent to a system of linear equations involving dot products of the rows of $A$, which the column view tells us that the matrix equation, and so the system of linear equations, has a solution iff $b$ can be expressed as a linear combination of the columns of $A$, in other words, iff $b$ lies in the column space of $A$.

I’ve described the second method in terms of the standard basis of $\mathbb R^n$ and the Euclidean inner (dot) product, but it’s possible to express the same idea in a way that requires neither an inner product nor choice of basis. Instead of elements of $V$’s orthogonal complement, we can use elements of the dual space that annihilate $V$, i.e., linear functionals $\varphi:\mathbb R^n\to\mathbb R$ such that $\varphi[v]=0$ for $v\in V$.