[Math] How to find the null space of a transformation matrix

linear algebralinear-transformations

I have a transformation matrix $A$ and vector $\overrightarrow{x}\in \mathbb{R^3}$ such that
$$
A\overrightarrow{x}=
\begin{bmatrix}3\\2\\7\end{bmatrix}
$$
and $\overrightarrow{x}$ has the general solution:
$$
\overrightarrow{x}=
\begin{bmatrix}
1\\0\\0
\end{bmatrix}+
s\begin{bmatrix}
1\\1\\0
\end{bmatrix}+
t\begin{bmatrix}
-1\\0\\1
\end{bmatrix}
$$

I need to find $null(A)$. I'm wondering if it is as simple as the number of parameters in the general solution – $s$ and $t$ so $null(A)=2$

If it isn't that simple my thought was to split up the general solution into the standard basis vectors to try and find what $A$ is then row-reduce it if possible:

$$
\overrightarrow{x}=
\overrightarrow{e_1}+
s(\overrightarrow{e_1}+\overrightarrow{e_2})+
t(-\overrightarrow{e_1}+\overrightarrow{e_3})
$$
$$
\overrightarrow{x}=
(1+s-t)\overrightarrow{e_1}+
\overrightarrow{e_2}+
t\overrightarrow{e_3}
$$

This brings me to the equation:
$$
A\overrightarrow{x}=
A[(1+s-t)\overrightarrow{e_1}+
\overrightarrow{e_2}+
t\overrightarrow{e_3}]
=\begin{bmatrix}3\\2\\7\end{bmatrix}
$$
$$
A = [A\overrightarrow{e_1} | A\overrightarrow{e_2} | A\overrightarrow{e_3}]
$$
Which I think comes out to:
$$
A = \begin{bmatrix}
1+s-t & 0 & 0\\
0 & s & 0\\
0 & 0 & t
\end{bmatrix}
$$
but I'm unsure of what's next, or even if my $A$ is correct.

Best Answer

Yep, it's just 2, as long as those basis vectors in your "general solution" are linearly independent (which they should be if you've got a good method for "solving it generally" but demands restating).

Basically, if you have $\vec x = \vec x_0 + \alpha ~ \vec a + \beta ~ \vec b$ such that $A \vec x = \vec y$ then you know that $A \vec a = A \vec b = \vec 0$. Simple proof: since you know the general solution, move to the particular solution $\vec x_0$ by choosing $\alpha = \beta = 0$ to find $A \vec x_0 = \vec y$. Then analyze the situation where $\alpha = 1, \beta = 0$ to find $\vec y = A \vec x = A \vec x_0 + A \vec a = \vec y + A \vec a$, and from the beginning and end of that you get $\vec y = \vec y + A \vec a$, so $A \vec a = 0$.

So you don't just have the nullity (2) but you have the entire kernel/nullspace, $$\operatorname{ker} A = \{s [1; 1; 0] + t [-1; 0; 1] | s, t \in \mathbb R \}. $$If that seems super-simple, well, that's why we love linearity: it's super-simple.

Finding an actual expression for $A$ will be more complicated in general. First realize that your vectors allow you to express an arbitrary vector as $\vec v = \xi ~ \vec x_0 + \alpha \vec a + \beta \vec b$. This is a mapping between the $\mathbb R^3$ that $\vec v$ lives in and a new $\mathbb R^3$ that the coefficient vector $[\xi; \alpha; \beta]$ lives in. Transforming from coefficient-space to $\vec v$-space is given by a 3x3 matrix:$$ \mathbf C = \left [~ \vec x_0 ~~ \vec a ~~ \vec b ~ \right] = \begin{bmatrix}1 & 1 & -1 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}.$$Now you know that $A$ maps from coefficients to vectors by the mapping:$$\tilde A = \left [~ \vec y ~~ \vec 0 ~~ \vec 0 ~\right] = \begin{bmatrix}3 & 0 & 0 \\ 2 & 0 & 0 \\ 7 & 0 & 0 \end{bmatrix},$$so you know that the matrix for $A$ is given by$$A = \tilde A ~ \mathbf C^{-1} = \begin{bmatrix}3 & 0 & 0 \\ 2 & 0 & 0 \\ 7 & 0 & 0 \end{bmatrix} \begin{bmatrix}1 & -1 & 1 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} = \begin{bmatrix}3 & -3 & 3 \\ 2 & -2 & 2 \\ 7 & -7 & 7 \end{bmatrix}.$$It's pretty obvious that the nullity there is 2.