[Math] Find a basis of $\ker(f)$ of this linear mapping

abstract-algebralinear algebramatrices

$M=\begin{pmatrix}
1 & 3 & -1\\
-1 & 0 & -2\\
1 & 1 & 1
\end{pmatrix}$
is a real matrix and $f: \mathbb{R}^3 \rightarrow \mathbb{R}^3$, $f(x)= Mx$ is a linear mapping. Find a basis of $\ker(f)$.


$\det(M)= 0 \Rightarrow$ kernel exists (is that necessary to check when task says "find a basis…"?)

$\begin{pmatrix}
1 & 3 & -1\\
-1 & 0 & -2\\
1 & 1 & 1
\end{pmatrix}\begin{pmatrix}
x\\
y\\
z
\end{pmatrix}=\begin{pmatrix}
0\\
0\\
0
\end{pmatrix}$

Forming this (you just assume I formed correctly) we get:

$y=z$

$x=-2z$

So we have $\ker(f)=\left\{\begin{pmatrix}
-2z\\
z\\
z
\end{pmatrix} \mid z \in \mathbb{R}\right\} \Rightarrow B_{\ker(f)}= \left\{\begin{pmatrix}
-2\\
1\\
1
\end{pmatrix}\right\}$


Is the procedure alright?

Best Answer

About the kernel

As some others have noted the kernel (also known as the null space) always exists. Why? Well let's look at the definition of the kernel of a linear map: Let $V$ and $W$ be vector spaces. The kernel of a linear map $L: V \to W$ is the set of all elements $v$ of $V$ such that $L(v) = 0$ (where $0$ here denotes the zero vector in $W$). So for any linear map, we could always supply $L(0) = 0$.

Now, to get a basis of the null space, the kernel must be non trivial, e.g. $\ker(L) \neq \{ 0 \}$ (although as was seen above $0 \in \ker(L)$ for any linear transformation). We can determine this by taking the determinant of the matrix representing the linear map. If the determinant is zero we know that the null space is non-trivial. Why is that? Well, perhaps an easy way to see this is via the invertible matrix theorem [0]. To use the theorem the matrix must be square (which we have) and the matrix must have coefficients in a field (and the real numbers are a field). From the invertible matrix theorem we then know the determinant of $M$ is nonzero if and only if the dimension of the null space is zero. Thus, as we know the determinant is zero we also know that the dimension of the null space is not zero.

Similarly the invertible matrix theorem also states that if the square matrix $A$ (with coefficients in a field) is row equivalent to $I_n$, then you know that that the determinant is nonzero (so you don't even need to take the determinant to determine if the dimension of the null space is zero! Just row reduce it). If the matrix is row reducible to $I_n$ you know the dimension of the null space is zero otherwise it's dimension is greater than 0.

The approach

I believe the approach you took in your answer is correct. To be complete I'll provide the procedure.

Suppose

$$M=\begin{pmatrix} 1 & 3 & -1\\ -1 & 0 & -2\\ 1 & 1 & 1 \end{pmatrix}$$

is a real matrix and $f: \mathbb{R}^3 \rightarrow \mathbb{R}^3$, $f(x)= Mx$ is a linear mapping. Find a basis of $\ker(f)$.

First row reduce the matrix into reduced row echelon form (RREF). Thus

$$\mathrm{RREF} (M)=\mathrm{RREF}\begin{pmatrix} 1 & 3 & -1\\ -1 & 0 & -2\\ 1 & 1 & 1 \end{pmatrix} = \begin{pmatrix} 1 & 0 & 2\\ 0 & 1 & -1\\ 0 & 0 & 0 \end{pmatrix}$$

Now we can define a set of linear equations whose leading entries are 1 and that when summed equal zero. Thus

$$\begin{align} 1x & +0y +2z = 0 \\ & +1y -1z = 0 \end{align}$$

Thus we can take $x = -2z$ and $y = z$ with $z$ as a free variable. To get each equation to sum to zero we know that $x$ and $y$ must be the multiples of $z$ we just derived. Thus

$$\ker(f)=\left\{\begin{pmatrix} -2z\\ z\\ z \end{pmatrix} \mid z \in \mathbb{R}\right\} $$

Letting $z = 1$ we have an explicit basis:

$$\begin{pmatrix} -2 \\ 1 \\ 1 \end{pmatrix}$$

https://math.dartmouth.edu/archive/m22f06/public_html/imt.pdf

Related Question