Solution verification relating determinants and planes alongside solution to a system of equations

cross productdeterminantlinear algebra

I have this multipart question here. I did all of it except the last part and I was wondering if I could get a bit of help with that.

Let $$\vec{u}=(u_1,u_2,u_3),\vec{v}=(v_1,v_2,v_3),\vec{w}=(w_1,w_2,w_3)$$
be $3$ vectors in $\mathbb{R}^3$. Define a matrix whose rows are these three vectors:
$$\begin{bmatrix} u_1 & u_2 & u_3 \\ v_1 & v_2 & v_3\\ w_1 & w_2 & w_3\\
\end{bmatrix}$$

a) Show that $$\det(A)=\vec{u} \cdot (\vec{v} \times \vec{w})$$

This proof was trivial. Simply compute $\vec{v} \times \vec{w} = \begin{vmatrix} i & j & k \\ v_1 & v_2 & v_3\\ w_1 & w_2 & w_3\\
\end{vmatrix}$
and dot the result with $\vec{u}=(u_1,u_2,u_3)$. Compute the determinant (I used cofactor expansion) and you'll get both sides to match.

b) Suppose that $\vec{v},\vec{w}$ are not colinear. Show that $\vec{v} \times \vec{w}$ is normal to the plane given by $\text{Span}\{\vec{v},\vec{w}\}$.

Since $\vec{v}$ and $\vec{w}$ are not collinear, this implies any vecotor can be written as the span of the two vectors. Let $\vec{u}=\alpha\vec{v}+\beta\vec{w}$ where $\alpha, \beta$ are real numbers. If $\vec{u}$ is normal to $\vec{v} \times \vec{w}$ then $\vec{u}\cdot(\vec{v} \times \vec{w})$ implies:

$\vec{u}\cdot(\vec{v} \times \vec{w})$

$(\alpha\vec{v}+\beta\vec{w})\cdot(\vec{v} \times \vec{w})$

$(\alpha\vec{v})\cdot(\vec{v} \times \vec{w})+(\beta\vec{w})\cdot(\vec{v} \times \vec{w})=0$

It's $0$ since two of the rows are the same so the determinant (Since the is the same as part a) evaluates to $0$ as well.

c) Use parts (a) and (b) to argue that $\vec{u},\vec{v},\vec{w}$ all lie on the same plane if and only if $\det(A)=0$.

They have to. Part a and b directly implies that $\vec{u}$ is orthogonal to the span of $\vec{v}$ and $\vec{w}$, then they must lie on the same plane. $\vec{u}$ is orthogonal to $\vec{v}$ and $\vec{w}$ but that's fine. They're still in the same plane. I think this was obvious…

d) Suppose that the matrix $A$ row reduces to
$$\begin{bmatrix} 1 & 2 & 0 \\ 0 & 0 & 1\\ 0 & 0 & 0\\
\end{bmatrix}$$

Now consider the system of equations:
\begin{cases}
u_1x_1+u_2x_2+u_3x_3=0 \\
v_1x_1+v_2x_2+v_3x_3=0 \\
w_1x_1+w_2x_2+w_3x_3=0 \\
\end{cases}

Each equation defines a plane in $\mathbb{R}^3$. Describe how the three planes intersect.

After row reduction we get the same matrix as given. They intersect along a line given by $\vec{x}=t\{-2,1,0\}$.

e) Consider the system of equations
\begin{cases}
u_1x_1+u_2x_2+u_3x_3=0 \\
v_1x_1+v_2x_2+v_3x_3=0 \\
w_1x_1+w_2x_2+w_3x_3=0 \\
\end{cases}

Assume only that the vectors $\vec{u}$,$\vec{v}$,$\vec{w}$ are non-zero and $\det(A)=0$ but that you don't know how the matrix row reduces. Each equation defines a plane in $\mathbb{R}^3$. Describe the possible ways that the $3$ planes can intersect and explain why.

This is the only part I am not sure about. I know when the determinant is $0$, this means the matrix isn't invertible so there is no unique solution to the system (I think?). But besides that, I'm not sure what other kind of intersections should happen.

Best Answer

You're right, the system cannot have a unique solution. It is homogeneous (all RHSs zero) so it is consistent (the planes all pass through the origin so the origin is in the intersection), so there are infinitely many solutions. But what is the geometry of the solution space?

Think about the rank of $A$. Since $\det A = 0$, the rank cannot be 3.

  • If the rank is 2, then you more or less get the scenario of part (d): a line of intersection. It doesn't matter whether the leading 1s are in columns 1 and 3 as in the matrix in part (d), or in columns 1 and 2, or even in columns 2 and 3.
  • What happens if the rank is 1? Suppose for example that $A$ reduces to $\begin{bmatrix} 1 & 2 & 3 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}$: What is the solution space in that case?
  • Can the rank be zero? Note that the given vectors are nonzero.