[Math] There exists $x^*\neq 0$ such that $Ax^* = 0$ $\iff \det A = 0$ or $\det A \neq 0$

linear algebrasystems of equations

Let $A\in\mathbb{R}^{n\times n}$ and $x\in\mathbb{R}^n$. Which of the
followinf affirmations is true? Prove or give a counterexample.

a) There exists $x^*$ such that $Ax^* = 0$ if $\det A = 0$

b) There exists $x^*\neq 0$ such that $Ax^* = 0$ $\iff \det A = 0$

c) There exists $x^*\neq 0$ such that $Ax^* = 0\iff \det A \neq 0$

a)

I know that if $\det A \neq 0$ we cannot simply multiply by $A^{-1}$ and say $x=0$, so I guess there should exist $x$ that solves this equation for some A's. One example is $x=0$. Can I guarantee that there's always a nonzero one? I guess this is the question below so there's no need to answer it here.

b) This is the quetion that appeared to be above

c) I now that if $\det A\neq 0$ then we can multiply by the inverse to get $x^*=0$ as a solution. However I don't know if this is the unique solution. On the other way, if there is $x\neq 0$ such that $Ax=0$ then all I know is that this is a syste with a solution.

How these things are proved in linear algebra? I know that there a lots of ways to deal with determinants and system of solutions, but I've never seen a proof of these arguments.

UPDATE:

b)

As pointed below in an answer, $\det A=0\implies $one of the eigenvalues is $0$, lets call it $v$. Then $Av = \lambda v = 0$, then $v$ is our nonzero vector $x^*$ such that $Ax^*=0$.

How to prove the converse? One thing I tried: By hypothesis $Ax^*=0$ and $x^*\neq 0$. Then suppose $\det A\neq 0$, then we can multiply by the inverse on both sides to get $A^{-1}Ax^* = x^* = 0$, but we supposed $x^*\neq 0$, which is a contradiction. Is this true?

c)

$\leftarrow$ is false because since by b) is true, we have:
There exists $x^*\neq 0$ such that $Ax^* = 0$ $\implies \det A = 0$ so the converse says
$\det A \neq 0\implies $ for all $x\neq 0, Ax^*\neq 0$

How do I prove $\rightarrow$?

Best Answer

(a) True. This is trivial, as you pointed out you may set $x=\varnothing$, i.e. the zero vector.

(c) False. This is the negation of (b), and (b) is provably true.

(b) True.

There are many ways to "prove" (b). But in this statement of the form S1 iff S2, both S1 and S2 are essentially saying the same thing: the columns of A are linearly dependent.

One argument is that det(A) is the volume of the parallelotope formed by the columns of A (and their translation). If det(A)=0, that means at least one such column is in the proper subspace spanned by the other columns so that the parallelotope's volume degenerates to zero. This implies $\exists x_i$, not all zeros, such that $\sum_i A_i x_i = \varnothing = Ax$, where $A_i$ are the columns of $A$. The proof of the other direction is similar.

Another argument is $det(A)=\Pi_i \lambda_i$, where $\lambda_i$ are eigenvalues of A. If $det(A)=0$, at least one eigenvalue is zero (without loss, let it be $\lambda_1$), with corresponding eigenvector $v_1$. By definition, $v_1 \neq \varnothing$ and $Av_1 = \lambda_1 v_1 = \varnothing$. The proof of the other direction is similar.

Related Question