We are of course assuming $A$ is an $n\times n$ matrix.
I know there's a proof of it going the other way (invertibility implies a unique solution), but I'm trying to work out a proof going this way.
I'm looking for a proof that only uses the most basic concepts of linear algebra.
I know that:
1) Every matrix can be reduced to rref.
2) Elementary row operations do not change the solution set of $Ax = b$.
3) I also know that $Ix = y$, where $y$ is the particular solution.
I'm not sure how to prove that $[A|b]$ can be reduced to $[I|y]$, however, and that's where the problem lies.
Any help with this would be greatly appreciated.
Best Answer
Suppose $A$ were not invertible. Then, there would be a non-zero vector in its null space. Call this vector $z$. Compute $y$ such that $z = x-y$. Since $z$ is non-zero, $x\neq y$.
Then, $$Az = 0 \implies A(x-y) = 0 \implies Ax-Ay = 0 \implies Ax = Ay.$$
But since $Ax = b$ then $Ay = b$. But $x$ was the unique solution to $Ax=b$, so $x=y$ necessarily, contradicting $x \neq y$.