[Math] Showing that if an equation has a unique solution for one variable, then it has unique solutions for all.

matrices

I have a problem and a proposed solution. Please tell me if I'm correct.

Problem: Let $A$ be a square matrix. Show that if the system $AX=B$ has a unique solution for some particular column vector B, then it has a unique solution for all $B$.

Solution: If $AX=B$ has a unique solution for some column vector $B$, then $A$ in reduced row echelon form has a pivot in each column and $A$ can be reduced to $I_n$, for $A$,$\\ n \times n$. Since the number of equations = the number of unknowns, we will have column vector $(n \times 1)$ of $x_i$'s = column vector $n \times 1$ of $b_i$'s. Hence, varying $B$ is equivalent to varying $X$ and will create a new solution for every change made to $B$.

Thanks!

Best Answer

Hence, varying B is equivalent to varying X and will create a new solution for every change made to B.

This statement is not precise. There are several ways to fix it, depending on how much you know. Do you know what non-singular matrices are? Do you know that they are invertible and that if $A$ can be reduced to $I_n$ then it is nonsingular? If you know that $A$ is invertible then from $AX = B$ you can write $X = A^{-1}B$ so there is only one choice for $X$ no matter what $B$ is.

Another way of seeing that the solution is unique (that doesn't use non-singularity explicitly) is the following. As $A$ reduces to $I_n$, when you reduce the augmented matrix $[A \ | \ B]$ do any of the choices you make depend on $B$? Try arguing that no matter what that last column is, reducing the augmented matrix will always yield something with a pivot in each of the first $n$ columns. Thus there will be no independent variables in your solution. I suspect that this is what you had in mind with what you wrote, but you should explain it a little further.