[Math] True or false statements about a square matrix

determinantinverselinear algebraproof-verificationsystems of equations

Consider the following four statements about an $n \times n$ matrix $A$.

$(i)$ If $det(A) \neq 0$, then $A$ is a product of elementary matrices.

$(ii)$ The equation $Ax=b$ can be solved using Cramer's rule for any $n \times 1$ matrix b.

$(iii)$ If $det(A)=0$, then either $A$ has a zero row or column, or two equal rows or columns.

$(iv)$ If $B$ is an $l\times n$ matrix, then the rows of $BA$ are linear combinations of the columns of $A$.

How many of the preceding statements are always true?

$(a)$ None.

$(b)$ One.

$(c)$ Two.

$(d)$ Three.

$(e)$ All of them.

My reasoning:

I am pretty sure that $(i)$ is true. Since the determinant of $A$ is not zero, that means that the matrix is invertible and hence, it can be expressed as a product of elementary matrices.

I don't think $(ii)$ is true. There might be a matrix b such that when I attempt to use Cramer's rule, the denominator is $0$ which is a problem.

I don't think $(iii)$ is true either. A zero row would imply that the determinant is zero or if two rows are the same but not columns. I don't think the columns are an issue so $(iii)$ is false as well.

I'm not 100% sure about statement $(iv)$ but I don't think this is true either since I can always express the matrix $Ax=b$ multiply as a linear combination of the $x_1,x_2,x_3,…$.

Thus, only one statement $(i)$ is true.

Can someone confirm my reasoning for each statement?

Best Answer

For $(ii)$, did you mean to say "...might be a matrix $A$ such that when I attempt to use Cramer's rule..."? The denominator in the Cramer's rule is $\det(A)$, which has nothing to do with $b$. If $\det(A)=0$, we can't solve the problem using Cramer's rule.

$(iii)$ is incorrect, because $\det(A)=0$ only tells us that the rows/columns of $A$ are linearly dependent, or that there is at-least one zero row/column in the echelon form of $A$, not in $A$ itself. There might be zero or identical rows/columns, and then the determinant would certainly be $0$, but this is not necessary. For example, $\det\Big(\begin{bmatrix}1&4\\9&36\end{bmatrix}\Big)=0$.

$(iv)$ is incorrect too. The rows of $BA$ are linear combinations of the rows of $A$. It is easy to see why this is true through the block matrix notation:

$BA=\begin{bmatrix}b_{11}&b_{12}&...&b_{1n}\\b_{21}&b_{22}&...&b_{2n}\\\vdots&\vdots&...&\vdots\\b_{l1}&b_{l2}&...&b_{ln}\end{bmatrix}\cdot\begin{bmatrix}A_1\\A_2\\\vdots\\A_n\\\end{bmatrix}=\begin{bmatrix}b_{11}A_1+b_{12}A_2+...+b_{1n}A_n\\b_{21}A_1+b_{22}A_2+...+b_{2n}A_n\\\vdots\\b_{n1}A_1+b_{n2}A_2+...+b_{nn}A_n\\\end{bmatrix}$

where $A_i$ is the $i^{th}$ row of $A$.

Related Question