Why is the following question about a matrix false

determinantlinear algebramatrices

Im studying off of someone else's notes

Consider the following statement to be false
Suppose $A =$

$$
\begin{bmatrix}
2 & 6\\
1 & 3\\
\end{bmatrix}
$$

Then the $det(A)=0$ and the rows and columns are all distinct and not full of zeros.

I don't understand why this is false. The determinant is in fact $0$ and the rows and columns are not 0's? im confused

Best Answer

I don't know the context, but here's my best guess at what's going on:

My interpretation of our "statement" is that it makes multiple claims:

  1. $det(A)=0$
  2. (a) The rows and columns of $A$ are all distinct and (b) $A$ is not all zeroes

(2) is confusing because (2b) follows from (2a). If the rows and columns of $A$ are all distinct, then $A$ cannot be all zeroes (a matrix of all zeroes does not have distinct rows and columns). So we can drop (2b) and just look at (2a), which gives:

  1. $det(A)=0$
  2. The rows and columns of $A$ are all distinct

As you pointed out, $det(A)$ is zero. Then the statement is false because the rows and columns are not distinct. "Distinct" is probably meant to mean "linearly independent". The 2nd row is equal to the first times two, and the 2nd column is equal to the first times three, so the rows and columns are not linearly independent; they are redundant, and thus not distinct.

As TokenToucan and Elliot G have pointed out, the statement is probably meant to show that you can have matrices with all nonzero entries that still have $det(A)=0$ because they have two or more linearly dependent rows or columns.

Note that 2x2 matrices will have linearly dependent rows if and only if they have linearly dependent columns. 3x3 and bigger matrices can have two linearly dependent rows (and $det=0$) without having any linearly dependent columns. The reverse is also true (just take the transpose). But in general, for square matrices, if all the columns are linearly dependent, then all the rows are too (see this question).