[Math] Prove that the determinant of a matrix is zero

determinantlinear algebramatricesproof-writing

Hi I need some help with this question:

Let $A$ be an $n \times n$ matrix, let $i, j, k$ be pairwise distinct indices, $1 \leq i, j, k \leq n$, and let $\lambda,\mu \in \mathbb R$ be arbitrary real numbers. Suppose that $a_k$, the $k-$th row vector of $A$, is equal to $\lambda a_i + \mu a_j$, where $a_i, a_j ∈ \mathbb R^n$ denote the $i-$th and the $j-$th row vectors of $A$ respectively. Prove that $\det(A) = 0$.

I think I need to split the matrix up into two separate ones then use the fact that one of these matrices has either a row of zeros or a row is a multiple of another then use $\det(AB)=\det(A)\det(B)$ to show one of these matrices has a determinant of zero so the whole thing has a determinant of zero. So I was wondering is there a way to split these matrices up so it suits my method?

Best Answer

Hint: The demerminant of a non-invertible matrix remains invariant under elementary row operations. Performing such row operations can you create a row with all components equal to zero? What would it mean for the determinant of a matrix, if there is a row only with zeros?

Related Question