[Math] Why if the columns of a matrix are not linearly independent the matrix is not invertible

determinanteigenvalues-eigenvectorsmatrices

Why if the columns of a matrix are not linearly independent the matrix is not invertible?

I have watched this video about eigenvalues and eigenvectors by Sal from Khan Academy, where he says that for $\lambda$ to be an eigenvalue for the matrix $A$, the following must be true

$$A \cdot \vec{v} = \lambda \cdot \vec{v} \\
\vec{0} = \lambda \cdot \vec{v} – A \cdot \vec{v} \\
\vec{0} = (\lambda – A )\cdot \vec{v} \\
\vec{0} = (\lambda \cdot I – A )\cdot \vec{v}$$

and the determinant of $(\lambda \cdot I – A )$ must be $0$, or in other words $(\lambda \cdot I – A )$ is not invertible, or in other words the columns of $(\lambda \cdot I – A )$ are linearly dependent, or the nullspace of $(\lambda \cdot I – A )$ is non trivial.

Could someone explain me better these statements? What's the relation between a statement and the other?

I understood some stuff, but some other clarifications might help too.

Best Answer

Here's an answer that completely avoids determinants. Determinants are heinously overrated.

Let $\vec v_1,\vec v_2,\dotsc,\vec v_n$ be the columns of a matrix $A$. That is, $$ A= \begin{bmatrix} \vec v_1 & \vec v_2 & \dotsb & \vec v_n \end{bmatrix} $$ Now, suppose the columns of $A$ are not linearly independent. Then there exist scalars $\lambda_1,\lambda_2,\dotsc,\lambda_n$ not all zero such that $$ \lambda_1\vec v_1+\lambda_2\vec v_2+\dotsb+\lambda_n\vec v_n=\vec 0\tag{1} $$ But (1) may be re-written in matrix form as $$ \begin{bmatrix} \vec v_1 & \vec v_2 & \dotsb & \vec v_n \end{bmatrix} \begin{bmatrix} \lambda_1\\ \lambda_2 \\ \vdots\\ \lambda_n \end{bmatrix} =\vec 0 $$ Putting $$ \vec \lambda= \begin{bmatrix} \lambda_1\\ \lambda_2 \\ \vdots\\ \lambda_n \end{bmatrix} $$ then gives $A\vec\lambda=\vec 0$ where $\vec \lambda\neq\vec 0$. Hence $A$ has a nontrivial nullspace and is thus not invertible.