[Math] Intuition: why distinct eigenvalues $\implies$ linearly independent eigen vectors

eigenvalues-eigenvectorslinear algebra

Suppose you have an $~n\times n~$ matrix with $~n~$ distinct (not repeated) eigenvalues.

There is a theorem telling us that the eigen vectors corresponding to these eigenvalues must be linearly independent.

I can basically follow the proof, but I am looking for an intuitive explanation of why this is the case.

Can anyone offer some insight?

Best Answer

I'm really sorry for the hand made graphs but I believe they can help

Think about the opposite case, can dependent eigen vectors have distinct eigen values?

In $\mathbb{R}^2$, the only way for two vectors to be dependent is to be collinear

enter image description here

as both have the same span, it's easy to see why they can't have distinct eigen values.

Now in $\mathbb{R}^3$ the only way for $3$ vectors to be dependent is to be coplanars

enter image description here

Here it is a little bit trickier to visualize, you can't have a transformation that causes two vectors to stretch with distinct factors without distorting the plane.

For example assume the three vectors $p_1,p_2,p_3$ to be eigen vectors for transformation $T$ can $p_1$ stretched with $\lambda_2=2$ , $p_3$ stretched with $\lambda_3=3$.

It can't be the case since the resulted plane will not be linear:

enter image description here

For both cases $\mathbb{R}^2,~\mathbb{R}^3$ the only way to keep linearity with dependent eigen vectors would be to have non distinct eigen values.

Hence, since the transformation is linear and the eigen values are distinct the eigen vectors cannot be dependent.

You can go from here and generalize to: for $\mathbb{R}^n$ if there exist $n$ distinct eigen value then there must be $n$ linearly independent eigen vector.

Related Question