[Math] Showing every eigenvector is a multiple of a known eigenvector

linear algebra

I have the following question:

Let $A$ be an $n \times n$ matrix. Suppose $A$ has distinct eigenvalues $\lambda_1,…,\lambda_n$ and let $v_1, … , v_n$ be eigenvectors with these eigenvalues. Show that every eigenvector is a multiple of one of the vectors $v_i$. Determine the matrix from the eigenvalues and eigenvectors.

Firstly, I understand that the question is true, but I'm having trouble formulating a decent proof.

Suppose we have an eigenvector $v_k \neq v_i, \ \ \forall i \in \left\{1,…,n\right\}$

Then, as $v_k$ is an eigenvector, we have that it satisfies

$Av_k = \mu v_k$

Where $\mu$ is the eigenvalue associated with the eigenvector $v_k$. However, $A$ has $n$ eigenvalues at most and so therefore $\mu = \lambda_m$ for some $m \in \left\{1,…,n\right\}$

Hence

$Av_k = \lambda_m v_k$

We also know that:

$Av_m = \lambda_m v_m$

So:

$A(v_k + v_m) = \lambda_m (v_k + v_m)$

It's obvious to me that this can only be true if $v_k$ is a multiple of $v_m$, say: $v_k = \alpha v_m$, as this is will allow the $\alpha$ to cancel on both sides – but I can't seem to show this rigorously (I get the impression it's a very simple step I'm failing to realise here).

Best Answer

Your reasoning up to $Av_k = \lambda_m v_k$ is correct. To continue, you need to show that $v_k$ is a multiple of $v_m$. Here is a way of doing it: Since the eigenvectors form a basis for $\mathbb{C}^n$, you can write $v_k = \sum_i \alpha v_i$. Multiplying across by $A$ gives $A v_k = \lambda_m v_k = \sum_i \alpha_i \lambda_m v_i = \sum_i \alpha_i \lambda_i v_i$. Here is where linear independence comes in, subtracting gives $\sum_i \alpha_i (\lambda_i-\lambda_m) v_i = 0$, from which we have $\alpha_i (\lambda_i-\lambda_m) = 0$ for all $i$. It follows that if $\lambda_i \neq \lambda_m$, then $\alpha_i = 0$, hence $v_k = \alpha_m v_m$.

Here is another way to do it:

Since the eigenvalues are distinct, the eigenvectors are linearly independent, and since there are $n$ of them, they form a basis for $\mathbb{C}^n$. Hence the matrix $V = \begin{bmatrix} v_1 & \cdots & v_n \end{bmatrix}$ is invertible. It is easy to see that $A V = V \Lambda$, where $\Lambda$ is a diagonal matrix with entries $\lambda_1,...,\lambda_n$. So we have $V^{-1} A V = \Lambda$.

Now suppose $A v = \lambda v$, for some $v \neq 0$. We can write $A V V^{-1} v = \lambda V V^{-1} v$, and premultiplying across by $V^{-1}$ gives $\Lambda u = \lambda u$, where $u = V^{-1} v$. If we rewrite this as $(\Lambda -\lambda I) u = 0$, we can see that we must have $\lambda = \lambda_k$ for some $k$ (otherwise the matrix $\Lambda -\lambda I$ would be invertible and that would imply $u=0$). Furthermore, we must have $u = \alpha e_k$, where $e_k$ is the unit vector with zeros everywhere except a one in the $k$th position, and $\alpha \neq 0$. Since $v = V u$, we have $v = \alpha V e_k = \alpha v_k$, the $k$th eigenvector.

The above also shows that $A = V \Lambda V^{-1}$, which gives $A$ in terms of its eigenvalues and eigenvectors.