Prove or disprove: $\forall f \in \hom(V): v, w$ Eigenvectors of different Eigenvalues in $f \Rightarrow v+w$ is not an Eigenvector in $f$

eigenvalues-eigenvectorslinear algebra

My task is the following:

Let $f : V \to V$ be a linear map and $v, w$ two eigenvectors corresponding to different eigenvalues of $f$.

Prove or disprove, that $v+w$ is never an eigenvector of $f$.

So far, the statement holds for all the examples I checked. So I'm starting to think that this statement can indeed be proven.

My idea

My idea is the following: $v$ and $w$ will obviously be linearly independent, for they belong to different eigenvalues. Let's take all eigenspaces each corresponding to one of the eigenvalues. Now, we take a maximum set of eigenvectors such that they are linearly independent — this guarantees that we can linearly combine each and every eigenvector from them.

We now add arbitrary vectors to turn this set of eigenvectors into a Basis of $V$. We can now represent every eigenvector with coordinates based on this basis such that there's only one value unequal to zero within the coordinate tuple, and the rest is zero. (But this is not entirely true, if we got an eigenvector from a 2-dimensional eigenspace, there might be two values unequal to zero, making the proof more complicated..).

When we now add vectors of different eigenspaces together, we'll get a vector in which a certain combination of coordinate-tuple values will be unequal to zero, and that combination will cause it to not fit into any of the eigenspaces

(for example: we take $v$ from the eigenspace where the first coordinate is unequal to zero, and we take $w$ from the eigenspace where the second coordinat eis unequal to zero. We get a vector $v+w$ where the first and the second coordinate are unequal to zero, but since the vector with first coordinate unequal to zero is in one eigenspace, and the vector with second coordinate unequal to zero is in another, we don't have any eigenspace to put this vector into, thus making it not an eigenvector)

The question

I've given an informal idea on how to prove this above. Is that proof correct? What's a good way to express those thoughts mathematically? Are there any easier, more concise ways of proving the above statement?

Best Answer

I think this can be done much more simply:

We have

$fv = \mu_v v, \tag 1$

$fw = \mu_w w, \tag 2$

with

$\mu_v \ne \mu_w; \tag 3$

we conclude from (3) that $v$ and $w$ are linearly independent. Now suppose

$f(v + w) = \mu(v + w) = \mu v + \mu w; \tag 4$

we also have

$f(v + w) = fv + fw = \mu_v v + \mu_w w; \tag 5$

these two equations yield

$\mu v + \mu w = \mu_v v + \mu_w w \Longrightarrow (\mu - \mu_v)v + (\mu - \mu_w)w = 0; \tag 6$

now the linear independence of $v, w$ implies that

$\mu - \mu_v = \mu - \mu_w = 0 \Longrightarrow \mu_v = \mu = \mu_w, \tag 7$

in contradiction to (3); thus $v + w$ cannot be an eigenvector of $f$. $OE\Delta.$

Related Question