[Math] Does eigenvalue of sum of matrices equals to sum of their eigenvalues

eigenvalues-eigenvectorslinear algebramatrices

Let matrix A be real, square and nonsingular. If $\det (A^2 – 2A + I) = 0$, then 2 is an eigenvalue of matrix $(A + A^{-1})$.

Solution: Cayley-Hamilton theorem claims that $p(A) = 0$, where $p(\cdot)$ is characteristic polynomial. Hence, we can replace A with $\lambda$ and after solving quadratic equation find that $\lambda = 1$.

Next, I'm aware of the property that $A^{-1}$ has $\lambda^{-1}$, i.e. 1 is also an eigenvalue of $A^{-1}$.

My question is: what to do now? Simply, to sum $1+1=2$, but I don't see why eigenvalue of sum of matrices should be the sum of their eigenvalues. Could you help me figure it out?

Best Answer

Your application of the Cayley-Hamilton theorem is mistaken for many reasons: you don't know the characteristic polynomial here, and the conclusion of the theorem does not give you what you want (namely information about eigenvalues); and if you would know the characteristic polynomial of$~A$, then you could find the eigenvalues of$~A$ (as its roots) without using the C-H theorem.

However it happens that you can argue that $\lambda=1$ is an eigenvalue of $A$ here (and that without using that $A$ is invertible), but only due to a rather exceptional circumstance, namely that $X^2-2X+1=(X-1)^2$ has $1$ as unique root. Here's how: it is given that $A^2-2A+I$ is singular, so it has some nonzero vector $v$ in its kernel. Now the fact that $0=(A^2-2A+I)v=(A-I)(A-I)v$ means that either $(A-I)v=Av-v$ is an eigenvector of$~A$ for $\lambda=1$, namely if it is nonzero, or else (if $Av-v$) $v$ itself is an eignevector of$~A$ for $\lambda=1$. Either way $\lambda=1$ is an eigenvalue of$~A$. (Note that if your polynomial had factored into distinct linear factors, you would only get that one of their roots is an eigenvalue.)

Now with an eignevector $w$ of $A$ for $\lambda=1$ things are easy: $(A+A^{-1})w=\lambda w+\lambda^{-1}w=2w$ (like you wanted to argue; this part is correct because you are reasoning for a specific eigenvector) so $w$ is also an eigenvector of $A+A^{-1}$, for the eigenvalue$~2$.

Note that you can equally well conclud for any pair of scalars $a,b$e that $aA+bA^{-1}$ has an eigenvalue $a+b$ (or more generally that any combination of powers of $A$ has as eignevalue the corresponding combination of powers of $\lambda=1$), something that a clever manipulation of determinants (as in the asnwer by M. Vinay) would not give you.