Linear Algebra – Prove Eigenvalue Correspondence for Invertible Matrix

determinanteigenvalues-eigenvectorslinear algebramatrices

If A is invertible, prove that $\lambda \neq 0$, and $\vec{v}$ is also an eigenvector for $A^{-1}$, what is the corresponding eigenvalue?

I don't really know where to start with this one. I know that $p(0)=det(0*I_{n}-A)=det(-A)=(-1)^{n}*det(A)$, thus if both $p(0)$ and $det(0) = 0$ then $0$ is an eigenvalue of A and A is not invertible. If neither are $0$, then $0$ is not an eigenvalue of A and thus A is invertible. I'm unsure of how to use this information to prove $\vec{v}$ is also an eigenvector for $A^{-1}$ and how to find a corresponding eigenvalue.

Best Answer

If you have $v\neq 0$ such that $A v = \lambda v$, and $A$ is invertible, then $v = \lambda A^{-1} v$.

What does that tell you about $\lambda$?

$\lambda$ must be non-zero, otherwise this would mean $v=0$, a contradiction.

How do you use the above to find an eigenvalue corresponding to $v$?

Since $\lambda \neq 0$, we have $A^{-1} v = \frac{1}{\lambda} v$, hence the eigenvalue is $\frac{1}{\lambda}$.