[Math] Showing that e$^\lambda$ is an eigenvalue of e$^A$

linear algebramatrices

Let $\lambda$ be an eigenvalue of an n x n matrix Aand let x be an eigenvector belonging to $\lambda$. Show that e$^\lambda$ is an eigenvalue of e$^A$ and x is an eigenvector of e$^A$ belonging to e$^\lambda$.

A study buddy and I were discussing this and we came up with two different approaches, though we're unsure if our work is correct.

My thoughts:

A x = $\lambda$ x

*e*$^A$ = Xe$^D$X$^{-1}$

A = XDX$^{-1}$

Where D is a diagonalizable n x n matrix and X is an invertible n x n matrix.

My attempt was to do this:

*e*$^A$ x = e$^\lambda$ x

*e*$^A$ x – *e*$^\lambda$ x = 0

(*e*$^A$ – *e*$^\lambda$ I) x = 0

(e$^A$ – e$^\lambda$ I) = 0

det(*e*$^A$ – *e*$^\lambda$ I) = 0

det(*e*$^A$) – det(e$^\lambda$ I) = 0

det(e$^A$) = e$^\lambda$

However, this is probably the wrong approach to this problem. Can anyone please give me a tip / help me out here? *e*$^D$ will have diagonals that correspond to A's eigenvalues, yes? Should I start from there instead?

Best Answer

Suppose $Av=\lambda v$. Then $A^k v = \lambda^k v$. Then $e^{A}v = \sum_k {1 \over k!} A^k v = \sum_k {1 \over k!} \lambda^k v = e^\lambda v$.