For odd $n$, $n\times n$ matrix with real entries have at least one real eigenvalues.

linear algebra

Reading a linear algebra textbook, I encounter

For odd $n$, $n\times n$ matrix with real entries have at least one real eigenvalues.

I noticed in Determinant-free proof that a real $n \times n$ matrix has at least one real eigenvalue when $n$ is odd., the proof without determinant are posts.

It seems for me with the trick of determinant, this problem might be easy. The first thing I came up with is $\det(A_n-\lambda I_n)$ but this does not guarantee that the solution has at least one real eigenvalues. i.e., setting characteristic polynomial as $f_{A}(\lambda) =\lambda^n + a_{n-1} \lambda^{n-2} + \cdots + a_0$ with $a_i \in \mathbb{R}$, we do not know one of $\lambda$ should be real.

What can be the simple proof with determinants?

Best Answer

Setting the characteristic polynomial as $f_{A}(\lambda) =\lambda^n + a_{n-1} \lambda^{n-2} + \cdots + a_0$ with $a_i \in \mathbb{R}$, we know that $f_A$ has at least one real root when $n$ is odd by the intermediate value theorem.

As $\lambda \to -\infty$, $f_A(\lambda) \to - \infty$; as $\lambda \to +\infty$, $f_A(\lambda) \to +\infty$. We know both of these, because that's the behavior of $\lambda^n$ for large $|\lambda|$, and $\lambda_n$ will eventually dominate over the other terms.

Therefore there must be some very negative value of $\lambda$ where $f_A(\lambda)<0$, and some very positive value of $\lambda$ where $f_A(\lambda)>0$. Between these, there must be a value of $\lambda$ where $f_A(\lambda)=0$, and this gives us a real eigenvalue.