How can we evaluate the characteristic polynomial with a matrix as the parameter

characteristic polynomialeigenvalues-eigenvectorslinear algebramatricespolynomials

For any polynomial p(x) = $a_0+a_1x+· · ·+a_kx^k$
and any square matrix A, p(A) is defined
as p(A) = $a_0I + a_1A + · · · + a_kA^k$
. Show that if v is any eigenvector of A and $χ_A(x)$ is
the characteristic polynomial of A, then $χ_A(A)v$ = 0, Deduce that if A is diagonalisable
then $χ_A(A)$ is the zero matrix

I don't get what it means here to apply the characteristic polynomial with the matrix as the parameter. Does it subtract from each term in p(A)?

Best Answer

The definition given for $p$ is just showing you how to treat $\chi_A(A)$. The characteristic polynomial typically takes scalar values and has roots at the eigenvalues of $A$, but now instead of a function from $\mathbb{R}\to\mathbb{R}$ we treat it as a function $M_{n\times n}(\mathbb{R})\to M_{n\times n}(\mathbb{R})$ where we plug in a matrix and obtain a matrix as a result. The function $p$ is just being used as an example.

Related Question