Matrix of powers of eigenvalues

determinanteigenvalues-eigenvectorslinear algebratrace

My question is about part (b) of the problem below.


Problem.
(a) Suppose that $\lambda_1, \dots, \lambda_r$ are distinct, non-zero complex numbers. Show that the matrix $$B = \begin{bmatrix} \lambda_1 & \lambda_2 & \cdots & \lambda_r \\ \lambda_1^2 & \lambda_2^2 & \cdots & \lambda_r^2 \\ \vdots & \vdots & \cdots & \vdots \\ \lambda_1^r & \lambda_2^r & \cdots & \lambda_r^r \end{bmatrix}$$ is invertible.

(b) Suppose that $A$ is a complex square matrix such that the trace of $A^k$ is zero for every $k \in \mathbb{N}$. Show that all eigenvalues of $A$ are zero.


Question. I've proved part (a). In part (b), I've showed that if $\lambda_1, \dots \lambda_n$ are the eigenvalues of matrix $A$, then $tr(A^k) = \lambda_1^k + \dots + \lambda_n^k$. My idea now is to select the distinct, nonzero eigenvalues $\lambda_1, \dots \lambda_r$ (after reindexing) and arrange these into a matrix of the format of matrix $B$. Assuming that the algebraic multiplicity of each of these eigenvalues is 1, then we get
\begin{align}
\begin{bmatrix} \lambda_1 & \lambda_2 & \cdots & \lambda_r \\ \lambda_1^2 & \lambda_2^2 & \cdots & \lambda_r^2 \\ \vdots & \vdots & \cdots & \vdots \\ \lambda_1^r & \lambda_2^r & \cdots & \lambda_r^r \end{bmatrix} \begin{bmatrix} 1 \\ 1 \\ \vdots \\ 1 \end{bmatrix} = \begin{bmatrix} tr(A) \\ tr(A^2) \\ \vdots \\ tr(A^k) \end{bmatrix}, \tag{1}
\end{align}

where the vector of traces is the zero vector by the hypotheses of the problem. So, (1) shows a matrix that is supposedly invertible (per part a), and yet it is mapping a nonzero vector to zero, a contradiction.

The problem I'm having is: What if there is an eigenvalue $\lambda_i$ that has algebraic multiplicity greater than 1? Then, if I take only the distinct eigenvalues, I can no longer assure that the right hand side is the vector of traces, so the right hand side is no longer necessarily the zero vector. On the other hand, if I construct matrix $B$ using multiple copies of $\lambda_i$, then $B$ will no longer be invertible. So in short: I don't know how to account for the possibility of nonzero eigenvalues with nontrivial algebraic multiplicity. Lastly, I think the resolution may have something to do with the algebraic multiplicities of the eigenvalues "merging" with powers, as described on stack exchange here, but I'm not sure.

Best Answer

You have the right idea. Assume by way of contradiction that you have distinct, nonzero eigenvalues $\lambda_1,\dots,\lambda_r$ of $A$ with multiplicities $m_1,\dots,m_r$, respectively. Then \begin{align} \begin{bmatrix} \lambda_1 & \lambda_2 & \cdots & \lambda_r \\ \lambda_1^2 & \lambda_2^2 & \cdots & \lambda_r^2 \\ \vdots & \vdots & \cdots & \vdots \\ \lambda_1^r & \lambda_2^r & \cdots & \lambda_r^r \end{bmatrix} \begin{bmatrix} m_1 \\ m_2 \\ \vdots \\ m_r \end{bmatrix} = \begin{bmatrix} tr(A) \\ tr(A^2) \\ \vdots \\ tr(A^r) \end{bmatrix}, \tag{1} \end{align} since $tr(A^k)=m_1\lambda_1^k+\cdots+ m_r\lambda_r^k$. But this means that $m_i=0$ for $1\leq i \leq r$ which contradicts the assumption.

Related Question