[Math] simple relation between the entropy of a matrix and its characteristic polynomial

determinantsentropyit.information-theorylinear algebramatrix equations

Assume $M$ is an invertible positive matrix of rank $N$. The Von Neumann entropy $H$ of a matrix $M$ with eigenvalues $\{ \lambda_n \}$ is

$H[M] = -\sum_{n=1}^N \lambda_n \ln \lambda_n$.

In principle, the eigenvalues are encoded in the characteristic polynomial

$\phi_t (M) = \mathrm{det}(tI-M) = \prod_n(t-\lambda_n) = t^{N} + a_{N-1} t^{N-1} \cdots + a_1 t +a_0 $.

The trace $\mathrm{Tr}\, M$ is given by the coefficient $a_{N-1}$ in the characteristic polynomial:

$\lim\limits_{t \to \infty} \dfrac{\phi_t (M)-t^N}{t^{N-1}} = a_{N-1} = (-1)^{N-1} \, \mathrm{Tr}\, M = (-1)^{N-1} \sum_n \lambda_n$.

Is there a similar relationship between the entropy and the characteristic polynomial?

Best Answer

I'm not sure if this qualifies as simple (or if this is helpful at all), but we have $$ \frac{\phi'_M(t)}{\phi_M(t)}=\sum_n\frac{1}{t-\lambda_n} $$

Using the residue theorem, we can write $$ H[M]=\frac{-1}{2\pi i}\oint\frac{\phi'_M(z)}{\phi_M(z)}z\log(z)\,dz $$ where the integral is taken over a closed contour containing all of the eigenvalues of $M$ (I guess we're either working on the Riemann surface of $\log(z)$, or we chose a branch of $\log(z)$).

Related Question