If $B$ commutes with a positive semidefinite matrix $A$, then $B$ commutes with $\sqrt A$

eigenvalues-eigenvectorslinear algebrapositive-semidefinite

How to prove if $B$ commutes with a positive semidefinite matrix $A$, then $B$ commutes with $\sqrt A$.

The logic of notes I refer to is like this: first if A is positive semidefinite, then
$$\sqrt A=\text{polynomial of } A\\
=P\begin{pmatrix}\sqrt{\lambda_1}&&\\&\sqrt{\lambda_2}\\&&\ddots\end{pmatrix}P^T.$$

And by this theorem we can deduce that if $B$ commutes with a positive semidefinite matrix $A$, then $B$ commutes with $\sqrt A$. Using the fact that if $B$ commutes with $A$, then B commutes with some polynomials of $A$. But actually I don't quite understand what is a polynomial of a matrix. And how can we show that if $B$ commutes with $A$, then B commutes with some polynomials of $A$.

Best Answer

A polynomial of a matrix means a "substitution" of the matrix for the indeterminate $x$ into a polynomial in $x$. E.g. if $f(x)=c_0+c_1x+c_2x^2+\cdots+c_mx^m$ is a polynomial with complex coefficients, then $f(A)$ means the sum $c_0I+c_1A+c_2A^2+\cdots+c_mA^m$.

Let $f$ be any polynomial such that $f(\lambda_i)=\sqrt{\lambda_i}$ for every $i$ (e.g. take $f$ as a Lagrange interpolation polynomial). Then $f(A)=f(P\Lambda P^{-1})=Pf(\Lambda)P^{-1}=P\sqrt{\Lambda}P^{-1}=\sqrt{A}$.

Now, if $B$ commutes with $A$, then $B$ also commutes with all nonnegative integer powers of $A$: $BA^k=(BA)A^{k-1}=ABA^{k-1}=A(BA)A^{k-2}=AABA^{k-2}=\cdots=A^kB$. Hence $B$ commutes with all linear combinations of nonnegative integer powers of $A$, i.e. $B$ commutes with all polynomials in $A$. Since $\sqrt{A}$ is a polynomial in $A$, the conclusion follows.

Related Question