Linear Algebra – If $A\ge0$ and $AB$ is Self-Dual, then $|\langle ABx,x\rangle| \le \Vert B\Vert_2 \langle Ax,x\rangle$

inner-productslinear algebramatrix-norms

Claim to prove (exercise 91.7 of Halmos 1958):

Let $A$ and $B$ be linear transformations of a finite-dimensional real or complex inner product space. If $A$ is positive semi-definite and $AB$ is self-dual (i.e. Hermitian or symmetric in the real or complex case), then $|\langle ABx,x\rangle| \le \Vert B\Vert_2 \cdot \langle Ax,x\rangle$.

Note: The absolute value in the claim above was omitted by mistake in the first version of the question. The proof in the accepted answer still works mutatis mutandis.


I really haven't made much progress (and I feel like it should be fairly simple), but here my observations for what it's worth. Each path seems like a dead end…

Clearly $|\langle ABx,x\rangle| \le \Vert B\Vert \Vert x\Vert \Vert Ax\Vert$ and $\langle Ax,x\rangle \le \Vert x\Vert \Vert Ax\Vert$.

By the spectral theorem, $A$ and $AB$ both have an orthogonal basis of eigenvectors with real eigenvalues. Say, $\{e_i\}_{i=1}^n$ are orthonormal eigenvectors of $A$ with eigenvalues $\sigma_i>0$, and $\{f_i\}_i$ are orthonormal eigenvectors of $AB$ with eigenvalues $\lambda_i\in\Bbb R$. Then I can write for example
$$
\langle ABx, x \rangle = \sum_{i} \lambda_i|\langle x, f_i\rangle|^2 = \sum_{i,j} \lambda_i\langle x, f_i\rangle\langle f_i, e_j\rangle\langle e_j, x\rangle,
$$

$$
\langle Ax, x \rangle = \sum_{i} \sigma_i|\langle x, e_i\rangle|^2,
$$

or several other variations of the above. The trouble with this approach is connecting it to $\Vert B\Vert$. I do have a characterization of what $B$ must look like for $AB$ to be self-dual. Let $T$ be the orthogonal matrix given by $Te_i = f_i$. Then let $B=CT^{-1} = CT^*$. We have
$$
\langle Ce_i,e_j\rangle
= \langle Bf_i,e_j\rangle
= \frac{\lambda_i}{\sigma_j}\langle f_i,e_j\rangle,
\quad\textrm{if }\sigma_j\ne0.
$$

Also,
$$
\langle Te_i,e_j\rangle
= \langle e_i,f_j\rangle
= \frac{\sigma_i}{\lambda_j}\langle e_i,Bf_j\rangle
= 0
\quad\textrm{if }\sigma_i=0\land\lambda_j\ne0.
$$

These are conditions on the entries of the matrices $C$ and $T$, and together with $T=T^*$, they completely characterize the possible matrices $B$.

Best Answer

It is equivalent to prove that $A\|B\|-AB$ is positive semidefinite (since it is self-dual).

If $A$ is invertible, then it is equivalent to $A^{1/2}(\|B\|I -B)A^{1/2}$ has all nonnegative eigenvalues, and to prove it, it is sufficient that $x^*(\|B\|I-B)x\ge 0$ for every $x$, that is true for easy reasons.

If $A$ is not invertible, you can project onto its rank and reduce to the previous case. In fact, take the EVD $A=QDQ^*$ and call $C=Q^*BQ$. Notice that $DC=C^*D$ and the thesis becomes the same with $D,C$ instead of $A,B$ and $D$ is diagonal. Suppose WLOG that the zeros eigenvalues are all at the bottom of $D$, and they are $k$. Since $DC$ has the last $k$ rows equal to zero and it's self-adjoint, you can conclude that $DC$ has also the last $k$ columns equal to zero and thus $C = \begin{pmatrix}E&0\\*&*\end{pmatrix}$ where the zero submatrix has dimension $(n-k)\times k$. As a consequence, if $F$ is the maximal non-singular diagonal principal submatrix of $D$, $$x^*DCx = y^*FEy \qquad \|C\| x^*Dx \ge \|E\| y^*Fy $$ where $y$ is the truncation of $x$ and $F$ is now invertible, so we can reduce to the previous case.

Related Question