Vanishing trace implies convergence to zero

convergence-divergenceeigenvalues-eigenvectorstrace

Let $A_n, B_n$ be sequences of $k \times k$ matrices such that $A_n$ converges to a positive semi-definite matrix $A$ and $B_n$ converges to a positive definite matrix $B$ (w.r.t. Frobenius norm). Suppose that $\mathrm{Tr}(A_n'B_n) \to 0$ as $n\to \infty$. I want to show that $A_n$ converges to zero in this case.

I can see intuitively that this must be the case because the trace is an inner product. Hence $\langle A_n,B_n\rangle \to 0$ and $B_n \to B>0$ should imply $A_n \to 0$.

I would like to find an elementary proof of this.

Thanks a lot for your help.

Best Answer

Note: if $A,B$ are symmetric, the trace of $AB$ is the sum of $\langle Ae,\,Be\rangle$ over any orthonormal basis. In particular, if $B$ is positive definite (with minimum eigenvalue $\lambda$) and $A$ is positive semidefinite, it is greater or equal to the sum of the $\lambda \langle Ae,\,e\rangle=\lambda Tr(A)$.

So let $a_n > 0$ be going to zero such that $A_n+a_n$ is positive semidefinite. We know that there are $r > 0$, $N \geq 0$ such that $B_n-rI$ is positive semidefinite if $n \geq N$. Thus, the limsup of $r Tr(A_n)$ is zero, thus, $Tr(A)=0$ so $A$ is zero.

Related Question