[Math] Cauchy-Schwarz matrix inequality for random vectors

inequalitymatricesprobabilitystatistics

If $X$ and $Y$ are random scalars, then Cauchy-Schwarz says that
$$| \mathrm{Cov}(X,Y) | \le \mathrm{Var}(X)^{1/2}\mathrm{Var}(Y)^{1/2}.$$
If $X$ and $Y$ are random vectors, is there a way to bound the covariance matrix $\mathrm{Cov}(X,Y)$ in terms of the matrices $\mathrm{Var}(X)$ and $\mathrm{Var}(Y)$?

In particular, is it true that
$$\mathrm{Cov}(X,Y) \le \mathrm{Var}(X)^{1/2}\mathrm{Var}(Y)^{1/2},$$
where the square roots are Cholesky decompositions, and the inequality is read as meaning that the right hand side minus the left hand side is positive semidefinite?

Best Answer

There is a generalization of Cauchy Schwarz inequality from Tripathi [1] that says that: \begin{equation} \mathrm{Var}(Y) \ge \mathrm{Cov}(Y,X)\mathrm{Var}(X)^{-1}\mathrm{Cov}(X,Y) \end{equation} in the sense that the diference is semidefinite positive. He actually says that a student asked about it and couldn't find any other reference (1998!).

[1]: G. Tripathi, ”A matrix extension of the Cauchy-Schwarz inequality”, Economics Letters, vol. 63, nr 1, s. 1–3, apr. 1999, doi: 10.1016/S0165-1765(99)00014-2.