Solved – Covariance between Linear Combinations of random vectors

covariancerandom variable

Given a random vector $x\sim N(0, \Sigma)$ of dimension $p$ and matrices $A$ and $B$ (both $m\times p$, what is $Cov(Ax, Bx)$?

It seems to me that the covariance should be $A\Sigma B^T$ but I am second guessing this as the result is not guaranteed to be symmetric positive definite.

Am I missing something?

Best Answer

\begin{align} Cov(Ax, Bx) &= E[(Ax-E(Ax))(Bx-E(Bx))^T]\\ &= E[A(x-E(x))(x-E(x))^TB^T] \\ &=AE[(x-E(x))(x-E(x))^T]B^T \\ &= A\Sigma B^T \end{align}

While covariance matrix is symmetric positive semidefinite, cross covariance matrix doesn't have to be.

Note property $4$ at the cross covariance matrix, it is the result of your interest.