Solved – Covariance of a linear and quadratic form of a multivariate normal

covariancemultivariate analysisnormal distribution

Does anyone know of an explicit matrix expression for the covariance of a linear and quadratic form? That is,

$\mathrm{Cov}[\mathbf{a' y},\mathbf{y' Hy}]$
where $\mathbf{y}\sim \mathcal N(\boldsymbol{\mu},\boldsymbol{\Sigma})$.

I'm particularly interested in the case where
$\boldsymbol{\mu}=\mathbf{0}$, and I think this simplifies (without the normal assumption) to

$\mathbb E[(\mathbf{a'y})(\mathbf{y'Hy})]$. Since this involves cubic terms it probably isn't going to be simple.

Best Answer

This is straightforward in the case you're interested in (${\boldsymbol \mu} = 0$) without using matrix algebra.

To clarify the notation ${\bf y} = \{y_{1}, ..., y_{n} \}$ is a multivariate normal random vector, ${\bf a} = \{a_{1}, ..., a_{n} \}$ is a row vector, and ${\bf H}$ is an $n \times n$ matrix with entries $\{ h_{jk} \}_{j,k=1}^{n}$. By definition (see e.g. page 3 here) you can re-write this covariance as $$ {\rm cov}({\bf a}'{\bf y}, {\bf y}' {\bf H} {\bf y}) = {\rm cov} \left( \sum_{i=1}^{n} a_i y_i, \sum_{j=1}^{n} \sum_{k=1}^{n} h_{jk} y_{j} y_{k} \right) = \sum_{i,j,k} {\rm cov}( a_i y_i, h_{jk} y_{j} y_{k} ) $$ where the second equality follows from bilinearity of covariance. When ${\boldsymbol \mu} = 0$, each term in the sum is $0$ because $${\rm cov}( a_i y_i, h_{jk} y_{j} y_{k} ) \propto E(y_i y_j y_k) - E(y_i) E(y_k y_j) = 0$$ The second term is zero because $E(y_i) = 0$. The first term is zero because the third order mean-centered moments of a multivariate normal random vector are 0, this can be seen more clearly by looking at the each cases:

  • when $i,j,k$ are distinct, then $E(y_i y_j y_k)=0$ by Isserlis' Theorem
  • when $i\neq j = k$, then we have $E(y_i y_j y_k) = E(y_i y_{j}^2)$. First we can deduce from here that $E(y_i | y_j=y) = y \cdot \Sigma_{ij}/\Sigma_{jj}$. Therefore, $E(y_{i} y_{j}^2 | y_{j} = y) = y^3 \cdot \Sigma_{ij}/\Sigma_{jj}$. Therefore, by the law of total expectation, $$E(y_i y_{j}^2) = E( E(y_{i} y_{j}^2 | y_{j} = y) ) = E(y^3 \Sigma_{ij}/\Sigma_{jj} ) = E(y^3) \cdot \Sigma_{ij}/\Sigma_{jj} = 0 $$ where $E(y_{j}^3) = 0$ because $y_j$ being symmetrically distributed with mean 0 implies that $y_{j}^3$ is also symmetrically distributed with mean 0.
  • when $i=j=k$, $E(y_i y_j y_k) = E(y_{i}^3) = 0$ by the same rationale just given.
Related Question