Does anyone know of an explicit matrix expression for the covariance of a linear and quadratic form? That is,
$\mathrm{Cov}[\mathbf{a' y},\mathbf{y' Hy}]$
where $\mathbf{y}\sim \mathcal N(\boldsymbol{\mu},\boldsymbol{\Sigma})$.
I'm particularly interested in the case where
$\boldsymbol{\mu}=\mathbf{0}$, and I think this simplifies (without the normal assumption) to
$\mathbb E[(\mathbf{a'y})(\mathbf{y'Hy})]$. Since this involves cubic terms it probably isn't going to be simple.
Best Answer
This is straightforward in the case you're interested in (${\boldsymbol \mu} = 0$) without using matrix algebra.
To clarify the notation ${\bf y} = \{y_{1}, ..., y_{n} \}$ is a multivariate normal random vector, ${\bf a} = \{a_{1}, ..., a_{n} \}$ is a row vector, and ${\bf H}$ is an $n \times n$ matrix with entries $\{ h_{jk} \}_{j,k=1}^{n}$. By definition (see e.g. page 3 here) you can re-write this covariance as $$ {\rm cov}({\bf a}'{\bf y}, {\bf y}' {\bf H} {\bf y}) = {\rm cov} \left( \sum_{i=1}^{n} a_i y_i, \sum_{j=1}^{n} \sum_{k=1}^{n} h_{jk} y_{j} y_{k} \right) = \sum_{i,j,k} {\rm cov}( a_i y_i, h_{jk} y_{j} y_{k} ) $$ where the second equality follows from bilinearity of covariance. When ${\boldsymbol \mu} = 0$, each term in the sum is $0$ because $${\rm cov}( a_i y_i, h_{jk} y_{j} y_{k} ) \propto E(y_i y_j y_k) - E(y_i) E(y_k y_j) = 0$$ The second term is zero because $E(y_i) = 0$. The first term is zero because the third order mean-centered moments of a multivariate normal random vector are 0, this can be seen more clearly by looking at the each cases: