Solved – Variance of the product of a random matrix and a random vector

mathematical-statisticsprobability

If $X$ and $Y$ are independent random variables, then the variance
of the product $XY$ is given by

$
\mathbb{V}\left(XY\right)=\left\{ \mathbb{E}\left(X\right)\right\} ^{2}\mathbb{V}\left(Y\right)+\left\{ \mathbb{E}\left(Y\right)\right\} ^{2}\mathbb{V}\left(X\right)+\mathbb{V}\left(X\right)\mathbb{V}\left(Y\right)
$

If $\mathbf{X}$ and $\mathbf{y}$ are independent matrix and vector
of $m\times m$ and $m\times1$ dimension respectively, then what would be the variance
of the product $\mathbf{X}\mathbf{y}$?

My Attempt

$
\mathbb{V}\left(\mathbf{X}\mathbf{y}\right)=\mathbb{E}\left(\mathbf{X}\right)\mathbb{V}\left(\mathbf{y}\right)\left\{ \mathbb{E}\left(\mathbf{X}\right)\right\} ^{\prime}+\left\{ \mathbb{E}\left(\mathbf{y}\right)\otimes\mathbf{I}_{m}\right\} ^{\prime}\mathbb{V}\left\{ \textrm{vec}\left(\mathbf{X}\right)\right\} \left\{ \mathbb{E}\left(\mathbf{y}\right)\otimes\mathbf{I}_{m}\right\} +\mathbb{V}\left\{ \textrm{vec}\left(\mathbf{X}\right)\right\} \left\{ \mathbb{V}\left(\mathbf{y}\right)\otimes\mathbf{I}_{m}\right\}
$

I know this is not right, at least the last term is wrong. I'd highly appreciate if you give me the right identity or point out any reference. Thanks in advance for your help and time.

Best Answer

I'll assume that the elements of $\mathbf{y}$ are i.i.d. and likewise for the elements of $\mathbf{X}$. This is important, though, so be forewarned!

  1. The diagonal elements of the covariance matrix equal the sum of $m$ products of i.i.d. random variates, so the variance will equal $m \mathbb{V}(x_{ij}y_j)$, which variance you have above in your first row.

  2. The off-diagonal elements all equal zero, as the rows of $\mathbf{X}$ are independent. To see this, without loss of generality assume $\mathbb{E}x_{ij} = \mathbb{E}y_i = 0 \space \forall\thinspace i,j$. Define $\mathbf{x}_i$ as the $i^{\text{th}}$ row of $\mathbf{X}$, transposed to be a column vector. Then:

    $\text{Cov}(\mathbf{x_i^\text{T}y},\mathbf{x_j^\text{T}y}) = \mathbb{E}(\mathbf{x_i^\text{T}y})^\text{T}(\mathbf{x_j^\text{T}y}) = \mathbb{E}\mathbf{y}^{\text{T}}\mathbf{x}_i\mathbf{x}_j^\text{T}\mathbf{y}=\mathbb{E}_y\mathbb{E}_x \mathbf{y}^{\text{T}}\mathbf{x}_i\mathbf{x}_j^\text{T}\mathbf{y}$

    Note that $\mathbf{x}_i\mathbf{x}_j^\text{T}$ is a matrix, the $(p,q)^\text{th}$ element of which equals $x_{ip}x_{jq}$. When $i \ne j$, the expectation with respect to $x$ of $\mathbf{y}^{\text{T}}\mathbf{x}_i\mathbf{x}_j^\text{T}\mathbf{y}$ equals 0 for any $\mathbf{y}$, as each element is just the expectation of the product of two independent r.v.s with mean 0 times $y_py_q$. Consequently, the entire expectation equals 0.

Related Question