[Math] Expectation and variance of matrix valued random variable

expectationmatricesprobabilityrandom variablesvariance

Suppose I have a discrete matrix-valued random variable $X$, that is, I have defined a set of fixed matrices $\{Y_i\}_{i=1}^n$, and the random variable $X = Y_i$ with probability $\frac{1}{n}$. Is there any coherent theory for investigating the expectation and variance of this r.v. $X$?

It seems that a reasonable(?) definition of $\mathbb{E}[X]$ is

\begin{align*}
\mathbb{E}[X] = \frac{1}{n} \sum_{i}^n Y_i
\end{align*}

which produces a matrix as the expectation. But I have no idea how the variance should be interpreted. Should the usual definition be used?

\begin{align*}
Var(X) = \mathbb{E}[(X-\mathbb{E}[X])^2] = \frac{1}{n}\sum_{i=1}^n (X – \mathbb{E}[X])^2
\end{align*}

What does the squared even mean in this case?

Searching produces a lot of literature of the statistics of random matrices whose individual entries are random variables, not I was not able to find anything on the situation outlined above. Any pointers will be greatly appreciated!

Best Answer

I'm giving you a quick answer because I'm lazy, but hope this helps you!

Treat the $m \times n$ matrix as a big vector of dimensions $mn$. So, treat, $X$ as a random vector $X: \Omega \to \mathbb{R}^{mn}$. Now, look the this wiki page: https://en.wikipedia.org/wiki/Multivariate_random_variable

In short, $E(X)$ will be a matrix of same dimensions so that $E(X)_{ij}:=E(X_{ij}) \in \mathbb{R}$. Now, following the wiki page, the covaraince of your random variable will be a $mn \times mn$ dimensional matrix.

Related Question