Well, in the $1 \times 1$ case, a matrix is positive semi-definite precisely when its single entry is a non-negative number, and a random variable $X$ has zero variance if and only if it is a.s. constant. (If you don't know what ‘a.s.’ means, you may ignore it throughout this discussion.) Indeed, assuming $\mathbb{E}[X] = 0$ (which is no loss of generality), we have
$$\operatorname{Var} X = \int_{\Omega} X^2 \mathrm{d} \mathbb{P}$$
and since the integrand is non-negative, this is zero if and only if the integrand is a.s. zero, i.e. if and only if $X = 0$ a.s. In the case where $\mathbb{E}[X] \ne 0$, we have (by linearity) $\operatorname{Var} X = 0$ if and only if $X = \mathbb{E}[X]$ a.s.
In general, if you have a $n \times n$ symmetric matrix $V$, there is an orthogonal matrix $Q$ such that $Q V Q^{\sf T}$ is a diagonal matrix $D$, and $V$ is positive semi-definite if and only if the diagonal entries of $D$ are all non-negative. But if $V$ is the covariance matrix of $\mathbf{X}$, then $D$ is the covariance matrix of $Q \mathbf{X}$, and so $V$ is positive semi-definite but not positive definite if and only if some component of $Q \mathbf{X}$ is a.s. constant. This happens if and only if some linear combination of $\mathbf{X}$ is ‘fully correlated’, to use your phrasing.
I am unsure how we should properly resolve this question.
As we have discussed at length in the comments, only square matrices can be positive semidefinite. Therefore, if $p\neq n$, there is no way that the product matrix $W=UV$ can be positive semidefinite, because it will also be non-square.
For grins, let's assume that $p=n$. Under what conditions is $W=UV$ positive semidefinite? This requires that $x^HUVx\geq 0$ for all complex vectors $x$. Alternatively, this is true if and only if $UV+VU^H=Q$ where $Q=W+W^H$ has nonnegative eigenvalues (and is therefore PSD itself). If $U$, $V$ are real, then you can relax the Hermitian transposes to real tranposes, and consider only reall vectors $x$.
Now for one special case, I know the answer. If $V$ is positive definite---i.e., not just PSD but nonsingular---and we require $W$ to be positive definite as well, then the answer follows from Lyapunov's theorem applied to linear systems:
- The eigenvalues of $U$ must have positive real part.
What about the more relaxed cases? That is, what if $V$ is only positive semidefinite? What if $W$ is only required to be positive semidefinite? I am afraid I do not know. I'm sure people who study Lyapunov's theorem for linear systems in some depth know...
Best Answer
The answer is affirmative. Every positive semidefinite matrix $C$ can be orthogonally diagonalised as $QD^2Q^T$, where $Q$ is a real orthogonal matrix and $D$ is a nonnegative diagonal matrix. Let $\mathbf{Z}$ be a random vector following the standard multivariate normal distribution $N(0,I_n)$. It is straightforward to verify that $C$ is the covariance matrix of $\mathbf{X}=QD\mathbf{Z}$.