The idea is that the singular value decomposition,
$$\mathbf A=\mathbf U\mathbf \Sigma\mathbf V^\top$$
and the eigendecomposition
$$\mathbf A=\mathbf Q\mathbf D\mathbf Q^\top$$
of a symmetric matrix are one and the same.
Thus, if one wants the Moore-Penrose pseudoinverse of $\mathbf A$, either decomposition could be used. (However, an SVD routine generally wouldn't exploit the nice structure of a symmetric matrix, so a bit more computational effort than what is actually needed will be used in that case; thus, use the eigendecomposition.)
The idea is that, letting $\mathbf A^\dagger$ be the Moore-Penrose pseudoinverse, we have the property
$$\mathbf A^\dagger=\mathbf Q\mathbf D^\dagger\mathbf Q^\top$$
where $\mathbf D^\dagger$ is (usually) computed via the following procedure: take $d_1$ to be the largest eigenvalue, and let $\varepsilon$ be machine epsilon. Reciprocate any entry of $\mathbf D$ that is greater than $\varepsilon\cdot d_1$, and set all other entries to zero.
I am unsure how we should properly resolve this question.
As we have discussed at length in the comments, only square matrices can be positive semidefinite. Therefore, if $p\neq n$, there is no way that the product matrix $W=UV$ can be positive semidefinite, because it will also be non-square.
For grins, let's assume that $p=n$. Under what conditions is $W=UV$ positive semidefinite? This requires that $x^HUVx\geq 0$ for all complex vectors $x$. Alternatively, this is true if and only if $UV+VU^H=Q$ where $Q=W+W^H$ has nonnegative eigenvalues (and is therefore PSD itself). If $U$, $V$ are real, then you can relax the Hermitian transposes to real tranposes, and consider only reall vectors $x$.
Now for one special case, I know the answer. If $V$ is positive definite---i.e., not just PSD but nonsingular---and we require $W$ to be positive definite as well, then the answer follows from Lyapunov's theorem applied to linear systems:
- The eigenvalues of $U$ must have positive real part.
What about the more relaxed cases? That is, what if $V$ is only positive semidefinite? What if $W$ is only required to be positive semidefinite? I am afraid I do not know. I'm sure people who study Lyapunov's theorem for linear systems in some depth know...
Best Answer
Note that for a positive semi-definite matrix ($\lambda_i\ge 0$) and invertible ($\lambda_i\neq 0$) we have that $\lambda_i> 0$ then the matrix is positive definite.