I am unsure how we should properly resolve this question.
As we have discussed at length in the comments, only square matrices can be positive semidefinite. Therefore, if $p\neq n$, there is no way that the product matrix $W=UV$ can be positive semidefinite, because it will also be non-square.
For grins, let's assume that $p=n$. Under what conditions is $W=UV$ positive semidefinite? This requires that $x^HUVx\geq 0$ for all complex vectors $x$. Alternatively, this is true if and only if $UV+VU^H=Q$ where $Q=W+W^H$ has nonnegative eigenvalues (and is therefore PSD itself). If $U$, $V$ are real, then you can relax the Hermitian transposes to real tranposes, and consider only reall vectors $x$.
Now for one special case, I know the answer. If $V$ is positive definite---i.e., not just PSD but nonsingular---and we require $W$ to be positive definite as well, then the answer follows from Lyapunov's theorem applied to linear systems:
- The eigenvalues of $U$ must have positive real part.
What about the more relaxed cases? That is, what if $V$ is only positive semidefinite? What if $W$ is only required to be positive semidefinite? I am afraid I do not know. I'm sure people who study Lyapunov's theorem for linear systems in some depth know...
Observe that the $2$-norm of a matrix is
$$\|A\|_2=\sup_{\|x\|=1}\sqrt{\langle x,A^*Ax\rangle}=\sup_{\|x\|=1}\sqrt{\|Ax\|^2}=\|A\|,$$
i.e. the 2-norm and the operator norm coincide. Knowing only $\|B\|\leq\|A\|$, I would not expect to be able to control $\|CBC\|$ in terms of $|\|CAC\|$. The following example shows that you need some type of extra condition to get a bound:
$$
A=\left(\matrix{
1&-1\\
-1&1
}\right),
C=\left(\matrix{
1&1\\
1&1
}\right).
$$
For these matrices, you get $CAC=0$.
Case 1: If we assume that $C$ is invertible, then
$$
\|CAC\|\geq \|C^{-1}\|^{-1}\|AC\|=\|C^{-1}\|^{-1}\|(AC)^*\|=\|C^{-1}\|^{-1}\|CA\| \geq \|C^{-1}\|^{-2}\|A\|,
$$
so
$$
\|CBC\|\leq \|C\|^2\|B\|\leq\|C\|^2\|A\|\leq\|C\|^2\|C^{-1}\|^2\|CAC\|.
$$
Case 2: Suppose $A$ is invertible, such that $I\leq \|A^{-1}\| A$. Observe that $B\leq\|B\|I\leq \|B\|\|A^{-1}\|A$, which implies $CBC\leq \|B\|\|A^{-1}\|CAC$, and thus
$$
\|CBC\|\leq \|B\|\|A^{-1}\|\|CAC\|.
$$
Note that this holds even without the condition $\|B\|\leq\|A\|$!
Best Answer
For $A$ symmetric positive semi-definite, we have
$$\rho(A)^2\quad\leq\quad \sum_{k=1}^n \lambda_k^2 \quad =\quad\operatorname{trace}(A^2),$$
where $\lambda_1,\ldots,\lambda_n$ are the eigenvalues of $A$ and $\rho(A)$ its spectral radius.
EDIT: In the book Matrix Analysis of A. Horn and C. Johnson, appears the bound $$ \operatorname{trace}(A^2) \geq \frac{\operatorname{trace}(A)^2}{\operatorname{rank}(A)}$$ (see exercise 13, p.175) which is in turn slightly better than $\frac{\operatorname{trace}(A)^2}{n}$ :).