$$\begin{array}{ll} \text{minimize} & f(x) + g(y)\\ \text{subject to} & xy \ge a\\ & x \ge 0\\ & y \ge 0\end{array}$$
where both $f$ and $g$ are convex quadratic functions and $a > 0$. The feasible region is convex and, since $a > 0$, also LMI-representable
$$\{ (x,y) \in \mathbb R^2 : x \geq 0 \land y \geq 0 \land x y \geq a \} = \left\{ (x,y) \in \mathbb R^2 : \begin{bmatrix} x & \sqrt{a}\\ \sqrt{a} & y\end{bmatrix} \succeq \mathrm O_2 \right\}$$
Hence, the original optimization problem can be rewritten as follows
$$\begin{array}{ll} \text{minimize} & f(x) + g(y)\\ \text{subject to} & \begin{bmatrix} x & \sqrt{a}\\ \sqrt{a} & y\end{bmatrix} \succeq \mathrm O_2\end{array}$$
Introducing optimization variables $s, t \in \mathbb R$, we rewrite the optimization problem in epigraph form
$$\begin{array}{ll} \text{minimize} & s + t\\ \text{subject to} & f(x) \leq s\\ & g(y) \leq t\\ & \begin{bmatrix} x & \sqrt{a}\\ \sqrt{a} & y\end{bmatrix} \succeq \mathrm O_2\end{array}$$
Let $f$ and $g$ be
$$f (x) := f_0 + f_1 x + f_2 x^2 \qquad\qquad\qquad g (y) := g_0 + g_1 y + g_2 y^2$$
where $f_2, g_2 > 0$ (to ensure convexity). Inequality constraints $f(x) \leq s$ and $g(y) \leq t$ can be written in LMI form, as follows
$$\begin{bmatrix} 1 & \sqrt{f_2} \, x\\ \sqrt{f_2} \, x & s - f_0 - f_1 x\end{bmatrix} \succeq \mathrm O_2$$
$$\begin{bmatrix} 1 & \sqrt{g_2} \, y\\ \sqrt{g_2} \, y & t - g_0 - g_1 y\end{bmatrix} \succeq \mathrm O_2$$
These LMIs introduce inequalities $s - f_0 - f_1 x \geq 0$ and $t - g_0 - g_1 y \geq 0$, which are redundant. Note that lines $s = f_0 + f_1 x$ and $t = g_0 + g_1 y$ are tangent to the graphs of $f$ and $g$, respectively.
Hence, we obtain a semidefinite program (SDP) in variables $x, y, s, t \in \mathbb R$
$$\begin{array}{lc} \\ \text{minimize} & s + t\\\\ \text{subject to} & \begin{bmatrix} 1 & \sqrt{f_2} \, x & & & \\ \sqrt{f_2} \, x & s - f_0 - f_1 x & & & \\ & & 1 & \sqrt{g_2} \, y & \\ & & \sqrt{g_2} \, y & t - g_0 - g_1 y & \\ & & & & x & \sqrt{a}\\ & & & & \sqrt{a} & y\end{bmatrix} \succeq \mathrm O_6\\\\\end{array}$$
which can be solved numerically using any SDP solver.
Let $X_0 = \sum_{i=1}^n \lambda_i v_i v_i^T$ be the eigenvalue decomposition of matrix $X_0$ with $\lambda_1 \geq \lambda_2 \geq \dots \geq \lambda_n$. Assume $\lambda_n < 0$, as otherwise the projection of $X_0$ onto positive semidefinite cone would be itself. Then we have
\begin{equation}
v_i^T X_0 v_i = \sum_{j=1}^n \lambda_i v_i^T v_j v_j^T v_i = \lambda_i
\end{equation}
and
\begin{equation}
||X_0||_2 = \max\{\lambda_1, -\lambda_n\} = \max\{v_1^T X_0 v_1, -v_n^T X_0 v_n\} = \max\{\sup_{||v||_2=1} v^T X_0 v, -\inf_{||v||_2=1} v^T X_0 v\}
\end{equation}
Now let $X$ be any symmetric positive semidefinite matrix. We have
\begin{equation}
||X - X_0||_2 \geq \sup_{||v||_2 = 1} v^T (X - X_0) v \geq v_n^T (X - X_0) v_n = v_n^T X v_n - v_n^TX_0v_n \geq -\lambda_n
\end{equation}
Now if we define $X = \sum_{i=1}^n \max\{\lambda_i, 0\} v_i v_i^T$, we have $||X - X_0||_2 = -\lambda_n$. Therefore, the defined matrix is the projection of $X_0$ onto positive semidefinite cone.
Best Answer
Minimizing $ {\left\| X \right\|}_{F} $ is equivalent of minimizing $ {\left\| X \right\|}_{F}^{2} $ which is equivalent of minimizing $ {\left\| x \right\|}_{2}^{2} $ where $ x = \operatorname{vec} \left( X \right) $, namely the Vectorization Operator applied on $ X $.
Now you can write your problem as:
$$\begin{align*} \arg \min_{x} \quad & \frac{1}{2} {\left\| C x - d \right\|}_{2}^{2} \\ \text{subject to} \quad & A x \leq b \end{align*}$$
Where $ C = I $ and $ d = \boldsymbol{0} $.
Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.