Convex optimization under a positive semidefinite constraint involving projections

convex optimizationoptimizationpositive-semidefinitequadratic programming

I am interested in finding the smallest operator $X$ in the Frobenius norm (also called the Hilbert-Schmidt norm)

$$\begin{array}{ll}
\text{minimize:} & \lVert X \rVert_F^2\\
\text{subject to:} & P \,\left( X + M \right) \, P \succeq 0
\end{array}$$

where $P$ is an orthogonal projection. Both $X$ and $M$ are Hermitian, implying that the whole expression is Hermitian and can be checked for positive definite property.

  1. What kind of problem is this?

  2. It looks like a quadratic program. Is that correct?

  3. Is it also convex, or possible to make it convex or even be related to SDP with some relaxations, such that I can be sure that the solution is a global minimum in the set of feasible solutions?

  4. Will it also be efficient?

Best Answer

I presume that $P$ and $M$ are input matrices. Then this is a linear SDP (a.k.a. LMI) which is convex. Because of the (positive) semidefinite constraint, it is not a quadratic program.

More specifically, don't square the norm in the objective. It can then be converted to a Second Order Cone constraint via epigraph formulation. So the problem will have one Second Order Cone constraint and one linear SDP constraint. It can be formulated via CVX, YALMIP, CVXPY, CVXR, or similar tool, and solved with a (linear) SDP solver, such as Mosek, SDPT3, SeDuMi, among others.

CVX code (automatically does epigraph reformulation):

cvx_begin sdp
variable X(n,n) hermitian
minimize(norm(X,'fro'))
P*(X+M)*P >= 0
cvx_end

This formulation allows X to be complex. if you want X to be real symmetric, use symmetric instead of hermitian in the variable declaration.

Related Question