[Math] Convex Optimization – Minimizing the Frobenius Norm of a Matrix with Linear Inequality Constraint of its Vectorization

convex optimizationleast squareslinear algebraoptimization

I have the following optimization problem:

Minimize $\|X\|_F$ subject to $Ax\le b$

Where $X$ is a matrix in $\mathbb{R}_{n\times n}$ of variables, $x$ is the $n^2$ vector of those variables and $A\in \mathbb{R}_{n^2\times n^2},b\in\mathbb{R}^{n^2}$, and $\|\cdot\|_F$ is the Frobenius norm $\|X\|_F = \sqrt{\text{tr}(AA^T)}$.

If I understand correctly, this is a convex optimization problem and can be solved with convex optimization tools, but my search did not find an explicit treatment of the subject and I believe I'm missing obvious tricks.

What is the standard way of solving this problem using convex optimization? Are there any transformations which makes this, e.g. a semidefinite linear programming problem? A quadratic programming problem? etc.

Best Answer

Minimizing $ {\left\| X \right\|}_{F} $ is equivalent of minimizing $ {\left\| X \right\|}_{F}^{2} $ which is equivalent of minimizing $ {\left\| x \right\|}_{2}^{2} $ where $ x = \operatorname{vec} \left( X \right) $, namely the Vectorization Operator applied on $ X $.

Now you can write your problem as:

$$\begin{align*} \arg \min_{x} \quad & \frac{1}{2} {\left\| C x - d \right\|}_{2}^{2} \\ \text{subject to} \quad & A x \leq b \end{align*}$$

Where $ C = I $ and $ d = \boldsymbol{0} $.

Now all you need is to utilize Linear Least Squares solver which supports Linear Inequality constraints.

Related Question