[Math] the minimum of the Frobenius norm in the intersection of positive semidefinite cones

eigenvectorlinear algebramatricesoc.optimization-and-control

For scalar variables $x$, we have a simple solution for the following problem.
\begin{eqnarray}
\min_x&&\alpha(x-a)^2+\beta(x-b)^2 \\\
\mathrm{s.t. }&&x\leq a\\\
&&x\leq b
\end{eqnarray}
where $\alpha,\beta>0$.

The optimal solution $x=\min(a,b)$ is straightforward and independent of $\alpha,\beta$.

However, in the case of real symmetrical matrix variables, the problem seems to be much more complex because the relationship $\leq$ in constraints has to be replaced with $\preceq$ which is a more complex relationship defined as
$$
A\preceq B\Leftrightarrow A-B\preceq0
$$
where $A\preceq0$ means $A$ is negative semi-definite.

Then the problem above is reformulated as following with matrix variables.
Assume that all matrices in this problem are real symmetrical.
\begin{eqnarray}
\min_X&&\alpha\|X-A\|_F^2+\beta\|X-B\|_F^2\\\
\mathrm{s.t.}&&X\preceq A\\\
&&X\preceq B
\end{eqnarray}
where $\alpha,\beta\geq0$.

If $A$ and $B$ can be diagonalized by the same orthogonal matrix $U$, the problem reduces to a the problems of eigenvalues since
$$
\alpha\|X-A\|_F^2+\beta\|X-B\|_F^2=\alpha\|\tilde{X}-\Lambda\|_F^2+\beta\|\tilde{X}-\Theta\|_F^2
$$
and
\begin{eqnarray}
X\preceq A\Leftrightarrow\tilde{X}\preceq\Lambda\\\
Y\preceq B\Leftrightarrow\tilde{X}\preceq\Theta
\end{eqnarray}
where $\tilde{X}=U^\top XU$, $\Lambda=U^\top AU$, $\Theta=U^\top BU$.
Since $\Lambda$ and $\Theta$ are diagonal matrices, the problem can decompose to sum of some scalar variables and solved independently.

However, how to solve it if $A$ and $B$ have different eigenvectors? Is the solution independent of $\alpha$ and $\beta$ yet?

Could any one be so kind to help me with the question or give some suggestions? Thank you very much!

Best Answer

I looked at the problem again and saw that it can be simplified nicely.

In summary:

  1. If $\alpha, \beta > 0$, the solution is independent of $\alpha$ and $\beta$
  2. The solution can be easily computed (one eigenvector decomposition is needed)

Details

Introduce the variables $X=B+C$ and $Z=A-B$. We optimize over the symmetric matrix $C$ instead. With this change of variables, the problem becomes \begin{eqnarray*} \min_{C}\quad &\alpha\|C-Z\|^2 + \beta\|C\|^2\\\\ &C \preceq Z,\quad C \preceq 0. \end{eqnarray*} Let $Z=U\Lambda U^T$, and use the unitary invariance of the Frobenius norm to replace the problem by \begin{eqnarray*} \min_{D}\quad\alpha\|D-\Lambda\|^2 + \beta\|D\|^2\\\\ D \preceq \Lambda,\quad D \preceq 0. \end{eqnarray*} Clearly, $D$ must be diagonal. In which case, the problem splits and we see that \begin{equation*} d_{ii} = \min(0, \lambda_{ii}) \end{equation*} Thus, eventually, we obtain $X=B+UDU^T$ as the optimal answer.

Related Question