[Math] Minimize trace($MX$) with $M$ rank-deficient and $X$ positive semidefinite

convex optimizationlinear algebramatricesrelaxationssemidefinite-programming

I have an optimization problem of the following form:

$$\min_{X\succeq0} \mathrm{trace\;} MX$$

under the linear constraint $\mbox{diag} (X) = \mathrm{Id}$ and the non-convex constraint $\mbox{rank} (X) = 1$. The matrix $M$ is square and rank-deficient.

The convex relaxation of this problem corresponds to dropping the rank-1 constraint, and merely keeping $X$ positive semidefinite.

I tried running a standard SDP solver (Mosek) on this problem but it yields a matrix $X$ which, despite satisfying the linear constraint and being positive semidefinite, is not of rank one. Instead, it is typically of rank $(n – \mathrm{rank\;} M)$ where $n$ is the number of rows of $X$.

Can you explain why I am getting this result?

Best Answer

You removed the rank-one constraint. The optimization will no longer find a rank-one matrix. However, for your problem, you may not need a solver. Let $M$ be a $n\times n$ matrix. Observe that $X\geq 0$ and $rank(X)=1$ implies that $X=xx^T$ for some vector $x\in\mathbb{R}^n$. Then $trace\{MX\}=x^TMx $. Also, $diag(X)=Id$ will imply $x_i^2=1$ where $x=[x_1,\dots,x_n]$. Thus, your optimization problem becomes $$\min_{x\in\mathbb{R}^n}~x^TMx \\s.t.~x_i\in\{-1,1\} \forall i .$$ Let $M$ be symmetric (try convincing yourself that this doesn't lose generality). Then $M$ has a decomposition of form $$M =\sum_{i=1}^{n}\lambda_iv_iv_i^T$$ where $\lambda_1<\dots<\lambda_n$ are eigenvalues and $v_i$ are corresponding eigenvectors. Thus, we have $$x^TMx=\sum_{i=1}^{n}\lambda_i(x^Tv_i)^2$$ Also, note that $\sum_{i}x_i<=n$. Given this, can you try and show that $x = sign(v_1)$. Here $sign(.)$ is +1 or -1 corresponding to positive and negative entries of the argument vector.

Related Question