Write the SDP of the optimal problem

convex optimizationoptimizationsemidefinite-programming

I want to write the SDP of the optimal problem below, but I am not sure whether the formula I rewrite is right or wrong

\begin{array}{rl}
\min_{F_k,\rho_k}&\sum_{k=1}^K tr(\mathbf F_k)\\
\text{s.t.}& 0\le\rho_k\le1\\
&\frac{\mathbf h_k^H \mathbf F_k \mathbf h_k}{\bar {r_{k}}}-\sum\limits_{j \neq k}\mathbf h_k^H \mathbf F_j \mathbf h_k \ge \sigma^2_{a_k}+\frac{\sigma^2_{d_k}}{\rho_k}\\
&\mathbf F_k\succeq0\\
&\sum\limits_{j=1}^{K}\mathbf h_k^H \mathbf F_j \mathbf h_k + \sigma^2_{a_k} \ge \frac{\hat P}{1-\rho_k}
\end{array}

And the SDP is
\begin{array}{rl}
\min_{F_k,\rho_k}&\sum_{k=1}^K \mathbf f_k^H f_k\\
\text{s.t.}& 0\le\rho_k\le1\\
&\frac{\mathbf h_k^H \mathbf F_k \mathbf h_k}{\bar {r_{k}}}-\sum\limits_{j \neq k}\mathbf h_k^H \mathbf F_j \mathbf h_k \ge \sigma^2_{a_k}+\frac{\sigma^2_{d_k}}{\rho_k}\\
&\mathbf F_k\succeq0\\
&\sum\limits_{j=1}^{K}\mathbf h_k^H \mathbf F_j \mathbf h_k + \sigma^2_{a_k} \ge \frac{\hat P}{1-\rho_k}
\end{array}

  1. every $\mathbf h_k$ is $4$ by $1$ matrix.

  2. $\mathbf F $ is $\mathbf f \mathbf f^H$, and $\mathbf f$ is beamforming, every $\mathbf f$ is $4$ by $1$ matrix.

The reference of SDP is "Convex Optimization" 4.6.2 (p168~p169), but I don't know that SDP I write is right or wrong. Can anyone help me?

Best Answer

The first is already an SDP, but not an LMI (due to the nonlinear terms w.r.t $\rho$). To get an LMI, you use a Schur complement to convert $A - B^T C^{-1} B \succeq 0 $ to $\begin{pmatrix}A & B^T\\B & C\end{pmatrix}\succeq 0$

The second is also an SDP, but a rank-constrained one. If you are talking about a tractable method to constrain $F$ to be a product $ff^T$ where $f$ is a vector, that is not possible, as that is a nonconvex extremely hard rank constraint. The whole idea here is that someone has started with a nonconvex quadratic program, and relaxed it to a convex semidefinite program instead, by droppping the condition $F = ff^T$.

Related Question