[Math] Minimizing quadratic function subject to quadratic equality constraint

convex optimizationnon-convex-optimizationoptimizationqcqp

Given $N \times N$ positive (semi)definite matrix $\mathbf{A}$, vector $\mathbf{b} \in \Bbb C^N$ and $c > 0$,

$$\begin{array}{ll} \underset{\mathbf{x} \in \mathbb{C}^N}{\text{minimize}} & \mathbf{x}^H\mathbf{A}\mathbf{x} + 2 \Re\left\{ \mathbf{b}^H \mathbf{x} \right\}\\ \text{subject to} & \mathbf{x}^H\mathbf{x}=c\end{array}$$

How do I solve this optimization problem?

Best Answer

This is not a convex optimization problem as written, because $x^Hx=c$ is not a convex constraint. It describes the surface of a hypersphere, which is not convex.

The constraint $x^Hx\leq c$ would be convex, on the other hand. If that relaxation is acceptable to you, then just solve that instead; it's a simple quadratic program. If you need equality, then one thing you can do is solve the convex form first. If the global minimum happens to achieve $x^Hx=c$, then you're done; the solution is the same for your problem and for the convex, relaxed problem. Otherwise, you need to do something else...

What else can you do? Well, it turns out that the $x^Hx \leq c$ version of the problem can be solved globally even if $A$ is not positive semidefinite, and therefore non-convex. It is known as the trust region subproblem, so it has been studied extensively; Google is your friend here. So I think what you may be able to do is adapt the standard Lagrange multiplier technique, which is what is used for the trust region subproblem, for your case. The Lagrangian is

$$L(x,\lambda) = x^HAx + 2\Re{b^Hx} - \lambda( c - x^H x )$$

where $\lambda$ is the Lagrange multiplier. The optimality conditions are

$$\begin{gathered} 2 ( A + \lambda I ) x + 2 b = 0 \\ x^H x = c \end{gathered}$$ So when $A+\lambda I$ is invertible, $$x=-(A+\lambda I)^{-1}b, \quad x^Hx = \| (A+\lambda I)^{-1} b \|_2^2$$ Now you're reduced to searching over the scalar variable $\lambda$ for the value which achieves $x^Hx=c$. Once you have that, then you have your $x$.

Now just to be clear, I don't have an existence proof for you. If you successfully find a $\lambda$, then you should be good. I just don't know if this will always work. For the standard trust region subproblem, it does.

EDIT: I realized that perhaps you're daunted by the complex variables here. Well, there's no need to be; this is easily translated to a real equivalent:

$$\begin{array}{ll} \text{minimize}_{\bar{x}} & \bar{x} \bar{A} \bar{x} + 2 \bar{b} ^T \bar{x} \\ \text{subject to} & \bar{x}^T\bar{x} = c \end{array}$$ where $$\bar{x}\triangleq\begin{bmatrix} \Re{x} \\ \Im{x} \end{bmatrix} \quad \bar{A}\triangleq\begin{bmatrix} \Re{A} & \Im{A} \\ -\Im{A} & \Re{A} \end{bmatrix} \quad \bar{b}\triangleq\begin{bmatrix} \Re{b} \\ \Im{b} \end{bmatrix} $$

Related Question