[Math] Maximizing quadratic form on the hypercube

eigenvaluesnon-convex-optimizationquadratic-forms

I want to maximize a quadratic form $\mathbf x^T\mathbf Q\mathbf x$ and also want to find out which vector $\mathbf x$ maximizes the quadratic form when

  1. $\mathbf Q$ is an $n\times n$ positive definite matrix, and
  2. $\mathbf x$ is a vertex of an $n$-dimensional hypercube; that is, the convex hull of all sign permutations of the coordinates $\mathbf x = (x_1,\cdots,x_n) \in (\pm 1, \cdots, \pm 1)$.

In my setting, $\mathbf Q$ has only two types of eigenvalues:
$$
\text{eigenvalues of $\mathbf Q$} =
\{
\underbrace{1,\cdots,1}_{k\quad\text{copies}},
\underbrace{\frac{1}{n+1}\cdots,\frac{1}{n+1}}_{n-k \quad\text{copies}}
\}
$$
where $n$ is the size of the matrix $\mathbf Q$.

As $\mathbf Q$ is positive definite, the functional $\mathbf x^T\mathbf Q\mathbf x$ is strictly convex and hence its maximum on any polytope $\mathcal P$ is attained at $\mathcal P$'s vertices.
So I can formulate my optimization problem as a quadratic programming:

$$
\begin{array}{rl}
\text{maximize} & \mathbf x^T\mathbf Q\mathbf x\\
\text{subject to} & -1 \leq x_i \leq 1 \quad\text{for $i\in\{1,\cdots,n\}$}
\end{array}
$$

My Goal

  • I want to prove that the maximum of $\mathbf x^T\mathbf Q\mathbf x$ is $n$, and
  • I want to find out a vector $\mathbf x\in (\pm 1, \cdots, \pm 1)$ that maximizes the quadratic form. (There can be many maximizers but any one will do.)

Unfortunately, I don't have much background on quadratic programming.
It seems there are a handful of solution methods out there such as interior point, active set, gradient projection, just to name a few. But I don't know which one is the best fit for my overly simple setting of quadratic programming.

Question: Which solution method will be the best approach for my problem?


Intuition behind my guessing

(In case you wonder why I'm guessing that the maximum of the quadratic form is $n$.)

As $\mathbf x$ is a vertex of a hypercube in my setting, I can regard the quadratic form as a scaled version of Rayleigh quotient
$R(\mathbf Q, \mathbf x) =\frac{\mathbf x^T \mathbf Q \mathbf x}{\mathbf x^T \mathbf x}$.

\begin{equation*}
\mathbf x^T \mathbf Q \mathbf x
=
n \frac{\mathbf x^T \mathbf Q \mathbf x}{\mathbf x^T \mathbf x}
=
nR(\mathbf Q, \mathbf x)
\end{equation*}

as $\mathbf x\in(\pm 1,\cdots, \pm 1)$ ensures that $\mathbf x^T\mathbf x = n$.

In this formulation, I know the maximum of the Rayleigh quotient because

  • $R(\mathbf Q, \mathbf x) \leq \lambda_{max}(\mathbf Q) = 1$, and
  • $R(\mathbf Q, \mathbf v) = \lambda_{max}(\mathbf Q) = 1$ when $\mathbf v$ is the dominant eigenvector of $\mathbf Q$; that is, $\mathbf v$ is an eigenvector of $\mathbf Q$ corresponding to the largest eigenvalue $\lambda_{max}(\mathbf Q)=1$.

Therefore, the max of the scaled Rayleigh quotient $\mathbf x^T \mathbf Q\mathbf x = n R(\mathbf Q, \mathbf x)$ is $n$.

But this approach does not guarantee that the dominant eigenvector $\mathbf v$ is a scaled version of $(\pm 1,\cdots, \pm 1)$ and it cannot be an answer of my quadratic programming.

Best Answer

This maxQP problem is hard, it includes MaxCUT as a special case---see this paper by M. Charikar on MAXQP. Having $Q$ be positive definite does not really help (take the MAXQP problem in Charikar's paper, and since $x_i \in \{\pm1\}$, you can add a suitable multiple of the identity to turn the problem with a symmetric matrix to one with a symmetric positive definite matrix.

That said, chasing the cited paper and the papers that cite it, you should be able to find a wealth of literature to help you approximately solve your problem (including semidefinite programming relaxations)

Related Question