Maximization of quadratic form subject to a set constraint

constraintshessian-matrixlinear algebraquadratic-forms

Given a quadratic form $x^tAx$ where the matrix $A$ is symmetric and $x^tx$ = 1, we can deduce that the maximum of the quadratic form is the first eigenvalue of the matrix $A$ (also, the first eigenvector is the direction for maximizing the quadratic form).

What if I have a set constraint such that $x \subseteq D$, where $D$ consists of a finite set of vectors. I want to choose the element(or the candidate) from $D$ which maximizes the quadratic form given symmetric matrix $A$ (norm of the elements in $D$ is 1 and $A$ is a hessian matrix).

How can I seek for the best element that maximizes the quadratic form?

Thank you.

Best Answer

You can directly brute force this problem: Given a function $f\colon\mathbb R^N \to\mathbb R, \theta\mapsto f(\theta)$ with Hessian $H=\frac{\partial^2 f}{\partial^2 \theta}$, then the quadratic form $x^T H x$ can be computed without explicitly computing $H$:

$$ x^T H x = x^T \frac{\partial^2 f}{\partial^2 \theta} x = x^T\frac{\partial}{\partial \theta} \Big(\frac{\partial f}{\partial \theta} x\Big) $$

Once you implement $f$ in a language that supports automatic differentiation such as pytorch or tensorflow, you can compute it efficiently via

  z = grad(f, theta).dot(x)
xHx = grad(z, theta).dot(x)

At no point in this process is the matrix $H$ explicitly constructed. If you have enough RAM you can also concatenate all $x$ vectors and run everything in a single swoop (using appropriate tensorcontraction instead of inner product .dot).