[Math] Which optimization method when Hessian is singular

nonlinear optimizationoptimization

I am trying to optimize a non linear function of four variables for which the Hessian matrix is always singular (pairs of columns / lines are colinear). I wanted to use a Newton method until I checked the value of the determinant of the hessian. The gradient methods implemented in NLopt seem to work by constructing local approximations of the function to optimize: http://ab-initio.mit.edu/wiki/index.php/NLopt_Algorithms. They are:

  • MMA (Method of Moving Asymptotes) and CCSA
  • SLSQP
  • Low-storage BFGS
  • Preconditioned truncated Newton
  • Shifted limited-memory variable-metric

Is it worth investigating NLopt and calling it to solve my problem ? For example, is the BFGS algorithm relevant when you know that the Hessian is singular ? Will the positive definite approximation of the Hessian be "close" to being singular and therefore unusable ?

Best Answer

I tried L-BFGS from the NLopt library and it converged much more quickly than my gradient descent solver. So, a singular matrix does not imply that L-BFGS will not work.