[Math] Global minimum of nonlinear least square

convex optimizationconvexitynonlinear optimizationoc.optimization-and-control

We have a continuous and differentiable function $f(\cdot)$ that maps from $R^n$ to $R^n$. We are trying to solve a nonlinear least square problem:

Minimize $J(x)=\Vert f(x)-z\Vert^2$

subject to box constraints: $l_i \leq x_i \leq u_i$.

This function $f(\cdot)$ has a beautiful property that

$\forall x_1, x_2 \in R^n, x_1\neq x_2$, $(x_1-x_2)^T(f(x_1)-f(x_2))>0$.

If $n=1$, we say $f(\cdot)$ is monotonically increasing. This nonlinear least square problem can be easily solved to global minimum.

But in our real case, $n=10$. Using standard quasi-Newton method together with gradient projection, we are able to solve this problem to local minimum. And usually, they appear to be global minimum. That's why we are into investigating whether they are indeed global optimum.

Global optimality usually requires the objective function $J(x)$ to be convex, or pseudo-convex, even still valid when it's quasi-convex. Since the function $f(\cdot)$ is $10\times 10$ and complicated, we did numerical verifications based on random sampling and discover: $J(x)$ is neither convex nor pseudo-convex, but quasi-convex.

So the questions are: 1) Can we prove from the previously stated property of $f(\cdot)$ that $J(x)$ is quasi-convex? 2) If not, can we solve this problem to global optimality via non-global optimization techniques?

Thanks!!!

Best Answer

Any function $f$ that satisfies $(x-y)^T(f(x)-f(y)) \ge 0$ is a monotone function, and has the interpretation of a subgradient for a certain convex function. If $f$ is continuous and monotone, then the solution set $S=\{x:f(x)=0\}$ is convex.

The global solution of monotone equations is a well-studied problem that admits very specialized algorithms. One such algorithm is described by Solodov & Svaiter, and is guaranteed to converge to a solution $x\in S$, assuming that one does exist. Moreover, if $f$ is differentiable, then convergence is super-linear. More recent algorithms are closer to the nonlinear least-squares procedure that you have described, and allow the system of equations to be solved without an explicit merit function search, see Zhou & Li.

References

Solodov, Michael V., and Benav F. Svaiter. "A globally convergent inexact Newton method for systems of monotone equations." Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods. Springer US, 1998. 355-369.

Zhou, Wei-Jun, and Dong-Hui Li. "A globally convergent BFGS method for nonlinear monotone equations without any merit functions." Mathematics of Computation 77.264 (2008): 2231-2240.

Related Question