Using the information from previous comments, the solution was as follows:
The problem is an optimization application of separable objective functions, depending on just one variable each. In this case, each objective function can be approximated by a polynomial expression of d-th order. The problem data is stored in a matrix mx2n, as sets of (x, y) points, where n is the number of separable fuctions, and m the number of points.
In my application, n = 100, and d = 3. The problem includes a constraint: sum(x) <= Qdlm. If given, Qdlm is used. Also, the lower boundary of xi = 0 (non-negative values), using a starting value of 50 for each variable.
Gradient and Hessian calculation is given by the polynomial derivatives.
These are my functions for this application:
d = 3;
n = 100;
pexp = cell(n, 1);
for i = 1:n
pexp{i} = polyfit(M(:, 2*i), M(:, 2*i - 1), d);
end
Optimization auxiliary funcion:
function [x, tx, fval, exitflag, elapsedTime] = optimization_nlcf(pexp, n, Qdlm)
if ~isempty(Qdlm)
A = ones(1, n);
b = Qdlm;
else
A = [];
b = [];
end
x0 = 50*ones(n, 1);
lb = zeros(n, 1);
fobj = @(x) optimization_nlc(x, pexp, n);
fhes = @(x, lambda) hessian_nlc(x, lambda, pexp, n);
options = optimoptions('fmincon', 'Algorithm', 'interior-point',...
'GradObj', 'on', 'Hessian', 'user-supplied', 'HessFcn',fhes, ...
'TolFun', 1e-3, 'TolX', 1e-3, 'Display', 'iter-detailed');
tic
[x, fval, exitflag, output, lambda, grad, hessian] = fmincon(fobj, x0, A, b, [], [], lb, [], [], options);
tx = sum(x);
elapsedTime = toc;
My optimization function is:
function [fobj, grad] = optimization_nlc(x, pexp, n)
fobj = 0;
for i = 1:n
fobj = fobj - polyval(pexp{i}, x(i));
end
if nargout > 1
grad = zeros(n, 1);
for i = 1:n
grad(i) = -polyval(polyder(pexp{i}, 1), x(i));
end
end
My hessian function:
function hess = hessian_nlc(x, ~, pexp, n)
hess = sparse(n, 1);
for i = 1:n
hess(i) = -polyval(polyder(pexp{i}, 2), x(i));
end
hess = diag(hess);
Hope this simple application could help others to implement their objective functions.
Best Answer