MATLAB: Optimization with very flat objective function

gradientlsqnonlinOptimization Toolbox

I'm attempting to optimize a smooth unconstrained function which has 77 unknown parameters. I have vectorized the optimization across observations and am using lsqnonlin to solve.
tic
opt = optimset('TolX', 1E-6, 'TolFun', 1E-6, 'MaxFunEvals', 100000, 'MaxIter', 5000, 'DiffMinChange', 1e-2, 'Algorithm', 'levenberg-marquardt', 'UseParallel', false);
f = @(parameters)ProbDiff(parameters, y, X);
[EstBetaLS, resnorm, residual] = lsqnonlin(f, Beta_init, [], [], opt);
toc
I run this optimization on data I simulated by setting all unknown parameters equal to 1.
I currently get the following message when running the optimization: "Local minimum found. Optimization completed because the size of the gradient is less than
1e-4 times the selected value of the function tolerance."
However, the parameters values found are very sensitive to the initial guess and never converge to the true value of 1. This leads me to believe the objective function is very flat.
What can I do to address this problem? In particular, will supplying the gradient help?

Best Answer

You can try scaling the function by multiplying it by a large value. You can also set some different options, such as TolFun. And, while you said that you vectorized the calculation, you did not set the Vectorized option.
You didn't say whether the result has a small residual. It is possible that your problem has many points that lead to essentially the same residual, so that your simulation value of ones(1,77) is not unique.
Alan Weiss
MATLAB mathematical toolbox documentation