MATLAB: Calibration problem – how to use fminsearch

calibrationfminsearch

i am trying to do an automatic calibration to find an optimal value for f such that simulation values (=1+a(t)*f) are "equal" to the observed values
however, when I define a sum of squared errors function and minimize this by using fminsearch, the obtained value is clearly not optimal as shown in the attached figure
I have also already tried fminsearch on the Nash Sutcliffe efficiency but this gave the same result, so does using the lsqnonneg. I even tried to just do a fit for the 10 percent highest values (as there is a bias for the peaks as it can be deduced from the scatter plots) but this gave even worse result.
I think the main problem is that there might not be an exact one to one match i.e sometimes peaks in simulated may occur one time step later or earlier than observed but how can I solve this problem? I tried minimizing the error between the moving average of both observed and simulated to account for this possible time lag, but again no better calibration was obtained so I am running out of ideas.

Best Answer

It looks linear, with a zero intercept, so the nonlinear optimisation routines seem to be overkill.
I would estimate ‘f’ as:
f = x(:)\OBS(:);
The (:) guarantee column vectors, necessary for the backslant operator to work in this application.
See if that works.