Bad fitting result using lsqcurvefit

8 views (last 30 days)
Hi all,
I am trying to fit some data using lsqcurvefit. The following is my code:
if true
Data = ...
[2.00734170000 0.864877830000
2.16226580000 0.686643940000
2.31458660000 0.552897860000
2.50475180000 0.426715470000
2.69761580000 0.331211350000
3.01613550000 0.219841800000
3.30361980000 0.150251110000
3.50929470000 0.111653630000
3.78701120000 0.068782100000];
t = Data(:,1);
y = Data(:,2);
plot(t, y, 'ro');
F = @(x,xdata)x(1)*(1-xdata/x(2)).^x(3);
x0 = [5 5 5];
[x, resnorm, exitflag, output] = lsqcurvefit(F,x0,t,y);
hold on
plot(t, F(x,t))
hold off
end
This code was written according to the example code. But I always got the following output information instead of the output parameters specified.
*_Local minimum found.
Optimization completed because the size of the gradient is less than the default value of the function tolerance.
Also, different initial guesses of x0 can give quite different results. Could someone give me some suggestions on this?
Thank you, Qingjie
  2 Comments
Matt J
Matt J on 28 Mar 2014
The code you've shown with x0=[5 5 5] produces what looks like a very good fit. What don't you like about it?
Qingjie
Qingjie on 28 Mar 2014
Yeah, this guess works well. But if try other guesses, it turns out to be bad. As Star Strider said, this is an heuristic exploration. I am not sure if there is an method that we can avoid this exploration of initial guess.

Sign in to comment.

Accepted Answer

Star Strider
Star Strider on 28 Mar 2014
It’s working as it should. The semicolon (;) on the lsqcurvefit line suppressed the output:
x =
13.6296e+000 3.0383e+003 4.1870e+003
There are probably several local minima, and the function will converge on the local minimum closest to the initial parameter estimates. If you know the parameters should be in specific ranges, you can specify those ranges in the arguments lb and ub.
Nonlinear regression can be an heuristic exploration. It can sometimes take a few different starting estimates to get meaningful final parameter estimates.

More Answers (3)

Matt J
Matt J on 28 Mar 2014
Edited: Matt J on 28 Mar 2014
As Star Strider said, this is an heuristic exploration. I am not sure if there is an method that we can avoid this exploration of initial guess.
For the model you've shown, you can take some of the heuristics out of it by analyzing the log of your model function log(F)
logF = log(x1)+ x3*(1-xdata/x2);
Since log(F) depends linearly on log(x1) and x3, the least squares estimates of those parameters are determined once x2 is fixed. You can therefore write the least squares function in terms of a single variable x2 as follows,
function [logF,x]=logFlsq(x2,t,y)
Q=[ones(size(t)), log(1-t(:)/x2)];
z=Q\log(y);
logF=norm( Q*z-log(y) );
%Untransform the parameters
x(1)=exp(z(1));
x(2)=x2;
x(3)=z(2);
end
If you plot this over a range of x2,
logF=@(x2) arrayfun(@(z) logFlsq(z,t,y),x2);
x2=linspace(3,100,1e4);
plot(x2,logF(x2));
you will see that it is minimized at about x2=12. The fit becomes very insensitive to x2 at that point.
By calling logFlsq() with 2 output arguments, you can find the minimizing values for x1 and x3 corresponding to x2=12,
>> [~,x0]=logFlsq(12,t,y),
x0 =
8.3133 12.0000 12.5592
Feeding this as your initial guess x0 and your original F(x) to lsqcurvefit leads to a very different result
x =
13.2006 457.1577 622.1261
but not a very different result for the curve. Both x and x0 fit the data very closely, at least visually
plot(t, F(x,t),t,F(x0,t),'-.',t,y,'*'),
legend('Final Fit','Initial Fit', 'Data');
So, it appears that the fitting problem is highly ill-conditioned.
  1 Comment
Matt J
Matt J on 28 Mar 2014
Edited: Matt J on 28 Mar 2014
So, it appears that the fitting problem is highly ill-conditioned.
One reason this might be is if your data really is of the form exp(-t), or approximately so. Then, because of the identity
exp(-t)=lim (1-t/n)^n
the fit will be optimized with x(1)=1 and x(2)=x(3)--->infinity.

Sign in to comment.


John D'Errico
John D'Errico on 28 Mar 2014
The example I use is to compare optimization schemes to a blind person, set down on the earth in a random place, and asked to find the lowest spot on earth. (Yes, I'll allow them scuba equipment if needed.)
My blind person is capable only of searching the local vicinity using a cane, walking in a downhill direction at any point in time.
Now, suppose I were to put them down on an arbitrary place on earth. Eventually, through diligent search and sometimes a long walk, they will come to a point where it is impossible to go any lower. In any direction they look, the ground slopes uphill, so they are done. A local minimum has been found. Do you really expect them to be able to fins the lowest spot? Do you expect the solution so found to be consistent with when they are placed in another spot to start? Of course not!
In general, GOOD starting values are an imperative. And there are better schemes to use for certain problems. I would suggest for example, that a good scheme might be my own fminspleas, as found on the file exchange. It reduces the search space to a set of only two nonlinear parameters so you need not choose a starting value for x(1).
Finally, I'm not at all sure why it is that you think this to be a good model for your problem, but that is your issue, not one I can help with at this point.
  1 Comment
Matt J
Matt J on 28 Mar 2014
Edited: Matt J on 28 Mar 2014
It reduces the search space to a set of only two nonlinear parameters so you need not choose a starting value for x(1).
If you apply fminspleas to the log() of the model function, it reduces the search space to only the single parameter x(2). The result can then be used as an initial guess in applying whatever fitting tool to the original model function.
The initial guess for x(2) can be obtained by evaluating/searching the reduced objective function over a range of x(2) values (see my Answer).

Sign in to comment.


Alex Sha
Alex Sha on 6 Sep 2019
Baron, LingoGlobal, Antigone, 1stOpt, those packages don't need to guess starting-values anymore, but with much better outcomes. Have a try.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!