curve fitting exponential function with two terms
Show older comments
Update: I need help curve fitting this set of points with an exponential function with two terms nicely.
% Curve Fit
x = [6500 6350 6000 5400 4500];
y = [0 0.25 0.5 0.75 1.0];
theFit=fit(x' , y', 'exp2')
10 Comments
Matt J
on 11 Nov 2023
You already asked essentially the same question and accepted an Answer here,
This does not look exponential to me. Or did you mean x is a function of y? (That doesn't look exponential, either.)
Also, fitting two terms of an exponential, with only 5 data points, seems very ill-advised.
Can you tell us exactly what functional form you want to use?
x = [6500 6350 5800 4900 4500];
y = [0 0.25 0.5 0.75 1.0];
scatter(x,y)
Matt J
on 11 Nov 2023
Looks appropriate to me...
Use a different variable name here,
exp = [6500 6350 5800 4900 4500];
otherwise, you cannot use exp() as a the exponential function later.
Additionally, to obtain curveY, it is sufficient to do,
curveY = theFit(curveX);
Image Analyst
on 11 Nov 2023
Moved: Matt J
on 11 Nov 2023
Did you reverse your x and y???
Briana Canet
on 11 Nov 2023
Moved: Matt J
on 11 Nov 2023
Sorry, but I don't believe you. When I swap x and y, the fit looks great.
% Curve Fit
y = [6500 6350 6000 5400 4500];
x = [0 0.25 0.5 0.75 1.0];
theFit=fit(x' , y', 'exp2')
plot(theFit , x , y)
Answers (3)
You should normalize your x data
% Curve Fit
x = [6500 6350 6000 5400 4500];
x=(x-mean(x))/std(x);
y = [0 0.25 0.5 0.75 1.0];
Also, I would recommend downloading fminspleas from the File Exchange
and using it to generate an initial guess for fit():
e=@(a,xd)exp(a*xd);
flist={@(p,xd) e(p(1),xd) , @(p,xd) e(p(2),xd)};
[bd,ac]=fminspleas(flist,[-1,1],x, y);
theFit=fit(x',y','exp2','StartPoint',[ac(1),bd(1),ac(2), bd(2) ])
plot(theFit,x,y)
5 Comments
Matt J
on 11 Nov 2023
Well, you can undo the normalization after the fit is performed. It's a 1-1 change of variables.
Briana Canet
on 12 Nov 2023
x = [6500 6350 6000 5400 4500];
xt = (x-mean(x))/std(x);
y = [0 0.25 0.5 0.75 1.0];
theFit=fit(xt',y','exp2')
theFit.b = theFit.b/std(x);
theFit.a = theFit.a*exp(-theFit.b*mean(x));
theFit.d = theFit.d/std(x);
theFit.c = theFit.c*exp(-theFit.d*mean(x));
theFit
plot(theFit,x,y)
Matt J
on 12 Nov 2023
I mentioned in the comments that I needed to change my points (noticed an error in my work).
All answers in this thread have been demonstrated using your new points.
You can also use fit()'s normalizer,
x = [6500 6350 6000 5400 4500];
y = [0 0.25 0.5 0.75 1.0];
theFit=fit(x',y','exp2','Normalize','on')
plot(theFit,x,y)
2 Comments
Briana Canet
on 12 Nov 2023
Edited: Briana Canet
on 12 Nov 2023
If you need to explicitly manipulate the coefficients and fit function, you'll have to do the normalization manually:
% Curve Fit
x = [6500 6350 6000 5400 4500];
xmu=mean(x);
xstd=std(x);
y = [0 0.25 0.5 0.75 1.0];
theFit=fit((x-xmu)'/xstd,y','exp2');
% Monthly Cost
cost = x;
costUtility = y;
% Plot Utility Points
figure;
plot(cost,costUtility,'*');
xlim([4500 6500]);ylim([0 1.25]);
yticks([costUtility 1.25]);
grid on;
xlabel('Monthly Cost ($)');
ylabel('Utility');
legend('Utility Points');
% Add utility curve fit
coeff=num2cell(coeffvalues(theFit));
[a,b,c,d]=deal(coeff{:});
curveX = linspace(4500,6500);
X=(curveX-xmu)/xstd;
curveY = a*exp(b*X) + c*exp(d*X);
hold on;
plot(curveX,curveY,'Color','b');
legend('Utility Points','Utility Curve Fit');
If taking the fitting function as: y=a*exp(b*x) + c*exp(d*x);
and also taking the data like below directly;
x = [6500 6350 5800 4900 4500];
y = [0 0.25 0.5 0.75 1.0];
The unique stable result should be:
Sum Squared Error (SSE): 0.00221359211819696
Root of Mean Square Error (RMSE): 0.0210408750682901
Correlation Coef. (R): 0.998229714370904
R-Square: 0.996462562653016
Parameter Best Estimate
--------- -------------
a 11.4185972844776
b -0.000545792212445247
c -1.22298024843855E-22
d 0.00759142468815435

If add one more parameter, that is the fitting function become: y=a*exp(b*x)+c*exp(d*x)+e; the result will be perfect:
Sum Squared Error (SSE): 4.41584921368883E-29
Root of Mean Square Error (RMSE): 2.97181736103982E-15
Correlation Coef. (R): 1
R-Square: 1
Parameter Best Estimate
--------- -------------
a -2.47788945923639E-15
b 0.00505297753885221
c 332.002937639918
d -0.00141137023194644
e 0.420769231934917

11 Comments
The unique stable result should be:
I don't know about unique, but the condition number below suggests that it may not be stable, assuming it's a least squares fit.
syms a b c d x
syms a b c d; syms x y [5,1]
f([a,b,c,d])= sum( ( a*exp(b*x) + c*exp(d*x) - y ).^2 )/2;
Hfun=matlabFunction(hessian(f));
aa = 11.4185972844776 ;
bb = -0.000545792212445247;
cc = -1.22298024843855E-22;
dd = 0.00759142468815435 ;
xx = [6500 6350 5800 4900 4500];
yy = [0 0.25 0.5 0.75 1.0];
args=num2cell([aa,bb,cc,dd,xx,yy]);
Hess=Hfun(args{:});
cond(Hess)
Alex Sha
on 12 Nov 2023
For this problem, the objective value of the least squares fit (SSE) is 0.00221359211819697, corresponded parameters are:
1:
Sum Squared Error (SSE): 0.00221359211819697
Root of Mean Square Error (RMSE): 0.0210408750682901
Correlation Coef. (R): 0.998229714367005
R-Square: 0.996462562645233
Parameter Best Estimate
--------- -------------
a 11.4185970800621
b -0.000545792208439445
c -1.22298133472063E-22
d 0.00759142455222768
or 2: (actually same as 1, only the order of parameter is different)
Sum Squared Error (SSE): 0.00221359211819697
Root of Mean Square Error (RMSE): 0.0210408750682901
Correlation Coef. (R): 0.9982297143666
R-Square: 0.996462562644425
Parameter Best Estimate
--------- -------------
a -1.22297963729857E-22
b 0.007591424764864
c 11.4185972678769
d -0.000545792211987959
There are maybe other solutions, but should be local solutions, not global ones, or in other word, If there are any other solutions with the SSE values less than 0.00221359211819697, give it out please!
There are maybe other solutions, but should be local solutions, not global ones
Why?
If there are any other solutions with the SSE values less than 0.00221359211819697, give it out please!
Like I said, I don't know if there are other global solutions. I'm just not sure it is stable.
Alex Sha
on 12 Nov 2023
The definition of a globally optimal solution is one where there are no other feasible solutions with better objective function values, so, until you find and obtain a better result, I can confidently say that the result I provided is the global optimal solution and all others are local optimal solutions, at least so far.
Alex Sha
on 12 Nov 2023
I run ten times (can be more if necessary) with random start-values of each parameters, the result of each run is the same, if this is not "stable", what is?
Matt J
on 12 Nov 2023
That is a good sign, but you would probably also have to perturb the x,y inputs by small amounts to see if the solution remains approximately the same.
I have to assume, though, that the software is doing some sort of data-prenormalization. We can see from the OP's experience that random initialization doesn't reach the same solution when you don't do this.
Alex Sha
on 13 Nov 2023
@Matt J If you have tried out the 1stOpt software, maybe you'll change your views and opinions!
The code looks like below, very simple, end-users do not need to do anything else, such as data-prenormalization you said
Function y=a*exp(b*x) + c*exp(d*x);
Data;
x = [6500 6350 5800 4900 4500];
y = [0 0.25 0.5 0.75 1.0];
Less than half of second, the output will be:
Sum Squared Error (SSE): 0.00221359211819697
Root of Mean Square Error (RMSE): 0.0210408750682901
Correlation Coef. (R): 0.998229714368578
R-Square: 0.996462562648373
Parameter Best Estimate
--------- -------------
a -1.22298041267836E-22
b 0.00759142466699956
c 11.4185972934291
d -0.000545792212517027
No need for users to care about anything else, such as the initial start-values of parameters, every run will output same results.
Taking one more example, if the fitting function become: y=a*exp(b*x) + c*exp(d*x) + e/x;
Function y=a*exp(b*x) + c*exp(d*x) + e/x;
Data;
x = [6500 6350 5800 4900 4500];
y = [0 0.25 0.5 0.75 1.0];
Multi-run will produce two solutions:
1:
Sum Squared Error (SSE): 2.37582717939609E-30
Root of Mean Square Error (RMSE): 6.89322446957314E-16
Correlation Coef. (R): 1
R-Square: 1
Parameter Best Estimate
--------- -------------
a -5.46715403508488E-16
b 0.00527771888944181
c 1272.58844177386
d -0.00179769933992128
e 2743.69929384194
2:
Sum Squared Error (SSE): 0
Root of Mean Square Error (RMSE): 0
Correlation Coef. (R): 1
R-Square: 1
Parameter Best Estimate
--------- -------------
a -25.5545324787999
b -0.000174029056569443
c -1.10229181033535E-13
d 0.00449303003927482
e 57050.1019982362
What it means is that the start-values of each run are not fixed, but highly likely to be random values.
Matt J
on 13 Nov 2023
If you have tried out the 1stOpt software, maybe you'll change your views and opinions!
I wouldn't change my opinions because that was never my opinion. My speculation was that the software was doing some sort of normalization internally, not the user.
Alex Sha
on 13 Nov 2023
It would be a good suggestion for Mathwork, although not claer how 1stOpt process such problem internally.
Matt J
on 13 Nov 2023
MathWorks' fit() routine does have an internal normalization step which can be enabled,
However, if 1stOpt does something similar, it appears to be smart enough to post-transform the parameters and undo the effect of the data normalization. fit() does not do that.
Categories
Find more on Fit Postprocessing in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!










