Problem in Curve fitting

11 views (last 30 days)
maryam amiri
maryam amiri on 20 May 2021
Edited: Matt J on 20 May 2021
Hi,
I have a sets of data [x,y] that I want to fit with a function F(v,x) where v contains six free parameters.
x=[70,75,80,83,90,100]; y=[1,1,0.97,0.95,0.9,0];
I found the best fitted curve by cftool for this data set (polynomial degree 5):
but the result is different when I use lsqcurvefit.
v0=[0,0,0,0,0,0];
fun = @(v,x)v(1)*x.^5 + v(2)*x.^4 + v(3)*x.^3 + v(4)*x.^2 + v(5)*x + v(6);
x=[70,75,80,83,90,100];y=[1,1,0.97,0.95,0.9,0];
v=lsqcurvefit(fun,v0,x,y);
times = linspace(x(1),x(end));
plot(x,y,'ko',times,fun(v,times),'b-')
this is the result:
It seems lsqurvefit did not fitted the curve to the points.
any idea that why it does not work for me?

Accepted Answer

Matt J
Matt J on 20 May 2021
Edited: Matt J on 20 May 2021
Although polyfit is the better tool here, both polyfit and lsqcurvefit will be challenged by the scaling of your xdata, which is making the problem highly ill-conditioned. Rescaling helps considerably, as shown below,
v0=[0,0,0,0,0,0];
fun = @(v,x)v(1)*x.^5 + v(2)*x.^4 + v(3)*x.^3 + v(4)*x.^2 + v(5)*x + v(6);
x=[70,75,80,83,90,100];y=[1,1,0.97,0.95,0.9,0];
x=(x-mean(x))/std(x);
[v,fval,~,exitflag]=lsqcurvefit(fun,v0,x,y)
Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance.
v = 1×6
-0.0381 -0.0852 -0.0060 0.0310 -0.0643 0.9500
fval = 5.8273e-19
exitflag = 1
times = linspace(x(1),x(end));
plot(x,y,'ko',times,fun(v,times),'b-')
  1 Comment
Matt J
Matt J on 20 May 2021
Another way to see the need for scaling is to look its effect on the condition number of the Vandermonde matrix,
x=[70,75,80,83,90,100];
cond(vander(x)),
ans = 2.2962e+15
cond(vander((x-mean(x))/std(x)))
ans = 230.1084

Sign in to comment.

More Answers (2)

Walter Roberson
Walter Roberson on 20 May 2021
fun = @(v,x)v(1)*x.^5 + v(2)*x.^4 + v(3)*x.^3 + v(4)*x.^2 + v(5)*2 + v(6);
^^^^^^
Should be
v(5)*x
  6 Comments
Walter Roberson
Walter Roberson on 20 May 2021
However, it stops when it thinks the residue is good enough, or if it gets too very small step sizes.
It is a convex problem
Your v = -0.0381 -0.0852 -0.0060 0.0310 -0.0643 0.9500 has two sign changes, so the function itself is not convex.
Matt J
Matt J on 20 May 2021
Edited: Matt J on 20 May 2021
However, it stops when it thinks the residue is good enough, or if it gets too very small step sizes.
Yes, the ill-conditioning of the problem does cause one of these lsqcurvefit stopping criteria to be triggered prematurely, and where it stops will indeed depend on the initial point.
Your v = -0.0381 -0.0852 -0.0060 0.0310 -0.0643 0.9500 has two sign changes, so the function itself is not convex.
Yes, the polynomial being fitted is surely not convex as a function of x as we can also see from the plots. However, the least squares objective is convex as a function of v, which is why, in theory, lsqcurvefit should be globally convergent for this problem.

Sign in to comment.


Girijashankar Sahoo
Girijashankar Sahoo on 20 May 2021
check the again, I get your result with same code
  3 Comments
Girijashankar Sahoo
Girijashankar Sahoo on 20 May 2021
v =
-0.0000 0.0000 -0.0019 0.0778 -19.4167 -9.7083
maryam amiri
maryam amiri on 20 May 2021
v=[ -0.0000 0.0000 -0.0019 0.0778 -19.4167 -9.7083];
x=[75:100];
y = v(1)*x.^5 + v(2)*x.^4 + v(3)*x.^3 + v(4)*x.^2 + v(5)*x + v(6);
z=plot(x,y,'b');
still has problem.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!