Got Questions? Get Answers.
Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
lsqcurvefit bounds influence fitting result

Subject: lsqcurvefit bounds influence fitting result

From: Philipp Steffen

Date: 17 Mar, 2011 17:37:05

Message: 1 of 3

Hallo,
I have been using lsqcurvefit for a while to fit measured data to a gaussian function with an initial constant part to extract parameters that describe the constant part as well as the gaussian shape of the curve.
Now I realized, that this is quite sensitive to the upper and lower bound. I need to set the lower bound to be >0. In principle I dont need a upper bound but the results of the fit vary quite drastically depending on if I put ub=[], ub=[1,1,1] or ub=[10,10,10].
In my dataset for example, ub=[10,10,10] yields the parameter
beta=0.0000 0.1995 1.3786 (resnorm=0.1626)
when I put ub=[1,1,1] i get:
beta=0.4395 0.2434 1.0000 (resnorm=0.2531)
and with ub=[]:
beta=2.6643 0.4244 0.6430 (resnorm=1.4990)
Clearly the fit using ub=[10,10,10] is the best. Why is this? Why do the boundaries influence the fit in this way?
I appreciate any hints and comments and hope that someone can help me to solve this issue. Basically, I don't want to fix the bounds to a value, since I would like to keep the function as flexible as possible.

Here is my code:

radius: 48 double values
RP: 48 double values

beta0(1)=0.4;
beta0(2)=0.79;
beta0(3)=0.8;
beta = lsqcurvefit(@func,beta0,radius,RP,[realmin,realmin,realmin],[]);

Function:
function [yhat] = func(b, radius)
for i=1:length(radius),
    if (radius(i) < b(1)),
    yhat(i) = b(2);
    else
    yhat(i) = 1 - ((1-b(2))*exp(-0.5*(((radius(i)-b(1))^2)/(b(3)^2))));
    end
end;
yhat=yhat';
end


Thanks a lot,

Philipp

Subject: lsqcurvefit bounds influence fitting result

From: Alan Weiss

Date: 18 Mar, 2011 14:06:44

Message: 2 of 3

Without analyzing your particular function, I can say that many, perhaps
most, realistic fitting problems have many local minima. Changing the
bounds causes lsqcurvefit to reach different local answers (minima). If
you want a global minimum, you have to take extra steps.

If you have a Global Optimization Toolbox license, you can use
MultiStart to search for a global minimum very easily. Even without
MultiStart, you can write a program of your own to take a variety of
start points and collect the results.

Good luck,

Alan Weiss
MATLAB mathematical toolbox documentation

On 3/17/2011 1:37 PM, Philipp Steffen wrote:
> Hallo,
> I have been using lsqcurvefit for a while to fit measured data to a
> gaussian function with an initial constant part to extract parameters
> that describe the constant part as well as the gaussian shape of the curve.
> Now I realized, that this is quite sensitive to the upper and lower
> bound. I need to set the lower bound to be >0. In principle I dont need
> a upper bound but the results of the fit vary quite drastically
> depending on if I put ub=[], ub=[1,1,1] or ub=[10,10,10].
> In my dataset for example, ub=[10,10,10] yields the parameter
> beta=0.0000 0.1995 1.3786 (resnorm=0.1626)
> when I put ub=[1,1,1] i get:
> beta=0.4395 0.2434 1.0000 (resnorm=0.2531)
> and with ub=[]:
> beta=2.6643 0.4244 0.6430 (resnorm=1.4990)
> Clearly the fit using ub=[10,10,10] is the best. Why is this? Why do the
> boundaries influence the fit in this way?
> I appreciate any hints and comments and hope that someone can help me to
> solve this issue. Basically, I don't want to fix the bounds to a value,
> since I would like to keep the function as flexible as possible.
>
> Here is my code:
>
> radius: 48 double values
> RP: 48 double values
>
> beta0(1)=0.4;
> beta0(2)=0.79;
> beta0(3)=0.8;
> beta = lsqcurvefit(@func,beta0,radius,RP,[realmin,realmin,realmin],[]);
>
> Function:
> function [yhat] = func(b, radius) for i=1:length(radius),
> if (radius(i) < b(1)),
> yhat(i) = b(2);
> else
> yhat(i) = 1 - ((1-b(2))*exp(-0.5*(((radius(i)-b(1))^2)/(b(3)^2))));
> end
> end;
> yhat=yhat';
> end
>
>
> Thanks a lot,
>
> Philipp

Subject: lsqcurvefit bounds influence fitting result

From: Philipp Steffen

Date: 19 Mar, 2011 11:53:05

Message: 3 of 3

Hi Alan,

thank you for your reply. I do not have the Global Optimization Toolbox.
In the meanwhile I found a solution for this particular problem.
I rewrote my function to be able to use fminsearch to find the smallest residuals of the fit to the data. The only constraint I need is that the first parameter can not be negative. My function now artificially sets the residuals to the value 10 which makes fminsearch to not give back these parameters as result.
In my hands this seems to work reasonably well.
Best,

Philipp


options=optimset('Display','none');
[beta,ssrs]=fminsearch(@fitfunc,beta0,options,[radius,RP]);

function [ ssrs ] = fitfunc( beta, data )
%FITFUNC This function is used with fminsearch to find the
%best set of parameters beta

curve=func(beta,data(:,1));
ssrs=sum((curve-data(:,2)).^2);
if beta(1)<=0, ssrs=10;

end


Alan Weiss <aweiss@mathworks.com> wrote in message <ilvotk$5es$1@ginger.mathworks.com>...
> Without analyzing your particular function, I can say that many, perhaps
> most, realistic fitting problems have many local minima. Changing the
> bounds causes lsqcurvefit to reach different local answers (minima). If
> you want a global minimum, you have to take extra steps.
>
> If you have a Global Optimization Toolbox license, you can use
> MultiStart to search for a global minimum very easily. Even without
> MultiStart, you can write a program of your own to take a variety of
> start points and collect the results.
>
> Good luck,
>
> Alan Weiss
> MATLAB mathematical toolbox documentation
>
> On 3/17/2011 1:37 PM, Philipp Steffen wrote:
> > Hallo,
> > I have been using lsqcurvefit for a while to fit measured data to a
> > gaussian function with an initial constant part to extract parameters
> > that describe the constant part as well as the gaussian shape of the curve.
> > Now I realized, that this is quite sensitive to the upper and lower
> > bound. I need to set the lower bound to be >0. In principle I dont need
> > a upper bound but the results of the fit vary quite drastically
> > depending on if I put ub=[], ub=[1,1,1] or ub=[10,10,10].
> > In my dataset for example, ub=[10,10,10] yields the parameter
> > beta=0.0000 0.1995 1.3786 (resnorm=0.1626)
> > when I put ub=[1,1,1] i get:
> > beta=0.4395 0.2434 1.0000 (resnorm=0.2531)
> > and with ub=[]:
> > beta=2.6643 0.4244 0.6430 (resnorm=1.4990)
> > Clearly the fit using ub=[10,10,10] is the best. Why is this? Why do the
> > boundaries influence the fit in this way?
> > I appreciate any hints and comments and hope that someone can help me to
> > solve this issue. Basically, I don't want to fix the bounds to a value,
> > since I would like to keep the function as flexible as possible.
> >
> > Here is my code:
> >
> > radius: 48 double values
> > RP: 48 double values
> >
> > beta0(1)=0.4;
> > beta0(2)=0.79;
> > beta0(3)=0.8;
> > beta = lsqcurvefit(@func,beta0,radius,RP,[realmin,realmin,realmin],[]);
> >
> > Function:
> > function [yhat] = func(b, radius) for i=1:length(radius),
> > if (radius(i) < b(1)),
> > yhat(i) = b(2);
> > else
> > yhat(i) = 1 - ((1-b(2))*exp(-0.5*(((radius(i)-b(1))^2)/(b(3)^2))));
> > end
> > end;
> > yhat=yhat';
> > end
> >
> >
> > Thanks a lot,
> >
> > Philipp

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us