Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
accuracy of lsqcurvefit

Subject: accuracy of lsqcurvefit

From: Chia-Lung Hsieh

Date: 1 Oct, 2012 15:29:08

Message: 1 of 3

Hi,

How can we know the accuracy of the fitting - lsqcurvefit?

For example,

x = lsqcurvefit(fun,x0,xdata,ydata);

How good is the value x that is found by lsqcurvefit? What is the uncertainty of x?

Thanks in advance!

Subject: accuracy of lsqcurvefit

From: Alan_Weiss

Date: 3 Oct, 2012 15:04:08

Message: 2 of 3

On 10/1/2012 11:29 AM, Chia-Lung Hsieh wrote:
> Hi,
>
> How can we know the accuracy of the fitting - lsqcurvefit?
>
> For example,
>
> x = lsqcurvefit(fun,x0,xdata,ydata);
>
> How good is the value x that is found by lsqcurvefit? What is the
> uncertainty of x?
>
> Thanks in advance!

You might mean one of two different things.

1. Is the result a global minimum? There is no guarantee. You might need
to start lsqcurvefit from a variety of points.

2. What is a confidence region for the resulting estimate, assuming it
is the global minimum? This is related to the inverse of the matrix of
second derivatives (the Hessian). Sorry, I don't have the details at my
fingertips, but you can probably find a discussion by googling
"nonlinear least squares estimates" or some such thing.

Alan Weiss
MATLAB mathematical toolbox documentation

Subject: accuracy of lsqcurvefit

From: John D'Errico

Date: 4 Oct, 2012 00:13:09

Message: 3 of 3

Alan_Weiss <aweiss@mathworks.com> wrote in message <k4hk56$bu6$1@newscl01ah.mathworks.com>...
> On 10/1/2012 11:29 AM, Chia-Lung Hsieh wrote:
> > Hi,
> >
> > How can we know the accuracy of the fitting - lsqcurvefit?
> >
> > For example,
> >
> > x = lsqcurvefit(fun,x0,xdata,ydata);
> >
> > How good is the value x that is found by lsqcurvefit? What is the
> > uncertainty of x?
> >
> > Thanks in advance!
>
> You might mean one of two different things.
>
> 1. Is the result a global minimum? There is no guarantee. You might need
> to start lsqcurvefit from a variety of points.
>
> 2. What is a confidence region for the resulting estimate, assuming it
> is the global minimum? This is related to the inverse of the matrix of
> second derivatives (the Hessian). Sorry, I don't have the details at my
> fingertips, but you can probably find a discussion by googling
> "nonlinear least squares estimates" or some such thing.
>
> Alan Weiss
> MATLAB mathematical toolbox documentation

There are several "accuracies" one can think about.

First of all, we have the accuracy of the floating point
doubles involved. No matter what you do in MATLAB,
you are bounded by this. Arithmetic with doubles is
simply not perfectly accurate. But I don't think this is
the gist of the question posed by the OP.

Next, we have convergence of the algorithm itself. As
an optimizer, lsqcurvefit is a basic tool that tries to
reduce the sums of squares of residuals. The problem
is, any optimizer simply steps along, trying to reduce
that objective from what it has already seen and what
ever gradient information it can obtain. So lsqcurvefit
will look at the objective and stop when no improvement
appears available. This does NOT mean it has converged
to a global minimizer.

So there is a tolerance applied on the objective, as
well as a tolerance on the parameters. lsqcurvefit stops
when it believes it can do no better within context of
those tolerances. Perhaps this is what you are thinking
about. Some users simply crank down on the tolerances,
expecting this means the solution can be made arbitrarily
good. That is often a waste of CPU time.

Finally, there is an issue of parameter uncertainty. Given
the data provided, how well can we estimate the
parameters? Can we generate confidence intervals
around them based on statistical assumptions? This is
usually solved using the Hessian matrix, but there are
other methods (jackknife or bootstrap methods come
to mind here.) I describe the Hessian matrix approach
in my Optimization tips and tricks document on the File
Exchange., but any text that discusses nonlinear
regression modeling should tell you how to do this.
So look at texts like Draper & Smith, or Seber & Wild.

John

Tags for this Thread

No tags are associated with this thread.

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us