lsqcurvefit cost/optimization function

15 views (last 30 days)
Steve bourke
Steve bourke on 13 Jul 2016
Answered: John D'Errico on 15 Jul 2016
In the lsqcurvefit function, is there a way to change the output 'resnorm' to be a different cost or optimization function, such as the absolute value of the difference, or the log of that (instead of the square of the difference)?

Answers (2)

Star Strider
Star Strider on 13 Jul 2016
The residual (the ‘raw’ difference between the fitted regression and the data) is the third output from lsqcurvefit. You can do whatever operations on it you want.
For example:
[x,resnorm,residual,exitflag,output] = lsqcurvefit(___);
abs_rsd = abs(residual);
log_abs_rsd = log(abs(residual));
  2 Comments
Steve bourke
Steve bourke on 15 Jul 2016
Thank you for your response, however what we want to know is if we can change the chi squared function it uses to optimize the fitting.
Star Strider
Star Strider on 15 Jul 2016
My pleasure.
Not to my knowledge.
If it’s not among the available options in the options structure, you can’t change it without hacking the code. I don’t recommend that even if it’s possible.
You can always write your own nonlinear curve-fitting routines. Having done that myself in FORTRAN back in the early 1980s, I don’t recommend it.

Sign in to comment.


John D'Errico
John D'Errico on 15 Jul 2016
No. There is no way to change the lsqcurvefit code to use a different measure of error. Ok, no way except for rewriting lsqcurvefit.
The point is, lsqcurvefit uses algorithms that are specific to a sum of SQUARES of residuals. lsqcurvefit is not a general optimizer, that you could somehow just tell it to use a different metric.
If that is your goal, you could in theory use a different tool, perhaps fminunc or some other totally general optimizer. Even that is subject to significant problems however. For example, a sum of absolute values would result in a non-differentiable objective function. That could result in a failure to converge for fminunc.
So IF you truly needed to use a different objective, then you would be best off using an optimizer that would not be subject to such a failure. That might mean fminsearch, or perhaps a genetic algorithm, or some other stochastic scheme like a particle swarm method.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!