Code covered by the BSD License  

Highlights from
Fminspleas

4.0

4.0 | 6 ratings Rate this file 32 Downloads (last 30 days) File Size: 4.93 KB File ID: #10093

Fminspleas

by

 

22 Feb 2006 (Updated )

Efficient nonlinear regression fitting using a constrained, partitioned least squares overlay to fmi

| Watch this File

File Information
Description

I need to thank Duane Hanselman for suggesting this great idea.

Fminspleas is a simple nonlinear least squares tool that fits regression models of the form

Y = a1*f1(X,C) + a2*f2(X,C) + ... + an*fn(X,C)

X can be any array, so it works on multidimensional
problems, and C is the set of only intrinsically nonlinear parameters. f1, f2, etc., must return a column vector result, of the same length as Y.

Because the optimization (in this case, fminsearch) need only work on the intrinsically nonlinear parameters, far fewer function evaluations are required. The example I give in the help took only 32 function evaluations to estimate 2 linear parameters plus 1 nonlinear parameter, versus over 300 evaluations had I just called fminsearch directly.

Fminspleas now allows you to specify bound constraints on the nonlinear parameters only. I'll see about adding linear parameter constraints if there are requests.

Finally, fminspleas allows the user to supply a set of non-negative weights to the regression.

E-mail me with any problems or bugs.

Acknowledgements

Fminsearchbnd, Fminsearchcon, Fminsearch Interface, and Optimization Tips And Tricks inspired this file.

MATLAB release MATLAB 7.0.1 (R14SP1)
Tags for This File   Please login to tag files.

No tags are associated with this file.

Please login to add a comment or rating.
Comments and Ratings (12)
30 Aug 2012 Antonui  
23 Aug 2011 John D'Errico

Do you mean that the linear parameters are to be the same for both terms? Then your model is simply

Y = a*(f1(X,C) + f2(X,C))

No explicit constraint is needed. Simply define one function as above.

23 Aug 2011 Matthew

Is it possible to constrain the linear parameters? E.g. have a function Y = a*f1(X,C) + a*f2(X,C)?

12 Jan 2011 Francis Esmonde-White

This is an elegant and extremely useful optimization function.

29 Dec 2010 Matt J

There was a posting in the NG that looks like it could benefit from this tool, with a few modifications.

http://www.mathworks.com/matlabcentral/newsreader/view_thread/299811

For one, the tool would need to allow an additional term depending only on the intrinsically non-linear parameters

Y = f0(X,C)+ a1*f1(X,C) + a2*f2(X,C) + ... + an*fn(X,C)

It looks like nearly the same methodology would accomodate this.

Secondly, it might be good to have the option, rather than passing the individual fi(X,C) as sepearate functions to allow the complete matrix

F(X,C)=[f0(X,C), f1(X,C),...,fn(X,C)]

to be passed. For large n, there may be MATLAB-savvy vectorized ways of generating the complete matrix whereas generating column-by-column could be slow. The above NG post gives one such case.

27 Apr 2010 Jonas

This is a beautiful piece of code. In addition to doing very well what it does, it is very well documented. Thus, it is easily modified, for example to allow robust fitting.

27 Apr 2010 John D'Errico

Ok, let me explain why simulated annealing is generally a rather poor choice for nonlinear regression modeling. Understanding the tools that you will use is important, else you use the wrong tool for the wrong problem.

Simulated annealing is a stochastic optimizer. I've written such tools as well as having used them. Broadly, it uses the process of cooling to an annealed state as a metaphor for optimization. The optimization tool varies the parameters in a random process, with the objective function viewed as the energy state of the system. Eventually, the system moves to a minimal energy state, if enough time is allowed for convergence. Of course, this is only a probabilistic statement, so there is no assurance that a simulated annealing schedule will succeed in convergence to a global minimizer. In fact and in practice, it can leave you in an arbitrarily bad location.

Worse, simulated annealing, like any Monte Carlo optimizer, will not quickly yield any degree of tight tolerances on your result. It can easily take a huge number of function evaluations to give an adequate result.

Contrast this to fminspleas, which uses a partitioned least squares scheme to give many digits of accuracy for often only a few dozen function evaluations, because many problems reduce to a single nonlinear parameter to be solved for. Because of the variable partitioning, you need only supply starting values for those nonlinear parameters. This makes it even easier to use this code than it is to use a simulated annealer, since you need not worry about starting values for many of the parameters. As well, this code is far more robust to poor starting values than are many other nonlinear regression tools.

In very rare cases, simulated annealing might converge to a solution that fminspleas will miss. That is always a possibility with ANY optimizer. The vast majority of the time, fminspleas will converge to a far better solution, faster than simulated annealing.

So while LPS is free to use simulated annealing if he so prefers, I would suggest it reflects a lack of understanding of the tools he uses.

27 Apr 2010 LPS

I have found through my work that this function can fail on a verity of fitting problems in data analysis. A more general function is available here, using Simulated Annealing:

http://www.mathworks.de/matlabcentral/fileexchange/10548

15 Feb 2010 Andre Guy Tranquille  
23 May 2006 Benson Tsui

when I tried to use f = {1, @(xdata, coef) exp(xdata*coef)};
It gives the 'identifier' error.
can you tell me what causes the problem?

03 Mar 2006 Duane Hanselman

The best approach for solving optimization problems where there is a mixture of linear and nonlinear terms. Solve for the linear terms using linear least squares embedded inside a nonlinear optimizer for the nonlinear terms. This function is difficult to describe with simple help text alone, but well worth the effort required to use it.

22 Feb 2006 John D'Errico

I've already added constraints to the nonlinear parameters in the new version going up, plus cleaned up the documentation a bit.

Updates
23 Feb 2006

Cleaned up the help, added constraints on the nonlinear parameters.

20 Mar 2006

Version 1.1 - clean up the help & added another example with a second nonlinear parameter

12 Apr 2006

Remove (the mistaken) requirement of the optimization toolbox

08 May 2006

Inclusion of weights as an option

24 May 2006

Release 2.1: repair an error in the example

Contact us