View License

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video

Highlights from

4.4 | 11 ratings Rate this file 15 Downloads (last 30 days) File Size: 4.93 KB File ID: #10093 Version: 1.0



John D'Errico (view profile)


22 Feb 2006 (Updated )

Efficient nonlinear regression fitting using a constrained, partitioned least squares overlay to fmi

| Watch this File

File Information

I need to thank Duane Hanselman for suggesting this great idea.

Fminspleas is a simple nonlinear least squares tool that fits regression models of the form

Y = a1*f1(X,C) + a2*f2(X,C) + ... + an*fn(X,C)

X can be any array, so it works on multidimensional
problems, and C is the set of only intrinsically nonlinear parameters. f1, f2, etc., must return a column vector result, of the same length as Y.

Because the optimization (in this case, fminsearch) need only work on the intrinsically nonlinear parameters, far fewer function evaluations are required. The example I give in the help took only 32 function evaluations to estimate 2 linear parameters plus 1 nonlinear parameter, versus over 300 evaluations had I just called fminsearch directly.

Fminspleas now allows you to specify bound constraints on the nonlinear parameters only. I'll see about adding linear parameter constraints if there are requests.

Finally, fminspleas allows the user to supply a set of non-negative weights to the regression.

E-mail me with any problems or bugs.


Fminsearchbnd, Fminsearchcon, Fminsearch Interface, and Optimization Tips And Tricks inspired this file.

MATLAB release MATLAB 7.0.1 (R14SP1)
Tags for This File   Please login to tag files.

No tags are associated with this file.

Please login to add a comment or rating.
Comments and Ratings (22)
06 Jul 2016 zhang

zhang (view profile)

31 May 2016 elham

elham (view profile)

11 Feb 2016 Matt J

Matt J (view profile)


Hard to comment without seeing the code you ran. However, it's surprising that fminsearch gave you anything reasonable when applied to a problem with 200+ variables. fminsearch is designed for 1-variable problems. When applied to multivariable problems, it has been found to work empirically, but only for small dimensions (up to say, 6 unknowns).

Comment only
11 Feb 2016 Alessandro De Sanctis

@ Matt J,

Thanks a lot.


Another issue:

I found optimal linear values completely different from those obtained using fminsearch (for which I provided starting values). Moreover, the likelihood of the probit I obtained using the optimal non-linear values from fminspleas is slightly lower than the one obtained using the values from fminsearch.

Comment only
10 Feb 2016 Matt J

Matt J (view profile)


Suppose A is an Nx200 matrix associated with your 200 linear parameters. Then a 2-line way of creating the funlist is,


funlist=arrayfun(@(i) @(inlp,xdata) Ac{i},1:length(Ac),'uni',0);

Then append to funlist additional functions for your 2 nonlinear parameters

Comment only
10 Feb 2016 Matt J

Matt J (view profile)

Alessandros' comment is pertinent to my comments on Dec. 29, 2010.

Comment only
10 Feb 2016 Alessandro De Sanctis

Thanks for the code.

Is there a simple way to estimate a model in which, in addition to 2 non-linear parameters, there are almost 200 linear parameters? Do I need to create other 200 functions? (The majority of them are dummy variables)

Comment only
08 Feb 2016 vivkrs

vivkrs (view profile)

22 Jul 2015 John

John (view profile)

If the data can be represented as:

y = a0 + a1*sin(c1*x+c2) + a2*sin(c3*x+c4) + a3*sin(c5*x+c6);

Should the funlist be:

funlist = {1, @(c,xdata) sin(c(1)*xdata+c(2)), @(c,xdata) sin(c(3)*xdata+c(4)), @(c,xdata) sin(c(5)*xdata+c(6))};


Or can it be a simpler formula?

13 Mar 2015 Florian

Great piece of code!

Actually, constraints for the linear parameters would be a valuable extension. I am fitting a Gaussian with an additional offset:
y = a1 + a2 * exp( - (c1-x)^2 / 4/c2^2 )

Sometimes it happens that a2 becomes negative, which is not a good solution for my data as the "peak" is then a "dip".. Besides of this, it works just perfectly fine!

30 Aug 2012 Antonui

23 Aug 2011 John D'Errico

John D'Errico (view profile)

Do you mean that the linear parameters are to be the same for both terms? Then your model is simply

Y = a*(f1(X,C) + f2(X,C))

No explicit constraint is needed. Simply define one function as above.

Comment only
23 Aug 2011 Matthew

Is it possible to constrain the linear parameters? E.g. have a function Y = a*f1(X,C) + a*f2(X,C)?

Comment only
12 Jan 2011 Francis Esmonde-White

This is an elegant and extremely useful optimization function.

29 Dec 2010 Matt J

Matt J (view profile)

There was a posting in the NG that looks like it could benefit from this tool, with a few modifications.

For one, the tool would need to allow an additional term depending only on the intrinsically non-linear parameters

Y = f0(X,C)+ a1*f1(X,C) + a2*f2(X,C) + ... + an*fn(X,C)

It looks like nearly the same methodology would accomodate this.

Secondly, it might be good to have the option, rather than passing the individual fi(X,C) as sepearate functions to allow the complete matrix

F(X,C)=[f0(X,C), f1(X,C),...,fn(X,C)]

to be passed. For large n, there may be MATLAB-savvy vectorized ways of generating the complete matrix whereas generating column-by-column could be slow. The above NG post gives one such case.

Comment only
27 Apr 2010 Jonas

Jonas (view profile)

This is a beautiful piece of code. In addition to doing very well what it does, it is very well documented. Thus, it is easily modified, for example to allow robust fitting.

27 Apr 2010 John D'Errico

John D'Errico (view profile)

Ok, let me explain why simulated annealing is generally a rather poor choice for nonlinear regression modeling. Understanding the tools that you will use is important, else you use the wrong tool for the wrong problem.

Simulated annealing is a stochastic optimizer. I've written such tools as well as having used them. Broadly, it uses the process of cooling to an annealed state as a metaphor for optimization. The optimization tool varies the parameters in a random process, with the objective function viewed as the energy state of the system. Eventually, the system moves to a minimal energy state, if enough time is allowed for convergence. Of course, this is only a probabilistic statement, so there is no assurance that a simulated annealing schedule will succeed in convergence to a global minimizer. In fact and in practice, it can leave you in an arbitrarily bad location.

Worse, simulated annealing, like any Monte Carlo optimizer, will not quickly yield any degree of tight tolerances on your result. It can easily take a huge number of function evaluations to give an adequate result.

Contrast this to fminspleas, which uses a partitioned least squares scheme to give many digits of accuracy for often only a few dozen function evaluations, because many problems reduce to a single nonlinear parameter to be solved for. Because of the variable partitioning, you need only supply starting values for those nonlinear parameters. This makes it even easier to use this code than it is to use a simulated annealer, since you need not worry about starting values for many of the parameters. As well, this code is far more robust to poor starting values than are many other nonlinear regression tools.

In very rare cases, simulated annealing might converge to a solution that fminspleas will miss. That is always a possibility with ANY optimizer. The vast majority of the time, fminspleas will converge to a far better solution, faster than simulated annealing.

So while LPS is free to use simulated annealing if he so prefers, I would suggest it reflects a lack of understanding of the tools he uses.

Comment only
27 Apr 2010 LPS

LPS (view profile)

I have found through my work that this function can fail on a verity of fitting problems in data analysis. A more general function is available here, using Simulated Annealing:

15 Feb 2010 Andre Guy Tranquille

23 May 2006 Benson Tsui

when I tried to use f = {1, @(xdata, coef) exp(xdata*coef)};
It gives the 'identifier' error.
can you tell me what causes the problem?

Comment only
03 Mar 2006 Duane Hanselman

The best approach for solving optimization problems where there is a mixture of linear and nonlinear terms. Solve for the linear terms using linear least squares embedded inside a nonlinear optimizer for the nonlinear terms. This function is difficult to describe with simple help text alone, but well worth the effort required to use it.

22 Feb 2006 John D'Errico

I've already added constraints to the nonlinear parameters in the new version going up, plus cleaned up the documentation a bit.

Comment only
23 Feb 2006

Cleaned up the help, added constraints on the nonlinear parameters.

20 Mar 2006

Version 1.1 - clean up the help & added another example with a second nonlinear parameter

12 Apr 2006

Remove (the mistaken) requirement of the optimization toolbox

08 May 2006

Inclusion of weights as an option

24 May 2006

Release 2.1: repair an error in the example

Contact us