Writing the function list for "Batched partitioned nonlinear least squares"
1 view (last 30 days)
Show older comments
Hi,
What is the function list for the following formula using "Batched partitioned nonlinear least squares":
y = a * exp(-x/b);
I want to extrapolate the values of a and b using nonlinear least squares fit(two parameters) for an image with size (256*256), I want to benefit from this optimization function to speed the fitting.
Thanks
0 Comments
Accepted Answer
John D'Errico
on 15 Apr 2017
Edited: John D'Errico
on 15 Apr 2017
Ok. I'll answer this, even though I refused to answer it when asked directly just the other day. I'm sorry, but I refuse to do e-mail consulting. That always turns into a problem. :)
To start with, IF you are asking this question, it is because you want to use my batchpleas code from the File Exchange. But then the very first thing you should do is look at the demos, which would explain a lot of this.
Yes, you can linearize the complete model by using a log transform. But that implicitly assumes that the noise structure is log-normally distributed, so proportional noise. If this is what you have, then just avoid the whole mess, and log your data!
But assuming you really have additive Gaussian (Normal) noise, then while you can use the log solution to compute good starting values for the parameters, the only viable solution is to use a nonlinear least squares.
The model
y = a * exp(-x/b)
has TWO unknown parameters. One of them (a) is conditionally linear. IF you knew the value of b, then it is trivial to solve for the value of a, that minimizes a linear least squares problem. Again, once b is given, solve for a becomes a linear least squares. Then you optimize on the value of b. We would call b an intrinsically NONLINEAR parameter. Once b has been found that yields a minimum error metric for the LINEAR problem, it is also known that it minimizes the 2 variable problem. In fact, the partitioned scheme is less likely to fall into local minima, or to diverge without success.
The nice thing about the partitioned least squares approach is it only has ONE nonlinear parameter. So the search process is a ONE dimensional search, whereas traditional nonlinear least squares would require a TWO-dimensional search space here. The savings is immense in terms of complexity and robustness to problems. And not needing to provide a starting value for a is very nice.
You can read about partitioned nonlinear least squares in texts like Seber & Wild , although if my memory serves, they called it separable nonlinear least squares, or something like that. It is just a name.
You might also find Bates & Watts a useful read, as I think it was from a paper of theirs that I first learned these methods. A quick check shows there is indeed a section on conditionally linear parameters.
As far as batching goes, this is the technique I developed to solve MANY small and similar nonlinear least squares problems at once. All of the subproblems must employ the same model. You provide all of the data, and an indicator vector to indicate which sub-problem each point belongs to.
The idea behind batching is you can use the sparse Jacobian matrix capability in tools from the optimization toolbox to make the problem solvable much more efficiently than simply looping over the individual datasets. In my experience (and limited memory, since this was long ago that I developed these ideas), I found that batch sizes of a few dozen to several hundred were viable, offering a speedup of perhaps an order of magnitude in throughput.
I assume you are trying to use my batchpleas code, found for download from the file exchange. It does much of the work for you. That code will need an input function as:
funlist = {@(b,xdata) exp(-xdata/b)};
Put it in a cell array as I did, which is necessary in case there are multiple terms in each subproblem. Thus suppose you needed a constant term. That would be a separate cell in funlist.
This should be sufficient information to learn the methods described, as well as to use batchpleas. Note that batchpleas has a fairly extensive example provided, with almost the same model as you are wanting to use. (Ok, the model I put in the example had an additive constant term too.) See batchpleas_demo.html.
4 Comments
John D'Errico
on 15 Apr 2017
Thanks Star.
By the way, I've also posted fminspleas on the FEX, which uses the same methodology with fminsearch as the underlying optimizer, but it does not solve batched sets of problems.
More Answers (0)
See Also
Categories
Find more on Get Started with Optimization Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!