Writing the function list for "Batched partitioned nonlinear least squares"

1 view (last 30 days)
Hi,
What is the function list for the following formula using "Batched partitioned nonlinear least squares":
y = a * exp(-x/b);
I want to extrapolate the values of a and b using nonlinear least squares fit(two parameters) for an image with size (256*256), I want to benefit from this optimization function to speed the fitting.
Thanks

Accepted Answer

John D'Errico
John D'Errico on 15 Apr 2017
Edited: John D'Errico on 15 Apr 2017
Ok. I'll answer this, even though I refused to answer it when asked directly just the other day. I'm sorry, but I refuse to do e-mail consulting. That always turns into a problem. :)
To start with, IF you are asking this question, it is because you want to use my batchpleas code from the File Exchange. But then the very first thing you should do is look at the demos, which would explain a lot of this.
Yes, you can linearize the complete model by using a log transform. But that implicitly assumes that the noise structure is log-normally distributed, so proportional noise. If this is what you have, then just avoid the whole mess, and log your data!
But assuming you really have additive Gaussian (Normal) noise, then while you can use the log solution to compute good starting values for the parameters, the only viable solution is to use a nonlinear least squares.
The model
y = a * exp(-x/b)
has TWO unknown parameters. One of them (a) is conditionally linear. IF you knew the value of b, then it is trivial to solve for the value of a, that minimizes a linear least squares problem. Again, once b is given, solve for a becomes a linear least squares. Then you optimize on the value of b. We would call b an intrinsically NONLINEAR parameter. Once b has been found that yields a minimum error metric for the LINEAR problem, it is also known that it minimizes the 2 variable problem. In fact, the partitioned scheme is less likely to fall into local minima, or to diverge without success.
The nice thing about the partitioned least squares approach is it only has ONE nonlinear parameter. So the search process is a ONE dimensional search, whereas traditional nonlinear least squares would require a TWO-dimensional search space here. The savings is immense in terms of complexity and robustness to problems. And not needing to provide a starting value for a is very nice.
You can read about partitioned nonlinear least squares in texts like Seber & Wild , although if my memory serves, they called it separable nonlinear least squares, or something like that. It is just a name.
You might also find Bates & Watts a useful read, as I think it was from a paper of theirs that I first learned these methods. A quick check shows there is indeed a section on conditionally linear parameters.
As far as batching goes, this is the technique I developed to solve MANY small and similar nonlinear least squares problems at once. All of the subproblems must employ the same model. You provide all of the data, and an indicator vector to indicate which sub-problem each point belongs to.
The idea behind batching is you can use the sparse Jacobian matrix capability in tools from the optimization toolbox to make the problem solvable much more efficiently than simply looping over the individual datasets. In my experience (and limited memory, since this was long ago that I developed these ideas), I found that batch sizes of a few dozen to several hundred were viable, offering a speedup of perhaps an order of magnitude in throughput.
I assume you are trying to use my batchpleas code, found for download from the file exchange. It does much of the work for you. That code will need an input function as:
funlist = {@(b,xdata) exp(-xdata/b)};
Put it in a cell array as I did, which is necessary in case there are multiple terms in each subproblem. Thus suppose you needed a constant term. That would be a separate cell in funlist.
This should be sufficient information to learn the methods described, as well as to use batchpleas. Note that batchpleas has a fairly extensive example provided, with almost the same model as you are wanting to use. (Ok, the model I put in the example had an additive constant term too.) See batchpleas_demo.html.
  4 Comments
John D'Errico
John D'Errico on 15 Apr 2017
Thanks Star.
By the way, I've also posted fminspleas on the FEX, which uses the same methodology with fminsearch as the underlying optimizer, but it does not solve batched sets of problems.
Jor Jor
Jor Jor on 2 May 2017
Hi again John,
I have tried your advice, when I attempted to fit the complete image(256*256*5 pixels) as one batch, it took nearly 7 minutes. Then I tried doing the fitting row by row as follows:
img = myDdata(:,:,96,:); % this is a 4-D array consists of 5 images to be fitted to produce one image (256,256,selectedImage,5)
m = 5;
n = size(myDdata(:,:,96,:),1);
batchindex = kron(1:n,ones(1,m));
Xdata = zeros(256,5);
for i = 1:256
Xdata(i,:)= [0.181 2.501 4.821 7.141 9.461];
end;
Xdata = reshape(Xdata',[],1);
tic
INLPstart = 2;
funlist = {@(c,X) exp(-X/c)};
batchsize = 250;
for j = 1:256
Ydata = img(j,:,1,:);
Ydata = reshape(Ydata ,size(data9(:,:,96,:),1) , 5);
Ydata = reshape(Ydata',[],1);
[INLP,ILP] = batchpleas(Xdata,Ydata,batchindex,funlist,INLPstart,batchsize);
end
toc
Using the above approach the time reduced to two minutes.
My questions:
1- Is it possible to make the batch fitting faster? Please note I have tried doing the fit using loop with initial guess of parameters, the total time was around 3-4 minutes per image.
2- How can I get the goodness of fit per pixel?
3- I want to weight the fit as follows, 65% of the weight goes to the first and second images evenly, the rest 35% for the rest three images(as I mentioned before I have a set of five images used in the fitting to produce one image). How can I pass the weights to the batch in order to accomplish the previous weighted fit.
Any advice is appreciated, because I have a large dataset and time is very crucial in this case.
Best Regards,

Sign in to comment.

More Answers (0)

Categories

Find more on Get Started with Optimization Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!