How to speed up multi-variance non-linear fitting in Matlab?

8 views (last 30 days)
Hello All! I am searching for a faster way to do the nonlinear fitting for the intensity fluctuation for each pixels from large quantity of imagesdatasets.
I know the fitting functions are always a single Gaussian, with variant mean value, standard derivation and magnitude.
I am sorry that I am not that good at Matlab and so far the built-in functions for nonlinear fitting with multiple variance all turns out to be quite slow. Could any of you please give me some suggestions how to speed up the fitting process? BTW, I am using MATLAB R2013b version.
Thank you very much for your help in advance!
  1 Comment
fei YANG
fei YANG on 8 Mar 2015
Sorry I forgot to mention, I have tried nonlinfit or lsqcurvefit, given the Gaussian function form and starting values, tolerance. The fitting is within foor loop, as each loop will give me the magnitude, mean and standard derivation of each data group. So any suggestions how to speed up? The both fitting choice seems to take huge time.... Thank you very much again!

Sign in to comment.

Answers (1)

John D'Errico
John D'Errico on 8 Mar 2015
Edited: John D'Errico on 8 Mar 2015
So, you want to do a large number of curve fits. Can you be surprised that it takes some serious effort? Just wanting something to run faster is nice. As the saying goes however, if wishes were horses, beggars would ride.
If you want to see more throughput, I have shown how to do what I call batch modeling, where you solve multiple problems at once. It is described in my optimization tips and tricks document on the file exchange. However, you need to recognize that to get a speedup here, you will need some skill in the use of sparse matrices, with the optimization toolbox, and with MATLAB in general.
Next, in that same tips and tricks document (as well as my fminspleas tool on the FEX) I show how to use a partitioned least squares scheme to speed up the convergence of nonlinear least squares problems.
You could use tools like the parallel computing toolbox. That too can yield some speedup, IF you have that toolbox, AND you can use it effectively, AND you have sufficient processors to gain speed.
Finally, you can use tools like GPU processors to give some speed, but here too, you will need the proper tools and the knowledge of how to use them.
In all cases, you need to invest some effort to get that speedup. If things were easy, they would already be done for you. You will also need to make sure you have good starting values for each sub-problem, as that can aid in both robustness and in speed of convergence.
  2 Comments
John D'Errico
John D'Errico on 8 Mar 2015
Funny, I wrote a tool for this some years ago, but never posted it. I probably decided that not many people were asking for it, so I never bothered to post it.
So I've just now uploaded batchedpleas.m that in a quick test, yields a 10-1 speedup over a set of 10000 subproblems, each with 3 parameters to estimate.
fei YANG
fei YANG on 9 Mar 2015
Thanks a lot! I will give it a try. Yes you are right, I will put on more effort to figure out how to speed it up.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!