- lsqcurvefit is better for a problem of that form.
- Are you supplying your own derivative calculations using the GradObj option (and Hessian option if applicable)? You should do so, since the analytical derivatives are easy here. With lsqcurvefit, there are similar options, e.g. Jacobian.
- How are you initializing the optimization? Because your model is loglinear, it is likely that the initial guess as generated below will be more effective than random guessing.
minimize a function fast
3 views (last 30 days)
Show older comments
Hi!
I am trying the minimize a function of the following form: (Y - exp(a + b_1 X1 + b_2 X2)).^2
Y, X1, and X2 are all vectors and I am trying to find a, b_1, and b_2 that minimize this function. Its basically a nonlinear regression. So far I always used fminunc but it is very slow. I need to do this many times, so my program runs for more than a day. I appreciate your help. Thank you!
0 Comments
Accepted Answer
Matt J
on 23 Oct 2014
Edited: Matt J
on 24 Oct 2014
n=numel(Y);
x0=[ones(n,1) X1(:), X2(:)]\log(Y(:)); %x0=[a;b_1;b_2]
More Answers (0)
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!