Using fmincon for a very costly objective function

18 views (last 30 days)
I would like to find the minimum of an objective function which has 43 variables and takes around 20 seconds to evaluate (another minimization problem needs to be solved for this objective function, explaining why it takes so long). It is not immensely important to me that I get the best, most accurate answer - only that I can find some sort of minimum reasonably quickly. So, I am looking for help in doing this.
So far, I have attempted to increase the DiffMinChange value, but fmincon seems to be ignoring this. I am also using the "active-set algorithm", because I read somewhere this was the most efficient of the four options. I do have a nonlinear constraint function, and upper and lower bounds.
Any help with this would be greatly appreciated - thanks! Danny

Accepted Answer

Alan Weiss
Alan Weiss on 9 Sep 2013
Danny, while I do not understand everything you are doing, here are my thoughts based on what you said in this forum.
1. The sqp algorithm is usually superior to the active-set algorithm, in that it is more robust and it respects bounds.
2. The interior-point algorithm is probably the most robust algorithm, and usually you should try it first, resorting to the others only if it proves unsatisfactory.
3. It is often crucial that you take a good starting point for the minimization. Especially, if you have an inner loop containing other minimizations, you might want to take the answer from one minimization as the starting value for the next.
4. Unlike Shashank, I would never attempt to use fminsearch or any of its relatives on a problem with more than 4 or 5 dimensions. It is almost always better to try to tune fmincon than to use a much poorer algorithm. You could try patternsearch, but it would likely be slower than fmincon (when properly tuned).
5. The best thing you might be able to do for fmincon would be to figure out somehow analytic gradients for the objective and nonlinear constraints, or some good approximation to them, and use them instead of letting fmincon take finite difference steps.
Good luck,
Alan Weiss
MATLAB mathematical toolbox documentation
  1 Comment
Danny
Danny on 9 Sep 2013
This is all very good advice. I wish I had an objective gradient! I have switched to using the interior-point however.
Thanks for all of your help!

Sign in to comment.

More Answers (2)

Shashank Prasanna
Shashank Prasanna on 8 Sep 2013
Edited: Shashank Prasanna on 8 Sep 2013
Hi Danny, since you are optimizing within the objective function, this maybe a be non-smooth problem.
Which means FMINCON may not be the best solver.
Do you have constraints in your optimization? If you don't try giving FMINSEARCH a try which is derivative free and may perform much faster.
  4 Comments
Shashank Prasanna
Shashank Prasanna on 9 Sep 2013
Are you able to provide your objective function? You mentioned that you are minimizing something within the objective function which makes it a changing objective function. FMINCON is not particularly suited for such problems which tend to be non smooth. #1 its recommended to try a non derivative based solver. #2 All hacks are related to your convergence criterion, you may play around with the options but if your objective is dynamic this helps little. #3 The objective function is evaluated and tested for nonlinear constraint violation. This adds up to the total execution time.
Please share your objective function and non-linear constraint function so that more eyes and look at it to give you better suggestions. Some standard suggestion should including reducing the objective function evaluation time itself - try using the MATLAB profiler
Matt J
Matt J on 9 Sep 2013
Edited: Matt J on 9 Sep 2013
I know that the objective function is theoretically smooth.
It sounds like you have an initial function f(x,y) and your are trying to use fmincon to minimize an objective g(y) of the form
g(y)=min_x f(x,y)
So, it is g(y) that you know for a fact is theoretically smooth? And you're basing that on more than just the smoothness of f(x,y)?
Note that smoothness of f(x,y) is not enough to ensure the smoothness of g(y), in general. For example,
f(x,y)=y^3*x^2 - 2*y*x
is smooth, but the corresponding g(y),
g(y) = 0, y=0
= -1/y, otherwise
is not smooth.

Sign in to comment.


Sadjad Yazdani
Sadjad Yazdani on 30 Jul 2017
Dear Denny the random search algorithm released for solve such a problem that you don`t have gradient of objective function specially when you want the global Optima or the objective has many computational cost. I suggest that you try GA or PSO.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!