First-order optimality is not close to zero

9 views (last 30 days)
Hi,
I have a question about first-order optimality.
My objective function is to minimize total cost which consists of 640 decision variables. There are 2 sets of the variable and they are scaled together with the objective function (ex. The first set variables ranges 100-1000, the second set is scaled from 0-0.8 to 0-80 and the objective function is scaled from 1e11-1e12 to 100-1000).
According to the documentation, it seems like using a large 'InitbarrierParam' may help to reach a better solution. Since, I believe my problem is large, I set the parameter to 1e10.
In this case, the first-order optimality is very close to zero. In contrast, when I changed the parameter to a very small one in order to reduce initial norm of step when rerunning the problem from the final solution, it turned out that the optimality measure went up to 1e7.
And when I checked a gradient matrix at the local possible solution, I found that some values is very large (about 1e7) in the both case mentioned above. My understanding is that values in the gradient matrix should be close to zero.
Also, I found that when solver is close to the large-optimality-measure solution, the norm step became very small and thus causing the solver to stop.
I found a similar thread says this is likely to happen if Hessians of the problem are very large.
So my questions are
1. Why changing the 'InitBarrierParam' causes the first-order optimality changes so much?
2. Is it possible to have such large value of gradients at minimums?
3. How to make solver take larger steps so that it won't stop because the TolX < step size while the first-order optimality remains large.
I am using fmincon, interior-point, fin-diff-grad and 'cg' method with analytical gradients, both objective and constraints, supplied. The gradients were derived using matlabFunction provided by symbolic toolbox. My problem is nonlinear objective function with nonlinear constraints.
Thanks!!

Answers (1)

Alan Weiss
Alan Weiss on 5 May 2015
Instead of answering your question, I would like to make a recommendation. You already know how to use Symbolic Math Toolbox for calculating gradients. I suggest that you use it for calculating the Hessian as well, along the lines of this example. Then, obviously, change your options to not use fin-diff-grad and 'cg', but to use the analytic Hessian function.
To answer one small question, it is possible for gradients to be large at a solution if the gradients are on a constraint boundary. In that case, you will have active constraints, and nonzero Lagrange multipliers.
Alan Weiss
MATLAB mathematical toolbox documentation
  1 Comment
Siwanon Thampibul
Siwanon Thampibul on 5 May 2015
Thank you for your reply. I was trying to calculate Hessians using symbolic math toolbox. However, it took me more than a night, in fact, I terminated the calculation before the toolbox finishes generating Hessian m files. I think that the speed of calculation is very slow because my objective function is very complicated and I did not optimize my code in such a way that easy for MATLAB to comply as I am very new to coding.

Sign in to comment.

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!