fmincon solution does not differ from initial guess if I provide gradient

35 views (last 30 days)
Hello everyone,
I am currently facing a problem when providing the analytic gradient of the objective function to be minimised.
I need to minimise a functional depending on a state variable vector of length 3*N+4, with N being in the order of 10. I want some linear equality constraints to be respected, hence I impose them by providing a suitable matrix Aeq and a suitable vector beq.
I noticed that the solution is not satisfactory, hence I decided to provide the analytic gradient of the functional. In order to do so, I define my objective function as
function [f,g]=objective_function(x,parameters)
where f is the objective scalar function and g is the gradient vector of size 3*N+4. In my main script, in order for my objective function to depend only on x, I define the handle function
function_to_minimise=@(x) objective_function(x,given_parameters)
where given_parameters are given in the main script.
Of course, I also insert the following option
options=optimoptions('fmincon','GradObj','on');
If I don't provide the gradient, I obtain a solution that is not satisfactory, meaning that what I get as a solution is not consistent with the expected results and the theory.
I noticed that if I provide the gradient, the fmincon routine stops after 1 or 2 iterations and the result is the same as the initial guess. In order to check if the analytic gradient is correct, I checked it with the numerical gradient of fmincon for some well known cases, and the results are the same, so I figured that the analytical gradient that I provide is correct.
Notes:
  • Since I only have equality constraints, I do not have to include the gradient of the constraints.
  • I tried to optimise the functional with all possible
Is there something that I am missing which might cause this problem?
P.s. I hope I gave enought details and that my explanation is clear enough.

Accepted Answer

Bruno Luong
Bruno Luong on 9 Apr 2024 at 9:13
To get more info turn the option 'CheckGradients' to 'on', and check the exitflag (third output) of fmincon
  3 Comments
Bruno Luong
Bruno Luong on 10 Apr 2024 at 8:28
Edited: Bruno Luong on 10 Apr 2024 at 8:35
"Is there a way to solve this problem with fmincon?"
No fmincon is the solver that assumes C1 objective function and constraints, you cannot relax it.
However if your have term with detivative that jump you might tweet it to make C1, exampe replace abs(x) by
x.^2 ./ sqrt(x.^2 + epsilon) to make the function round around x = 0.
Or replace step function (if else) by logistic function ("soft" logical).

Sign in to comment.

More Answers (1)

Torsten
Torsten on 9 Apr 2024 at 9:22
Moved: Torsten on 9 Apr 2024 at 9:22
Use
SpecifyObjectiveGradient
Gradient for the objective function defined by the user. See the description of fun to see how to define the gradient in fun. The default, false, causes fmincon to estimate gradients using finite differences. Set to true to have fmincon use a user-defined gradient of the objective function. To use the 'trust-region-reflective' algorithm, you must provide the gradient, and set SpecifyObjectiveGradient to true.
For optimset, the name is GradObj and the values are 'on' or 'off'. See Current and Legacy Option Names.
instead of 'GradObj','on' if you use "optimoptions" and not "optimset".

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!