Single iteration with lsqnonlin (or fsolve), only compute new X0

I want lsqnonlin (or fsolve) to only carry out one iteration, ie. compute the new X, and then stop. No further function evaluations.
So ideally I dont even want it to compute the new values of the objective function, but I definitely do not want extra function evaluations for the jacobian or first order optimality conditions at the new guess for X.
(My question is similar to an earlier question by me:
... but now function evaluations are even more expensive, and i want to use lsqnonlin, so i also dont know how to update X (which is easy for the Newton Raphson step if you know the Jacobian), so the suggestions made there dont help me for this case.

5 Comments

Why not just use MaxIterations=1?
Do you just want to compute a jacobian at the start point? What does this possibly gain you? And why are you asking the very same question you already asked? Does asking it multiple times mean you will get better answers? What does one step of lsqnonlin possibly gain you?
@Matt J because then it does not stop after computing the new X value. It will evalute the function at the new X, and in addition compute the first order optimilality = more function evaluations. Ideally i want zero function evaluations with the new X value (because i want to do this outside the optimizer).
@John D'Errico did you read my explanation, why the asnwer to my previous question does not help? (the solution there was a work around that does not help for my current problem)
But you loose all information about the Jacobian in the iteration point and it will take much more effort in recomputing it in the next call to lsqnonlin than to continue the iterations.
@Torsten i use projection methods, and i want to update the grid before making a next iteration. so all information gathered at the new X with the old grid could be useless (especially if the step in X is relatively large). I first want to update the grid, and then do any further evaluations.
May I conclude it is not possible? Or at least not with a simple command?

Sign in to comment.

 Accepted Answer

This seems to be a feasible workaround. So, the important thing to realize is that even though the iterative display says the Func-count=2, the call to the objective function is doing no significant work after the first function call, because the externally scoped stopflag has been raised by that point.
doOptimization()
Norm of First-order Iteration Func-count Resnorm step optimality 0 1 4225 1.04e+03 1 2 0 4.0625 0 Local minimum found. Optimization completed because the size of the gradient is less than the value of the optimality tolerance.
x = 6.0625
res = 0
function doOptimization
clc
[stopflag,r0,J0]=deal(0);
opts=optimoptions('lsqnonlin','Display','iter',...
'SpecifyObjectiveGradient',true,'MaxIterations',0);
[x,res]=lsqnonlin(@resid, 2,[],[],opts)
function [r,J]=resid(x)
if ~stopflag
r=(x-10)^2+1; J=2*(x-10); %normal evaluation of residual function
r0=zeros(size(r)); %important that these be zero (but unclear why).
J0=zeros(size(J));
stopflag=1;
else %do no work
r=r0; J=r0;
end
end
end

4 Comments

Thank you so much Matt!!
You need to make the algorithm think it has found the optimum. That's why the residual has to be zero (or within the optimility conditions).
For completeness, I tweaked your setup a bit, so it stops after making the new guess for x (and without using an anlytical gradient):
function lsqnonlin_1iter
clc
eva = 0;
opts=optimoptions('lsqnonlin','Display','iter','MaxIterations',0);
[x,res]=lsqnonlin(@resid, 2,[],[],opts)
function [r]=resid(x)
x
if eva < 2
eva
x
r=(x-10)^2+1;
eva = eva+1;% counter for number of evaluations
else %do no work
display('Not doing anything')
r=0;%when you change this to r=100 then lsnonlin gives x=2 as output
end
end
end
You need to make the algorithm think it has found the optimum. That's why the residual has to be zero (or within the optimility conditions).
Yes, and I find that strange. With MaxIterations=0, you would think that any final function evaluations would be needed only to populate the output resnorm and Jacobian arguments. I don't see why it would care what the actual values are.
Anyway, it works great! Super happy!

Sign in to comment.

More Answers (0)

Asked:

on 29 Jun 2023

Commented:

on 30 Jun 2023

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!