FMINUNC CHECK GRADIENT FAILS
Show older comments
Hello everyone,
I'm trying to minimize this function through fminunc running:
[ygrad, cost] = tvd_sim_grad(x, lam, Nit,t);
where x is 4096x1 double & lam, Nit, t are 1x1 double.
function [xden,fval] = tvd_sim_grad(y, lam, Nit,t)
rng default % For reproducibility
ycut=double(abs(y)-t>0); % OUTLIERS REDUCTION TO t = variance calculated using robust covariance estimation.
yind=find(ycut==1);
y(yind)=t;
y=y+1; % necessary to get out of the neighborhood of zero
y0=y;
ObjectiveFunction = @(y) tvd_sim2(y,y0,lam);
options = optimoptions('fminunc','MaxIter',Nit,'ObjectiveLimit',0,'MaxFunEvals',Inf,'TolFun',1e-20,...
'TolX',1e-20,'UseParallel',false,'SpecifyObjectiveGradient',true,'CheckGradients',true,...
'FinDiffRelStep',1e-10,'DiffMinChange',0,'DiffMaxChange',Inf,'Diagnostics','off','Algorithm','quasi-newton',...
'HessUpdate','bfgs','FinDiffType','central','HessianFcn',[],...
'PlotFcns','optimplotfval','Display','final-detailed');
[xden,fval] = fminunc(ObjectiveFunction,y,options);
xden= xden-1; % zero realignment
end
function [TVD,mygrad] = tvd_sim2(x,y, lam)
TVD=1/2.*sum(abs((y-x).^2)) + lam.*sum(abs(diff(diff(-y./(1-x.*y-x.^2)))));
f=@(x) 1/2.*sum(abs((y-x).^2)) + lam.*sum(abs(diff(diff(-y./(1-x.*y-x.^2)))));
mygrad=gradient(f(x'));mygrad=mygrad';
end
This is a modification of the total variation denoising that I created to make the function itself derivable (the original is not differentiable in the second term). This function is differentiable in all real space except to 0. As you can see I made the opportune modification to the dataset to avoid zeros and now the data is condensed around the value 1.
When I use :
'SpecifyObjectiveGradient',false
I obtain this great results (red line is xden) :

But when I use :
'SpecifyObjectiveGradient',true
it makes 0 iteration and fails returning :
Optimization stopped because the objective function cannot be decreased in the
current search direction. Either the predicted change in the objective function,
or the line search interval is less than eps.
'CheckGradients',true
gives me :
Objective function derivatives:
Maximum relative difference between supplied
and finite-difference derivatives = 33382.1.
Supplied derivative element (1012,1): 0.480282
Finite-difference derivative element (1012,1): -33381.7
CheckGradients failed.
____________________________________________________________
Error using validateFirstDerivatives (line 102)
CheckGradients failed:
Supplied and finite-difference derivatives not within 1e-06.
how to get the above results by providing the gradient and why it doesn't work?
Thanks !
13 Comments
Emiliano Rosso
on 19 Dec 2022
Edited: Emiliano Rosso
on 19 Dec 2022
Torsten
on 19 Dec 2022
What is
size(mygrad)
in tvd_sim2 ?
where x is 4096x1 double & lam, Nit, t are 1x1 double.
I suggest posting them in a .mat file
Why I can't use gradient() for an anonymous function and to obtain numerical gradient ?
Because that's not one of its capabilities, as you should see from the documentation.
Torsten
on 19 Dec 2022
"gradient" in your case computes
[f(2)-f(1);0.5*(f(3)-f(1));...;0.5*(f(end)-f(end-2));f(end)-f(end-1)]
where
f = f(x')
I doubt this is what you want.
Emiliano Rosso
on 19 Dec 2022
Edited: Emiliano Rosso
on 19 Dec 2022
Emiliano Rosso
on 19 Dec 2022
Edited: Emiliano Rosso
on 20 Dec 2022
It really works although it should be
mygrad(i,1)=(f(xp)-f(xm))/(2*delta);
instead of
mygrad(i,1)=(f(xp)-f(xm))./2*delta;
?
But not well, I guess.
Emiliano Rosso
on 20 Dec 2022
Edited: Emiliano Rosso
on 20 Dec 2022
Your function is not differentiable everywhere since it contains "abs" expressions.
Did you change
mygrad(i,1)=(f(xp)-f(xm))./2*delta;
to
mygrad(i,1)=(f(xp)-f(xm))/(2*delta);
?
Does it run better with the corrected derivative ?
Emiliano Rosso
on 20 Dec 2022
Torsten
on 20 Dec 2022
A one-sided finite difference approximation for the derivative instead of a centered one will half the number of function calls...
Emiliano Rosso
on 21 Dec 2022
Edited: Emiliano Rosso
on 21 Dec 2022
Answers (0)
Categories
Find more on Linear Least Squares in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!