Gradient Problem : FMINUNC requires two input arguments.

3 views (last 30 days)
Hello, I have a problem with an optimization problem: When I try to perform the code, it gives the error:
Error using fminunc (line 168)
FMINUNC requires two input arguments.
Error in JournalReplicationUnconstrained_rt_xt_mlf2 (line 8)
opts = optimset(fminunc, 'GradObj','on','Algorithm','trust-region');
dataX = xlsread('Xt_dividends_trial3(ln)');
dataM = xlsread('Mt_performance_Variable');
dataR = xlsread('Rt_log_returns');
d_VW = dataX(:,2); % Dividend yields for Value weighted index
MVW = dataM(:,2); % performance variable for Value weighted index
RVW = dataR(:,2); % log returns for Value weighted index
N = length(dataX); dt = 1;
opts = optimset(fminunc, 'GradObj','on','Algorithm','trust-region');
objfun = @(theta) mlf4(theta, RVW, MVW, d_VW, N, dt);
theta0 = [-.0577;.0135;.0524; .011; .0092; .016; .15];
[theta, fv] = fminunc(objfun, theta0, opts)
By the way I need to keep theta0 unchanged.
Function:
function f = mlf4(theta, RVW, MVW, d_VW, N, dt)
sigmaX1 = theta(1);
sigmaX2 = theta(2);
sigmaS1 = theta(3);
alpha = theta(4);
mu0 = theta(5);
mu1 = theta(6);
phi = theta(7);
%K = [sigmaS1^2, sigmaS1*sigmaX1*mu1; ...
%sigmaS1*sigmaX1*mu1, ((sigmaX1 + sigmaX2)*mu1)^2];
f0 = N*0.5*log((2*pi)^2*mu1^2*sigmaS1^2*sigmaX2^2 + 2*sigmaX1*mu1^2*sigmaS1^2*sigmaX2);
f1 = 0;
for j = 2:N
A = [RVW(j) - ((1-phi)*(mu0+mu1*d_VW(j-1))+phi*MVW(j-1))*dt; ...
mu1*(d_VW(j)-d_VW(j-1)*(1-alpha*dt))];
f1 = f1 + 0.5*A'*[(sigmaX1 + sigmaX2)^2/(sigmaS1^2*sigmaX2^2 + 2*sigmaX1*sigmaS1^2*sigmaX2), ...
-sigmaX1/(mu1*sigmaS1*sigmaX2^2 + 2*mu1*sigmaS1*sigmaX1*sigmaX2); ...
-sigmaX1/(mu1*sigmaS1*sigmaX2^2 + 2*mu1*sigmaS1*sigmaX1*sigmaX2), ...
1/(mu1^2*sigmaX2^2 + 2*sigmaX1*mu1^2*sigmaX2)]*A;
end
f = f0+f1;
end
Thanks in advance.

Answers (1)

Torsten
Torsten on 1 Apr 2015
Use
options = optimoptions('fminunc','GradObj','on','Algorithm','trust-region');
theta0 = [-.0577, .0135, .0524, .011, .0092, .016, .15];
and supply the gradient in mlf4.
Best wishes
Torsten.
  2 Comments
Armagan Ozbilge
Armagan Ozbilge on 1 Apr 2015
Hello,
Thank you for your help. However, what do you mean by supplying the gradient in mlf4? Like,
function [f, fGrad] = mlf4(theta, RVW, MVW, d_VW, N, dt)
.
.
.
fGrad = jacobian(f, theta);
or something else? If so, how can I put this gradient into fminunc?
Thanks in advance.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!