Documentation

This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

Minimization with Gradient and Hessian

This example shows how to solve a nonlinear minimization problem with an explicit tridiagonal Hessian matrix H(x).

The problem is to find x to minimize

f(x)=i=1n1((xi2)(xi+12+1)+(xi+12)(xi2+1)),(6-16)

where n = 1000.

Step 1: Write a file brownfgh.m that computes the objective function, the gradient of the objective, and the sparse tridiagonal Hessian matrix.

The file is lengthy so is not included here. View the code with the command

type brownfgh

Because brownfgh computes the gradient and Hessian values as well as the objective function, you need to use optimoptions to indicate that this information is available in brownfgh, using the SpecifyObjectiveGradient and HessianFcn options.

Step 2: Call a nonlinear minimization routine with a starting point xstart.

n = 1000;
xstart = -ones(n,1);
xstart(2:2:n,1) = 1;
options = optimoptions(@fminunc,'Algorithm','trust-region',...
    'SpecifyObjectiveGradient',true,'HessianFcn','objective');
[x,fval,exitflag,output] = fminunc(@brownfgh,xstart,options);

This 1000 variable problem is solved in about 7 iterations and 7 conjugate gradient iterations with a positive exitflag indicating convergence. The final function value and measure of optimality at the solution x are both close to zero. For fminunc, the first order optimality is the infinity norm of the gradient of the function, which is zero at a local minimum:

fval,exitflag,output.firstorderopt

fval =

   2.8709e-17


exitflag =

     1


ans =

   4.7948e-10

Related Topics

Was this topic helpful?