Note: This page has been translated by MathWorks. Please click here

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

The goal in this problem is to minimize the nonlinear function

$$f(x)=1+{\displaystyle \sum _{i=1}^{n}{\left|\left(3-2{x}_{i}\right){x}_{i}-{x}_{i-1}-{x}_{i+1}+1\right|}^{p}}+{\displaystyle \sum _{i=1}^{n/2}{\left|{x}_{i}+{x}_{i+n/2}\right|}^{p}},$$

such that -10.0 ≤ *x _{i}* ≤ 10.0, where

The `tbroyfg.m`

file computes the function
value and gradient. This file is long and is not included here. You
can see the code for this function using the command

type tbroyfg

The sparsity pattern of the Hessian matrix has been predetermined
and stored in the file `tbroyhstr.mat`

. The sparsity
structure for the Hessian of this problem is banded, as you can see
in the following `spy`

plot.

load tbroyhstr spy(Hstr)

In this plot, the center stripe is itself a five-banded matrix. The following plot shows the matrix more clearly:

spy(Hstr(1:20,1:20))

Use `optimoptions`

to set
the `HessPattern`

parameter to `Hstr`

.
When a problem as large as this has obvious sparsity structure, not
setting the `HessPattern`

parameter requires a huge
amount of unnecessary memory and computation. This is because `fmincon`

attempts to use finite differencing
on a full Hessian matrix of 640,000 nonzero entries.

You must also set the `SpecifyObjectiveGradient`

parameter
to `true`

using `optimoptions`

,
since the gradient is computed in `tbroyfg.m`

. Then
execute `fmincon`

as shown in Step
2.

fun = @tbroyfg; load tbroyhstr % Get Hstr, structure of the Hessian n = 800; xstart = -ones(n,1); xstart(2:2:n) = 1; lb = -10*ones(n,1); ub = -lb; options = optimoptions('fmincon','SpecifyObjectiveGradient',true,'HessPattern',Hstr,... 'Algorithm','trust-region-reflective'); [x,fval,exitflag,output] = ... fmincon(fun,xstart,[],[],[],[],lb,ub,[],options);

The `exitflag`

, `fval`

, first-order
optimality measure (`output.firstorderopt`

), and
number of iterations (`output.iterations`

) are:

exitflag,fval,output.firstorderopt,output.iterations exitflag = 3 fval = 270.4790 ans = 0.0163 ans = 7

For bound constrained problems, the first-order optimality is
the infinity norm of `v.*g`

, where `v`

is
defined as in Box Constraints, and `g`

is
the gradient.

Because of the five-banded center stripe, you can improve the
solution by using a five-banded preconditioner instead of the default
diagonal preconditioner. Using the `optimoptions`

function,
reset the `PrecondBandWidth`

parameter to `2`

and
solve the problem again. (The bandwidth is the number of upper (or
lower) diagonals, not counting the main diagonal.)

fun = @tbroyfg; load tbroyhstr % Get Hstr, structure of the Hessian n = 800; xstart = -ones(n,1); xstart(2:2:n,1) = 1; lb = -10*ones(n,1); ub = -lb; options = optimoptions('fmincon','SpecifyObjectiveGradient',true,'HessPattern',Hstr, ... 'Algorithm','trust-region-reflective','PrecondBandWidth',2); [x,fval,exitflag,output] = ... fmincon(fun,xstart,[],[],[],[],lb,ub,[],options);

The number of iterations increases by two. But the first-order
optimality measure is reduced by a factor of `1e-3`

:

exitflag,fval,output.firstorderopt,output.iterations exitflag = 3 fval = 270.4790 ans = 7.5340e-05 ans = 9

Was this topic helpful?