Note: This page has been translated by MathWorks. Please click here

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

This example illustrates how `GlobalSearch`

finds
a global minimum efficiently, and how `MultiStart`

finds
many more local minima.

The objective function for this example has many local minima and a unique global minimum. In polar coordinates, the function is

*f*(*r*,*t*)
= *g*(*r*)*h*(*t*),

where

$$\begin{array}{c}g(r)=\left(\mathrm{sin}(r)-\frac{\mathrm{sin}(2r)}{2}+\frac{\mathrm{sin}(3r)}{3}-\frac{\mathrm{sin}(4r)}{4}+4\right)\frac{{r}^{2}}{r+1}\\ h(t)=2+\mathrm{cos}(t)+\frac{\mathrm{cos}\left(2t-\frac{1}{2}\right)}{2}\text{.}\end{array}$$

The global minimum is at *r* = 0, with objective
function 0. The function *g*(*r*)
grows approximately linearly in *r*, with a repeating
sawtooth shape. The function *h*(*t*)
has two local minima, one of which is global.

Write a function file to compute the objective:

function f = sawtoothxy(x,y) [t r] = cart2pol(x,y); % change to polar coordinates h = cos(2*t - 1/2)/2 + cos(t) + 2; g = (sin(r) - sin(2*r)/2 + sin(3*r)/3 - sin(4*r)/4 + 4) ... .*r.^2./(r+1); f = g.*h; end

Create the problem structure. Use the

`'sqp'`

algorithm for`fmincon`

:problem = createOptimProblem('fmincon',... 'objective',@(x)sawtoothxy(x(1),x(2)),... 'x0',[100,-50],'options',... optimoptions(@fmincon,'Algorithm','sqp','Display','off'));

The start point is

`[100,-50]`

instead of`[0,0]`

, so`GlobalSearch`

does not start at the global solution.Add an artificial bound, and validate the problem structure by running

`fmincon`

:problem.lb = -Inf([2,1]); [x,fval] = fmincon(problem) x = 45.7236 -107.6515 fval = 555.5820

Create the

`GlobalSearch`

object, and set iterative display:gs = GlobalSearch('Display','iter');

Run the solver:

rng(14,'twister') % for reproducibility [x,fval] = run(gs,problem) Num Pts Best Current Threshold Local Local Analyzed F-count f(x) Penalty Penalty f(x) exitflag Procedure 0 200 555.6 555.6 0 Initial Point 200 1479 1.547e-15 1.547e-15 1 Stage 1 Local 300 1580 1.547e-15 5.858e+04 1.074 Stage 2 Search 400 1680 1.547e-15 1.84e+05 4.16 Stage 2 Search 500 1780 1.547e-15 2.683e+04 11.84 Stage 2 Search 600 1880 1.547e-15 1.122e+04 30.95 Stage 2 Search 700 1980 1.547e-15 1.353e+04 65.25 Stage 2 Search 800 2080 1.547e-15 6.249e+04 163.8 Stage 2 Search 900 2180 1.547e-15 4.119e+04 409.2 Stage 2 Search 950 2375 1.547e-15 477 589.7 387 2 Stage 2 Local 952 2439 1.547e-15 368.4 477 250.7 2 Stage 2 Local 1000 2487 1.547e-15 4.031e+04 530.9 Stage 2 Search GlobalSearch stopped because it analyzed all the trial points. 3 out of 4 local solver runs converged with a positive local solver exit flag. x = 1.0e-07 * 0.0414 0.1298 fval = 1.5467e-15

You can get different results, since

`GlobalSearch`

is stochastic.

The solver found three local minima, and it found the global
minimum near `[0,0]`

.

Write a function file to compute the objective:

function f = sawtoothxy(x,y) [t r] = cart2pol(x,y); % change to polar coordinates h = cos(2*t - 1/2)/2 + cos(t) + 2; g = (sin(r) - sin(2*r)/2 + sin(3*r)/3 - sin(4*r)/4 + 4) ... .*r.^2./(r+1); f = g.*h; end

Create the problem structure. Use the

`fminunc`

solver with the`Algorithm`

option set to`'quasi-newton'`

. The reasons for these choices are:The problem is unconstrained. Therefore,

`fminunc`

is the appropriate solver; see Optimization Decision Table in the Optimization Toolbox™ documentation.The default

`fminunc`

algorithm requires a gradient; see Choosing the Algorithm in the Optimization Toolbox documentation. Therefore, set`Algorithm`

to`'quasi-newton'`

.

problem = createOptimProblem('fminunc',... 'objective',@(x)sawtoothxy(x(1),x(2)),... 'x0',[100,-50],'options',... optimoptions(@fminunc,'Algorithm','quasi-newton','Display','off'));

Validate the problem structure by running it:

[x,fval] = fminunc(problem) x = 8.4420 -110.2602 fval = 435.2573

Create a default

`MultiStart`

object:ms = MultiStart;

Run the solver for 50 iterations, recording the local minima:

% rng(1) % uncomment to obtain the same result [x,fval,eflag,output,manymins] = run(ms,problem,50) MultiStart completed some of the runs from the start points. 10 out of 50 local solver runs converged with a positive local solver exit flag. x = -73.8348 -197.7810 fval = 766.8260 eflag = 2 output = funcCount: 8583 localSolverTotal: 50 localSolverSuccess: 10 localSolverIncomplete: 40 localSolverNoSolution: 0 message: 'MultiStart completed some of the runs from the start po...' manymins = 1x10 GlobalOptimSolution array with properties: X Fval Exitflag Output X0

You can get different results, since

`MultiStart`

is stochastic.The solver did not find the global minimum near

`[0,0]`

. It found 10 distinct local minima.Plot the function values at the local minima:

histogram([manymins.Fval],10)

Plot the function values at the three best points:

bestf = [manymins.Fval]; histogram(bestf(1:3),10)

`MultiStart`

started `fminunc`

from
start points with components uniformly distributed between –1000
and 1000. `fminunc`

often got stuck in one of the
many local minima. `fminunc`

exceeded its iteration
limit or function evaluation limit 40 times.

Was this topic helpful?