# Pattern Search Options

This example shows how to create and manage options for the pattern search function `patternsearch` using the `optimoptions` function in the Global Optimization Toolbox.

## Contents

## Setting Up a Problem for Pattern Search

`patternsearch` finds a linearly constrained minimum of a function. `patternsearch` attempts to solve problems of the form: min F(X) subject to: A*X <= B, Aeq*X = Beq (linear constraints) X C(X) <= 0, Ceq(X) = 0 (nonlinear constraints) LB <= X <= UB

In this example, the `patternsearch` solver is used to optimize an objective function subject to some linear equality and inequality constraints.

The pattern search solver takes at least two input arguments, namely the objective function and a start point. We will use the objective function `lincontest7` which is in the MATLAB file `lincontest7.m`. We pass a function handle to `lincontest7` as the first argument. The second argument is a starting point. Since `lincontest7` is a function of six variables, our start point must be of length six.

X0 = [2 1 0 9 1 0]; objectiveFcn = @lincontest7;

We also create the constraint matrices for the problem.

Aineq = [-8 7 3 -4 9 0 ]; Bineq = [7]; Aeq = [7 1 8 3 3 3; 5 0 5 1 5 8; 2 6 7 1 1 8; 1 0 0 0 0 0]; Beq = [84 62 65 1];

Next we run the `patternsearch` solver.

[X1,Fval,Exitflag,Output] = patternsearch(objectiveFcn,X0,Aineq,Bineq,Aeq,Beq); fprintf('The number of iterations was : %d\n', Output.iterations); fprintf('The number of function evaluations was : %d\n', Output.funccount); fprintf('The best function value found was : %g\n', Fval);

Optimization terminated: mesh size less than options.MeshTolerance. The number of iterations was : 278 The number of function evaluations was : 1889 The best function value found was : 2189.03

## Adding Visualization

If you are interested in visualizing the performance of the solver at run time you can use the provided plot functions or create your own. `patternsearch` can accept one or more plot functions. Plot functions can be selected using the function `optimoptions`. The default is that no plot functions are called while `patternsearch` is running. Here we create options using `optimoptions` that select two plot functions. The first plot function `psplotbestf` plots the best objective function value at every iteration, and the second plot function `psplotfuncount` plots the number of times the objective function is evaluated at each iteration.

```
opts = optimoptions(@patternsearch,'PlotFcn',{@psplotbestf,@psplotfuncount});
```

We run the `patternsearch` solver. Since we do not have upper or lower bound constraints and nonlinear constraints, we pass empty arrays (`[]`) for the seventh, eighth, and ninth arguments.

```
[X1,Fval,ExitFlag,Output] = patternsearch(objectiveFcn,X0,Aineq,Bineq, ...
Aeq,Beq,[],[],[],opts);
```

Optimization terminated: mesh size less than options.MeshTolerance.

## Mesh Parameters

**Initial mesh size**

The pattern search algorithm uses a set of rational basis vectors to generate search directions. It performs a search along the search directions using the current mesh size. The solver starts with an initial mesh size of one by default. To start the initial mesh size at 10, we can use `optimoptions`:

```
options = optimoptions(@patternsearch,'InitialMeshSize',10);
```

**Mesh scaling**

A mesh can be scaled to improve the minimization of a badly scaled optimization problem. Scale is used to rotate the pattern by some degree and scale along the search directions. The scale option is on (`true`) by default but can be turned off if the problem is well scaled. In general, if the problem is badly scaled, setting this option to `true` may help in reducing the number of function evaluations. For our objective function, we can set `ScaleMesh` to `false` because it is known to be a well scaled problem.

```
opts = optimoptions(@patternsearch,'ScaleMesh',false);
```

**Mesh accelerator**

Direct search methods require many function evaluations as compared to derivative-based optimization methods. The pattern search algorithm can quickly find the neighborhood of an optimum point, but may be slow in detecting the minimum itself. This is the cost of not using derivatives. The `patternsearch` solver can reduce the number of function evaluations using an accelerator. When the accelerator is on (`opts.AccelerateMesh = true`) the mesh size is contracted rapidly after some minimum mesh size is reached. This option is recommended only for smooth problems, otherwise you may lose some accuracy. The accelerator is off (`false`) by default. Here we set the `AccelerateMesh` to `true` because we know that our objective function is smooth.

Set `AccelerateMesh` to `true` and set `ScaleMesh` to `false`.

opts = optimoptions(@patternsearch,'AccelerateMesh',true,'ScaleMesh',false);

Run the `patternsearch` solver.

[X2,Fval,ExitFlag,Output] = patternsearch(objectiveFcn,X0,Aineq,Bineq, ... Aeq,Beq,[],[],[],opts); fprintf('The number of iterations was : %d\n', Output.iterations); fprintf('The number of function evaluations was : %d\n', Output.funccount); fprintf('The best function value found was : %g\n', Fval);

Optimization terminated: mesh size less than options.MeshTolerance. The number of iterations was : 233 The number of function evaluations was : 1308 The best function value found was : 2189.03

Turning the accelerator on reduced the number of function evaluations.

## Stopping Criteria and Tolerances

**What are MeshTolerance, StepTolerance and FunctionTolerance?**

`MeshTolerance` is a tolerance on the mesh size. If the mesh size is less than `MeshTolerance`, the solver will stop. `StepTolerance` is used as the minimum tolerance on the change in the current point to the next point. `FunctionTolerance` is used as the minimum tolerance on the change in the function value from the current point to the next point.

We can create options that set `StepTolerance` to 1e-7 using `optimoptions`. We can update `StepTolerance` directly in our previously created options `opts` by using the "." operator.

opts.StepTolerance = 1e-7;

## Search Methods in Pattern Search

The pattern search algorithm can use an additional search at every iteration. This option is called the `SearchFcn`. When a `SearchFcn` is provided, that search is done first before the mesh search. If the `SearchFcn` is successful, the mesh search, commonly called the `PollFcn` is skipped. If the search method is unsuccessful in improving the current point, the poll method is performed.

The toolbox provides five different search methods. These search methods include `searchga` and `searchneldermead`, which are two different optimization algorithms. It is recommended to use these methods only for the first iteration, which is the default. Using these methods repeatedly at every iteration might not improve the results and could be computationally expensive. One the other hand, the `searchlhs`, which generates Latin hypercube points, can be used in every iteration or possibly every 10 iterations. Other choices for search methods include Poll methods such as positive basis N+1 or positive basis 2N.

A recommended strategy is to use the positive basis N+1 (which requires at most N+1 points to create a pattern) as a search method and positive basis 2N (which requires 2N points to create a pattern) as a poll method. We update our options structure to use `positivebasisnp1` as the search method. Since the positive basis 2N is the default `PollFcn`, we do not need to set that option. Additionally we can select the plot functions we used in first part of this example to compare the performance of the solver.

opts = optimoptions(opts,'SearchFcn',@positivebasisnp1, ... 'PlotFcn',{@psplotbestf, @psplotfuncount});

Run the `patternsearch` solver.

[X5,Fval,ExitFlag,Output] = patternsearch(objectiveFcn,X0,Aineq,Bineq,Aeq,Beq, ... [],[],[],opts); fprintf('The number of iterations was : %d\n', Output.iterations); fprintf('The number of function evaluations was : %d\n', Output.funccount); fprintf('The best function value found was : %g\n', Fval);

Optimization terminated: mesh size less than options.MeshTolerance. The number of iterations was : 54 The number of function evaluations was : 662 The best function value found was : 2189.03

With the above settings of options, the total number of function evaluations decreased.