There are seven Global Optimization Toolbox solvers:

`ga`

(genetic algorithm)`patternsearch`

, also called direct search`simulannealbnd`

(simulated annealing)`gamultiobj`

, which is not a minimizer; see Multiobjective Optimization

Choose an optimizer based on problem characteristics and on the type of solution you want. Solver Characteristics contains more information to help you decide which solver is likely to be most suitable.

Desired Solution | Smooth Objective and Constraints | Nonsmooth Objective or Constraints |
---|---|---|

Explanation of "Desired Solution" | Choosing Between Solvers for Smooth Problems | Choosing Between Solvers for Nonsmooth Problems |

Single local solution | Optimization Toolbox™ functions; see Optimization Decision Table in the Optimization Toolbox documentation | `fminbnd` , `patternsearch` , `fminsearch` , `ga` , `particleswarm` , `simulannealbnd` |

Multiple local solutions | `GlobalSearch` , `MultiStart` | |

Single global solution | `GlobalSearch` , `MultiStart` , `patternsearch` , `ga` , `simulannealbnd` | `patternsearch` , `ga` , `particleswarm` , `simulannealbnd` |

Single local solution using parallel processing | `MultiStart` , Optimization Toolbox functions | `patternsearch` , `ga` , `particleswarm` |

Multiple local solutions using parallel processing | `MultiStart` | |

Single global solution using parallel processing | `MultiStart` | `patternsearch` , `ga` , `particleswarm` |

To understand the meaning of the terms in "Desired Solution," consider the example

*f*(*x*)=100*x*^{2}(1–*x*)^{2}–*x*,

which has local minima `x1`

near 0 and `x2`

near
1:

Code for generating the figure

The minima are located at:

x1 = fminsearch(@(x)(100*x^2*(x - 1)^2 - x),0) x1 = 0.0051 x2 = fminsearch(@(x)(100*x^2*(x - 1)^2 - x),1) x2 = 1.0049

**Description of the Terms**

Term | Meaning |
---|---|

Single local solution | Find one local solution, a point x where
the objective function f(x)
is a local minimum. For more details, see Local vs. Global Optima. In the example, both `x1` and `x2` are
local solutions. |

Multiple local solutions | Find a set of local solutions. In the example, the complete
set of local solutions is `{x1,x2}` . |

Single global solution | Find the point x where the objective function f(x)
is a global minimum. In the example, the global solution is `x2` . |

**Single Global Solution **

Try

`GlobalSearch`

first. It is most focused on finding a global solution, and has an efficient local solver,`fmincon`

.Try

`MultiStart`

second. It has efficient local solvers, and can search a wide variety of start points.Try

`patternsearch`

third. It is less efficient, since it does not use gradients. However,`patternsearch`

is robust and is more efficient than the remaining local solvers.Try

`particleswarm`

fourth, if your problem is unconstrained or has only bound constraints. Usually,`particleswarm`

is more efficient than the remaining solvers, and can be more efficient than`patternsearch`

.Try

`ga`

fifth. It can handle all types of constraints, and is usually more efficient than`simulannealbnd`

.Try

`simulannealbnd`

last. It can handle problems with no constraints or bound constraints.`simulannealbnd`

is usually the least efficient solver. However, given a slow enough cooling schedule, it can find a global solution.

**Multiple Local Solutions. **`GlobalSearch`

and `MultiStart`

both
provide multiple local solutions. For the syntax to obtain multiple
solutions, see Multiple Solutions. `GlobalSearch`

and `MultiStart`

differ
in the following characteristics:

`MultiStart`

can find more local minima. This is because`GlobalSearch`

rejects many generated start points (initial points for local solution). Essentially,`GlobalSearch`

accepts a start point only when it determines that the point has a good chance of obtaining a global minimum. In contrast,`MultiStart`

passes all generated start points to a local solver. For more information, see GlobalSearch Algorithm.`MultiStart`

offers a choice of local solver:`fmincon`

,`fminunc`

,`lsqcurvefit`

, or`lsqnonlin`

. The`GlobalSearch`

solver uses only`fmincon`

as its local solver.`GlobalSearch`

uses a scatter-search algorithm for generating start points. In contrast,`MultiStart`

generates points uniformly at random within bounds, or allows you to provide your own points.`MultiStart`

can run in parallel. See How to Use Parallel Processing.

Choose the applicable solver with the lowest number. For problems
with integer constraints, use `ga`

.

Use

`fminbnd`

first on one-dimensional bounded problems only.`fminbnd`

provably converges quickly in one dimension.Use

`patternsearch`

on any other type of problem.`patternsearch`

provably converges, and handles all types of constraints.Try

`fminsearch`

next for low-dimensional unbounded problems.`fminsearch`

is not as general as`patternsearch`

and can fail to converge. For low-dimensional problems,`fminsearch`

is simple to use, since it has few tuning options.Try

`particleswarm`

next on unbounded or bound-constrained problems.`particleswarm`

has little supporting theory, but is often an efficient algorithm.Try

`ga`

next.`ga`

has little supporting theory and is often less efficient than`patternsearch`

or`particleswarm`

. It handles all types of constraints.`ga`

is the only solver that handles integer constraints.Try

`simulannealbnd`

last for unbounded problems, or for problems with bounds.`simulannealbnd`

provably converges only for a logarithmic cooling schedule, which is extremely slow.`simulannealbnd`

takes only bound constraints, and is often less efficient than`ga`

.

Solver | Convergence | Characteristics |
---|---|---|

`GlobalSearch` | Fast convergence to local optima for smooth problems. | Deterministic iterates |

Gradient-based | ||

Automatic stochastic start points | ||

Removes many start points heuristically | ||

`MultiStart` | Fast convergence to local optima for smooth problems. | Deterministic iterates |

Can run in parallel; see How to Use Parallel Processing | ||

Gradient-based | ||

Stochastic or deterministic start points, or combination of both | ||

Automatic stochastic start points | ||

Runs all start points | ||

Choice of local solver: `fmincon` , `fminunc` , `lsqcurvefit` ,
or `lsqnonlin` | ||

`patternsearch` | Proven convergence to local optimum, slower than gradient-based solvers. | Deterministic iterates |

Can run in parallel; see How to Use Parallel Processing | ||

No gradients | ||

User-supplied start point | ||

`particleswarm` | No convergence proof. | Stochastic iterates |

Can run in parallel; see How to Use Parallel Processing | ||

Population-based | ||

No gradients | ||

Automatic start population, or user-supplied population, or combination of both | ||

Only bound constraints | ||

`ga` | No convergence proof. | Stochastic iterates |

Can run in parallel; see How to Use Parallel Processing | ||

Population-based | ||

No gradients | ||

Allows integer constraints; see Mixed Integer Optimization | ||

Automatic start population, or user-supplied population, or combination of both | ||

`simulannealbnd` | Proven to converge to global optimum for bounded problems with very slow cooling schedule. | Stochastic iterates |

No gradients | ||

User-supplied start point | ||

Only bound constraints |

Explanation of some characteristics:

Convergence — Solvers can fail to converge to any solution when started far from a local minimum. When started near a local minimum, gradient-based solvers converge to a local minimum quickly for smooth problems.

`patternsearch`

provably converges for a wide range of problems, but the convergence is slower than gradient-based solvers. Both`ga`

and`simulannealbnd`

can fail to converge in a reasonable amount of time for some problems, although they are often effective.Iterates — Solvers iterate to find solutions. The steps in the iteration are iterates. Some solvers have deterministic iterates. Others use random numbers and have stochastic iterates.

Gradients — Some solvers use estimated or user-supplied derivatives in calculating the iterates. Other solvers do not use or estimate derivatives, but use only objective and constraint function values.

Start points — Most solvers require you to provide a starting point for the optimization. One reason they require a start point is to obtain the dimension of the decision variables.

`ga`

does not require any starting points, because it takes the dimension of the decision variables as an input.`ga`

can generate its start population automatically.

Compare the characteristics of Global Optimization Toolbox solvers to Optimization Toolbox solvers.

Solver | Convergence | Characteristics |
---|---|---|

`fmincon` , `fminunc` , `fseminf` , `lsqcurvefit` , `lsqnonlin` | Proven quadratic convergence to local optima for smooth problems | Deterministic iterates |

Gradient-based | ||

User-supplied starting point | ||

`fminsearch` | No convergence proof — counterexamples exist. | Deterministic iterates |

No gradients | ||

User-supplied start point | ||

No constraints | ||

`fminbnd` | Proven convergence to local optima for smooth problems, slower than quadratic. | Deterministic iterates |

No gradients | ||

User-supplied start point | ||

Only one-dimensional problems |

All these Optimization Toolbox solvers:

Have deterministic iterates

Start from one user-supplied point

Search just one basin of attraction

`GlobalSearch`

and `MultiStart`

are
objects. What does this mean for you?

You create a

`GlobalSearch`

or`MultiStart`

object before running your problem.You can reuse the object for running multiple problems.

`GlobalSearch`

and`MultiStart`

objects are containers for algorithms and global options. You use these objects to run a local solver multiple times. The local solver has its own options.

For more information, see the Object-Oriented Programming documentation.

Was this topic helpful?