patternsearch

Find minimum of function using pattern search

Syntax

x = patternsearch(fun,x0)
x = patternsearch(fun,x0,A,b)
x = patternsearch(fun,x0,A,b,Aeq,beq)
x = patternsearch(fun,x0,A,b,Aeq,beq,LB,UB)
x = patternsearch(fun,x0,A,b,Aeq,beq,LB,UB,nonlcon)
x = patternsearch(fun,x0,A,b,Aeq,beq,LB,UB,nonlcon,options)
x = patternsearch(problem)
[x,fval] = patternsearch(fun,x0, ...)
[x,fval,exitflag] = patternsearch(fun,x0, ...)
[x,fval,exitflag,output] = patternsearch(fun,x0, ...)

Description

patternsearch finds the minimum of a function using a pattern search.

x = patternsearch(fun,x0) finds a local minimum, x, to the function handle fun that computes the values of the objective function. For details on writing fun, see Compute Objective Functions. x0 is an initial point for the pattern search algorithm, a real vector.

    Note   To write a function with additional parameters to the independent variables that can be called by patternsearch, see the section on Passing Extra Parameters in the Optimization Toolbox™ documentation.

x = patternsearch(fun,x0,A,b) finds a local minimum x to the function fun, subject to the linear inequality constraints represented in matrix form by Axb, see Linear Inequality Constraints.

If the problem has m linear inequality constraints and n variables, then

  • A is a matrix of size m-by-n.

  • b is a vector of length m.

x = patternsearch(fun,x0,A,b,Aeq,beq) finds a local minimum x to the function fun, starting at x0, and subject to the constraints

A*xbAeq*x=beq,

where Aeqx=beq represents the linear equality constraints in matrix form, see Linear Equality Constraints. If the problem has r linear equality constraints and n variables, then

  • Aeq is a matrix of size r-by-n.

  • beq is a vector of length r.

If there are no inequality constraints, pass empty matrices, [], for A and b.

x = patternsearch(fun,x0,A,b,Aeq,beq,LB,UB) defines a set of lower and upper bounds on the design variables, x, so that a solution is found in the range LB ≤ x ≤ UB, see Bound Constraints. If the problem has n variables, LB and UB are vectors of length n. If LB or UB is empty (or not provided), it is automatically expanded to -Inf or Inf, respectively. If there are no inequality or equality constraints, pass empty matrices for A, b, Aeq and beq.

x = patternsearch(fun,x0,A,b,Aeq,beq,LB,UB,nonlcon) subjects the minimization to the constraints defined in nonlcon, a nonlinear constraint function. The function nonlcon accepts x and returns the vectors C and Ceq, representing the nonlinear inequalities and equalities respectively. patternsearch minimizes fun such that C(x) ≤ 0 and Ceq(x) = 0. (Set LB=[] and UB=[] if no bounds exist.)

x = patternsearch(fun,x0,A,b,Aeq,beq,LB,UB,nonlcon,options) minimizes fun with the default optimization parameters replaced by values in options. The structure options can be created using psoptimset.

x = patternsearch(problem) finds the minimum for problem, where problem is a structure containing the following fields:

  • objective — Objective function

  • X0 — Starting point

  • Aineq — Matrix for linear inequality constraints

  • bineq — Vector for linear inequality constraints

  • Aeq — Matrix for linear equality constraints

  • beq — Vector for linear equality constraints

  • lb — Lower bound for x

  • ub — Upper bound for x

  • nonlcon — Nonlinear constraint function

  • solver'patternsearch'

  • options — Options structure created with psoptimset

  • rngstate — Optional field to reset the state of the random number generator

Create the structure problem by exporting a problem from the Optimization app, as described in Importing and Exporting Your Work in the Optimization Toolbox documentation.

    Note   problem must have all the fields as specified above.

[x,fval] = patternsearch(fun,x0, ...) returns the value of the objective function fun at the solution x.

[x,fval,exitflag] = patternsearch(fun,x0, ...) returns exitflag, which describes the exit condition of patternsearch. Possible values of exitflag and the corresponding conditions are

Exit FlagMeaning

1

Without nonlinear constraints — Magnitude of the mesh size is less than the specified tolerance and constraint violation is less than TolCon.

With nonlinear constraints — Magnitude of the complementarity measure (defined after this table) is less than sqrt(TolCon), the subproblem is solved using a mesh finer than TolMesh, and the constraint violation is less than TolCon.

2

Change in x and the mesh size are both less than the specified tolerance, and the constraint violation is less than TolCon.

3

Change in fval and the mesh size are both less than the specified tolerance, and the constraint violation is less than TolCon.

4

Magnitude of step smaller than machine precision and the constraint violation is less than TolCon.

0

Maximum number of function evaluations or iterations reached.

-1

Optimization terminated by an output function or plot function.

-2

No feasible point found.

In the nonlinear constraint solver, the complementarity measure is the norm of the vector whose elements are ciλi, where ci is the nonlinear inequality constraint violation, and λi is the corresponding Lagrange multiplier.

[x,fval,exitflag,output] = patternsearch(fun,x0, ...) returns a structure output containing information about the search. The output structure contains the following fields:

  • function — Objective function.

  • problemtype — String describing the type of problem, one of:

    • 'unconstrained'

    • 'boundconstraints'

    • 'linearconstraints'

    • 'nonlinearconstr'

  • pollmethod — Polling technique.

  • searchmethod — Search technique used, if any.

  • iterations — Total number of iterations.

  • funccount — Total number of function evaluations.

  • meshsize — Mesh size at x.

  • maxconstraint — Maximum constraint violation, if any.

  • rngstate — State of the MATLAB® random number generator, just before the algorithm started. You can use the values in rngstate to reproduce the output when you use a random search method or random poll method. See Reproduce Results, which discusses the identical technique for ga.

  • message — Reason why the algorithm terminated.

    Note   patternsearch does not accept functions whose inputs are of type complex. To solve problems involving complex data, write your functions so that they accept real vectors, by separating the real and imaginary parts.

Examples

Given the following constraints

[111221][x1x2][223],x10,  x20,

the following code finds the minimum of the function, lincontest6, that is provided with your software:

A = [1 1; -1 2; 2 1];
b = [2; 2; 3];
lb = zeros(2,1);
[x,fval,exitflag] = patternsearch(@lincontest6,[0 0],...
                                    A,b,[],[],lb)
Optimization terminated: mesh size less than 
                         options.TolMesh.

x =
    0.6667    1.3333

fval =
   -8.2222

exitflag =
     1

References

[1] Audet, Charles and J. E. Dennis Jr. "Analysis of Generalized Pattern Searches." SIAM Journal on Optimization, Volume 13, Number 3, 2003, pp. 889–903.

[2] Conn, A. R., N. I. M. Gould, and Ph. L. Toint. "A Globally Convergent Augmented Lagrangian Barrier Algorithm for Optimization with General Inequality Constraints and Simple Bounds." Mathematics of Computation, Volume 66, Number 217, 1997, pp. 261–288.

[3] Abramson, Mark A. Pattern Search Filter Algorithms for Mixed Variable General Constrained Optimization Problems. Ph.D. Thesis, Department of Computational and Applied Mathematics, Rice University, August 2002.

[4] Abramson, Mark A., Charles Audet, J. E. Dennis, Jr., and Sebastien Le Digabel. "ORTHOMADS: A deterministic MADS instance with orthogonal directions." SIAM Journal on Optimization, Volume 20, Number 2, 2009, pp. 948–966.

[5] Kolda, Tamara G., Robert Michael Lewis, and Virginia Torczon. "Optimization by direct search: new perspectives on some classical and modern methods." SIAM Review, Volume 45, Issue 3, 2003, pp. 385–482.

[6] Kolda, Tamara G., Robert Michael Lewis, and Virginia Torczon. "A generating set direct search augmented Lagrangian algorithm for optimization with a combination of general and linear constraints." Technical Report SAND2006-5315, Sandia National Laboratories, August 2006.

[7] Lewis, Robert Michael, Anne Shepherd, and Virginia Torczon. "Implementing generating set search methods for linearly constrained minimization." SIAM Journal on Scientific Computing, Volume 29, Issue 6, 2007, pp. 2507–2530.

Was this topic helpful?