Find minimum of unconstrained multivariable function using derivativefree method
Finds the minimum of a problem specified by
$$\underset{x}{\mathrm{min}}f(x)$$
where f(x) is a function that returns a scalar. x is a vector or a matrix.
x = fminsearch(fun,x0)
x = fminsearch(fun,x0,options)
x = fminsearch(problem)
[x,fval] = fminsearch(...)
[x,fval,exitflag] = fminsearch(...)
[x,fval,exitflag,output] = fminsearch(...)
fminsearch
finds the minimum of a scalar
function of several variables, starting at an initial estimate. This
is generally referred to as unconstrained nonlinear optimization.
x = fminsearch(fun,x0)
starts
at the point x0
and returns a value x
that
is a local minimizer of the function described in fun
. x0
can
be a scalar, vector, or matrix. fun
is a function
handle; see Function Handles.
Parameterizing Functions in the MATLAB^{®} Mathematics
documentation explains how to pass additional parameters to your objective
function fun
. See also Example 2 and Example 3 below.
x = fminsearch(fun,x0,options)
minimizes
with the optimization parameters specified in the structure options
.
You can define these parameters using the optimset
function. fminsearch
uses
these options
structure fields:
 Level of display. 
 Check whether objective function values are valid. 
 Maximum number of function evaluations allowed, a positive
integer. The default is 
 Maximum number of iterations allowed, a positive integer.
The default value is 
 Specify one or more userdefined functions that an optimization
function calls at each iteration, either as a function handle or as
a cell array of function handles. The default is none ( 
 Plots various measures of progress while the algorithm
executes, select from predefined plots or write your own. Pass a function
handle or a cell array of function handles. The default is none (
See Plot Functions in MATLAB Mathematics for more information. 
 Termination tolerance on the function value, a positive
scalar. The default is 
 Termination tolerance on 
x = fminsearch(problem)
finds the minimum
for problem
, where problem
is
a structure with the following fields:
 Objective function 
 Initial point for x 
 'fminsearch' 
 Options structure created using optimset 
[x,fval] = fminsearch(...)
returns
in fval
the value of the objective function fun
at
the solution x
.
[x,fval,exitflag] = fminsearch(...)
returns
a value exitflag
that describes the exit condition
of fminsearch
:


 Maximum number of function evaluations or iterations was reached. 
 Algorithm was terminated by the output function. 
[x,fval,exitflag,output] = fminsearch(...)
returns
a structure output
that contains information about
the optimization in the following fields:


 Number of function evaluations 
 Number of iterations 
 Exit message 
fun
is the function to be minimized. It
accepts an input x
and returns a scalar f
,
the objective function evaluated at x
. The function fun
can
be specified as a function handle for a function file
x = fminsearch(@myfun, x0)
where myfun
is a function file such as
function f = myfun(x) f = ... % Compute function value at x
or as a function handle for an anonymous function, such as
x = fminsearch(@(x)sin(x^2), x0);
Other arguments are described in the syntax descriptions above.
The Rosenbrock banana function is a classic test example for multidimensional minimization:
$$f(x)=100{\left({x}_{2}{x}_{1}^{2}\right)}^{2}+{\left(1{x}_{1}\right)}^{2}.$$
The minimum is at (1,1)
and has the value 0
.
The traditional starting point is (1.2,1)
. The
anonymous function shown here defines the function and returns a function
handle called banana
:
banana = @(x)100*(x(2)x(1)^2)^2+(1x(1))^2;
Pass the function handle to fminsearch
:
[x,fval] = fminsearch(banana,[1.2, 1])
This produces
x = 1.0000 1.0000 fval = 8.1777e010
This indicates that the minimizer was found to at least four decimal places with a value near zero.
If fun
is parameterized, you can use anonymous
functions to capture the problemdependent parameters. For example,
suppose you want to minimize the objective function myfun
defined
by the following function file:
function f = myfun(x,a) f = x(1)^2 + a*x(2)^2;
Note that myfun
has an extra parameter a
,
so you cannot pass it directly to fminsearch
. To
optimize for a specific value of a
, such as a
= 1.5
.
Assign the value to a
.
a = 1.5; % define parameter first
Call fminsearch
with
a oneargument anonymous function that captures that value of a
and
calls myfun
with two arguments:
x = fminsearch(@(x) myfun(x,a),[0,1])
You can modify the first example by adding a parameter a to the second term of the banana function:
$$f(x)=100{\left({x}_{2}{x}_{1}^{2}\right)}^{2}+{\left(a{x}_{1}\right)}^{2}.$$
This changes the location of the minimum to the point [a,a^2]
.
To minimize this function for a specific value of a
,
for example a = sqrt(2)
,
create a oneargument anonymous function that captures the value of a
.
a = sqrt(2); banana = @(x)100*(x(2)x(1)^2)^2+(ax(1))^2;
Then the statement
[x,fval] = fminsearch(banana, [1.2, 1], ... optimset('TolX',1e8));
seeks the minimum [sqrt(2), 2]
to an accuracy
higher than the default on x
.
fminsearch
can
often handle discontinuity, particularly if it does not occur near
the solution. fminsearch may only give local solutions.
fminsearch
only minimizes over the real numbers,
that is, x must only consist of real numbers and f(x) must
only return real numbers. When x has complex variables,
they must be split into real and imaginary parts.
[1] Lagarias, J. C., J. A. Reeds, M. H. Wright, and P. E. Wright, "Convergence Properties of the NelderMead Simplex Method in Low Dimensions," SIAM Journal of Optimization, Vol. 9 Number 1, pp. 112147, 1998.