This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.


1-D minimization using backtracking


[a,gX,perf,retcode,delta,tol] = srchbac(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,TOL,ch_perf)


srchbac is a linear search routine. It searches in a given direction to locate the minimum of the performance function in that direction. It uses a technique called backtracking.

[a,gX,perf,retcode,delta,tol] = srchbac(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,TOL,ch_perf) takes these inputs,


Neural network


Vector containing current values of weights and biases


Delayed input vectors


Layer target vectors


Initial input delay conditions


Batch size


Time steps


Search direction vector


Gradient vector


Performance value at current X


Slope of performance value at current X in direction of dX


Initial step size


Tolerance on search


Change in performance on previous step

and returns


Step size that minimizes performance


Gradient at new minimum point


Performance value at new minimum point


Return code that has three elements. The first two elements correspond to the number of function evaluations in the two stages of the search. The third element is a return code. These have different meanings for different search algorithms. Some might not be used in this function.

 0  Normal
 1  Minimum step taken
 2  Maximum step taken
 3  Beta condition not met

New initial step size, based on the current step size


New tolerance on search

Parameters used for the backstepping algorithm are


Scale factor that determines sufficient reduction in perf


Scale factor that determines sufficiently large step size


Lower limit on change in step size


Upper limit on change in step size


Maximum step length


Minimum step length


Parameter that relates the tolerance tol to the initial step size delta, usually set to 20

The defaults for these parameters are set in the training function that calls them. See traincgf, traincgb, traincgp, trainbfg, and trainoss.

Dimensions for these variables are


No-by-Ni-by-TS cell array

Each element P{i,j,ts} is a Dij-by-Q matrix.


Nl-by-TS cell array

Each element P{i,ts} is a Vi-by-Q matrix.


Nl-by-LD cell array

Each element Ai{i,k} is an Si-by-Q matrix.


Ni =net.numInputs
Nl =net.numLayers
LD =net.numLayerDelays
Ri =net.inputs{i}.size
Vi= net.targets{i}.size
Dij=Ri * length(net.inputWeights{i,j}.delays)


Here is a problem consisting of inputs p and targets t to be solved with a network.

p = [0 1 2 3 4 5];
t = [0 0 0 1 1 1];

A two-layer feed-forward network is created. The network’s input ranges from [0 to 10]. The first layer has two tansig neurons, and the second layer has one logsig neuron. The traincgf network training function and the srchbac search function are to be used.

Create and Test a Network

net = newff([0 5],[2 1],{'tansig','logsig'},'traincgf');
a = sim(net,p)

Train and Retest the Network

net.trainParam.searchFcn = 'srchbac';
net.trainParam.epochs = 50; = 10;
net.trainParam.goal = 0.1;
net = train(net,p,t);
a = sim(net,p)

Network Use

You can create a standard network that uses srchbac with newff, newcf, or newelm.

To prepare a custom network to be trained with traincgf, using the line search function srchbac,

  1. Set net.trainFcn to 'traincgf'. This sets net.trainParam to traincgf’s default parameters.

  2. Set net.trainParam.searchFcn to 'srchbac'.

The srchbac function can be used with any of the following training functions: traincgf, traincgb, traincgp, trainbfg, trainoss.

More About

collapse all

Backtracking Search

The backtracking search routine srchbac is best suited to use with the quasi-Newton optimization algorithms. It begins with a step multiplier of 1 and then backtracks until an acceptable reduction in the performance is obtained. On the first step it uses the value of performance at the current point and a step multiplier of 1. It also uses the value of the derivative of performance at the current point to obtain a quadratic approximation to the performance function along the search direction. The minimum of the quadratic approximation becomes a tentative optimum point (under certain conditions) and the performance at this point is tested. If the performance is not sufficiently reduced, a cubic interpolation is obtained and the minimum of the cubic interpolation becomes the new tentative optimum point. This process is continued until a sufficient reduction in the performance is obtained.

The backtracking algorithm is described in Dennis and Schnabel. It is used as the default line search for the quasi-Newton algorithms, although it might not be the best technique for all problems.


srchbac locates the minimum of the performance function in the search direction dX, using the backtracking algorithm described on page 126 and 328 of Dennis and Schnabel’s book, noted below.


Dennis, J.E., and R.B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Englewood Cliffs, NJ, Prentice-Hall, 1983

See Also

| |

Introduced before R2006a

Was this topic helpful?