# Documentation

### This is machine translation

Translated by
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

# srchbac

1-D minimization using backtracking

## Syntax

```[a,gX,perf,retcode,delta,tol] = srchbac(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,TOL,ch_perf) ```

## Description

`srchbac` is a linear search routine. It searches in a given direction to locate the minimum of the performance function in that direction. It uses a technique called backtracking.

`[a,gX,perf,retcode,delta,tol] = srchbac(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,TOL,ch_perf)` takes these inputs,

 `net` Neural network `X` Vector containing current values of weights and biases `Pd` Delayed input vectors `Tl` Layer target vectors `Ai` Initial input delay conditions `Q` Batch size `TS` Time steps `dX` Search direction vector `gX` Gradient vector `perf` Performance value at current `X` `dperf` Slope of performance value at current `X` in direction of `dX` `delta` Initial step size `tol` Tolerance on search `ch_perf` Change in performance on previous step

and returns

 `a` Step size that minimizes performance `gX` Gradient at new minimum point `perf` Performance value at new minimum point `retcode` Return code that has three elements. The first two elements correspond to the number of function evaluations in the two stages of the search. The third element is a return code. These have different meanings for different search algorithms. Some might not be used in this function. `0`  Normal `1`  Minimum step taken `2`  Maximum step taken `3`  Beta condition not met `delta` New initial step size, based on the current step size `tol` New tolerance on search

Parameters used for the backstepping algorithm are

 `alpha` Scale factor that determines sufficient reduction in `perf` `beta` Scale factor that determines sufficiently large step size `low_lim` Lower limit on change in step size `up_lim` Upper limit on change in step size `maxstep` Maximum step length `minstep` Minimum step length `scale_tol` Parameter that relates the tolerance `tol` to the initial step size `delta`, usually set to 20

The defaults for these parameters are set in the training function that calls them. See `traincgf`, `traincgb`, `traincgp`, `trainbfg`, and `trainoss`.

Dimensions for these variables are

 `Pd` `No`-by-`Ni`-by-`TS` cell array Each element `P{i,j,ts}` is a `Dij`-by-`Q` matrix. `Tl` `Nl`-by-`TS` cell array Each element `P{i,ts}` is a `Vi`-by-`Q` matrix. `V` `Nl`-by-`LD` cell array Each element `Ai{i,k}` is an `Si`-by-`Q` matrix.

where

 `Ni ` `=` `net.numInputs` `Nl ` `=` `net.numLayers` `LD ` `=` `net.numLayerDelays` `Ri ` `=` `net.inputs{i}.size` `Si` `=` `net.layers{i}.size` `Vi` `=` ` net.targets{i}.size` `Dij` `=` `Ri * length(net.inputWeights{i,j}.delays)`

## Examples

Here is a problem consisting of inputs `p` and targets `t` to be solved with a network.

```p = [0 1 2 3 4 5]; t = [0 0 0 1 1 1]; ```

A two-layer feed-forward network is created. The network’s input ranges from ```[0 to 10]```. The first layer has two `tansig` neurons, and the second layer has one `logsig` neuron. The `traincgf` network training function and the `srchbac` search function are to be used.

### Create and Test a Network

```net = newff([0 5],[2 1],{'tansig','logsig'},'traincgf'); a = sim(net,p) ```

### Train and Retest the Network

```net.trainParam.searchFcn = 'srchbac'; net.trainParam.epochs = 50; net.trainParam.show = 10; net.trainParam.goal = 0.1; net = train(net,p,t); a = sim(net,p) ```

## Network Use

You can create a standard network that uses `srchbac` with `newff`, `newcf`, or `newelm`.

To prepare a custom network to be trained with `traincgf`, using the line search function `srchbac`,

1. Set `net.trainFcn` to `'traincgf'`. This sets `net.trainParam` to `traincgf`’s default parameters.

2. Set `net.trainParam.searchFcn` to `'srchbac'`.

The `srchbac` function can be used with any of the following training functions: `traincgf`, `traincgb`, `traincgp`, `trainbfg`, `trainoss`.

## More About

collapse all

### Backtracking Search

The backtracking search routine `srchbac` is best suited to use with the quasi-Newton optimization algorithms. It begins with a step multiplier of 1 and then backtracks until an acceptable reduction in the performance is obtained. On the first step it uses the value of performance at the current point and a step multiplier of 1. It also uses the value of the derivative of performance at the current point to obtain a quadratic approximation to the performance function along the search direction. The minimum of the quadratic approximation becomes a tentative optimum point (under certain conditions) and the performance at this point is tested. If the performance is not sufficiently reduced, a cubic interpolation is obtained and the minimum of the cubic interpolation becomes the new tentative optimum point. This process is continued until a sufficient reduction in the performance is obtained.

The backtracking algorithm is described in Dennis and Schnabel. It is used as the default line search for the quasi-Newton algorithms, although it might not be the best technique for all problems.

## Algorithms

`srchbac` locates the minimum of the performance function in the search direction `dX`, using the backtracking algorithm described on page 126 and 328 of Dennis and Schnabel’s book, noted below.

## References

Dennis, J.E., and R.B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Englewood Cliffs, NJ, Prentice-Hall, 1983

## See Also

#### Introduced before R2006a

Was this topic helpful?

#### Deep Learning: Top 7 Ways to Get Started with MATLAB

Get resources now