1D minimization using backtracking
[a,gX,perf,retcode,delta,tol] = srchbac(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,TOL,ch_perf)
srchbac
is a linear search routine. It searches in a given direction to
locate the minimum of the performance function in that direction. It uses a technique called
backtracking.
[a,gX,perf,retcode,delta,tol] = srchbac(net,X,Pd,Tl,Ai,Q,TS,dX,gX,perf,dperf,delta,TOL,ch_perf)
takes these inputs,
net 
Neural network 
X 
Vector containing current values of weights and biases 
Pd 
Delayed input vectors 
Tl 
Layer target vectors 
Ai 
Initial input delay conditions 
Q 
Batch size 
TS 
Time steps 
dX 
Search direction vector 
gX 
Gradient vector 
perf 
Performance value at current 
dperf 
Slope of performance value at current 
delta 
Initial step size 
tol 
Tolerance on search 
ch_perf 
Change in performance on previous step 
and returns
a 
Step size that minimizes performance 
gX 
Gradient at new minimum point 
perf 
Performance value at new minimum point 
retcode 
Return code that has three elements. The first two elements correspond to the number of function evaluations in the two stages of the search. The third element is a return code. These have different meanings for different search algorithms. Some might not be used in this function. 
0 Normal  
1 Minimum step taken  
2 Maximum step taken  
3 Beta condition not met  
delta 
New initial step size, based on the current step size 
tol 
New tolerance on search 
Parameters used for the backstepping algorithm are
alpha 
Scale factor that determines sufficient reduction in 
beta 
Scale factor that determines sufficiently large step size 
low_lim 
Lower limit on change in step size 
up_lim 
Upper limit on change in step size 
maxstep 
Maximum step length 
minstep 
Minimum step length 
scale_tol 
Parameter that relates the tolerance 
The defaults for these parameters are set in the training function that calls them. See
traincgf
, traincgb
, traincgp
, trainbfg
, and trainoss
.
Dimensions for these variables are
Pd 

Each element 
Tl 

Each element 
V 

Each element 
where
Ni  =  net.numInputs 
Nl  =  net.numLayers 
LD  =  net.numLayerDelays 
Ri  =  net.inputs{i}.size 
Si  =  net.layers{i}.size 
Vi  =  net.targets{i}.size 
Dij  =  Ri * length(net.inputWeights{i,j}.delays) 
Here is a problem consisting of inputs p
and targets
t
to be solved with a network.
p = [0 1 2 3 4 5]; t = [0 0 0 1 1 1];
A twolayer feedforward network is created. The network’s input ranges from [0 to
10]
. The first layer has two tansig
neurons, and the second layer
has one logsig
neuron. The traincgf
network training
function and the srchbac
search function are to be used.
net = newff([0 5],[2 1],{'tansig','logsig'},'traincgf'); a = sim(net,p)
net.trainParam.searchFcn = 'srchbac'; net.trainParam.epochs = 50; net.trainParam.show = 10; net.trainParam.goal = 0.1; net = train(net,p,t); a = sim(net,p)
You can create a standard network that uses srchbac
with
newff
, newcf
, or newelm
.
To prepare a custom network to be trained with traincgf
, using the line
search function srchbac
,
Set net.trainFcn
to 'traincgf'
.
This sets net.trainParam
to traincgf
’s default
parameters.
Set net.trainParam.searchFcn
to
'srchbac'
.
The srchbac
function can be used with any of the following training
functions: traincgf
, traincgb
,
traincgp
, trainbfg
, trainoss
.
srchbac
locates the minimum of the performance function in the search
direction dX
, using the backtracking algorithm described on page 126 and 328
of Dennis and Schnabel’s book, noted below.
Dennis, J.E., and R.B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Englewood Cliffs, NJ, PrenticeHall, 1983