Documentation |
Scaled conjugate gradient backpropagation
net.trainFcn = 'trainscg'
[net,tr] = train(net,...)
trainscg is a network training function that updates weight and bias values according to the scaled conjugate gradient method.
net.trainFcn = 'trainscg' sets the network trainFcn property.
[net,tr] = train(net,...) trains the network with trainscg.
Training occurs according to trainscg training parameters, shown here with their default values:
net.trainParam.epochs | 1000 | Maximum number of epochs to train |
net.trainParam.show | 25 | Epochs between displays (NaN for no displays) |
net.trainParam.showCommandLine | 0 | Generate command-line output |
net.trainParam.showWindow | 1 | Show training GUI |
net.trainParam.goal | 0 | Performance goal |
net.trainParam.time | inf | Maximum time to train in seconds |
net.trainParam.min_grad | 1e-6 | Minimum performance gradient |
net.trainParam.max_fail | 6 | Maximum validation failures |
net.trainParam.sigma | 5.0e-5 | Determine change in weight for second derivative approximation |
net.trainParam.lambda | 5.0e-7 | Parameter for regulating the indefiniteness of the Hessian |
You can create a standard network that uses trainscg with feedforwardnet or cascadeforwardnet. To prepare a custom network to be trained with trainscg,
In either case, calling train with the resulting network trains the network with trainscg.
Here is a problem consisting of inputs p and targets t to be solved with a network.
p = [0 1 2 3 4 5]; t = [0 0 0 1 1 1];
A two-layer feed-forward network with two hidden neurons and this training function is created.
net = feedforwardnet(2,'trainscg');
Here the network is trained and retested.
net = train(net,p,t); a = net(p)
See help feedforwardnet and help cascadeforwardnet for other examples.