bttderiv
(To be removed) Backpropagation through time derivative function
bttderiv will be removed in a future release. For more information,
see Transition Legacy Neural Network Code to dlnetwork Workflows.
For advice on updating your code, see Version History.
Syntax
bttderiv('dperf_dwb',net,X,T,Xi,Ai,EW)
bttderiv('de_dwb',net,X,T,Xi,Ai,EW)
Description
This function calculates derivatives using the chain rule from a network’s performance back through the network, and in the case of dynamic networks, back through time.
bttderiv('dperf_dwb',net,X,T,Xi,Ai,EW) takes these arguments,
net | Neural network |
X | Inputs, an RxQ matrix (or NxTS cell array of RixQ matrices) |
T | Targets, an SxQ matrix (or MxTS cell array of SixQ matrices) |
Xi | Initial input delay states (optional) |
Ai | Initial layer delay states (optional) |
EW | Error weights (optional) |
and returns the gradient of performance with respect to the network’s weights and biases, where R and S are the number of input and output elements and Q is the number of samples (and N and M are the number of input and output signals, Ri and Si are the number of each input and outputs elements, and TS is the number of timesteps).
bttderiv('de_dwb',net,X,T,Xi,Ai,EW) returns the Jacobian of
errors with respect to the network’s weights and biases.
Examples
Here a feedforward network is trained and both the gradient and Jacobian are calculated.
[x,t] = simplefit_dataset;
net = feedforwardnet(20);
net = train(net,x,t);
y = net(x);
perf = perform(net,t,y);
gwb = bttderiv('dperf_dwb',net,x,t)
jwb = bttderiv('de_dwb',net,x,t)
Version History
Introduced in R2010bSee Also
Time Series
Modeler | fitrnet (Statistics and Machine Learning Toolbox) | fitcnet (Statistics and Machine Learning Toolbox) | trainnet | trainingOptions | dlnetwork