Gradient descent weight and bias learning function
[dW,LS] = learngd(W,P,Z,N,A,T,E,gW,gA,D,LP,LS)
info = learngd('code
')
learngd
is the gradient descent weight and
bias learning function.
[dW,LS] = learngd(W,P,Z,N,A,T,E,gW,gA,D,LP,LS)
takes
several inputs:
W 

P 

Z 

N 

A 

T 

E 

gW 

gA 

D 

LP  Learning parameters, none, 
LS  Learning state, initially should be 
and returns
dW 

LS  New learning state 
Learning occurs according to learngd
's
learning parameter, shown here with its default value.
LP.lr  0.01  Learning rate 
info = learngd('
returns
useful information for each supported code
')code
string:
'pnames'  Names of learning parameters 
'pdefaults'  Default learning parameters 
'needg'  Returns 1 if this function uses 
Here you define a random gradient gW
for
a weight going to a layer with three neurons from an input with two
elements. Also define a learning rate of 0.5.
gW = rand(3,2); lp.lr = 0.5;
Because learngd
only needs these values to
calculate a weight change (see "Algorithm" below), use
them to do so.
dW = learngd([],[],[],[],[],[],[],gW,[],[],lp,[])
You can create a standard network that uses learngd
with newff
, newcf
,
or newelm
. To prepare the weights and the bias
of layer i
of a custom network to adapt with learngd
,
Set net.adaptFcn
to 'trains'
.
net.adaptParam
automatically becomes trains
's
default parameters.
Set each net.inputWeights{i,j}.learnFcn
to 'learngd'
.
Set each net.layerWeights{i,j}.learnFcn
to 'learngd'
.
Set net.biases{i}.learnFcn
to 'learngd'
.
Each weight and bias learning parameter property is automatically
set to learngd
's default parameters.
To allow the network to adapt,
Set net.adaptParam
properties
to desired values.
Call adapt
with
the network.
See help newff
or help newcf
for
examples.