Outstar weight learning function
[dW,LS] = learnos(W,P,Z,N,A,T,E,gW,gA,D,LP,LS)
info = learnos('code')
learnos is the outstar weight learning function.
[dW,LS] = learnos(W,P,Z,N,A,T,E,gW,gA,D,LP,LS) takes several inputs,
W |
|
P |
|
Z |
|
N |
|
A |
|
T |
|
E |
|
gW |
|
gA |
|
D |
|
LP | Learning parameters, none, |
LS | Learning state, initially should be = |
and returns
dW |
|
LS | New learning state |
Learning occurs according to learnos’s learning parameter, shown here
with its default value.
LP.lr - 0.01 | Learning rate |
info = learnos(' returns useful
information for each code')code character vector:
'pnames' | Names of learning parameters |
'pdefaults' | Default learning parameters |
'needg' | Returns 1 if this function uses |
Here you define a random input P, output A, and
weight matrix W for a layer with a two-element input and three neurons. Also
define the learning rate LR.
p = rand(2,1); a = rand(3,1); w = rand(3,2); lp.lr = 0.5;
Because learnos only needs these values to calculate a weight change
(see “Algorithm” below), use them to do so.
dW = learnos(w,p,[],[],a,[],[],[],[],[],lp,[])
To prepare the weights and the bias of layer i of a custom network to
learn with learnos,
Set net.trainFcn to 'trainr'.
(net.trainParam automatically becomes trainr’s default
parameters.)
Set net.adaptFcn to 'trains'.
(net.adaptParam automatically becomes trains’s default
parameters.)
Set each net.inputWeights{i,j}.learnFcn to
'learnos'.
Set each net.layerWeights{i,j}.learnFcn to
'learnos'. (Each weight learning parameter property is automatically set
to learnos’s default parameters.)
To train the network (or enable it to adapt),
Set net.trainParam (or
net.adaptParam) properties to desired values.
Call train (adapt).
learnos calculates the weight change dW for a given
neuron from the neuron’s input P, output A, and learning
rate LR according to the outstar learning rule:
dw = lr*(a-w)*p'
Grossberg, S., Studies of the Mind and Brain, Drodrecht, Holland, Reidel Press, 1982