In the Neural Network Toolbox, how can I set different trainParam values for each layer of the network?

5 views (last 30 days)
In the Neural Network Toolbox, how can I set different trainParam values for each layer of the network? For example, in the example "simpleclassInputs" dataset in the "Neural Net Pattern Recognition" App, I modified the net to add a hidden layer (hiddenLayerSize = [7 7]). I am using the training function 'trainlm'. I would like to set the values of "mu", "mu_dec", and "mu_inc" to different values for the hidden layers.
How can I do this?
  2 Comments
Greg Heath
Greg Heath on 12 Jan 2013
For pattern recognition or classification with c classes, it is recommended to use
1. Columns of the c-dimensional unit matrix for targets
2. TRAINSCG for the learning function.
I think I remember a verification demo in the documentation.
So my question is
Why are you doing this on a classification problem?
Larry
Larry on 12 Jan 2013
The use of TRAINLM came from what Mathworks used in their tutorial-style demo in the APPS:Neural Net Pattern Recognition. But, thanks, I will try TRAINSCG.

Sign in to comment.

Accepted Answer

Greg Heath
Greg Heath on 12 Jan 2013
From examples of regression using fitnet([ 5 5 ] ) and classification using patternnet([5 5]) I've deduced the following:
1. The learning rate, lr, and momentum constant, mc, can be specified for each set of layer weights between layer i and layer j.
2. For fitnet with default TRAINLM, the training properties listed in net.trainParam are fixed for all layer weights. In particular, mu, mu_dec, mu_inc and mu_max.
3. For patternnet with default TRAINSCG, the training properties listed in net.trainParam are fixed for all layer weights. In particular, sigma and lambda.
For example
clear all, clc
[x,t] = simpleclass_dataset;
[I N ] = size(x)
[ O N ] = size(t)
net = patternnet([ 5 5 ]);
rng(0)
net = configure(net,x,t)
% ---SNIP
layerWeights = net.layerWeights
% layerWeights =
%
% [] [] []
% [1x1 nnetWeight] [] []
% [] [1x1 nnetWeight] []
layerWeights21 = layerWeights{2,1}
% layerWeights21 =
%
% Neural Network Weight
%
% delays: 0
% initFcn: (none)
% initSettings: .range
% learn: true
% learnFcn: 'learngdm'
% learnParam: .lr, .mc
% size: [5 5]
% weightFcn: 'dotprod'
% weightParam: (none)
% userdata: (your custom info)
%
layerWeights21learnParam = layerWeights21.learnParam
% layerWeights21learnParam =
%
% Function Parameters for 'learngdm'
%
% Learning Rate lr: 0.01
% Momentum Constant mc: 0.9
% So, the specifications have the form
%
% net.layerWeights{ 2,1 }.learnParam.lr = lr21 ;
% net.layerWeights{ 3, 2 }.learnParam.lr = lr23 ;
%
% net.layerWeights{ 2,1 }.learnParam.mc = mc21 ;
% net.layerWeights{ 3, 2 }.learnParam.mc = mc32
trainParam = net.trainParam
% trainParam =
%
%
% Function Parameters for 'trainscg'
%
% Show Training Window Feedback showWindow: true
% Show Command Line Feedback showCommandLine: false
% Command Line Frequency show: 25
% Maximum Epochs epochs: 1000
% Maximum Training Time time: Inf
% Performance Goal goal: 0
% Minimum Gradient min_grad: 1e-006
% Maximum Validation Checks max_fail: 6
% Sigma sigma: 5e-005
% Lambda lambda: 5e-007
%
% Resulting in specifications of the form
net.trainParam.sigma = sigma0 ;
net.trainParam.lamda= lambda0 ;
Hope this helps.
Thank you for formally accepting my answer.
Greg.
  2 Comments
Larry
Larry on 12 Jan 2013
Thank you Greg. Very helpful.
Yesterday, I also found that it is possible to "turn off" all the learning in a layer by:
net.layerWeights{2,1}.learn = 0;
However, this approach appears to have a bug (which I am investigating with Mathworks). Currently, at least from what I could tell, learning can be turned off only after the net is trained - not before. So, I am using the following as a temporary work-around:
% Train the Network
net.trainParam.epochs=1;
[net,tr] = train(net,inputs,targets);
% BUG WORKAROUND: Re-Initialize
net = initlay(net);
% Turn off the learning for Layer 2
net.layerWeights{2,1}.learn = 0;
% Re-Train the Network
net.trainParam.epochs=1000;
[net,tr] = train(net,inputs,targets);
Greg Heath
Greg Heath on 13 Jan 2013
Interesting.
Did you get an error message when trying to turn off learning after configure but before train?
Did you ask if there are any other properties that can be changed after train, but not after configure ?
Also, what about if you train for one epoch, then configure. Now are there any properties that cannot be changed?

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!