MATLAB Answers


In the Neural Network Toolbox, how can I set different trainParam values for each layer of the network?

Asked by Larry
on 10 Jan 2013

In the Neural Network Toolbox, how can I set different trainParam values for each layer of the network? For example, in the example "simpleclassInputs" dataset in the "Neural Net Pattern Recognition" App, I modified the net to add a hidden layer (hiddenLayerSize = [7 7]). I am using the training function 'trainlm'. I would like to set the values of "mu", "mu_dec", and "mu_inc" to different values for the hidden layers.

How can I do this?


For pattern recognition or classification with c classes, it is recommended to use

1. Columns of the c-dimensional unit matrix for targets

2. TRAINSCG for the learning function.

I think I remember a verification demo in the documentation.

So my question is

Why are you doing this on a classification problem?

The use of TRAINLM came from what Mathworks used in their tutorial-style demo in the APPS:Neural Net Pattern Recognition. But, thanks, I will try TRAINSCG.

Log in to comment.

1 Answer

Answer by Greg Heath
on 12 Jan 2013
 Accepted Answer

From examples of regression using fitnet([ 5 5 ] ) and classification using patternnet([5 5]) I've deduced the following:

1. The learning rate, lr, and momentum constant, mc, can be specified for each set of layer weights between layer i and layer j.

2. For fitnet with default TRAINLM, the training properties listed in net.trainParam are fixed for all layer weights. In particular, mu, mu_dec, mu_inc and mu_max.

3. For patternnet with default TRAINSCG, the training properties listed in net.trainParam are fixed for all layer weights. In particular, sigma and lambda.

For example

          clear all, clc
          [x,t] = simpleclass_dataset;
          [I N ] = size(x)
          [ O N ] = size(t)
          net = patternnet([ 5 5 ]);
          net = configure(net,x,t)
          % ---SNIP
          layerWeights =  net.layerWeights
        %    layerWeights = 
        %                   []                  []    []
        %     [1x1 nnetWeight]                  []    []
        %                   []    [1x1 nnetWeight]    []
        layerWeights21 = layerWeights{2,1}
        % layerWeights21 = 
        %     Neural Network Weight
        %             delays: 0
        %            initFcn: (none)
        %       initSettings: .range
        %              learn: true
        %           learnFcn: 'learngdm'
        %         learnParam: .lr, .mc
        %               size: [5 5]
        %          weightFcn: 'dotprod'
        %        weightParam: (none)
        %           userdata: (your custom info)
          layerWeights21learnParam = layerWeights21.learnParam
        % layerWeights21learnParam = 
        %     Function Parameters for 'learngdm'
        %     Learning Rate     lr: 0.01
        %     Momentum Constant mc: 0.9     
        % So, the specifications have the form
        % net.layerWeights{ 2,1 }  = lr21 ;
        % net.layerWeights{ 3, 2 } = lr23 ;
        % net.layerWeights{ 2,1 } = mc21 ;
        % net.layerWeights{ 3, 2 } = mc32
        trainParam = net.trainParam
        % trainParam = 
        %     Function Parameters for 'trainscg'
        %     Show Training Window Feedback   showWindow: true
        %     Show Command Line Feedback showCommandLine: false
        %     Command Line Frequency                show: 25
        %     Maximum Epochs                      epochs: 1000
        %     Maximum Training Time                 time: Inf
        %     Performance Goal                      goal: 0
        %     Minimum Gradient                  min_grad: 1e-006
        %     Maximum Validation Checks         max_fail: 6
        %     Sigma                                sigma: 5e-005
        %     Lambda                              lambda: 5e-007
        %    Resulting in specifications of the form
        net.trainParam.sigma = sigma0 ;
        net.trainParam.lamda= lambda0 ;

Hope this helps.

Thank you for formally accepting my answer.



Thank you Greg. Very helpful.

Yesterday, I also found that it is possible to "turn off" all the learning in a layer by:

net.layerWeights{2,1}.learn = 0;

However, this approach appears to have a bug (which I am investigating with Mathworks). Currently, at least from what I could tell, learning can be turned off only after the net is trained - not before. So, I am using the following as a temporary work-around:

% Train the Network


[net,tr] = train(net,inputs,targets);

% BUG WORKAROUND: Re-Initialize

net = initlay(net);

% Turn off the learning for Layer 2

net.layerWeights{2,1}.learn = 0;

% Re-Train the Network


[net,tr] = train(net,inputs,targets);


Did you get an error message when trying to turn off learning after configure but before train?

Did you ask if there are any other properties that can be changed after train, but not after configure ?

Also, what about if you train for one epoch, then configure. Now are there any properties that cannot be changed?

Log in to comment.

Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

MATLAB Academy

New to MATLAB?

Learn MATLAB today!