How can I freeze specific weights of Neural network model?
90 views (last 30 days)
Utkarsh on 18 Jun 2020
If you wish to assign those weights in the beginning and keep them as constant, you can set the ‘WeightLearnRateFactor’ property as 0 (which defines the learning rate for that layer) for those layers.
convolution2dLayer(3,1,'Padding',[1 1 1 1],'WeightLearnRateFactor',0);
You may refer to this link to learn more about such properties.
Or if you want to fix certain weights to some layers in a trained network , then directly assign those layers the values after training the network.
net = alexnet; % or your pre-trained network
layer = net.Layers(1) % here 1 can be replaced with the layer number you wish to change
layer.Weights = randn(11,11,3,96); %the weight matrix which you wish to assign