Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Neural Network Normalisation Issues

Subject: Neural Network Normalisation Issues

From: David

Date: 4 Oct, 2013 13:30:07

Message: 1 of 6

Hi,

I have created a neural network using the neural network toolbox. I would like to export the weights and bias values to use in a neural network implemented outside of Matlab. However when I export the network weights and bias values I can't then use these to reproduce the same results, even in Matlab through a m-file.

I'll explain what I have so far, I have a neural net called ANN. This takes a 128x3000 matrix as the input (3000 samples where each sample has 128 points), The network has 10 perceptrons in the hidden layer and one in the output layer, the transfer functions are tansigs.

I extract the networks weight and bias values using the following:

ANN_Weights_input=net.IW{1,1}; % Produces a 10x128 matrix
ANN_Weights_hidden=net.LW{2,1}; % Produces a 1x10 matrix
ANN_Bias_input=net.b{1,1}; % Produces a 1x10 matrix
ANN_Bias_hidden=net.b{2,1}; % Produces a 1x1 matrix

I am assuminG the weights for the input-hidden layer are arranged as each row represents a perceptron and each column corresponds to an input. Then for the bias values each column corresponds to a perceptron. Its important this is correct as if I've interpreted this incorrectly the result will obviously be wrong!

Once I have these values, for each hidden layer perceptron I multiply the inputs (1-128) by the weights (row=perceptron number, column corresponds to input) and add these to a running total, finally I add the bias and apply the tansig function to the result. I then multiply these results by the corresponding weights for the hidden-output layers and sum the results and add the bias value. The tansig function is then applied to obtain the neural network output.

The problem is the result is always around -1. When I run the sim(ANN, Input) or output = ANN(Input) command the result I get is as I expect. I have read that the neural net applies some normalization to the data by default. I have tried disabling this by setting

ANN.inputs{1}.processFcns = {};
ANN.outputs{2}.processFcns = {};

and retraining but the results I get are the same. I'm convinced that there is something being done to the data which I don't know about and until I find this and either apply it to my input data or disable it my results won't match.

Please help! (Sorry for the long post, wanted to give as much info as possible)

David

 

Subject: Neural Network Normalisation Issues

From: David

Date: 4 Oct, 2013 16:51:11

Message: 2 of 6

Just to add, I've also tried to leaving the nomailisation and applying it to my data, so before I use the data I normalise it. Checking net.inputs{1}.processFcns iI see that the network uses removeconstantrows and mapminmax, net.outputs{2}.processFcns are the same. So I apply these to my data in the following way:

[target1,ts1]=removeconstantrows(target); % Normalise target data used to train network
[target_norm,ts2]=mapminmax(target1);

[Input1,ps1]=removeconstantrows(Input); % Normalise input data
[Input_norm,ps2]=mapminmax(Input1);

I then implement the network as described previously. Once I get an output I attempt to undo the normalisation using

OP1=mapminmax('reverse',output,ts2);
OP=removeconstantrows('reverse',OP1,ts1);

But I'm not getting anythng sensible out of that either.

Any help greatly appreciated

David

Subject: Neural Network Normalisation Issues

From: Greg Heath

Date: 4 Oct, 2013 20:50:09

Message: 3 of 6

"David " <david.martin.hind@gmail.com> wrote in message <l2mfsv$5gg$1@newscl01ah.mathworks.com>...
> Hi,
>
> I have created a neural network using the neural network toolbox. I would like to export the weights and bias values to use in a neural network implemented outside of Matlab. However when I export the network weights and bias values I can't then use these to reproduce the same results, even in Matlab through a m-file.

By default, the biases and weights operate on normalized inputs to prevent truncation errors and saturation and produce normalized outputs.

Therefore, you must normalize inputs to the trained net and unnormalize the outputs.

You can find some examples by searching ANSWERS using

tsettings

Hope this helps

Greg

Subject: Neural Network Normalisation Issues

From: David

Date: 6 Oct, 2013 13:51:11

Message: 4 of 6

"Greg Heath" <heath@alumni.brown.edu> wrote in message <l2n9m1$en$1@newscl01ah.mathworks.com>...
> "David " <david.martin.hind@gmail.com> wrote in message <l2mfsv$5gg$1@newscl01ah.mathworks.com>...
> > Hi,
> >
> > I have created a neural network using the neural network toolbox. I would like to export the weights and bias values to use in a neural network implemented outside of Matlab. However when I export the network weights and bias values I can't then use these to reproduce the same results, even in Matlab through a m-file.
>
> By default, the biases and weights operate on normalized inputs to prevent truncation errors and saturation and produce normalized outputs.
>
> Therefore, you must normalize inputs to the trained net and unnormalize the outputs.
>
> You can find some examples by searching ANSWERS using
>
> tsettings
>
> Hope this helps
>
> Greg

Thanks Greg,

I think I am doing the normalisation of the inputs and de-normalisation of the outputs as suggeted but I'm still not getting results comparable to the sim(net, Samps) command. I've included my code below.

code:

clc;
clearvars j i H_Perceptron O_Perceptron LSQ_norm tsettings Samps_norm isettings net_Weights_input net_Weights_hidden net_Bias_input net_Bias_hidden

%[LSQ_normo,ts1]=removeconstantrows(LSQ2);
[LSQ_norm,tsettings]=mapminmax(LSQ2);

%[Samps_normo,ps1]=removeconstantrows(Samps);
[Samps_norm,isettings]=mapminmax(Samps);

% Import the net weight and bias data. NB the
% Network must be in theworkspace
net_Weights_input=net.IW{1,1};
net_Weights_hidden=net.LW{2,1};
net_Bias_input=(net.b{1,1})';
net_Bias_hidden=(net.b{2,1})';

% Find Size of net layers
[hidden,input] = size(net_Weights_input);
[output,hidden] = size(net_Weights_hidden);


for i=1:1:hidden
    H_Perceptron(i) = 0;
end

O_Perceptron = 0;


for iteration=1:1:3000
    
    for i=1:1:hidden
        for j=1:1:input
            H_Perceptron(i)=H_Perceptron(i)+(Samps_norm(j,iteration)*net_Weights_input(i,j)); % sum weights*inputs
        end
        H_Perceptron(i)=H_Perceptron(i)+net_Bias_input(1,i); % add bias
    end
    
    for i=1:1:hidden
        H_Perceptron(i)=tansig(H_Perceptron(i)); % apply tansig transfer function
    end
        
   % Output layer, only 1 output perceptron
    for j=1:1:hidden
        O_Perceptron=O_Perceptron+(H_Perceptron(j)*net_Weights_hidden(1,j)); % sum weights*hidden perceptron outputs
    end
   
    O_Perceptron=O_Perceptron+net_Bias_hidden(1,1); % add bias
    O_Perceptron=tansig(O_Perceptron); % apply tansig transfer function
       
    OP_data=mapminmax('reverse',O_Perceptron,tsettings); %de-normalise output

    fprintf('%f\n',OP_data); % Display result
    result(iteration)=OP_data; % Store result
       
end
% end of code

As you can see I've also tried the removeconstant rows normalisation/de-normalisation. Currently net.inputs{1}.processFcns returns

ans =

    'mapminmax'

The output returns the same ans. Am I able to to upload files on here as that would have been easier! I could also have loaded the workspaces contents to look at.

David

Subject: Neural Network Normalisation Issues

From: Greg Heath

Date: 6 Oct, 2013 18:42:05

Message: 5 of 6

"David " <david.martin.hind@gmail.com> wrote in message <l2rpsf$lrc$1@newscl01ah.mathworks.com>...
> "Greg Heath" <heath@alumni.brown.edu> wrote in message <l2n9m1$en$1@newscl01ah.mathworks.com>...
> > "David " <david.martin.hind@gmail.com> wrote in message <l2mfsv$5gg$1@newscl01ah.mathworks.com>...
> > > Hi,
> > >
> > > I have created a neural network using the neural network toolbox. I would like to export the weights and bias values to use in a neural network implemented outside of Matlab. However when I export the network weights and bias values I can't then use these to reproduce the same results, even in Matlab through a m-file.
> >
> > By default, the biases and weights operate on normalized inputs to prevent truncation errors and saturation and produce normalized outputs.
> >
> > Therefore, you must normalize inputs to the trained net and unnormalize the outputs.
> >
> > You can find some examples by searching ANSWERS using
> >
> > tsettings
> >
> > Hope this helps
> >
> > Greg
>
> Thanks Greg,
>
> I think I am doing the normalisation of the inputs and de-normalisation of the outputs as suggeted but I'm still not getting results comparable to the sim(net, Samps) command. I've included my code below.
>
> code:
>
> clc;
> clearvars j i H_Perceptron O_Perceptron LSQ_norm tsettings Samps_norm isettings net_Weights_input net_Weights_hidden net_Bias_input net_Bias_hidden
>
> %[LSQ_normo,ts1]=removeconstantrows(LSQ2);
> [LSQ_norm,tsettings]=mapminmax(LSQ2);
>
> %[Samps_normo,ps1]=removeconstantrows(Samps);
> [Samps_norm,isettings]=mapminmax(Samps);
>
> % Import the net weight and bias data. NB the
> % Network must be in theworkspace
> net_Weights_input=net.IW{1,1};
> net_Weights_hidden=net.LW{2,1};
> net_Bias_input=(net.b{1,1})';
> net_Bias_hidden=(net.b{2,1})';
>
> % Find Size of net layers
> [hidden,input] = size(net_Weights_input);
> [output,hidden] = size(net_Weights_hidden);
>
>
> for i=1:1:hidden
> H_Perceptron(i) = 0;
> end
>
> O_Perceptron = 0;
>
>
> for iteration=1:1:3000
>
> for i=1:1:hidden
> for j=1:1:input
> H_Perceptron(i)=H_Perceptron(i)+(Samps_norm(j,iteration)*net_Weights_input(i,j)); % sum weights*inputs
> end
> H_Perceptron(i)=H_Perceptron(i)+net_Bias_input(1,i); % add bias
> end
>
> for i=1:1:hidden
> H_Perceptron(i)=tansig(H_Perceptron(i)); % apply tansig transfer function
> end
>
> % Output layer, only 1 output perceptron
> for j=1:1:hidden
> O_Perceptron=O_Perceptron+(H_Perceptron(j)*net_Weights_hidden(1,j)); % sum weights*hidden perceptron outputs
> end
>
> O_Perceptron=O_Perceptron+net_Bias_hidden(1,1); % add bias
> O_Perceptron=tansig(O_Perceptron); % apply tansig transfer function
>
> OP_data=mapminmax('reverse',O_Perceptron,tsettings); %de-normalise output
>
> fprintf('%f\n',OP_data); % Display result
> result(iteration)=OP_data; % Store result
>
> end
> % end of code
>
> As you can see I've also tried the removeconstant rows normalisation/de-normalisation. Currently net.inputs{1}.processFcns returns
>
> ans =
>
> 'mapminmax'
>
> The output returns the same ans. Am I able to to upload files on here as that would have been easier! I could also have loaded the workspaces contents to look at.
>
> David

Please use matrix multiplication instead of loops.

Test your code on one of the MATLAB nndatasets so that we can compare answers.

help nndatasets

If there are any problems, post relevant code and error messages.

Hope this helps.

Greg

Subject: Neural Network Normalisation Issues

From: David

Date: 7 Oct, 2013 11:09:06

Message: 6 of 6

"Greg Heath" <heath@alumni.brown.edu> wrote in message <l2satt$jcf$1@newscl01ah.mathworks.com>...
> "David " <david.martin.hind@gmail.com> wrote in message <l2rpsf$lrc$1@newscl01ah.mathworks.com>...
> > "Greg Heath" <heath@alumni.brown.edu> wrote in message <l2n9m1$en$1@newscl01ah.mathworks.com>...
> > > "David " <david.martin.hind@gmail.com> wrote in message <l2mfsv$5gg$1@newscl01ah.mathworks.com>...
> > > > Hi,
> > > >
> > > > I have created a neural network using the neural network toolbox. I would like to export the weights and bias values to use in a neural network implemented outside of Matlab. However when I export the network weights and bias values I can't then use these to reproduce the same results, even in Matlab through a m-file.
> > >
> > > By default, the biases and weights operate on normalized inputs to prevent truncation errors and saturation and produce normalized outputs.
> > >
> > > Therefore, you must normalize inputs to the trained net and unnormalize the outputs.
> > >
> > > You can find some examples by searching ANSWERS using
> > >
> > > tsettings
> > >
> > > Hope this helps
> > >
> > > Greg
> >
> > Thanks Greg,
> >
> > I think I am doing the normalisation of the inputs and de-normalisation of the outputs as suggeted but I'm still not getting results comparable to the sim(net, Samps) command. I've included my code below.
> >
> > code:
> >
> > clc;
> > clearvars j i H_Perceptron O_Perceptron LSQ_norm tsettings Samps_norm isettings net_Weights_input net_Weights_hidden net_Bias_input net_Bias_hidden
> >
> > %[LSQ_normo,ts1]=removeconstantrows(LSQ2);
> > [LSQ_norm,tsettings]=mapminmax(LSQ2);
> >
> > %[Samps_normo,ps1]=removeconstantrows(Samps);
> > [Samps_norm,isettings]=mapminmax(Samps);
> >
> > % Import the net weight and bias data. NB the
> > % Network must be in theworkspace
> > net_Weights_input=net.IW{1,1};
> > net_Weights_hidden=net.LW{2,1};
> > net_Bias_input=(net.b{1,1})';
> > net_Bias_hidden=(net.b{2,1})';
> >
> > % Find Size of net layers
> > [hidden,input] = size(net_Weights_input);
> > [output,hidden] = size(net_Weights_hidden);
> >
> >
> > for i=1:1:hidden
> > H_Perceptron(i) = 0;
> > end
> >
> > O_Perceptron = 0;
> >
> >
> > for iteration=1:1:3000
> >
> > for i=1:1:hidden
> > for j=1:1:input
> > H_Perceptron(i)=H_Perceptron(i)+(Samps_norm(j,iteration)*net_Weights_input(i,j)); % sum weights*inputs
> > end
> > H_Perceptron(i)=H_Perceptron(i)+net_Bias_input(1,i); % add bias
> > end
> >
> > for i=1:1:hidden
> > H_Perceptron(i)=tansig(H_Perceptron(i)); % apply tansig transfer function
> > end
> >
> > % Output layer, only 1 output perceptron
> > for j=1:1:hidden
> > O_Perceptron=O_Perceptron+(H_Perceptron(j)*net_Weights_hidden(1,j)); % sum weights*hidden perceptron outputs
> > end
> >
> > O_Perceptron=O_Perceptron+net_Bias_hidden(1,1); % add bias
> > O_Perceptron=tansig(O_Perceptron); % apply tansig transfer function
> >
> > OP_data=mapminmax('reverse',O_Perceptron,tsettings); %de-normalise output
> >
> > fprintf('%f\n',OP_data); % Display result
> > result(iteration)=OP_data; % Store result
> >
> > end
> > % end of code
> >
> > As you can see I've also tried the removeconstant rows normalisation/de-normalisation. Currently net.inputs{1}.processFcns returns
> >
> > ans =
> >
> > 'mapminmax'
> >
> > The output returns the same ans. Am I able to to upload files on here as that would have been easier! I could also have loaded the workspaces contents to look at.
> >
> > David
>
> Please use matrix multiplication instead of loops.
>
> Test your code on one of the MATLAB nndatasets so that we can compare answers.
>
> help nndatasets
>
> If there are any problems, post relevant code and error messages.
>
> Hope this helps.
>
> Greg

Hi,

Problem Solved!

You were of course right Greg about the normalization which was the root cause. I had also made a silly mistake in my code, the hidden and output perceptron outputs were not being reset at the beginning of every loop!

Thanks Again

David

Tags for this Thread

No tags are associated with this thread.

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us