"Greg Heath" <heath@alumni.brown.edu> wrote in message <l2satt$jcf$1@newscl01ah.mathworks.com>...
> "David " <david.martin.hind@gmail.com> wrote in message <l2rpsf$lrc$1@newscl01ah.mathworks.com>...
> > "Greg Heath" <heath@alumni.brown.edu> wrote in message <l2n9m1$en$1@newscl01ah.mathworks.com>...
> > > "David " <david.martin.hind@gmail.com> wrote in message <l2mfsv$5gg$1@newscl01ah.mathworks.com>...
> > > > Hi,
> > > >
> > > > I have created a neural network using the neural network toolbox. I would like to export the weights and bias values to use in a neural network implemented outside of Matlab. However when I export the network weights and bias values I can't then use these to reproduce the same results, even in Matlab through a mfile.
> > >
> > > By default, the biases and weights operate on normalized inputs to prevent truncation errors and saturation and produce normalized outputs.
> > >
> > > Therefore, you must normalize inputs to the trained net and unnormalize the outputs.
> > >
> > > You can find some examples by searching ANSWERS using
> > >
> > > tsettings
> > >
> > > Hope this helps
> > >
> > > Greg
> >
> > Thanks Greg,
> >
> > I think I am doing the normalisation of the inputs and denormalisation of the outputs as suggeted but I'm still not getting results comparable to the sim(net, Samps) command. I've included my code below.
> >
> > code:
> >
> > clc;
> > clearvars j i H_Perceptron O_Perceptron LSQ_norm tsettings Samps_norm isettings net_Weights_input net_Weights_hidden net_Bias_input net_Bias_hidden
> >
> > %[LSQ_normo,ts1]=removeconstantrows(LSQ2);
> > [LSQ_norm,tsettings]=mapminmax(LSQ2);
> >
> > %[Samps_normo,ps1]=removeconstantrows(Samps);
> > [Samps_norm,isettings]=mapminmax(Samps);
> >
> > % Import the net weight and bias data. NB the
> > % Network must be in theworkspace
> > net_Weights_input=net.IW{1,1};
> > net_Weights_hidden=net.LW{2,1};
> > net_Bias_input=(net.b{1,1})';
> > net_Bias_hidden=(net.b{2,1})';
> >
> > % Find Size of net layers
> > [hidden,input] = size(net_Weights_input);
> > [output,hidden] = size(net_Weights_hidden);
> >
> >
> > for i=1:1:hidden
> > H_Perceptron(i) = 0;
> > end
> >
> > O_Perceptron = 0;
> >
> >
> > for iteration=1:1:3000
> >
> > for i=1:1:hidden
> > for j=1:1:input
> > H_Perceptron(i)=H_Perceptron(i)+(Samps_norm(j,iteration)*net_Weights_input(i,j)); % sum weights*inputs
> > end
> > H_Perceptron(i)=H_Perceptron(i)+net_Bias_input(1,i); % add bias
> > end
> >
> > for i=1:1:hidden
> > H_Perceptron(i)=tansig(H_Perceptron(i)); % apply tansig transfer function
> > end
> >
> > % Output layer, only 1 output perceptron
> > for j=1:1:hidden
> > O_Perceptron=O_Perceptron+(H_Perceptron(j)*net_Weights_hidden(1,j)); % sum weights*hidden perceptron outputs
> > end
> >
> > O_Perceptron=O_Perceptron+net_Bias_hidden(1,1); % add bias
> > O_Perceptron=tansig(O_Perceptron); % apply tansig transfer function
> >
> > OP_data=mapminmax('reverse',O_Perceptron,tsettings); %denormalise output
> >
> > fprintf('%f\n',OP_data); % Display result
> > result(iteration)=OP_data; % Store result
> >
> > end
> > % end of code
> >
> > As you can see I've also tried the removeconstant rows normalisation/denormalisation. Currently net.inputs{1}.processFcns returns
> >
> > ans =
> >
> > 'mapminmax'
> >
> > The output returns the same ans. Am I able to to upload files on here as that would have been easier! I could also have loaded the workspaces contents to look at.
> >
> > David
>
> Please use matrix multiplication instead of loops.
>
> Test your code on one of the MATLAB nndatasets so that we can compare answers.
>
> help nndatasets
>
> If there are any problems, post relevant code and error messages.
>
> Hope this helps.
>
> Greg
Hi,
Problem Solved!
You were of course right Greg about the normalization which was the root cause. I had also made a silly mistake in my code, the hidden and output perceptron outputs were not being reset at the beginning of every loop!
Thanks Again
David
