matlab neural network strange simulation performance
1 view (last 30 days)
Show older comments
Hello, and thank you for giving me the chance to ask this question. I am a relative beginner in MATLAB & neural network concepts, trying to research into a prediction system of some economic and demographic data, basing on a neural network.
Having double input matrix (manually normalized) and double output matrix (manually normalized), after a long session of training and comparing performances (the less the better) for different functions, I have finally created five neural networks with the following sets of MATLAB functions:
1 newcf trainlm initnw mse learngd satlin
2 newcf trainlm initnw msne learngdm compet
3 newelm trainlm initwb mse learnhd purelin
4 newff trainlm initnw mse learngd purelin
5 newff trainlm initnw mse learnwh tansig
The best performance results (perf) for each one of them vary from xxxE-30 to xxxE-32.
But still, after running simulation of those networks for each single column of the input matrix, I got the expected output results in just 60% of the cases, while the other 40% are totally wrong.
I have exactly the same 60%/40% relationship between good and bad simulation results for all the above networks, with different bad columns per net.
Can something like this happen? What do you think could be wrong? Maybe the perf result when training is not enough to judge when a neural network is good enough? Maybe I didn't understand something well in the concepts?
Thank you in advance
P.S. the approximate (pseudo)code (MATLAB R2010) of the course of my actions is given below:
inp = {
0,1300 0,0300 0,0300 0,0300 -0,0100 0,0300 0,0900 0,0100 0,0600 0,0700;
0,0500 -0,0400 -0,0400 -0,0400 -0,0100 0,0100 0,1400 0,0900 0,0600 -0,1700;
............
}
outp={
0,0427 -0,1071 0,0605 -0,0637 -0,0410 0,2566 -0,0551 -0,0902 -0,2483 0,1543;
-0,0249 0,0192 -0,1199 -0,3748 0,3212 0,5490 -0,1655 -0,1213 -1,0236 0,4678;
0,1000 -0,1000 0,4000 0,2000 -0,3000 -0,9000 -0,6000 -0,7000 0,2000 0,4000;
...........
}
[out_r, out_c] = size(cell2mat(outp));
[inp_r, inp_c] = size(cell2mat(inp));
%------------ for 2 layers:
biasConnect = [1;1];
inputConnect = [1; 0];
layerConnect = [0 0; 1 0];
outputConnect = [0 1];
%-------------or for 3 layers:
% biasConnect = [1;1;1];
% inputConnect = [1; 0; 0];
% layerConnect = [0 0 0; 1 0 0; 0 1 0];
% outputConnect = [0 0 1];
my_net = network(1, 2, biasConnect,inputConnect,layerConnect,outputConnect); %create neural network
my_net.inputs{1}.size = inp_r; % set the number of elements in an input vector
my_net.outputs{1}.size = out_r; % set the number of elements in an output vector
my_net = newff(inp, outp, [inp_r out_r]); % Create feed-forward backpropagation network
% or for 3 layers: my_net = newff(inp, outp, [inp_r round(inp_r/2) out_r]);
my_net.layers{1}.size = inp_r; % 1st layer size
my_net.layers{1}.transferFcn = 'purelin'; % transfer function
my_net.layers{1}.initFcn = 'initnw';
%-------------------- Functions----------------------------------%/
my_net.divideFcn = 'divideblock';
my_net.plotFcns = {'plotperform','plottrainstate'};
my_net.initFcn ='initlay'; % Layer init function
my_net.performFcn = 'mse'; % performing function
my_net.trainFcn = 'trainlm'; % training function
my_net.adaptFcn = 'learngd'; % should be from list: learngdm, learngd
%-------------------set a few traininig params and train the net
my_net.trainParam.epochs = 100;
my_net.trainParam.goal = 1.0000e-030;
my_net.trainParam.max_fail = 3;
my_net.trainParam.mu = 1.0000e-03;
my_net.trainParam.mu_inc = 10;
my_net.trainParam.mu_dec = 0.1000;
my_net.trainParam.mu_max = 1e10;
my_net.trainParam.showWindow = false;
my_net.trainParam.showCommandLine = false;
my_net.trainParam.show = 0;
[my_net,tr] = train(my_net,inp,outp); % train the network
%---After all when the best my_net is found, I perform simulation for each column:
Y=sim(my_net, inp{:,i}) % for each i-column of the inp matrix and expect y=outp{:,i}
0 Comments
Accepted Answer
Greg Heath
on 27 Apr 2013
What version do you have?
If this is classification/pattern-recognition use patternnet, newpr(OLD) or newff(VERY OLD)
If this is regression/curve-fitting use fitnet, newfit(OLD) or newff(VERY OLD)
1. Use ALL of the defaults as in the help examples e.g.,
help fitnet
2. Apply to the MATLAB nndataset that is most similar to yours.
help nndatasets
3. Apply to your data.
If you have any problems, post a reply with error messages and/or code.
Hope this helps.
Greg
2 Comments
More Answers (0)
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!