Neural Network generated code gives me different result then my own code

2 views (last 30 days)
Hi !
Neural Network script generated by Matlab gets some set X and then we set paremters traing set ratio, validation set ratio and test set ratio. In my case this ratio is 0.55/0.15/0.3. However I would like to write script which predict the same number of elements but gets only training set and number h - horizon (how many values neural network should predict).Because I give Neural Network only training set (instead of the whole set) by proportion I should split training set into ratio 0.78/0.22/0.0 (there is no test set so test ratio is 0). Code of my function below or in attached file.
function y_predykcja = matlabNeuralNetworkScript(training_set, horizon)
%T = simplenarTargets;
T= tonndata(training_set,true,false);
trainFcn = 'trainlm'; % Levenberg-Marquardt
feedbackDelays = 1:2;
hiddenLayerSize = 10;
net = narnet(feedbackDelays,hiddenLayerSize,'open',trainFcn);
net.input.processFcns = {'removeconstantrows','mapminmax'};
[x,xi,ai,t] = preparets(net,{},{},T);
net.divideParam.trainRatio = 0.78;
net.divideParam.valRatio = 0.22;
net.divideParam.testRatio = 0;
net.trainParam.showWindow = false;
net.performFcn = 'mse'; % Mean squared error
% Train the Network
[net tr Ys Es Xf Af ] = train(net,x,t,xi,ai,'useParallel','no');
y = net(x,xi,ai);
y_predykcja = zeros(1,horizon);
for i=1:horizon
Xnew = net(x,Xf,Af);
Xf = [Xf Xnew];
Xf = Xf(1,2:3);
y_predykcja(1,i) = cell2mat(Xf(1,2));
end
end
My solution is working... but not as good as normal generated by Matlab script. For example I use series load ice_dataset. If I use ntstool where the whole series is divied into ratio 0.55/0.15/0.3 I get MSE 0.02. When I split this data to training_set (0.7 of whole set) and then use my script I get MSE 2. If I use sinus seris MSE of Matlab script is 1.4-e10 in my case is 1.4-e8. Could anybody explain me why ? How to fix my script to get expected accuracy ?
Best regards Jan

Accepted Answer

Greg Heath
Greg Heath on 2 Sep 2015
Edited: Greg Heath on 2 Sep 2015
I don't quite follow your logic. You seem to be trying to mimic the effect of using a closed loop configuration. I'm not sure of the validity. However, it seems that the last loop should be something like
Xnew = Ys; Xinew = Xf, Ainew = Af; Ypred = {[]}
for i=1:horizon
[Ynew Xfnew Afnew] = net(Xnew,Xinew,Ainew)
Ypred = [Ypred Ynew ];
Xnew = Ynew; Xinew = Xfnew; Ainew = Afnew;
end
Hope this helps.
Greg
P.S. I did not test this
  2 Comments
Jan Kostrzewa
Jan Kostrzewa on 4 Sep 2015
You are right. In last loop i try to mimic closed loop configuration. I like to writy everything "my own style" to understand everything ;)
However my question concerns code before this last loop. I made several simulations and they shown that the difference in the results is because I calculate MSE differently. I mean in ntstool Matlab always put as an input proper values even if they are from test set. I put into neural network predicted values to predict new values. In my case I calculate horiznt and error was accumulating :)

Sign in to comment.

More Answers (0)

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!