Narnet: How does the prediction work?

2 views (last 30 days)
I am introducing myself into Neural Networks and wanted to unterstand the prediction of the Global Ice example in Matlabs NAR-Network example. I used the wizard
nnstart
and created the Neural Network net. Then I had a look at the generated code. There seems to be the possibility to predict future Global Ice values by calling
nnclose(net)
Anyway I do not understand, how I get the predicted data (let's say I want to predict 5 timestemps). Or is nnclose only interesting for NARX-Networks? Until now I used
removedelay
to predict one value and then I gave the network my "old data" plus the one new generated in order to predict the next-next one (and so on). I am very unsecure about this method.
This seems like a dumb beginner question, but searching the web and this forum in particular did not help neither, as many questions were unanswered or not precisely answered (e.g. http://www.mathworks.com/matlabcentral/answers/9424 has some bad errors).

Accepted Answer

Greg Heath
Greg Heath on 8 Jul 2013
Edited: Greg Heath on 8 Jul 2013
for I=1:500 % or any other large number you like
1. Calculate the autocorrelation function of randn(1,N) where N is the
length of the ICE time series
2. Sort the 2*N-1 absolute values and find the value at index
M = floor(0.95*(2*N-1))
3. Store the value in sigthresh95(I)
end
sigthresh = mean(sigthresh95)
Obtain the autocorrelation function of the ICE time series and find the significant positive lags where abs(autocorrt(N+1:end)) > sigthresh.
If you want to predict m timesteps ahead
1. Design an openloop narnet with
a, divideFcn = divideblock
b. Either all feedback delays in 1:m
c. Or all significant feedback delays in the range 1:m.
2. the length of the delay buffer will be m even if you just use the
significant subset of 1:m. Subsequently, the length of the
active series that will be the input to preparets is of length N-m.
3. Use closeloop to get a feedback net that will drive itself given m
consecutive values in the delay buffer.
4. Test the closeloop net on the design(training+validation) data.
5. If closeloop performance is poor, use train to improve the closeloop
design with the initial weights being those of the openloop design.
6. Use the last m points in the design data to predict the test data.
7. if that is successful, use the last m points of the test data to predict into the wild blue yonder.
8. For a given m you can vary the number of hidden nodes, H, and the train/val/test division ratios to improve results. For each combination try 10 different sets of random initial weights.
9. Finally, you can shorten prediction time by using the REMOVEDELAY function.
10. if you search ANSWERS you may be able to find one of my narxnet examples that uses closeloop and removedelay. I don't remember if I posted any narnet examples with those extensions.
Hope this helps.
Thank you for formally accepting my answer.
Greg
  1 Comment
Vincent
Vincent on 15 Jul 2013
Edited: Vincent on 15 Jul 2013
I tried to understand the steps, could you be more precise in Step 7.? This is exactly where I'm stuck. I mean:
  1. How can I determine the amount of cycles the network should take (=the amount of steps to predict)?
  2. How can I even input "initial" data after the loop is closed?
Am I right, that for prediction, first of all I have to close the loop? And removedelay is not really neccessary?
My Code looks like this for now:
targetSeries = tonndata(inputData,false,false);
net = narnet(1:4,10); % arbitrary until here
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
[X,Xi,Ai,t] = preparets(net,{},{},targetSeries);
[net,tr] = train(net,X,t,Xi,Ai);
netc = closeloop(net);
[Xc,Xic,Aic,Tc] = preparets(netc,{},{},targetSeries);
% NOW it's getting interesting:
outputc = netc(Xc,Xic,Aic);
The two questions above relate to the last line.

Sign in to comment.

More Answers (0)

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!