Good performance, poor output with NARNET

1 view (last 30 days)
I use narnet to predict multistep ahead and my performance values are quite good:
NMSEs = 0.2560
openLoopPerformance = 0.2418
NMSEc = 0.9366
closedLoopPerformance = 0.8844
The openloop output is fitting good, but the closeloop output is not:
What is the problem? Thank you for help.

Accepted Answer

Greg Heath
Greg Heath on 16 Jun 2015
I'm really surprised by these posts. Just within the past year or so I've posted several times regarding
1. How to identify significant auto and cross-correlation lags so that an effective subset can be chosen for time-series prediction.
2. How to use a double loop approach to obtain multiple random initial weight and random data division designs for each of a range of values for the number of hidden nodes.
3. What to do when the closeloop performance is significantly worse than the openloop performance.
4. How to predict timeseries outputs beyond the time of known targets.
Just searching with the word closeloop over the last 13 months in the NEWSGROUP yields these somewhat useful posts
29 May 2015 NARNET TUTORIAL ON MULTISTEP AHEAD PREDICTIONS
27 May 2015 NARXNET CODE FOR MULTISTEP AHEAD PREDICTIONS
7 May 2015 HELP NARXNET BUGS, COMMENTS and SUGGESTIONS.
24 Jul 2014 perform using mse is giving me a abnormal value
17 Jul 2014 How to train my narxnet with untrained data??
If you have any questions I recommend using the MATLAB example data obtained from the commands
help nndatasets
doc nndatasets
Hope this helps.
Greg
  2 Comments
Peta
Peta on 16 Jun 2015
I hope its ok that I ask this here even though it’s not my thread;
I’m a little bit confused by what you said in point 2 about trying “random data division designs”. I’ve read many of your posts and guides and from what I’ve read before I have gotten the impression that you usually tell people to always override the default random data division function and instead use “divideblock”. But are you now advocating the use of “dividerand” or do you mean something different than division function when you talk about “data division designs”?
Greg Heath
Greg Heath on 20 Jun 2015
Edited: Greg Heath on 20 Jun 2015
Sorry for the confusion.
I recommend 'divideblock' for timeseries prediction so that correlations at the chosen lags are significant compared to those of random noise.
In these cases the randomness only comes from initial weights.
However, for regression/curve-fitting and classification/pattern-recognition I favor accepting the default and exploiting the two-fold randomness to mitigate both unfortunate initial weights and data order.
Thanks for the question.
Greg

Sign in to comment.

More Answers (2)

Greg Heath
Greg Heath on 13 Jun 2015
None of these results are acceptable.
A good goal is NMSE << 1. For example, 0.005 or 0.01.
  1 Comment
Gondos Gellert
Gondos Gellert on 13 Jun 2015
I tried to choose by trial and error the smallest subset of the smallest lags that yield a satisfactory result, but NMSEs was never significantly less than ~ 0.256.
How can I improve this?

Sign in to comment.


Peta
Peta on 13 Jun 2015
I’m having exactly the same problem with my narnet closed loop predictions. If I train narnets on for example the simplenar_dataset example data everything works perfect, but as soon as I move on to a real life application time series the prediction collapses just like in your image. And that happens even after I have tried ~30 different significant feedback delays, ~20 different hidden node sizes and 10 different weight initializations on each net. Even when my open net shows NMSEs =0.0029 the closed net with the same training jumps up to around NMSEc=0.6 and the predictions become a worthless straight line.
Unbelievably frustrating stuff so it’s good to know I’m not alone, and I will make sure to follow what happens in this thread. What data are you using by the way, is it something from mathworks or is it your own time series?

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!