Neural Network - Multi Step Ahead Prediction
55 views (last 30 days)
Show older comments
Hi all, please I need your help !
I've read all the posts here about Time Series Forecasting but still can't figure it out ! I'm drained.. *:-(*
I've a NARX neural network with 10 hidden neurons and 2 delays. As input I have a 510x5 (called Inputx) and as output I have a 510x1 (called Target).
I want to forecast 10 days ahead but it's really not working...
I tried the following code but I'm stuck now. *:-(*
Would you mind to help me ? *Some code will be awesome. :-(*
***////////////////////////////////////////////******** ***/////////////////////////////////////////// ******
inputSeries = tonndata(Inputx,false,false);
targetSeries = tonndata(Target,false,false);
netc = closeloop(net);
netc.name = [net.name ' - Closed Loop'];
[xc,xic,aic,tc] = preparets(netc,inputSeries,{},targetSeries);
yc = netc(xc,xic,aic);
***////////////////////////////////////////////******** ***/////////////////////////////////////////// ******
2 Comments
Oleg Komarov
on 2 Sep 2011
Two things,please change the title of your post in something useful and format the code: http://www.mathworks.com/matlabcentral/answers/13205-tutorial-how-to-format-your-question-with-markup#answer_18099
Constantine
on 21 Nov 2014
with respect to the accepted answer by Lucas Garcia, I find the predicted data only agrees with the actual data as well as his every once in a while.
1. It's important, before running the fit, to clear the variables, eg. 'clear all.' Re-running without clearing the variables leads to much worse fits.
2. much better fits result from using a bigger delay, like 5, instead of the delay of 2 in his example. Or by adding additional training data, such as the time derivative or 2nd time derivatives of the training data. Of course, doing this makes the fit considerably slower.
Accepted Answer
Lucas García
on 7 Sep 2011
Edited: Lucas García
on 3 Sep 2015
Hi Jack,
When using narxnet, the network performs only a one-step ahead prediction after it has been trained. Therefore, you need to use closeloop to perform a multi-step-ahead prediction and turn the network into parallel configuration.
Take a look at this example for a multi-step-ahead prediction, N steps. This uses the dataset magdata.mat which is available in the Neural Network Toolbox. Also, some of the inputs will be used for performing the multi-step-ahead prediction, and results validated with the original data. I hope the comments help to understand.
Edited in September 2015 to simplify step 5
%% 1. Importing data
S = load('magdata');
X = con2seq(S.u);
T = con2seq(S.y);
%% 2. Data preparation
N = 300; % Multi-step ahead prediction
% Input and target series are divided in two groups of data:
% 1st group: used to train the network
inputSeries = X(1:end-N);
targetSeries = T(1:end-N);
% 2nd group: this is the new data used for simulation. inputSeriesVal will
% be used for predicting new targets. targetSeriesVal will be used for
% network validation after prediction
inputSeriesVal = X(end-N+1:end);
targetSeriesVal = T(end-N+1:end); % This is generally not available
%% 3. Network Architecture
delay = 2;
neuronsHiddenLayer = 10;
% Network Creation
net = narxnet(1:delay,1:delay,neuronsHiddenLayer);
%% 4. Training the network
[Xs,Xi,Ai,Ts] = preparets(net,inputSeries,{},targetSeries);
net = train(net,Xs,Ts,Xi,Ai);
view(net)
Y = net(Xs,Xi,Ai);
% Performance for the series-parallel implementation, only
% one-step-ahead prediction
perf = perform(net,Ts,Y);
%% 5. Multi-step ahead prediction
[Xs1,Xio,Aio] = preparets(net,inputSeries(1:end-delay),{},targetSeries(1:end-delay));
[Y1,Xfo,Afo] = net(Xs1,Xio,Aio);
[netc,Xic,Aic] = closeloop(net,Xfo,Afo);
[yPred,Xfc,Afc] = netc(inputSeriesVal,Xic,Aic);
multiStepPerformance = perform(net,yPred,targetSeriesVal);
view(netc)
figure;
plot([cell2mat(targetSeries),nan(1,N);
nan(1,length(targetSeries)),cell2mat(yPred);
nan(1,length(targetSeries)),cell2mat(targetSeriesVal)]')
legend('Original Targets','Network Predictions','Expected Outputs')
24 Comments
Nils
on 25 May 2020
"inputSeriesVal = X(end-N+1:end)"
What could be done if these values are not available?
Chris P
on 2 Aug 2020
I'm wondering the same thing as Nils. What if this is implemented in real-time and we don't have these inputs planned in advance? For instance if we are trying to run an optimal control scheme to determine the best input values to achieve a desired trajectory. Or if some of the NN inputs are uncontrollable but necessary to get a good model prediction.
More Answers (5)
Mark Hudson Beale
on 9 Sep 2011
Here is an example that may help. A NARX network is trained on series inputs X and targets T, then the simulation is picked up at the end of X using continuation input data X2 with a closed loop network. The final states after open loop simulation with X are used as the initial states for closed loop simulation with X2.
% DESIGN NETWORK
[x,t] = simplenarx_dataset;
net = narxnet;
[X,Xi,Ai,T] = preparets(net,x,{},t);
net = train(net,X,T,Xi,Ai);
view(net)
% SIMULATE NETWORK FOR ORIGINAL SERIES
[Y,Xf,Af] = sim(net,X,Xi,Ai);
% CONTINUE SIMULATION FROM FINAL STATES XF & AF WITH ADDITIONAL
% INPUT DATA USING CLOSED LOOP NETWORK.
% Closed Loop Network
netc = closeloop(net);
view(netc)
% 10 More Steps for the first (now only) input
X2 = num2cell(rand(1,10));
% Initial input states for closed loop continuation will be the
% first input's final states.
Xi2 = Xf(1,:);
% Initial 2nd layer states for closed loop contination will be the
% processed second input's final states. Initial 1st layer states
% will be zeros, as they have no delays associated with them.
Ai2 = cell2mat(Xf(2,:));
for i=1:length(net.inputs{1}.processFcns)
fcn = net.inputs{i}.processFcns{i};
settings = net.inputs{i}.processSettings{i};
Ai2 = feval(fcn,'apply',Ai2,settings);
end
Ai2 = mat2cell([zeros(10,2); Ai2],[10 1],ones(1,2));
% Closed loop simulation on X2 continues from open loop state after X.
Y2 = sim(netc,X2,Xi2,Ai2);
4 Comments
WT
on 1 Mar 2015
May I know what this "Ai2 = mat2cell([zeros(10,2); Ai2],[10 1],ones(1,2));" means?
Thank You
IOANNIS4
on 5 Aug 2015
Please can someone exlpain little bit more this part
% Xi2 = Xf(1,:); Ai2 = cell2mat(Xf(2,:)); for i=1:length(net.inputs{1}.processFcns) fcn = net.inputs{i}.processFcns{i}; settings = net.inputs{i}.processSettings{i}; Ai2 = feval(fcn,'apply',Ai2,settings); end Ai2 = mat2cell([zeros(10,2); Ai2],[10 1],ones(1,2)); Y2 = sim(netc,X2,Xi2,Ai2);
Please you would really help us, Kind regards, Ioannis
Greg Heath
on 25 Mar 2014
When the loop is closed, the net should be retrained with the original data and initial weights the same as the final weights of the openloop configuration.
1 Comment
Mario Viola
on 26 Feb 2021
Just one question, How can i access the final weights from the open loop configuration ? ANd how to set them in the new closed configuration ?
mladen
on 25 Oct 2013
Be aware that predicting outputs this way (similar to cascade relaization of linear system) has great sensitivity to parametar estimation errors because they propagate in the process Mark Hudson Beale mentioned. This is highlighted in hard, multiple steps ahead problems.
Parallel realizations (simoltanoius output estimation...for instance 10 outputs of neural network for next 10 time steps) tend to be less sensitive to this errors. I have implemented this with my code which is alway prone to error :) So my subquestion is:
Is there some specific way to prepare my data for training with some matlab function?
Murat Akdag
on 28 Mar 2015
I'm trying to understand this narnet but still can't solve. Looking for help in matlab help section in this page: http://www.mathworks.com/help/nnet/ug/multistep-neural-network-prediction.html?searchHighlight=narnet%20multistep i try same codes but there is an error in >> [netc,xi,ai] = closeloop(net,xf,af); too many arguments. I just need one working sample about narnet which can predict 12 steps ahead prediction. I try to do with GUI in matlab NN section. But it has predict just 1 step ahead with removedelay command. I need 12 steps ahead. Thanks for help.
4 Comments
Greg Heath
on 11 Jul 2017
epochs are the number of loops the training goes through to trying to minimize the objective function ...
but you knew that already because you have GOOGLE & WIKIPEDIA
RIGHT?
Did you find this
https://www.quora.com/What-is-epochs-in-machine-learning
?
hugo kuribayashi
on 15 Apr 2015
Considering all this examples.. How can i calculate MAPE error instead MSE?
1 Comment
Greg Heath
on 11 Jul 2017
Edited: Greg Heath
on 12 Jul 2017
You mean "in addition to" ?
1. Learn with MSE or MSEREG
2. Report your findings with whatever floats your boat.
3. I prefer NMSE [0 1] for regression and time series
and
PCTERR [ 0 1 ]
for classification and pattern recognition
(;>)
Greg
P.S. Be aware of the shortcomings of MAPE and it's attempted modifications
https://en.wikipedia.org/wiki/Mean_absolute_percentage_error
Hope this helps.
Greg
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!