How to forecast next 20 points with NARX neural net?
1 view (last 30 days)
Show older comments
Hello people! Please, can anyone help me? Just reading and trying for one month.. Tired! : (
How can I forecast next 20 points in future (Horizon) of a Target using a NARX neural network?
My original data has 3000 known past time serie data (Input and Target). Now using Matlab example data set.
Bellow is my example code done with NTSTOOL GUI and Matlab's Simplenarx data set. But don't know what to do after.
Thank you very much in advance.. God bless you all!
Eric (Matlab newbie)
% Solve an Autoregression Problem with External Input with a NARX Neural Network
% Script generated by Neural Time Series app
% Created 14-Feb-2016 17:06:00
%
% This script assumes these variables are defined:
%
% simplenarxInputs - input time series.
% simplenarxTargets - feedback time series.
load simplenarx_dataset
X = simplenarxInputs;
T = simplenarxTargets;
% Choose a Training Function
trainFcn = 'trainbr'; % Bayesian Regularization backpropagation.
% Create a Nonlinear Autoregressive Network with External Input
inputDelays = 1:2;
feedbackDelays = 1:2;
hiddenLayerSize = 10;
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize,'open',trainFcn);
% Choose Input and Feedback Pre/Post-Processing Functions
% Settings for feedback input are automatically applied to feedback output
% Customize input parameters at: net.inputs{i}.processParam
% Customize output parameters at: net.outputs{i}.processParam
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer
% states. Using PREPARETS allows you to keep your original time series data
% unchanged, while easily customizing it for networks with differing
% numbers of delays, with open loop or closed loop feedback modes.
[x,xi,ai,t] = preparets(net,X,{},T);
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide
net.divideFcn = 'divideblock'; % Divide data randomly
net.divideMode = 'time'; % Divide up every sample
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean Squared Error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate', 'ploterrhist', ...
'plotregression', 'plotresponse', 'ploterrcorr', 'plotinerrcorr'};
% Train the Network
[net,tr] = train(net,x,t,xi,ai);
% Test the Network
y = net(x,xi,ai);
e = gsubtract(t,y);
performance = perform(net,t,y)
% View the Network
view(net)
% Plots
% Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotresponse(t,y)
%figure, ploterrcorr(e)
%figure, plotinerrcorr(x,e)
% Closed Loop Network
% Use this network to do multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
netc = closeloop(net);
netc.name = [net.name ' - Closed Loop'];
view(netc)
[xc,xic,aic,tc] = preparets(netc,X,{},T);
yc = netc(xc,xic,aic);
closedLoopPerformance = perform(net,tc,yc)
% Multi-step Prediction
% Sometimes it is useful to simulate a network in open-loop form for as
% long as there is known output data, and then switch to closed-loop form
% to perform multistep prediction while providing only the external input.
% Here all but 5 timesteps of the input series and target series are used
% to simulate the network in open-loop form, taking advantage of the higher
% accuracy that providing the target series produces:
numTimesteps = size(x,2);
knownOutputTimesteps = 1:(numTimesteps-5);
predictOutputTimesteps = (numTimesteps-4):numTimesteps;
X1 = X(:,knownOutputTimesteps);
T1 = T(:,knownOutputTimesteps);
[x1,xio,aio] = preparets(net,X1,{},T1);
[y1,xfo,afo] = net(x1,xio,aio);
% Next the the network and its final states will be converted to
% closed-loop form to make five predictions with only the five inputs
% provided.
x2 = X(1,predictOutputTimesteps);
[netc,xic,aic] = closeloop(net,xfo,afo);
[y2,xfc,afc] = netc(x2,xic,aic);
multiStepPerformance = perform(net,T(1,predictOutputTimesteps),y2)
% Alternate predictions can be made for different values of x2, or further
% predictions can be made by continuing simulation with additional external
% inputs and the last closed-loop states xfc and afc.
% Step-Ahead Prediction Network
% For some applications it helps to get the prediction a timestep early.
% The original network returns predicted y(t+1) at the same time it is
% given y(t+1). For some applications such as decision making, it would
% help to have predicted y(t+1) once y(t) is available, but before the
% actual y(t+1) occurs. The network can be made to return its output a
% timestep early by removing one delay so that its minimal tap delay is now
% 0 instead of 1. The new network returns the same outputs as the original
% network, but outputs are shifted left one timestep.
nets = removedelay(net);
nets.name = [net.name ' - Predict One Step Ahead'];
view(nets)
[xs,xis,ais,ts] = preparets(nets,X,{},T);
ys = nets(xs,xis,ais);
stepAheadPerformance = perform(nets,ts,ys)
1 Comment
Greg Heath
on 15 Feb 2016
1. Why do you find it useful to completely ignore the less confusing documentation example codes in
help narxnet
and
doc narxnet?
2. These codes take full advantage of default properties which do not have to be explicitly assigned.
3. After reproducing the documentation examples I recommend searching both the NEWSGROUP and ANSWERS for my relevant posts
narxnet greg
narxnet greg tutorial
Please let me know if you think it is better to peruse these in reverse calendar order.
Hope this helps.
Thank you for formally accepting my answer
Greg
Accepted Answer
Greg Heath
on 15 Feb 2016
0. I ran your code with the RNG initialization statement
rng('default')
just before the training command. The results were
performance = 2.4306e-13
closedLoopPerformance = 2.8027e-13
multiStepPerformance = 2.0576e-13
stepAheadPerformance = 2.4306e-13
1. Why do you find it useful to completely ignore the less confusing documentation example codes in
help narxnet
and
doc narxnet?
2. These codes are MUCH easier to understand because they take full advantage of default properties which do not have to be explicitly assigned.
3. After reproducing the documentation examples I recommend searching both the NEWSGROUP and ANSWERS for my relevant posts
narxnet greg
narxnet greg tutorial
Please let me know if you think it is better to peruse these in reverse calendar order.
Hope this helps.
Thank you for formally accepting my answer
Greg
4 Comments
Greg Heath
on 20 Feb 2016
Thanks.
Plotting the series reveals quite different statistics before and after time ~ 350.
You need TWO models. If you cross or glaze your eyes you should see a triangular waveform:
linear before 350 and constant afterwards.
C'est la vie!
Greg
More Answers (0)
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!