Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Help with mismatching slopes in a time series prediction with Narxnet

Subject: Help with mismatching slopes in a time series prediction with Narxnet

From: Luca I

Date: 24 Dec, 2012 00:30:08

Message: 1 of 4

Hi,
I'm a student of Politecnico di Bari, I'm trying to predict Wind Speed with a NARX neural network using previous Wind Speed Data and Temperature,Pressure,Wind Direction and others as exogeneous variables. I think it is the best net to do this..
I've a problem,after the training and simulation of the net, I've noticed looking at the response plot, that there is a mismathing slope between predictions and targets.
Predictions are shifted to the right about 1,5 or 2 timesteps late, and obviously error increase.
I've read something on the guide, and I suppose that MAPMINMAX data normalization cause this mismatch.
Can someone help me to remove this mismatch?(and if possible other correction to script for a better performance..)
-----------------------------------------------------------
Here is the net script:

input_train=mapminmax('apply',input_train,Periodo1_MMSettings);
target_train=mapminmax('apply',target_train,Periodo1_WSMMSettings);

inputSeries=input_train;
targetSeries=target_train;

inputSeries=tonndata(inputSeries,true,false);
targetSeries=tonndata(targetSeries,true,false);

inputDelays = 1:8;
feedbackDelays = 1:8;
hiddenLayerSize = 10;
WINDnet = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
WINDnet.layers{1}.transferFcn= 'tansig'; %(or 'logsig....but I have to change MAPMINMAX to 0.1-0.9)
WINDnet.layers{2}.transferFcn= 'tansig';

WINDnet.inputs{1}.processFcns = {'removeconstantrows' 'mapminmax'};
WINDnet.inputs{2}.processFcns = {'removeconstantrows' 'mapminmax'};

[inputs,inputStates,layerStates,targets] = preparets(WINDnet,inputSeries,{},targetSeries);

WINDnet.divideFcn = 'dividerand';
WINDnet.divideMode = 'value'; %%(what is the difference with 'time'?)
WINDnet.divideParam.trainRatio = 70/100;
WINDnet.divideParam.valRatio = 15/100;
WINDnet.divideParam.testRatio = 15/100;

WINDnet.trainFcn = 'trainrp' ; % Resilient Backpropagation
WINDnet.trainParam.epochs = 100;
WINDnet.trainParam.lr = 0.01;

WINDnet.performFcn = 'mse'; % Mean squared error

WINDnet.plotFcns = {'plotperform','plottrainstate','plotresponse', 'plotregression' ...
  'ploterrcorr', 'plotinerrcorr'};
% Train the Network
[WINDnet,tr] = train(WINDnet,inputs,targets,inputStates,layerStates);

% Test the Network
outputs = WINDnet(inputs,inputStates,layerStates);
errors = gsubtract(targets,outputs);
performance = perform(WINDnet,targets,outputs)

% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(targets,tr.trainMask);
valTargets = gmultiply(targets,tr.valMask);
testTargets = gmultiply(targets,tr.testMask);
trainPerformance = perform(WINDnet,trainTargets,outputs)
valPerformance = perform(WINDnet,valTargets,outputs)
testPerformance = perform(WINDnet,testTargets,outputs)

%---Revert to real data--%

outputsMAT=cell2mat(outputs);
targetsMAT=cell2mat(targets);
targetsMAT = mapminmax('reverse',targetsMAT,Periodo1_WSMMSettings);
outputsMAT = mapminmax('reverse',outputsMAT,Periodo1_WSMMSettings);
e=(outputsMAT-targetsMAT);
RMSE=sqrt(mse(e));


--------------------------------------------------------------------------------

Merry Xmas to everybody...
Thank in advance for support...and sorry if my English is not so good....

Luca I.

Subject: Help with mismatching slopes in a time series prediction with Narxnet

From: Luca I

Date: 24 Dec, 2012 00:42:08

Message: 2 of 4

sorry there is an error in the script,the 2nd layer transfer function is 'purelin' not 'tansig'!!!

WINDnet.layers{2}.transferFcn= 'purelin'

Subject: Help with mismatching slopes in a time series prediction with Narxnet

From: Greg Heath

Date: 30 Dec, 2012 19:34:09

Message: 3 of 4

"Luca I" <luca_iannone@hotmail.com> wrote in message <kb88d0$k5$1@newscl01ah.mathworks.com>...
> sorry there is an error in the script,the 2nd layer transfer function is 'purelin' not 'tansig'!!!
>
> WINDnet.layers{2}.transferFcn= 'purelin'

It s very difficult to help without data. Are you able to demonstrate your problem by either applying your code to accessible MATLAB example or demo data or by posting a picture (target and output vs time) of your problem?

How did you determine the number of lags ?
.... significant auto and cross correlation peaks?

How did you determine the number of hidden layers ?
.... trial and error or a comparison of the number of training equations
and unknown weights?

Greg

Subject: Help with mismatching slopes in a time series prediction with Narxnet

From: Mehdi

Date: 2 Jan, 2013 04:39:09

Message: 4 of 4

"Greg Heath" <heath@alumni.brown.edu> wrote in message <kbq4vh$ii3$1@newscl01ah.mathworks.com>...
> "Luca I" <luca_iannone@hotmail.com> wrote in message <kb88d0$k5$1@newscl01ah.mathworks.com>...
> > sorry there is an error in the script,the 2nd layer transfer function is 'purelin' not 'tansig'!!!
> >
> > WINDnet.layers{2}.transferFcn= 'purelin'
>
> It s very difficult to help without data. Are you able to demonstrate your problem by either applying your code to accessible MATLAB example or demo data or by posting a picture (target and output vs time) of your problem?
>
> How did you determine the number of lags ?
> .... significant auto and cross correlation peaks?
>
> How did you determine the number of hidden layers ?
> .... trial and error or a comparison of the number of training equations
> and unknown weights?
>
> Greg

I trained NARX opened form successfully, Then closed it and exert the same training data to see the the Responses. The Responses are awful and bad. I dont know why?(Because I trained opened loop with enough datasets e.g 100000 and mse=10e-7)
To solve the problem I try to train closed loop using lm. But because of large training data the training speed was very very low.I decreased data up to 1000 pairs. The training starts but after some iteration it stops with the Maximum Mu Reached. I tested the resulted network with same data set that I used for training. But the responses was awful again.
Any comment will be so helpfull.
clc;
load outputs.mat
load inputs.mat
inputDelays = 0:2;
feedbackDelays = 1:2;
hiddenLayerSize = 27;
net = narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
net.trainFcn = 'trainlm'; % Levenberg-Marquardt
net.trainParam.min_grad=1e-10;
net.trainParam.max_fail=11;
net.trainParam.show=1;
net.trainParam.epochs = 100;
net.trainParam.goal = 1e-3;
net.trainParam.mu_max = 1e10;
%net.efficiency.memoryReduction = 2;
InputSeries= tonndata(inputs0,false,false);
OutputSeries= tonndata(outputs0,false,false);
[Inputs,inputStates,layerStates,targets] = preparets(net,InputSeries,{},OutputSeries);
net.divideFcn = 'dividerand'; % Divide data randomly
net.divideMode = 'value'; % Divide up every value
net.divideParam.trainRatio = 90/100;
net.divideParam.valRatio = 10/100;
%net.divideParam.testRatio = 15/100;
net.plotFcns = {'plotperform','plottrainstate','plotresponse', ...
  'ploterrcorr', 'plotinerrcorr'};
[net,tr] = train(net,Inputs,targets,inputStates,layerStates);
view(net)
% Closed Loop Network
%netc = closeloop(net);
%inputSeries= tonndata(inputs,false,false);
%targetSeries= tonndata(outputs,false,false);
%netC=netc;
%netC.name = [net.name ' - Closed Loop'];
%view(netC)
%[inputsC,inputStatesC,layerStatesC,targetsC] = preparets(netC,inputSeries,{},targetSeries);
%yC = netC(inputsC,inputStatesC,layerStatesC);
%closedLoopPerformance = perform(netC,targetsC,yC);
%netC.trainFcn = 'trainlm'; % Levenberg-Marquardt
%netC.trainParam.min_grad=1e-10;
%netC.trainParam.max_fail=21;
%netC.trainParam.show=1;
%netC.trainParam.epochs = 10000;
%netC.trainParam.goal = 1e-7;
%netC.trainParam.mu_max = 1e10;
%net.trainParam.mem_reduc = 1;
%[netC,trC] = train(netC,inputsC,targetsC,inputStatesC,layerStatesC);
%gensim(netC)

I have tested many ways(e.g running up to 1000 times, different numbers of hidden units and delays but no success yet)
give me your email to send inputs and outputs.
my email:mehdi.bgh@gmail.com
Mehdi,

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us