Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
On Designing a Feedback Time-Series Neural Network for Operational Deployment

Subject: On Designing a Feedback Time-Series Neural Network for Operational Deployment

From: Greg Heath

Date: 17 Oct, 2013 12:49:39

Message: 1 of 2

% To obtain a time-series feedback net for operational deployment,
% save
%
% 1. The CLOSELOOP net
% 2. The final contents of the delay buffer Xf, Af obtained from
% either openloop training
% [ net tr Ys Es Xsf Asf ] = train(net, Xs,Ts,Xi,Ai);
% or closeloop training
% [ netc tr Ycs Ecs Xcsf Acsf ] = train(netc, Xcs,Tcs,Xci,Aci);

% Further advice
%
% 1. NN design is a trial and error process because,
% a. default trn/val/tst data division is random
% b. default weight/bias initialization is random
% c. default input-delay, feedback-delay and number of hidden
% node assignments may be inadequate
% 2. Since multiple designs are usually needed, you might as well
% begin by using as many defaults as is reasonable.
% 3. The default data division function 'dividerand' is not reasonable
% for time-series because it destroys input and feedback correlations
% with the target. Replace it with 'divideblock' or 'divideind'
% 4. Direct closeloop training tends to be very slow
% 5. However, just closing the loop on an openloop design may
% result in very poor closeloop performance
% 6. Test the closeloop design on the original openloop design data.
% If the performance is poor, continue with direct training on the
% closeloop design. This is much faster than beginning with a direct
% closeloop design.
% 7. To help clarify the above points, the following simpleseries_dataset
% example ignores
% a. Optimization of the delay specifications using the target
% autocorrelation function and the target/input crosscorrelation function.
% b. Multiple loop designs to optimize the number of hidden nodes and
% choice of random initial weights
% c. Individual performances of the training, validation and test data
% subsets.
% One or more of the above will be necessary to improve the design
% below. These topics are addressed in many of my other posts.
% 8. You may wish to see how the simple code below works on others
% in the MATLAB neural net feedback time-series database:
% help nndatasets
%
% simpleseries_dataset - Simple time-series prediction dataset.
% simplenarx_dataset - Simple time-series prediction dataset.
% exchanger_dataset - Heat exchanger dataset.
% maglev_dataset - Magnetic levitation dataset.
% ph_dataset - Solution PH dataset.
% pollution_dataset - Pollution mortality dataset.
% refmodel_dataset - Reference model dataset
% robotarm_dataset - Robot arm dataset
% valve_dataset - Valve fluid flow dataset.
 
close all, clear all, clc

  [X,T] = simpleseries_dataset;
  net = narxnet; % Preliminary designs with defaults
  %net = narxnet(ID,FD,H); % For followup designs
  net.divideFcn = 'divideblock';
  [ Xs Xi Ai Ts ] = preparets(net, X,{},T );
  ts = cell2mat(Ts);
  MSE00s = mean(var(ts',1)) % 0.0444 = MSE normalization reference

  rng(4151941)
  [ net tr Ys Es Xsf Asf ] = train( net, Xs, Ts, Xi, Ai );
   % Ys = net(Xs,Xi,Ai);
   % Es = gsubtract(net,Ts,Ys)
   view(net)
   % Google: wikipedia/R-squared
   R2s = 1 - perform(net,Ts,Ys)/MSE00s % 0.6644
       
   netc = closeloop(net);
   view(netc)
   [Xcs,Xci,Aci,Tcs] = preparets(netc,X,{},T);
   Ycs = netc( Xcs, Xci, Aci );
   R2cs1 = 1 - perform(net,Tcs,Ycs)/MSE00s % 0.4744
   if R2cs1 < 0.95*R2s
      [ net tr Ycs Ecs Xcf Acf ] = train( netc, Xcs, Tcs, Xci, Aci );
      view(net)
      R2cs2 = 1 - perform(net,Tcs,Ycs)/MSE00s % 0.501
   end

ope this helps.

Greg
  

Subject: On Designing a Feedback Time-Series Neural Network for Operational Deployment

From: Greg Heath

Date: 18 Oct, 2013 05:29:06

Message: 2 of 2

"Greg Heath" <heath@alumni.brown.edu> wrote in message <l3omd3$bgc$1@newscl01ah.mathworks.com>...
> % To obtain a time-series feedback net for operational deployment,
> % save
> %
> % 1. The CLOSELOOP net
> % 2. The final contents of the delay buffer Xf, Af obtained from
> % either openloop training
> % [ net tr Ys Es Xsf Asf ] = train(net, Xs,Ts,Xi,Ai);
> % or closeloop training
> % [ netc tr Ycs Ecs Xcsf Acsf ] = train(netc, Xcs,Tcs,Xci,Aci);
>
> % Further advice
> %
> % 1. NN design is a trial and error process because,
> % a. default trn/val/tst data division is random
> % b. default weight/bias initialization is random
> % c. default input-delay, feedback-delay and number of hidden
> % node assignments may be inadequate
> % 2. Since multiple designs are usually needed, you might as well
> % begin by using as many defaults as is reasonable.
> % 3. The default data division function 'dividerand' is not reasonable
> % for time-series because it destroys input and feedback correlations
> % with the target. Replace it with 'divideblock' or 'divideind'
> % 4. Direct closeloop training tends to be very slow
> % 5. However, just closing the loop on an openloop design may
> % result in very poor closeloop performance
> % 6. Test the closeloop design on the original openloop design data.
> % If the performance is poor, continue with direct training on the
> % closeloop design. This is much faster than beginning with a direct
> % closeloop design.
> % 7. To help clarify the above points, the following simpleseries_dataset
> % example ignores
> % a. Optimization of the delay specifications using the target
> % autocorrelation function and the target/input crosscorrelation function.
> % b. Multiple loop designs to optimize the number of hidden nodes and
> % choice of random initial weights
> % c. Individual performances of the training, validation and test data
> % subsets.
> % One or more of the above will be necessary to improve the design
> % below. These topics are addressed in many of my other posts.
> % 8. You may wish to see how the simple code below works on others
> % in the MATLAB neural net feedback time-series database:
> % help nndatasets
> %

close all, clear all, clc
format short
tic

% Choose a dataset
%
% [X,T] = refmodel_dataset ; % Reference model dataset 1 1 2000
% [X,T] = simplenarx_dataset; %Simple time-series prediction 1 1 100
% [X,T] = valve_dataset; % Valve fluid flow dataset. 1 1 1800

% [X,T] = maglev_dataset; % Magnetic levitation dataset. 1 1 4001
% [X,T] = robotarm_dataset; % Robot arm dataset 1 1 1463
% [X,T] = simpleseries_dataset; %Simple time-series prediction 1 1 100

% [X,T] = pollution_dataset; % Pollution mortality dataset. 8 3 508
% [X,T] = ph_dataset; % Solution PH dataset. 2 1 2001
% [X,T] = exchanger_dataset; % Heat exchanger dataset. 1 1 4000
>
> net = narxnet; % Preliminary designs with defaults
> %net = narxnet(ID,FD,H); % For followup designs
> net.divideFcn = 'divideblock';
> [ Xs Xi Ai Ts ] = preparets(net, X,{},T );
> ts = cell2mat(Ts);
> MSE00s = mean(var(ts',1)) % 0.0444 = MSE normalization reference
>
> rng(4151941)
> [ net tr Ys Es Xsf Asf ] = train( net, Xs, Ts, Xi, Ai );
> % Ys = net(Xs,Xi,Ai);
> % Es = gsubtract(net,Ts,Ys)
> view(net)
> % Google: wikipedia/R-squared
> R2s = 1 - perform(net,Ts,Ys)/MSE00s % 0.6644
>
> netc = closeloop(net);
> view(netc)
> [Xcs,Xci,Aci,Tcs] = preparets(netc,X,{},T);
> Ycs = netc( Xcs, Xci, Aci );
> R2cs1 = 1 - perform(net,Tcs,Ycs)/MSE00s % 0.4744
> if R2cs1 < 0.95*R2s
> [ net tr Ycs Ecs Xcf Acf ] = train( netc, Xcs, Tcs, Xci, Aci );
> view(net)
> R2cs2 = 1 - perform(net,Tcs,Ycs)/MSE00s % 0.501
> end

 Time = toc

summary = [ Time R2s R2cs1 R2cs1-R2s R2cs2 R2cs2-R2cs1 ]
% ref 12.22 1.0000 0.9996 -0.0004 0.9996 0
% narx 5.99 0.9949 0.9933 -0.0016 0.9933 0
% valve 11.38 0.9245 0.9061 -0.0184 0.9061 0
% maglev 174.31 1.0000 0.8203 -0.1797 0.8203 0
% robot 62.65 1.0000 0.9494 -0.0506 0.9699 0.0205
% series 10.66 0.6644 0.4744 -0.1900 0.5041 0.0297
% pollutn 35.98 0.6981 0.3780 -0.3201 0.5422 0.1642
% ph 163.38 0.9654 -0.2551 -1.2205 0.9627 1.2179
% exch 564.29 0.9335 -0.2813 -1.2149 0.9384 1.2197

% Since only one trial was run for each of the nine data sets, an accurate
% and precise summary is not possible. However it is possible to see the
% variety of results.
%
% 1. Closing the loop on an openloop design never improves performance
% R2cs1 <= R2s
% 2. Performance degradation just by closing the loop can range from
% insignificant (ref and narx) to disastrous (ph and exc).
% 3. Closelop performance enhancement via direct design can range from
% negligible (ref, narx, valve and maglev) to very significant ( ph and exch)

Hope this helps.

Greg

 

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us