Documentation Center

  • Trial Software
  • Product Updates

Contents

elmannet

Elman neural network

Syntax

elmannet(layerdelays,hiddenSizes,trainFcn)

Description

Elman networks are feedforward networks (feedforwardnet) with the addition of layer recurrent connections with tap delays.

With the availability of full dynamic derivative calculations (fpderiv and bttderiv), the Elman network is no longer recommended except for historical and research purposes. For more accurate learning try time delay (timedelaynet), layer recurrent (layrecnet), NARX (narxnet), and NAR (narnet) neural networks.

Elman networks with one or more hidden layers can learn any dynamic input-output relationship arbitrarily well, given enough neurons in the hidden layers. However, Elman networks use simplified derivative calculations (using staticderiv, which ignores delayed connections) at the expense of less reliable learning.

elmannet(layerdelays,hiddenSizes,trainFcn) takes these arguments,

layerdelays

Row vector of increasing 0 or positive delays (default = 1:2)

hiddenSizes

Row vector of one or more hidden layer sizes (default = 10)

trainFcn

Training function (default = 'trainlm')

and returns an Elman neural network.

Examples

Here an Elman neural network is used to solve a simple time series problem.

[X,T] = simpleseries_dataset;
net = elmannet(1:2,10);
[Xs,Xi,Ai,Ts] = preparets(net,X,T);
net = train(net,Xs,Ts,Xi,Ai);
view(net)
Y = net(Xs,Xi,Ai);
perf = perform(net,Ts,Y)

See Also

| | | | |

Was this topic helpful?