# idNeuralNetwork

Multilayer neural network mapping function for nonlinear ARX models and Hammerstein-Wiener models (requires Statistics and Machine Learning Toolbox or Deep Learning Toolbox)

*Since R2023b*

## Description

An `idNeuralNetwork`

object creates a neural network function and
is a nonlinear mapping object for estimating nonlinear ARX models and Hammerstein-Wiener
models. This mapping object lets you create neural networks using the regression networks of
Statistics and Machine Learning Toolbox™ and the deep and shallow networks of Deep Learning Toolbox™.

Mathematically, `idNeuralNetwork`

is a function that maps
*m* inputs *X*(*t*) =
[*x*(*t*_{1}),*x*_{2}(*t*),…,*x _{m}*(

*t*)]

^{T}to a single scalar output

*y*(

*t*) using the following relationship:

$$y(t)={y}_{0}+{\rm X}{(t)}^{T}PL+S({{\rm X}}^{T}(t)Q)$$

Here:

*X*(*t*) is an*m*-by-1 vector of inputs, or*regressors*.*y*is the output offset, a scalar._{0}*P*and*Q*are*m*-by-*p*and*m*-by-*q*projection matrices, respectively.*L*is a*p*-by-1 vector of weights.*S*(.) represents a neural network object of one of the following types:`RegressionNeuralNetwork`

(Statistics and Machine Learning Toolbox) object — Network object created using`fitrnet`

(Statistics and Machine Learning Toolbox)`dlnetwork`

(Deep Learning Toolbox) object — Deep learning network object`network`

(Deep Learning Toolbox) object — Shallow network object created using a command such as`feedforwardnet`

(Deep Learning Toolbox)

See Examples for more information.

Use `idNeuralNetwork`

as the output value, or, for multiple-output systems,
one of the output values in the `OutputFcn`

property of an `idnlarx`

model or the `InputNonlinearity`

and
`OutputNonlinearity`

properties of an `idnlhw`

object. For example, specify `idNeuralNetwork`

when you
estimate an `idnlarx`

model with the following
command.

sys = nlarx(data,regressors,idNeuralNetwork)

`nlarx`

estimates the model, it essentially estimates the parameters of
the `idNeuralNetwork`

function.
You can use a similar approach when you specify input or output linearities using the
`nlhw`

command. For example, specify `idNeuralNetwork`

as both the
input and output nonlinearities with the following command.

sys = nlhw(data,orders,idNeuralNetwork,idNeuralNetwork)

## Creation

### Syntax

### Description

#### Create Regression Network or Deep Learning Network

creates an
`NW`

= idNeuralNetwork`idNeuralNetwork`

object `NW`

that uses a single
hidden layer of ten rectified linear unit (ReLU) activations.

The specific type of network that `NW`

represents depends on the
toolboxes you have access to.

If you have access to Statistics and Machine Learning Toolbox, then

`idNeuralNetwork`

uses`fitrnet`

(Statistics and Machine Learning Toolbox) to create a`RegressionNeuralNetwork`

(Statistics and Machine Learning Toolbox)-based map.If Statistics and Machine Learning Toolbox is not available but you have access to Deep Learning Toolbox, then

`idNeuralNetwork`

uses`dlnetwork`

(Deep Learning Toolbox) to create a deep learning network map.

For `idnlhw`

models, the number of inputs to the network is 1. For
`idnlarx`

models, the number of inputs is unknown, as this number is
determined during estimation. `NW`

also uses a parallel linear
function and an offset element.

For multiple-output nonlinear ARX or Hammerstein-Wiener models, create a separate
`idNeuralNetwork`

object for each output. Each element of the
output function must represent a single-output network object.

uses `NW`

= idNeuralNetwork(`LayerSizes`

)`numel(LayerSizes)`

layers. Each *i*th element in
`LayerSizes`

specifies the number of
*activations* in the corresponding *i*th
layer.

specifies the types of activation to use in each layer. The combination of the
`NW`

= idNeuralNetwork(`LayerSizes`

,`Activations`

)`Activations`

specification and the available toolboxes determines
which type of neural network `NW`

uses.

specifies whether `NW`

= idNeuralNetwork(`LayerSizes`

`Activations`

,`UseLinearFcn`

)`NW`

uses a linear function as a
subcomponent.

specifies whether `NW`

= idNeuralNetwork(`LayerSizes`

`Activations`

,`UseLinearFcn`

,`UseOffset`

)`NW`

uses an offset term.

forces the use of either a regression neural network from Statistics and Machine Learning Toolbox or a deep network from Deep Learning Toolbox. Specify `NW`

= idNeuralNetwork(___,`Network=type`

)`type`

as
`"RegressionNeuralNetwork"`

or `"dlnetwork"`

. The
specification in this syntax overrides the default automatic activation-based selection
of network type described in `Activations`

. Setting
`type`

to `"auto"`

is equivalent to the using the
default selection.

You can use this syntax with any of the previous input-argument combinations.

#### Use Existing Shallow Neural Network

creates `NW`

= idNeuralNetwork(`shallownet`

)`NW`

using the `network`

(Deep Learning Toolbox) object `shallownet`

.

`shallownet`

is typically the output of `feedforwardnet`

(Deep Learning Toolbox), `cascadeforwardnet`

(Deep Learning Toolbox), or `linearlayer`

(Deep Learning Toolbox).

specifies whether `NW`

= idNeuralNetwork(`shallownet`

,[],`UseLinearFcn`

)`NW`

uses a linear function as a
subcomponent.

specifies whether `NW`

= idNeuralNetwork(`shallownet`

,[],`UseLinearFcn`

,`UseOffset`

)`NW`

uses an offset term.

### Input Arguments

## Properties

## Examples

## Algorithms

The learnable parameters of the `idNeuralNetwork`

function are determined
during estimation of the nonlinear ARX and Hammerstein-Wiener models, using `nlarx`

and `nlhw`

commands, respectively.

The software initializes these parameters using the following steps:

Determine the linear function coefficients

*L*and the offset*y*, if in use and free, by performing a least-squares fit to the data._{0}Initialize the learnable parameters of the network function by fitting the residues of the linear and offset terms from step 1. The initialization scheme depends upon the type of the underlying network:

For

`RegressionNeuralNetwork`

(Statistics and Machine Learning Toolbox) networks, use`fitrnet`

(Statistics and Machine Learning Toolbox).For

`dlnetwork`

(Deep Learning Toolbox) networks, perform initialization by training the network using the specified solver in`NW.EstimationOptions`

.For

`network`

(Deep Learning Toolbox) networks, perform initialization by training the network using the specified solver in`NW.EstimationOptions`

.

After initialization, the software updates the parameters using a nonlinear least-squares
optimization solver (see `SearchMethod`

in `nlarxOptions`

and `SearchOptions`

in `nlhwOptions`

) to minimize the chosen
objective, as the following objective summaries describe:

For nonlinear ARX models, the objective is either prediction-error minimization or simulation-error minimization, depending on whether the

`Focus`

option in`nlarxOptions`

is`"prediction"`

or`"simulation"`

.For Hammerstein-Wiener models, the objective is simulation-error-norm minimization.

See `nlarxOptions`

and `nlhwOptions`

for more information on how to configure the objective and search
method.

## Version History

**Introduced in R2023b**

## See Also

`nlarx`

| `idnlarx`

| `nlhw`

| `idnlhw`

| `RegressionNeuralNetwork`

(Statistics and Machine Learning Toolbox) | `fitrnet`

(Statistics and Machine Learning Toolbox) | `dlnetwork`

(Deep Learning Toolbox) | `network`

(Deep Learning Toolbox) | `feedforwardnet`

(Deep Learning Toolbox) | `trainingOptions`

(Deep Learning Toolbox) | `evaluate`

### Topics

- Available Mapping Functions for Nonlinear ARX Models
- Available Nonlinearity Estimators for Hammerstein-Wiener Models
- List of Deep Learning Layers (Deep Learning Toolbox)
- Define Custom Training Loops, Loss Functions, and Networks (Deep Learning Toolbox)
- Train and Apply Multilayer Shallow Neural Networks (Deep Learning Toolbox)