idNeuralNetwork
Multilayer neural network mapping function for nonlinear ARX models and Hammerstein-Wiener models (requires Statistics and Machine Learning Toolbox or Deep Learning Toolbox)
Since R2023b
Description
An idNeuralNetwork object creates a neural network function and
is a nonlinear mapping object for estimating nonlinear ARX models and Hammerstein-Wiener
models. This mapping object lets you create neural networks using the regression networks of
Statistics and Machine Learning Toolbox™ and the deep and shallow networks of Deep Learning Toolbox™.
![]()
Mathematically, idNeuralNetwork is a function that maps
m inputs X(t) =
[x(t1),x2(t),…,xm(t)]T
to a single scalar output
y(t) using the following relationship:
Here:
X(t) is an m-by-1 vector of inputs, or regressors.
y0 is the output offset, a scalar.
P and Q are m-by-p and m-by-q projection matrices, respectively.
L is a p-by-1 vector of weights.
S(.) represents a neural network object of one of the following types:
RegressionNeuralNetwork(Statistics and Machine Learning Toolbox) object — Network object created usingfitrnet(Statistics and Machine Learning Toolbox)dlnetwork(Deep Learning Toolbox) object — Deep learning network objectnetwork(Deep Learning Toolbox) object — Shallow network object created using a command such asfeedforwardnet(Deep Learning Toolbox)
Additionally, a cascade-correlation neural network is a network type that is implemented with deep networks. It allows you to create networks without specifying the network sizes.
See Examples for more information.
Use idNeuralNetwork as the output value, or, for multiple-output systems,
one of the output values in the OutputFcn property of an idnlarx model or the InputNonlinearity and
OutputNonlinearity properties of an idnlhw object. For example, specify idNeuralNetwork when you
estimate an idnlarx model with the following
command.
sys = nlarx(data,regressors,idNeuralNetwork)
nlarx estimates the model, it essentially estimates the parameters of
the idNeuralNetwork function.You can use a similar approach when you specify input or output linearities using the
nlhw command. For example, specify idNeuralNetwork as both the
input and output nonlinearities with the following command.
sys = nlhw(data,orders,idNeuralNetwork,idNeuralNetwork)
Creation
Syntax
Description
Create Regression Network or Deep Learning Network
creates an
NW = idNeuralNetworkidNeuralNetwork object NW that uses a single
hidden layer of ten rectified linear unit (ReLU) activations.
The specific type of network that NW represents depends on the
toolboxes you have access to.
If you have access to Statistics and Machine Learning Toolbox, then
idNeuralNetworkusesfitrnet(Statistics and Machine Learning Toolbox) to create aRegressionNeuralNetwork(Statistics and Machine Learning Toolbox)-based map.If Statistics and Machine Learning Toolbox is not available but you have access to Deep Learning Toolbox, then
idNeuralNetworkusesdlnetwork(Deep Learning Toolbox) to create a deep learning network map.
For idnlhw models, the number of inputs to the network is 1. For
idnlarx models, the number of inputs is unknown, as this number is
determined during estimation. NW also uses a parallel linear
function and an offset element.
For multiple-output nonlinear ARX or Hammerstein-Wiener models, create a separate
idNeuralNetwork object for each output. Each element of the
output function must represent a single-output network object.
uses the length of NW = idNeuralNetwork(LayerSizes)LayerSizes as the number of layers, if
LayerSizes is a row vector of positive integers. Each
ith element in LayerSizes specifies the number
of activations in the corresponding ith
layer.
specifies the types of activation to use in each layer. The combination of the
NW = idNeuralNetwork(LayerSizes,Activations)Activations specification and the available toolboxes determines
which type of neural network NW uses.
specifies whether NW = idNeuralNetwork(LayerSizes,Activations,UseLinearFcn)NW uses a linear function as a
subcomponent.
specifies whether NW = idNeuralNetwork(LayerSizes,Activations,UseLinearFcn,UseOffset)NW uses an offset term.
creates a neural network object with properties specified by one or more name-value
arguments.NW = idNeuralNetwork(___,Name=Value)
Create Cascade-Correlation Neural Network
Creating cascade-correlation neural networks requires Deep Learning Toolbox software. You can use these cascade-correlation neural networks for estimating nonlinear ARX models but not Hammerstein-Wiener models.
creates a cascade-correlation neural network where the network determines the number of
layers during training. Each of these layers has one unit. All units are connected to
all previous layers and inputs. For more information on these networks, see Cascade-Correlation Neural Networks.NW = idNeuralNetwork("cascade-correlation")
specifies the types of activation to use in each layer.NW = idNeuralNetwork("cascade-correlation",Activations)
specifies whether NW = idNeuralNetwork("cascade-correlation",Activations,UseLinearFcn)NW uses a linear function as a
subcomponent.
specifies whether NW = idNeuralNetwork("cascade-correlation",Activations,UseLinearFcn,UseOffset)NW uses an offset term.
creates a cascade-correlation neural network object with properties specified by one or
more name-value arguments.NW = idNeuralNetwork(___,Name=Value)
Use Existing Shallow Neural Network
creates NW = idNeuralNetwork(shallownet)NW using the network (Deep Learning Toolbox) object shallownet.
shallownet is typically the output of feedforwardnet (Deep Learning Toolbox), cascadeforwardnet (Deep Learning Toolbox), or linearlayer (Deep Learning Toolbox).
specifies whether NW = idNeuralNetwork(shallownet,[],UseLinearFcn)NW uses a linear function as a
subcomponent.
specifies whether NW = idNeuralNetwork(shallownet,[],UseLinearFcn,UseOffset)NW uses an offset term.
Input Arguments
Name-Value Arguments
Properties
Examples
Algorithms
The learnable parameters of the idNeuralNetwork function are determined
during estimation of the nonlinear ARX and Hammerstein-Wiener models, using nlarx and nlhw commands, respectively.
The software initializes these parameters using the following steps:
Determine the linear function coefficients L and the offset y0, if in use and free, by performing a least-squares fit to the data.
Initialize the learnable parameters of the network function by fitting the residues of the linear and offset terms from step 1. The initialization scheme depends upon the type of the underlying network:
For
RegressionNeuralNetwork(Statistics and Machine Learning Toolbox) networks, usefitrnet(Statistics and Machine Learning Toolbox).For
dlnetwork(Deep Learning Toolbox) networks, perform initialization by training the network using the specified solver inNW.EstimationOptions. For cascade-correlation neural networks, perform initialization by training a network using the options specified inNW.EstimationOptionsandnlarxOptions.For
network(Deep Learning Toolbox) networks, perform initialization by training the network using the specified solver inNW.EstimationOptions.
After initialization, the software updates the parameters using a nonlinear least-squares
optimization solver (see SearchMethod in nlarxOptions
and SearchOptions in nlhwOptions) to minimize the chosen
objective, as the following objective summaries describe:
For nonlinear ARX models, the objective is either prediction-error minimization or simulation-error minimization, depending on whether the
Focusoption innlarxOptionsis"prediction"or"simulation".For Hammerstein-Wiener models, the objective is simulation-error-norm minimization.
See nlarxOptions and nlhwOptions for more information on how to configure the objective and search
method.
Version History
Introduced in R2023bSee Also
nlarx | idnlarx | nlarxOptions | nlhw | idnlhw | RegressionNeuralNetwork (Statistics and Machine Learning Toolbox) | fitrnet (Statistics and Machine Learning Toolbox) | dlnetwork (Deep Learning Toolbox) | network (Deep Learning Toolbox) | feedforwardnet (Deep Learning Toolbox) | trainingOptions (Deep Learning Toolbox) | evaluate
Topics
- Available Mapping Functions for Nonlinear ARX Models
- Cascade-Correlation Neural Networks
- Available Nonlinearity Estimators for Hammerstein-Wiener Models
- List of Deep Learning Layers (Deep Learning Toolbox)
- Define Custom Training Loops, Loss Functions, and Networks (Deep Learning Toolbox)
- Train and Apply Multilayer Shallow Neural Networks (Deep Learning Toolbox)
- Predict EV Battery Temperature Using Cascade-Correlation Model


