# CompactRegressionNeuralNetwork

Compact neural network model for regression

## Description

`CompactRegresionNeuralNetwork` is a compact version of a `RegressionNeuralNetwork` model object. The compact model does not include the data used for training the regression model. Therefore, you cannot perform some tasks, such as cross-validation, using the compact model. Use a compact model for tasks such as predicting the response values of new data.

## Creation

Create a `CompactRegressionNeuralNetwork` object from a full `RegressionNeuralNetwork` model object by using `compact`.

## Properties

expand all

### Neural Network Properties

Sizes of the fully connected layers in the neural network model, returned as a positive integer vector. The ith element of `LayerSizes` is the number of outputs in the ith fully connected layer of the neural network model.

`LayerSizes` does not include the size of the final fully connected layer. This layer always has one output.

Data Types: `single` | `double`

Learned layer weights for fully connected layers, returned as a cell array. The ith entry in the cell array corresponds to the layer weights for the ith fully connected layer. For example, `Mdl.LayerWeights{1}` returns the weights for the first fully connected layer of the model `Mdl`.

`LayerWeights` includes the weights for the final fully connected layer.

Data Types: `cell`

Learned layer biases for fully connected layers, returned as a cell array. The ith entry in the cell array corresponds to the layer biases for the ith fully connected layer. For example, `Mdl.LayerBiases{1}` returns the biases for the first fully connected layer of the model `Mdl`.

`LayerBiases` includes the biases for the final fully connected layer.

Data Types: `cell`

Activation functions for the fully connected layers of the neural network model, returned as a character vector or cell array of character vectors with values from this table.

ValueDescription
`'relu'`

Rectified linear unit (ReLU) function — Performs a threshold operation on each element of the input, where any value less than zero is set to zero, that is,

`$f\left(x\right)=\left\{\begin{array}{cc}x,& x\ge 0\\ 0,& x<0\end{array}$`

`'tanh'`

Hyperbolic tangent (tanh) function — Applies the `tanh` function to each input element

`'sigmoid'`

Sigmoid function — Performs the following operation on each input element:

`$f\left(x\right)=\frac{1}{1+{e}^{-x}}$`

`'none'`

Identity function — Returns each input element without performing any transformation, that is, f(x) = x

• If `Activations` contains only one activation function, then it is the activation function for every fully connected layer of the neural network model, excluding the final fully connected layer, which does not have an activation function (`OutputLayerActivation`).

• If `Activations` is an array of activation functions, then the ith element is the activation function for the ith layer of the neural network model.

Data Types: `char` | `cell`

Activation function for final fully connected layer, returned as `'none'`.

### Data Properties

Predictor variable names, returned as a cell array of character vectors. The order of the elements of `PredictorNames` corresponds to the order in which the predictor names appear in the training data.

Data Types: `cell`

Categorical predictor indices, returned as a vector of positive integers. Assuming that the predictor data contains observations in rows, `CategoricalPredictors` contains index values corresponding to the columns of the predictor data that contain categorical predictors. If none of the predictors are categorical, then this property is empty (`[]`).

Data Types: `double`

Expanded predictor names, returned as a cell array of character vectors. If the model uses encoding for categorical variables, then `ExpandedPredictorNames` includes the names that describe the expanded variables. Otherwise, `ExpandedPredictorNames` is the same as `PredictorNames`.

Data Types: `cell`

Response variable name, returned as a character vector.

Data Types: `char`

Response transformation function, returned as `'none'`. The software does not transform the raw response values.

## Object Functions

expand all

 `lime` Local interpretable model-agnostic explanations (LIME) `partialDependence` Compute partial dependence `plotPartialDependence` Create partial dependence plot (PDP) and individual conditional expectation (ICE) plots `shapley` Shapley values
 `loss` Loss for regression neural network `predict` Predict responses using regression neural network

## Examples

collapse all

Reduce the size of a full regression neural network model by removing the training data from the model. You can use a compact model to improve memory efficiency.

Load the `patients` data set. Create a table from the data set. Each row corresponds to one patient, and each column corresponds to a diagnostic variable. Use the `Systolic` variable as the response variable, and the rest of the variables as predictors.

```load patients tbl = table(Age,Diastolic,Gender,Height,Smoker,Weight,Systolic);```

Train a regression neural network model using the data. Specify the `Systolic` column of `tblTrain` as the response variable. Specify to standardize the numeric predictors.

`Mdl = fitrnet(tbl,"Systolic","Standardize",true)`
```Mdl = RegressionNeuralNetwork PredictorNames: {'Age' 'Diastolic' 'Gender' 'Height' 'Smoker' 'Weight'} ResponseName: 'Systolic' CategoricalPredictors: [3 5] ResponseTransform: 'none' NumObservations: 100 LayerSizes: 10 Activations: 'relu' OutputLayerActivation: 'linear' Solver: 'LBFGS' ConvergenceInfo: [1×1 struct] TrainingHistory: [1000×7 table] Properties, Methods ```

`Mdl` is a full `RegressionNeuralNetwork` model object.

Reduce the size of the model by using `compact`.

`compactMdl = compact(Mdl)`
```compactMdl = CompactRegressionNeuralNetwork LayerSizes: 10 Activations: 'relu' OutputLayerActivation: 'linear' Properties, Methods ```

`compactMdl` is a `CompactRegressionNeuralNetwork` model object. `compactMdl` contains fewer properties than the full model `Mdl`.

Display the amount of memory used by each neural network model.

`whos("Mdl","compactMdl")`
``` Name Size Bytes Class Attributes Mdl 1x1 72818 RegressionNeuralNetwork compactMdl 1x1 5995 classreg.learning.regr.CompactRegressionNeuralNetwork ```

The full model is larger than the compact model.