Main Content

regressionLayer

Regression output layer

Description

A regression layer computes the half-mean-squared-error loss for regression tasks.

layer = regressionLayer returns a regression output layer for a neural network as a RegressionOutputLayer object.

Predict responses of a trained regression network using predict. Normalizing the responses often helps stabilizing and speeding up training of neural networks for regression. For more information, see Train Convolutional Neural Network for Regression.

example

layer = regressionLayer(Name,Value) sets the optional Name and ResponseNames properties using name-value pairs. For example, regressionLayer('Name','output') creates a regression layer with the name 'output'. Enclose each property name in single quotes.

Examples

collapse all

Create a regression output layer with the name 'routput'.

layer = regressionLayer('Name','routput')
layer = 
  RegressionOutputLayer with properties:

             Name: 'routput'
    ResponseNames: {}

   Hyperparameters
     LossFunction: 'mean-squared-error'

The default loss function for regression is mean-squared-error.

Include a regression output layer in a Layer array.

layers = [ ...
    imageInputLayer([28 28 1])
    convolution2dLayer(12,25)
    reluLayer
    fullyConnectedLayer(1)
    regressionLayer]
layers = 
  5x1 Layer array with layers:

     1   ''   Image Input         28x28x1 images with 'zerocenter' normalization
     2   ''   2-D Convolution     25 12x12 convolutions with stride [1  1] and padding [0  0  0  0]
     3   ''   ReLU                ReLU
     4   ''   Fully Connected     1 fully connected layer
     5   ''   Regression Output   mean-squared-error

Input Arguments

collapse all

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

Example: regressionLayer('Name','output') creates a regression layer with the name 'output'

Layer name, specified as a character vector or a string scalar. For Layer array input, the trainnet, trainNetwork, assembleNetwork, layerGraph, and dlnetwork functions automatically assign names to layers with the name "".

The regressionLayer object stores this property as a character vector.

Data Types: char | string

Names of the responses, specified a cell array of character vectors or a string array. At training time, the software automatically sets the response names according to the training data. The default is {}.

Data Types: cell

Output Arguments

collapse all

Regression output layer, returned as a RegressionOutputLayer object.

More About

collapse all

Regression Output Layer

A regression layer computes the half-mean-squared-error loss for regression tasks. For typical regression problems, a regression layer must follow the final fully connected layer.

For a single observation, the mean-squared-error is given by:

MSE=i=1R(tiyi)2R,

where R is the number of responses, ti is the target output, and yi is the network’s prediction for response i.

For image and sequence-to-one regression networks, the loss function of the regression layer is the half-mean-squared-error of the predicted responses, not normalized by R:

loss=12i=1R(tiyi)2.

For image-to-image regression networks, the loss function of the regression layer is the half-mean-squared-error of the predicted responses for each pixel, not normalized by R:

loss=12p=1HWC(tpyp)2,

where H, W, and C denote the height, width, and number of channels of the output respectively, and p indexes into each element (pixel) of t and y linearly.

For sequence-to-sequence regression networks, the loss function of the regression layer is the half-mean-squared-error of the predicted responses for each time step, not normalized by R:

loss=12Si=1Sj=1R(tijyij)2,

where S is the sequence length.

When training, the software calculates the mean loss over the observations in the mini-batch.

Extended Capabilities

Version History

Introduced in R2017a