Main Content

leakyReluLayer

Leaky Rectified Linear Unit (ReLU) layer

Description

A leaky ReLU layer performs a threshold operation, where any input value less than zero is multiplied by a fixed scalar.

This operation is equivalent to:

f(x)={x,x0scale*x,x<0.

Creation

Description

layer = leakyReluLayer returns a leaky ReLU layer.

layer = leakyReluLayer(scale) returns a leaky ReLU layer with a scalar multiplier for negative inputs equal to scale.

example

layer = leakyReluLayer(___,'Name',Name) returns a leaky ReLU layer and sets the optional Name property.

Properties

expand all

Leaky ReLU

Scalar multiplier for negative input values, specified as a numeric scalar.

Example: 0.4

Layer

Layer name, specified as a character vector or a string scalar. For Layer array input, the trainnet, trainNetwork, assembleNetwork, layerGraph, and dlnetwork functions automatically assign names to layers with the name "".

The LeakyReLULayer object stores this property as a character vector.

Data Types: char | string

This property is read-only.

Number of inputs to the layer, returned as 1. This layer accepts a single input only.

Data Types: double

This property is read-only.

Input names, returned as {'in'}. This layer accepts a single input only.

Data Types: cell

This property is read-only.

Number of outputs from the layer, returned as 1. This layer has a single output only.

Data Types: double

This property is read-only.

Output names, returned as {'out'}. This layer has a single output only.

Data Types: cell

Examples

collapse all

Create a leaky ReLU layer with the name 'leaky1' and a scalar multiplier for negative inputs equal to 0.1.

layer = leakyReluLayer(0.1,'Name','leaky1')
layer = 
  LeakyReLULayer with properties:

     Name: 'leaky1'

   Hyperparameters
    Scale: 0.1000

Include a leaky ReLU layer in a Layer array.

layers = [
    imageInputLayer([28 28 1])
    convolution2dLayer(3,16)
    batchNormalizationLayer
    leakyReluLayer
    
    maxPooling2dLayer(2,'Stride',2)
    convolution2dLayer(3,32)
    batchNormalizationLayer
    leakyReluLayer
    
    fullyConnectedLayer(10)
    softmaxLayer
    classificationLayer]
layers = 
  11x1 Layer array with layers:

     1   ''   Image Input             28x28x1 images with 'zerocenter' normalization
     2   ''   2-D Convolution         16 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
     3   ''   Batch Normalization     Batch normalization
     4   ''   Leaky ReLU              Leaky ReLU with scale 0.01
     5   ''   2-D Max Pooling         2x2 max pooling with stride [2  2] and padding [0  0  0  0]
     6   ''   2-D Convolution         32 3x3 convolutions with stride [1  1] and padding [0  0  0  0]
     7   ''   Batch Normalization     Batch normalization
     8   ''   Leaky ReLU              Leaky ReLU with scale 0.01
     9   ''   Fully Connected         10 fully connected layer
    10   ''   Softmax                 softmax
    11   ''   Classification Output   crossentropyex

Algorithms

expand all

References

[1] Maas, Andrew L., Awni Y. Hannun, and Andrew Y. Ng. "Rectifier nonlinearities improve neural network acoustic models." In Proc. ICML, vol. 30, no. 1. 2013.

Extended Capabilities

C/C++ Code Generation
Generate C and C++ code using MATLAB® Coder™.

GPU Code Generation
Generate CUDA® code for NVIDIA® GPUs using GPU Coder™.

Version History

Introduced in R2017b