Main Content

Custom Training Loops

Customize deep learning training loops and loss functions for image networks

If the trainingOptions function does not provide the training options that you need for your task, or you have a loss function that the trainnet function does not support, then you can define a custom training loop. For models that cannot be specified as networks of layers, you can define the model as a function. To learn more, see Define Custom Training Loops, Loss Functions, and Networks.

Functions

expand all

dlnetworkDeep learning neural network
trainingProgressMonitorMonitor and plot training progress for deep learning custom training loops (Since R2022b)
minibatchqueueCreate mini-batches for deep learning (Since R2020b)
dlarrayDeep learning array for customization
dlgradientCompute gradients for custom training loops using automatic differentiation
dljacobianJacobian matrix deep learning operation (Since R2024b)
dldivergenceDivergence of deep learning data (Since R2024b)
dllaplacianLaplacian of deep learning data (Since R2024b)
dlfevalEvaluate deep learning model for custom training loops
crossentropyCross-entropy loss for classification tasks
indexcrossentropyIndex cross-entropy loss for classification tasks (Since R2024b)
l1lossL1 loss for regression tasks (Since R2021b)
l2lossL2 loss for regression tasks (Since R2021b)
huberHuber loss for regression tasks (Since R2021a)
mseHalf mean squared error
dlconvDeep learning convolution
dltranspconvDeep learning transposed convolution
fullyconnectSum all weighted input data and apply a bias
batchnormNormalize data across all observations for each channel independently
crosschannelnormCross channel square-normalize using local responses (Since R2020a)
groupnormNormalize data across grouped subsets of channels for each observation independently (Since R2020b)
instancenormNormalize across each channel for each observation independently (Since R2021a)
layernormNormalize data across all channels for each observation independently (Since R2021a)
avgpoolPool data to average values over spatial dimensions
maxpoolPool data to maximum value
maxunpoolUnpool the output of a maximum pooling operation
reluApply rectified linear unit activation
leakyreluApply leaky rectified linear unit activation
geluApply Gaussian error linear unit (GELU) activation (Since R2022b)
softmaxApply softmax activation to channel dimension
sigmoidApply sigmoid activation

Topics

Custom Training Loops

Automatic Differentiation

Featured Examples