Main Content

polynomialLearnRate

Polynomial learning rate schedule

Since R2024b

    Description

    A polynomial learning rate schedule drops the learning rate using a power law.

    Tip

    For most tasks, setting the NumSteps argument to the number of training steps can yield better results. To easily use a polynomial learning rate schedule that drops the learning rate over the length of the training process, specify the LearnRateSchedule argument of the trainingOptions function as "polynomial".

    Creation

    Description

    schedule = polynomialLearnRate creates a polynomialLearnRate object.

    example

    schedule = polynomialLearnRate(Name=Value) specifies optional properties using one or more name-value arguments. For example, Power=2 specifies an exponent of 2 in the power law.

    example

    Properties

    expand all

    Initial scaling factor, specified as a positive scalar.

    The software starts by scaling the base learning rate by InitialFactor and transitions toward scaling the learning rate by FinalFactor.

    Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64

    Final scaling factor, specified as a positive scalar.

    The software starts by scaling the base learning rate by InitialFactor and transitions toward scaling the learning rate by FinalFactor.

    Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64

    Exponent of the power law, specified as a positive scalar.

    Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64

    Number of steps to drop the learning rate, specified as a finite positive integer.

    If FrequencyUnit is "iteration", then the software updates the learning rate each iteration for NumSteps iterations. If FrequencyUnit is "epoch", then the software updates the learning rate each epoch for NumSteps epochs.

    Tip

    For most tasks, setting the NumSteps argument to the number of training steps can yield better results. To easily use a polynomial learning rate schedule that drops the learning rate over the length of the training process, specify the LearnRateSchedule argument of the trainingOptions function as "polynomial".

    polynomialLearnRate objects are finite learning rate schedules, so NumSteps is a finite positive integer.

    Data Types: single | double | int8 | int16 | int32 | int64 | uint8 | uint16 | uint32 | uint64

    Frequency unit, specified as "epoch" or "iteration".

    If FrequencyUnit is "iteration", then the software updates the learning rate each iteration for NumSteps iterations. If FrequencyUnit is "epoch", then the software updates the learning rate each epoch for NumSteps epochs.

    Examples

    collapse all

    Create a polynomial learning rate schedule with the default settings.

    schedule = polynomialLearnRate
    schedule = 
      polynomialLearnRate with properties:
    
        InitialFactor: 1
          FinalFactor: 1.0000e-05
                Power: 1
        FrequencyUnit: "epoch"
             NumSteps: 30
    
    

    Specify this schedule as a training option.

    options = trainingOptions("adam",LearnRateSchedule=schedule);

    For most tasks, setting the NumSteps argument to the number of training steps can yield better results. As an alternative to creating a polynomial learning rate object, to easily use a polynomial learning rate schedule that drops the learning rate over the length of the training process, specify the LearnRateSchedule argument of the trainingOptions function as "polynomial".

    options = trainingOptions("adam",LearnRateSchedule="polynomial");

    Create a polynomial learning rate schedule that drops the learning rate for 100 iterations, starting with an initial scaling factor of 0.5.

    schedule = polynomialLearnRate( ...
        NumSteps=100, ...
        InitialFactor=0.5)
    schedule = 
      polynomialLearnRate with properties:
    
        InitialFactor: 0.5000
          FinalFactor: 1.0000e-05
                Power: 1
        FrequencyUnit: "epoch"
             NumSteps: 100
    
    

    Specify this schedule as a training option.

    options = trainingOptions("adam",LearnRateSchedule=schedule);

    Algorithms

    expand all

    Version History

    Introduced in R2024b