StepScheduler#

class composer.optim.StepScheduler(step_size, gamma=0.1)[source]#

Decays the learning rate discretely at fixed intervals.

See also

This scheduler is based on StepLR from PyTorch.

Decays the learning rate by a factor of gamma periodically, with a frequency determined by step_size.

Specifically, the learning rate multiplier \(\alpha\) can be expressed as:

\[\alpha(t) = \gamma ^ {\text{floor}(t / \rho)} \]

Where \(\rho\) represents the time between changes to the learning rate (the step size), and \(\gamma\) represents the multiplicative decay factor.

Parameters
  • step_size (str | Time) โ€“ Time between changes to the learning rate.

  • gamma (float) โ€“ Multiplicative decay factor. Default = 0.1.