MultiStepScheduler#

class composer.optim.MultiStepScheduler(milestones, gamma=0.1)[source]#

Decays the learning rate discretely at fixed milestones.

See also

This scheduler is based on MultiStepLR from PyTorch.

Decays the learning rate by a factor of gamma whenever a time milestone in milestones is reached.

Specifically, the learning rate multiplier \(\alpha\) can be expressed as:

\[\alpha(t) = \gamma ^ x \]

Where \(x\) represents the amount of milestones that have been reached, and \(\gamma\) represents the multiplicative decay factor.

Parameters
  • milestones (list[str | Time]) โ€“ Times at which the learning rate should change.

  • gamma (float) โ€“ Multiplicative decay factor. Default = 0.1.