MultiStepScheduler#

class composer.optim.MultiStepScheduler(milestones, gamma=0.1)[source]#

Decays the learning rate discretely at fixed milestones.

See also

This scheduler is based on MultiStepLR from PyTorch.

Decays the learning rate by a factor of gamma whenever a time milestone in milestones is reached.

Specifically, the learning rate multiplier ฮฑ\alpha can be expressed as:

ฮฑ(t)=ฮณx\alpha(t) = \gamma ^ x

Where xx represents the amount of milestones that have been reached, and ฮณ\gamma represents the multiplicative decay factor.

Parameters
  • milestones (list[str | Time]) โ€“ Times at which the learning rate should change.

  • gamma (float) โ€“ Multiplicative decay factor. Default = 0.1.