StepScheduler#

class composer.optim.StepScheduler(step_size, gamma=0.1)[source]#

Decays the learning rate discretely at fixed intervals.

See also

This scheduler is based on StepLR from PyTorch.

Decays the learning rate by a factor of gamma periodically, with a frequency determined by step_size.

Specifically, the learning rate multiplier ฮฑ\alpha can be expressed as:

ฮฑ(t)=ฮณfloor(t/ฯ)\alpha(t) = \gamma ^ {\text{floor}(t / \rho)}

Where ฯ\rho represents the time between changes to the learning rate (the step size), and ฮณ\gamma represents the multiplicative decay factor.

Parameters
  • step_size (str | Time) โ€“ Time between changes to the learning rate.

  • gamma (float) โ€“ Multiplicative decay factor. Default = 0.1.