StepScheduler#
- class composer.optim.StepScheduler(step_size, gamma=0.1)[source]#
Decays the learning rate discretely at fixed intervals.
See also
This scheduler is based on
StepLR
from PyTorch.Decays the learning rate by a factor of
gamma
periodically, with a frequency determined bystep_size
.Specifically, the learning rate multiplier \(\alpha\) can be expressed as:
\[\alpha(t) = \gamma ^ {\text{floor}(t / \rho)} \]Where \(\rho\) represents the time between changes to the learning rate (the step size), and \(\gamma\) represents the multiplicative decay factor.