MultiStepScheduler#
- class composer.optim.MultiStepScheduler(milestones, gamma=0.1)[source]#
Decays the learning rate discretely at fixed milestones.
See also
This scheduler is based on
MultiStepLR
from PyTorch.Decays the learning rate by a factor of
gamma
whenever a time milestone inmilestones
is reached.Specifically, the learning rate multiplier \(\alpha\) can be expressed as:
\[\alpha(t) = \gamma ^ x \]Where \(x\) represents the amount of milestones that have been reached, and \(\gamma\) represents the multiplicative decay factor.