MultiStepScheduler#
- class composer.optim.MultiStepScheduler(milestones, gamma=0.1)[source]#
Decays the learning rate discretely at fixed milestones.
See also
This scheduler is based on
MultiStepLR
from PyTorch.Decays the learning rate by a factor of
gamma
whenever a time milestone inmilestones
is reached.Specifically, the learning rate multiplier can be expressed as:
Where represents the amount of milestones that have been reached, and represents the multiplicative decay factor.