StepScheduler#
- class composer.optim.StepScheduler(step_size, gamma=0.1)[source]#
Decays the learning rate discretely at fixed intervals.
See also
This scheduler is based on
StepLR
from PyTorch.Decays the learning rate by a factor of
gamma
periodically, with a frequency determined bystep_size
.Specifically, the learning rate multiplier can be expressed as:
Where represents the time between changes to the learning rate (the step size), and represents the multiplicative decay factor.