ConstantWithWarmupScheduler#
- class composer.optim.ConstantWithWarmupScheduler(t_warmup, alpha=1.0, t_max='1dur', scale_warmup=False)[source]#
Maintains a fixed learning rate, with an initial warmup.
This scheduler is based on
ConstantLR
from PyTorch, with an added warmup.Starts with a linear warmup over
t_warmup
time, then simply maintains a learning rate factor of 1 for the entire training duration. However, both the factor and the duration of this scheduler can be configured.Specifically, the learning rate multiplier can be expressed as:
Where represents the learning rate multiplier to maintain while this scheduler is active, and represents the duration of this scheduler.
Warning
By default, initial warmup time is not scaled according to any provided scale schedule ratio. To change this behavior, set
scale_warmup=True
.