LinearScheduler#
- class composer.optim.LinearScheduler(alpha_i=1.0, alpha_f=0.0, t_max='1dur')[source]#
Adjusts the learning rate linearly.
See also
This scheduler is based on
LinearLR
from PyTorch.Warning
Note that the defaults for this scheduler differ from the defaults for
LinearLR
. The PyTorch scheduler, by default, linearly increases the learning rate multiplier from 1.0 / 3 to 1.0, whereas this implementation, by default, linearly decreases the multiplier rom 1.0 to 0.0.Linearly adjusts the learning rate multiplier from
alpha_i
toalpha_f
overt_{max}
time.Specifically, the learning rate multiplier can be expressed as:
Given , the fraction of time elapsed (clipped to the interval ), as:
Where represents the initial learning rate multiplier, represents the learning rate multiplier to decay to, and represents the duration of this scheduler.