LinearScheduler#
- class composer.optim.LinearScheduler(alpha_i=1.0, alpha_f=0.0, t_max='1dur')[source]#
Adjusts the learning rate linearly.
See also
This scheduler is based on
LinearLRfrom PyTorch.Warning
Note that the defaults for this scheduler differ from the defaults for
LinearLR. The PyTorch scheduler, by default, linearly increases the learning rate multiplier from 1.0 / 3 to 1.0, whereas this implementation, by default, linearly decreases the multiplier rom 1.0 to 0.0.Linearly adjusts the learning rate multiplier from
alpha_itoalpha_fovert_{max}time.Specifically, the learning rate multiplier \(\alpha\) can be expressed as:
\[\alpha(t) = \alpha_i + (alpha_f - \alpha_i) \times \tau \]Given \(\tau\), the fraction of time elapsed (clipped to the interval \([0, 1]\)), as:
\[\tau = t / t_{max} \]Where \(\alpha_i\) represents the initial learning rate multiplier, \(\alpha_f\) represents the learning rate multiplier to decay to, and \(t_{max}\) represents the duration of this scheduler.