ExponentialScheduler#
- class composer.optim.ExponentialScheduler(gamma, decay_period='1ep')[source]#
Decays the learning rate exponentially.
See also
This scheduler is based on
ExponentialLR
from PyTorch.Exponentially decays the learning rate such that it decays by a factor of
gamma
everydecay_period
time.Specifically, the learning rate multiplier \(\alpha\) can be expressed as:
\[\alpha(t) = \gamma ^ {t / \rho} \]Where \(\rho\) represents the decay period, and \(\gamma\) represents the multiplicative decay factor.