# ExponentialScheduler#

class composer.optim.ExponentialScheduler(gamma, decay_period='1ep')[source]#

Decays the learning rate exponentially.

This scheduler is based on ExponentialLR from PyTorch.

Exponentially decays the learning rate such that it decays by a factor of gamma every decay_period time.

Specifically, the learning rate multiplier $$\alpha$$ can be expressed as:

$\alpha(t) = \gamma ^ {t / \rho}$

Where $$\rho$$ represents the decay period, and $$\gamma$$ represents the multiplicative decay factor.

Parameters
• decay_period (str | Time) – Decay period. Default = "1ep".

• gamma (float) – Multiplicative decay factor.