ConstantScheduler#

class composer.optim.ConstantScheduler(alpha=1.0, t_max='1dur')[source]#

Maintains a fixed learning rate.

This scheduler is based on ConstantLR from PyTorch.

The default settings for this scheduler simply maintain a learning rate factor of 1 for the entire training duration. However, both the factor and the duration of this scheduler can be configured.

Specifically, the learning rate multiplier \(\alpha\) can be expressed as:

\[\alpha(t) = \begin{cases} \alpha, & \text{if } t < t_{max} \\ 1.0 & \text{otherwise} \end{cases} \]

Where \(\alpha\) represents the learning rate multiplier to maintain while this scheduler is active, and \(t_{max}\) represents the duration of this scheduler.

Parameters
  • alpha (float) โ€“ Learning rate multiplier to maintain while this scheduler is active. Default = 1.0.

  • t_max (str | Time) โ€“ Duration of this scheduler. Default = "1dur".