composer.algorithms.functional.scale_scheduler
- composer.algorithms.functional.scale_scheduler(scheduler: torch.optim.lr_scheduler._LRScheduler, ssr: float, orig_max_epochs: Optional[int] = None)[source]
Makes a learning rate schedule take a different number of epochs.
See
ScaleSchedulefor more information.- Parameters
scheduler –
A learning rate schedule object. Must be one of:
torch.optim.lr_scheduler.CosineAnnealingLRtorch.optim.lr_scheduler.CosineAnnealingWarmRestartstorch.optim.lr_scheduler.ExponentialLRtorch.optim.lr_scheduler.MultiStepLRtorch.optim.lr_scheduler.StepLR
ssr – the factor by which to scale the duration of the schedule. E.g., 0.5 makes the schedule take half as many epochs and 2.0 makes it take twice as many epochs.
orig_max_epochs – the current number of epochs spanned by
scheduler. Used along withssrto determine the new number of epochsschedulershould span.
- Raises
ValueError – If
scheduleris not an instance of one of the above types.