composer.optim.optimizer_hparams#
Hyperparameters for optimizers.
Hparams
These classes are used with yahp for YAML-based configuration.
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Base class for optimizer hyperparameter classes. |
|
Hyperparameters for the |
|
Hyperparameters for the |
|
Hyperparameters for the |
- class composer.optim.optimizer_hparams.AdamHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.0, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparamsHyperparameters for the
Adamoptimizer.See
Adamfor documentation.
- class composer.optim.optimizer_hparams.AdamWHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.01, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparamsHyperparameters for the
AdamWoptimizer.See
AdamWfor documentation.
- class composer.optim.optimizer_hparams.DecoupledAdamWHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.01, amsgrad=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparamsHyperparameters for the
DecoupledAdamWoptimizer.See
DecoupledAdamWfor documentation.- Parameters
lr (float, optional) โ See
DecoupledAdamW.betas (float, optional) โ See
DecoupledAdamW.eps (float, optional) โ See
DecoupledAdamW.weight_decay (float, optional) โ See
DecoupledAdamW.amsgrad (bool, optional) โ See
DecoupledAdamW.
- class composer.optim.optimizer_hparams.DecoupledSGDWHparams(lr, momentum=0.0, weight_decay=0.0, dampening=0.0, nesterov=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparamsHyperparameters for the
DecoupledSGDWoptimizer.See
DecoupledSGDWfor documentation.- Parameters
lr (float) โ See
DecoupledSGDW.momentum (float, optional) โ See
DecoupledSGDW.weight_decay (float, optional) โ See
DecoupledSGDW.dampening (float, optional) โ See
DecoupledSGDW.nesterov (bool, optional) โ See
DecoupledSGDW.
- class composer.optim.optimizer_hparams.OptimizerHparams[source]#
Bases:
yahp.hparams.Hparams,abc.ABCBase class for optimizer hyperparameter classes.
Optimizer parameters that are added to
TrainerHparams(e.g. via YAML or the CLI) are initialized in the training loop.
- class composer.optim.optimizer_hparams.RAdamHparams(lr=0.001, betas=<factory>, eps=1e-08, weight_decay=0.0)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparamsHyperparameters for the
RAdamoptimizer.See
RAdamfor documentation.
- class composer.optim.optimizer_hparams.RMSpropHparams(lr, alpha=0.99, eps=1e-08, momentum=0.0, weight_decay=0.0, centered=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparamsHyperparameters for the
RMSpropoptimizer.See
RMSpropfor documentation.
- class composer.optim.optimizer_hparams.SGDHparams(lr, momentum=0.0, weight_decay=0.0, dampening=0.0, nesterov=False)[source]#
Bases:
composer.optim.optimizer_hparams.OptimizerHparamsHyperparameters for the
SGDoptimizer.See
SGDfor documentation.