composer.algorithms.sam.sam#
composer.algorithms.sam.sam
Functions
Converts |
Classes
Base class for algorithms. |
|
Enum to represent events in the training loop. |
|
An interface to record training data. |
|
Adds sharpness-aware minimization (Foret et al, 2020) by wrapping an existing optimizer with a |
|
Wraps an optimizer with sharpness-aware minimization (Foret et al, 2020). |
|
The state of the trainer. |
Attributes
Optionalannotationslog
- class composer.algorithms.sam.sam.SAM(rho=0.05, epsilon=1e-12, interval=1)[source]#
Bases:
composer.core.algorithm.AlgorithmAdds sharpness-aware minimization (Foret et al, 2020) by wrapping an existing optimizer with a
SAMOptimizer.- Parameters
rho (float, optional) โ The neighborhood size parameter of SAM. Must be greater than 0. Default:
0.05.epsilon (float, optional) โ A small value added to the gradient norm for numerical stability. Default:
1e-12.interval (int, optional) โ SAM will run once per
intervalsteps. A value of 1 will cause SAM to run every step. Steps on which SAM runs take roughly twice as much time to complete. Default:1.
- class composer.algorithms.sam.sam.SAMOptimizer(base_optimizer, rho=0.05, epsilon=1e-12, interval=1, **kwargs)[source]#
Bases:
torch.optim.optimizer.OptimizerWraps an optimizer with sharpness-aware minimization (Foret et al, 2020). See
SAMfor details.Implementation based on https://github.com/davda54/sam