# composer.algorithms.sam.sam#

SAM algorithm and optimizer class.

Classes

 SAM Adds sharpness-aware minimization (Foret et al, 2020) by wrapping an existing optimizer with a SAMOptimizer. SAMOptimizer Wraps an optimizer with sharpness-aware minimization (Foret et al, 2020).
class composer.algorithms.sam.sam.SAM(rho=0.05, epsilon=1e-12, interval=1)[source]#

Adds sharpness-aware minimization (Foret et al, 2020) by wrapping an existing optimizer with a SAMOptimizer. SAM can improve model generalization and provide robustness to label noise.

Runs on INIT.

Parameters
• rho (float, optional) – The neighborhood size parameter of SAM. Must be greater than 0. Default: 0.05.

• epsilon (float, optional) – A small value added to the gradient norm for numerical stability. Default: 1e-12.

• interval (int, optional) – SAM will run once per interval steps. A value of 1 will cause SAM to run every step. Steps on which SAM runs take roughly twice as much time to complete. Default: 1.

Example

from composer.algorithms import SAM
algorithm = SAM(rho=0.05, epsilon=1.0e-12, interval=1)
trainer = Trainer(
model=model,

Bases: torch.optim.optimizer.Optimizer
Wraps an optimizer with sharpness-aware minimization (Foret et al, 2020). See SAM for details.