RandAugment#

class composer.algorithms.RandAugment(severity=9, depth=2, augmentation_set='all')[source]#

Randomly applies a sequence of image data augmentations to an image.

This algorithm (Cubuk et al, 2019) runs on Event.INIT to insert a dataset transformation. It is a no-op if this algorithm already applied itself on the State.train_dataloader.dataset.

See the Method Card for more details.

Example

from composer.algorithms import RandAugment
from composer.trainer import Trainer

randaugment_algorithm = RandAugment(
    severity=9,
    depth=2,
    augmentation_set="all"
)
trainer = Trainer(
    model=model,
    train_dataloader=train_dataloader,
    eval_dataloader=eval_dataloader,
    max_duration="1ep",
    algorithms=[randaugment_algorithm],
    optimizers=[optimizer]
)
Parameters
  • severity (int, optional) โ€“ Severity of augmentation operators (between 1 to 10). M in the original paper. Default: 9.

  • depth (int, optional) โ€“ Depth of augmentation chain. N in the original paper. Default: 2.

  • augmentation_set (str, optional) โ€“

    Must be one of the following options as also described in augmentation_primitives.augmentation_sets:

    • "all"

      Uses all augmentations from the paper.

    • "safe"

      Like "all", but excludes transforms that are part of the ImageNet-C/CIFAR10-C test sets

    • "original"

      Like "all", but some of the implementations are identical to the original Github repository, which contains implementation specificities for the augmentations "color", "contrast", "sharpness", and "brightness". The original implementations have an intensity sampling scheme that samples a value bounded by 0.118 at a minimum, and a maximum value of \(intensity \times 0.18 + .1\), which ranges from 0.28 (intensity = 1) to 1.9 (intensity 10). These augmentations have different effects depending on whether they are < 0 or > 0 (or < 1 or > 1). "all" uses implementations of "color", "contrast", "sharpness", and "brightness" that account for diverging effects around 0 (or 1).

    Default: "all".