# composer.algorithms.ghost_batchnorm.ghost_batchnorm#

composer.algorithms.ghost_batchnorm.ghost_batchnorm

Functions

 apply_ghost_batchnorm Replace batch normalization modules with ghost batch normalization modules.

Classes

 Algorithm Base class for algorithms. Event Enum to represent training loop events. GhostBatchNorm Replaces batch normalization modules with Ghost Batch Normalization modules that simulate the effect of using a smaller batch size. GhostBatchNorm1d composer.algorithms.ghost_batchnorm.ghost_batchnorm.GhostBatchNorm1d GhostBatchNorm2d composer.algorithms.ghost_batchnorm.ghost_batchnorm.GhostBatchNorm2d GhostBatchNorm3d composer.algorithms.ghost_batchnorm.ghost_batchnorm.GhostBatchNorm3d Logger An interface to record training data. Optimizer Base class for all optimizers. State The state of the trainer.

Attributes

• Optional

• Sequence

• Union

• log

class composer.algorithms.ghost_batchnorm.ghost_batchnorm.GhostBatchNorm(ghost_batch_size=32)[source]#

Replaces batch normalization modules with Ghost Batch Normalization modules that simulate the effect of using a smaller batch size.

Works by spliting input into chunks of ghost_batch_size samples and running batch normalization on each chunk separately. dim=0 is assumed to be the sample axis.

Runs on INIT.

Parameters

ghost_batch_size (int, optional) – size of sub-batches to normalize over. Default: 32.

composer.algorithms.ghost_batchnorm.ghost_batchnorm.apply_ghost_batchnorm(model, ghost_batch_size=32, optimizers=None)[source]#

Replace batch normalization modules with ghost batch normalization modules.

Ghost batch normalization modules split their input into chunks of ghost_batch_size samples and run batch normalization on each chunk separately. dim=0 is assumed to be the sample axis.

Parameters
• model (Module) – The model to modify in-place.

• ghost_batch_size (int, optional) – Size of sub-batches to normalize over. Default: 32.

• optimizers (Optimizer | Sequence[Optimizer], optional) –

Existing optimizers bound to model.parameters(). All optimizers that have already been constructed with model.parameters() must be specified here so that they will optimize the correct parameters.

If the optimizer(s) are constructed after calling this function, then it is safe to omit this parameter. These optimizers will see the correct model parameters.

Returns

The modified model

Example

import composer.functional as cf
from torchvision import models
model = models.resnet50()
cf.apply_ghost_batchnorm(model)