apply_ghost_batchnorm#

composer.functional.apply_ghost_batchnorm(model, ghost_batch_size=32, optimizers=None)[source]#

Replace batch normalization modules with ghost batch normalization modules.

Ghost batch normalization modules split their input into chunks of ghost_batch_size samples and run batch normalization on each chunk separately. dim=0 is assumed to be the sample axis.

Parameters
  • model (Module) โ€“ The model to modify in-place.

  • ghost_batch_size (int, optional) โ€“ Size of sub-batches to normalize over. Default: 32.

  • optimizers (Optimizer | Sequence[Optimizer], optional) โ€“

    Existing optimizers bound to model.parameters(). All optimizers that have already been constructed with model.parameters() must be specified here so that they will optimize the correct parameters.

    If the optimizer(s) are constructed after calling this function, then it is safe to omit this parameter. These optimizers will see the correct model parameters.

Returns

The number of modules modified.

Example

import composer.functional as cf
from torchvision import models
model = models.resnet50()
cf.apply_ghost_batchnorm(model)