- composer.functional.apply_ghost_batchnorm(model, ghost_batch_size=32, optimizers=None)#
Replace batch normalization modules with ghost batch normalization modules.
Ghost batch normalization modules split their input into chunks of
ghost_batch_sizesamples and run batch normalization on each chunk separately.
dim=0is assumed to be the sample axis.
model (Module) – The model to modify in-place.
ghost_batch_size (int, optional) – Size of sub-batches to normalize over. Default:
Existing optimizers bound to
model.parameters(). All optimizers that have already been constructed with
model.parameters()must be specified here so that they will optimize the correct parameters.
If the optimizer(s) are constructed after calling this function, then it is safe to omit this parameter. These optimizers will see the correct model parameters.
The modified model
import composer.functional as cf from torchvision import models model = models.resnet50() cf.apply_ghost_batchnorm(model)