- class composer.Algorithm(*args, **kwargs)#
Base class for algorithms.
Algorithms are pieces of code which run at specific events (see
Event) in the training loop. Algorithms modify the trainer’s
State, generally with the effect of improving the model’s quality or increasing the efficiency and throughput of the training loop.
- Algorithms must implement the following two methods:
- abstract apply(event, state, logger)#
Applies the algorithm to make an in-place change to the
Can optionally return an exit code to be stored in a
Trace. This exit code is made accessible for debugging.
- property backwards_create_graph#
Whether this algorithm requires the backwards pass to be differentiable. Defaults to
If it returns
create_graph=Truewill be passed to
torch.Tensor.backward()which will result in the graph of the gradient also being constructed. This allows the computation of second order derivatives.
- property find_unused_parameters#
Indicates whether this algorithm may cause some model parameters to be unused. Defaults to False.
For example, it is used to tell
torch.nn.parallel.DistributedDataParallel(DDP) that some parameters will be frozen during training, and hence it should not expect gradients from them. All algorithms which do any kind of parameter freezing should override this function to return
DeepSpeed integration with this function returning True is not tested. It may not work as expected.
- abstract match(event, state)#
Examples: To only run on a specific event (e.g., on
Event.BEFORE_LOSS), override match as shown below:
>>> class MyAlgorithm: ... def match(self, event, state): ... return event == Event.BEFORE_LOSS >>> MyAlgorithm().match(Event.BEFORE_LOSS, state) True
To run based on some value of a
Stateattribute, override match as shown below:
>>> class MyAlgorithm: ... def match(self, event, state): ... return state.timestamp.epoch > 30 >>> MyAlgorithm().match(Event.BEFORE_LOSS, state) False
Statefor accessible attributes.
- static required_on_load()#
Return True to indicate this algorithm is required when loading from a checkpoint which used it.