Composer (Beta)

MosaicML Composer contains a library of ML training efficiency methods, and a modular approach to compose them together to train deep neural networks. We aim to ease the transition from research to industry through reproducible code and rigorous benchmarking. With Composer, speed-up or accuracy-boosting methods can be easily composed into complete recipes.

The library features:

  • Implementation of 20+ efficiency methods curated from the research community

  • Standardized approach to implement and compose efficiency methods, extended from two-way callbacks (Howard et al, 2020)

  • Easy way to access our methods either directly for your trainer loops, or through the MosaicML trainer


MosaicML Composer is currently in beta, so the API is subject to change.


MosaicML exists to make ML training more efficient. We believe large scale ML should be available to everyone not just large companies.

The ML community is overwhelmed by the plethora of new algorithms in the literature and open source. It is often difficult to integrate new methods into existing code, due to reproducibility (Pineau et al, 2020) and complexity. In addition, methods should be charaterized according to their effect of time-to-train and interactions with systems.

For more details on our philosophy, see our Methodology and our founder’s blog.

We hope to contribute to the amazing community around ML Systems and ML Training efficiency.


Our documentation is organized into a few sections:

  • Getting Started covers installation, a quick tour and explains how to use Composer.

  • Core covers the core components of the library.

  • composer contains the library’s API reference.

  • Methods Library details our implemented efficiency methods.

Indices and tables