composer.utils#
Helper utilities.
Functions
Add a transform to a dataset's collection of transforms. |
|
Indexes into the batch given the key. |
|
Indexes into the batch given the key and sets the element at that index to value. |
|
Build object store given the backend name and kwargs. |
|
Collect and print system information when |
|
Converts flat dictionary separated by slashes to nested dictionary. |
|
Takes in a nested dict converts it to a flat dict with keys separated by slashes. |
|
Helper function to create a scheduler according to a specified interval. |
|
Create a symlink file, which can be followed by |
|
Disable environment report generation on exception. |
|
Enable environment report generation on exception. |
|
Ensure that the given folder does not have any files conflicting with the |
|
Ensure that the given folder is empty. |
|
Converts |
|
Export a model for inference. |
|
Helper method for exporting a model for inference. |
|
Takes in local symbol table and recursively grabs any hyperparameter. |
|
Returns the checkpont path from symlink file. |
|
Format |
|
Format |
|
Query Composer pertinent system information as a dict. |
|
Obtain the compressor that supports the format of the given file. |
|
Takes string or Device and returns the corresponding |
|
Get a file from a local folder, URL, or object store. |
|
Get free socket port to use as MASTER_PORT. |
|
Gets full filename of save filename. |
|
Dynamically import a Python object (e.g. |
|
Whether the filename is for a directly compressed pt file. |
|
Determines whether the module needed for training on HPUs (Gaudi, Gaudi2) is installed. |
|
Whether |
|
Whether |
|
Whether Composer is running in a IPython/Jupyter Notebook. |
|
Returns whether |
|
Determines whether the module needed for training on TPUsโtorch_xlaโis installed. |
|
Load a checkpoint from a local file, URI, or cloud object store into |
|
Applies |
|
Automatically creates an |
|
Automatically creates a |
|
Set model.eval() for context duration, restoring model status at end. |
|
Uses |
|
Format a string with a partial set of arguments. |
|
Generate system information report. |
|
Decorator to retry a function with backoff and jitter. |
|
Load a torch checkpoint, catching errors due to backwards compatibility issues. |
|
Checkpoint the training |
|
Upload a tiny text file to test if the credentials are setup correctly. |
Classes
Base class for data compression CLI tools. |
|
Abstract class for implementing eval clients, such as LambdaEvalClient. |
|
Enum class for the supported export formats. |
|
Configuration for Fully Sharded Data Parallelism (FSDP). |
|
Utility for uploading to and downloading from a Google Cloud bucket using |
|
Class used to convert iterator of bytes into a file-like binary stream object. |
|
Utility for creating a client for and invoking an AWS Lambda. |
|
Utility for uploading to and downloading from object (blob) stores, such as Amazon S3. |
|
Utility for creating a client for and invoking local evaluations. |
|
Utility class for uploading and downloading artifacts from MLflow. |
|
Utility for uploading to and downloading from an OCI bucket. |
|
Abstract class for implementing object stores, such as LibcloudObjectStore and S3ObjectStore. |
|
Configuration for parallelism. |
|
Enum class for different parallelism types in the device mesh. |
|
An enumeration. |
|
Class for uploading a file to object store asynchronously. |
|
Utility for uploading to and downloading from an S3-compatible bucket using |
|
Utility for uploading to and downloading to a server via SFTP. |
|
Base class for Enums containing string values. |
|
Configuration for tensor parallelism (TP). |
|
Utility class for uploading and downloading data from Databricks Unity Catalog (UC) Volumes. |
Exceptions
Handles errors for external packages that might not be installed. |
|
Custom exception class to signify transient errors. |
|
A custom deprecation warning class that includes version information. |