composer.core.types#
Reference for common types used throughout the composer library.
- composer.core.types.Batch#
Union type covering the most common representations of batches. A batch of data can be represented in several formats, depending on the application.
- Type
BatchPair | BatchDict | Tensor
- composer.core.types.BatchPair#
Commonly used in computer vision tasks. The object is assumed to contain exactly two elements, where the first represents inputs and the second represents targets.
- composer.core.types.BatchDict#
Commonly used in natural language processing tasks.
- Type
Dict[str, Tensor]
- composer.core.types.PyTorchScheduler#
Alias for base class of learning rate schedulers such as
torch.optim.lr_scheduler.ConstantLR
.- Type
torch.optim.lr_scheduler._LRScheduler
- composer.core.types.JSON#
JSON Data.
- composer.core.types.Dataset[source]#
Alias for
torch.utils.data.Dataset
.- Type
Dataset[Batch]
Functions
Classes
Protocol for custom DataLoaders compatible with |
|
Enum class to represent different memory formats. |
|
|
composer.core.types.torch.optim.lr_scheduler._LRScheduler |
Exceptions
Raising this exception will immediately end the current epoch. |
Attributes
- exception composer.core.types.BreakEpochException[source]#
Bases:
Exception
Raising this exception will immediately end the current epoch.
If youโre wondering whether you should use this, the answer is no.
- class composer.core.types.DataLoader(*args, **kwargs)[source]#
Bases:
Protocol
Protocol for custom DataLoaders compatible with
torch.utils.data.DataLoader
.- dataset#
Dataset from which to load the data.
- Type
Dataset
- num_workers#
How many subprocesses to use for data loading.
0
means that the data will be loaded in the main process.- Type
- pin_memory#
If
True
, the data loader will copy Tensors into CUDA pinned memory before returning them.- Type
- drop_last#
If
len(dataset)
is not evenly divisible bybatch_size
, whether the last batch is dropped (if True) or truncated (if False).- Type
- prefetch_factor#
Number of samples loaded in advance by each worker.
2
means there will be a total of 2 *num_workers
samples prefetched across all workers.- Type
- class composer.core.types.MemoryFormat(value)[source]#
Bases:
composer.utils.string_enum.StringEnum
Enum class to represent different memory formats.
See
torch.torch.memory_format
for more details.- CONTIGUOUS_FORMAT#
Default PyTorch memory format represnting a tensor allocated with consecutive dimensions sequential in allocated memory.
- CHANNELS_LAST#
This is also known as NHWC. Typically used for images with 2 spatial dimensions (i.e., Height and Width) where channels next to each other in indexing are next to each other in allocated memory. For example, if C[0] is at memory location M_0 then C[1] is at memory location M_1, etc.
- CHANNELS_LAST_3D#
This can also be referred to as NTHWC. Same as
CHANNELS_LAST
but for videos with 3 spatial dimensions (i.e., Time, Height and Width).
- PRESERVE_FORMAT#
A way to tell operations to make the output tensor to have the same memory format as the input tensor.