Timestamp#

class composer.Timestamp(iteration=0, epoch=0, batch=0, sample=0, token=0, epoch_in_iteration=0, token_in_iteration=0, batch_in_epoch=0, sample_in_epoch=0, token_in_epoch=0, total_wct=None, iteration_wct=None, epoch_wct=None, batch_wct=None)[source]#

Timestamp represents a snapshot of the current training progress.

The timestamp measures training progress in terms of iterations, epochs, batches, samples, tokens, and wall clock time. Timestamps are not updated in-place.

See the Time Guide for more details on tracking time during training.

Parameters
  • iteration (int | Time[int], optional) โ€“ The iteration.

  • epoch (int | Time[int], optional) โ€“ The epoch.

  • batch (int | Time[int], optional) โ€“ the batch.

  • sample (int | Time[int], optional) โ€“ The sample.

  • token (int | Time[int], optional) โ€“ The token.

  • epoch_in_iteration (int | Time[int], optional) โ€“ The epoch in the iteration.

  • token_in_iteration (int | Time[int], optional) โ€“ The token in the iteration.

  • batch_in_epoch (int | Time[int], optional) โ€“ The batch in the epoch.

  • sample_in_epoch (int | Time[int], optional) โ€“ The sample in the epoch.

  • token_in_epoch (int | Time[int], optional) โ€“ The token in the epoch.

  • total_wct (timedelta, optional) โ€“ The total wall-clock duration.

  • iteration_wct (timedelta, optional) โ€“ The wall-clock duration of the current iteration.

  • epoch_wct (timedelta, optional) โ€“ The wall-clock duration of the current epoch.

  • batch_wct (timedelta, optional) โ€“ The wall-clock duration of the last batch.

property batch#

The total batch count.

property batch_in_epoch#

The batch count in the current epoch (resets at 0 at the beginning of every epoch).

property batch_wct#

The wall-clock duration (in seconds) for the last batch.

copy(iteration=None, epoch=None, batch=None, sample=None, token=None, epoch_in_iteration=None, token_in_iteration=None, batch_in_epoch=None, sample_in_epoch=None, token_in_epoch=None, total_wct=None, iteration_wct=None, epoch_wct=None, batch_wct=None)[source]#

Create a copy of the timestamp.

Any specified values will override the existing values in the returned copy.

Parameters
  • iteration (int | Time[int], optional) โ€“ The iteration.

  • epoch (int | Time[int], optional) โ€“ The epoch.

  • batch (int | Time[int], optional) โ€“ the batch.

  • sample (int | Time[int], optional) โ€“ The sample.

  • token (int | Time[int], optional) โ€“ The token.

  • epoch_in_iteration (int | Time[int], optional) โ€“ The epoch in the iteration.

  • token_in_iteration (int | Time[int], optional) โ€“ The token in the iteration.

  • batch_in_epoch (int | Time[int], optional) โ€“ The batch in the epoch.

  • sample_in_epoch (int | Time[int], optional) โ€“ The sample in the epoch.

  • token_in_epoch (int | Time[int], optional) โ€“ The token in the epoch.

  • total_wct (timedelta, optional) โ€“ The elapsed duration from the beginning of training.

  • iteration_wct (timedelta, optional) โ€“ The wall-clock duration of the current iteration.

  • epoch_wct (timedelta, optional) โ€“ The wall-clock duration of the current epoch.

  • batch_wct (timedelta, optional) โ€“ The wall-clock duration of the last batch.

Returns

Timestamp โ€“ A new timestamp instance, created from a copy, but with any specified values overriding the existing values.

property epoch#

The total epoch count.

property epoch_in_iteration#

The epoch count in the current iteration (resets at 0 at the beginning of every iteration).

property epoch_wct#

The wall-clock duration (in seconds) for the current epoch.

get(unit)[source]#

Returns the current time in the specified unit.

Parameters

unit (str | TimeUnit) โ€“ The desired unit.

Returns

Time โ€“ The current time, in the specified unit.

property iteration#

The total iteration count.

property iteration_wct#

The wall-clock duration (in seconds) for the current iteration.

property sample#

The total sample count.

property sample_in_epoch#

The sample count in the current epoch (resets at 0 at the beginning of every epoch).

to_next_batch(samples=0, tokens=0, duration=None)[source]#

Create a new Timestamp, advanced to the next batch.

Equivalent to:

>>> timestamp.copy(
...     batch=timestamp.batch + 1,
...     batch_in_epoch=timestamp.batch_in_epoch + 1,
...     sample=timestamp.sample + samples,
...     sample_in_epoch=timestamp.sample_in_epoch + samples,
...     token = timestamp.token + tokens,
...     token_in_epoch=timestamp.token_in_epoch + tokens,
...     total_wct=timestamp.total_wct + duration,
...     iteration_wct=timestamp.iteration_wct + duration,
...     epoch_wct=timestamp.epoch_wct + duration,
...     batch_wct=duration,
... )
Timestamp(...)

Note

For accurate time tracking, when doing distributed training, the samples and tokens should be the total across all ranks for the given batch. This method will not accumulate these counts automatically. If per-rank sample and token counts are provided, these counts will differ across ranks, which could lead towards inconsistent behavior by Algorithm or Callback instances that use these counts.

Parameters
  • samples (int | Time, optional) โ€“ The number of samples trained in the batch. Defaults to 0.

  • tokens (int | Time, optional) โ€“ The number of tokens trained in the batch. Defaults to 0.

  • duration (timedelta, optional) โ€“ The duration to train the batch.

to_next_epoch(tokens=0, duration=None)[source]#

Create a new Timestamp, advanced to the next epoch.

Equivalent to:

>>> timestamp.copy(
...     epoch=timestamp.epoch + 1,
...     epoch_in_iteration=timestamp.epoch_in_iteration + 1,
...     token_in_iteration=timestamp.token_in_iteration + tokens,
...     batch_in_epoch=0,
...     sample_in_epoch=0,
...     token_in_epoch=0,
...     total_wct=timestamp.total_wct + duration,
...     iteration_wct=timestamp.iteration_wct + duration,
...     epoch_wct=datetime.timedelta(seconds=0),
...     batch_wct=datetime.timedelta(seconds=0),
... )
Timestamp(...)
Parameters
  • tokens (int | Time, optional) โ€“ The number of tokens trained in the batch. Defaults to 0.

  • duration (timedelta, optional) โ€“ The duration to train the batch.

to_next_iteration(duration=None)[source]#

Create a new Timestamp, advanced to the next iteration.

Equivalent to:

>>> timestamp.copy(
...     iteration=timestamp.iteration + 1,
...     epoch_in_iteration=0,
...     token_in_iteration=0,
...     batch_in_epoch=0,
...     sample_in_epoch=0,
...     token_in_epoch=0,
...     total_wct=timestamp.total_wct + duration,
...     iteration_wct=datetime.timedelta(seconds=0),
...     epoch_wct=datetime.timedelta(seconds=0),
...     batch_wct=datetime.timedelta(seconds=0),
... )
Timestamp(...)
property token#

The total token count.

property token_in_epoch#

The token count in the current epoch (resets at 0 at the beginning of every epoch).

property token_in_iteration#

The token count in the current iteration (resets at 0 at the beginning of every iteration).

property total_wct#

The wall-clock duration (in seconds) from the beginning of training.