format_name_with_dist_and_time#
- composer.utils.format_name_with_dist_and_time(format_str, run_name, timestamp, **extra_format_kwargs)[source]#
Format
format_str
with therun_name
, distributed variables,timestamp
, andextra_format_kwargs
.In addition to the variables specified via
extra_format_kwargs
, the following format variables are available:Variable
Description
{run_name}
The name of the training run. See
Logger.run_name
.{rank}
The global rank, as returned by
get_global_rank()
.{local_rank}
The local rank of the process, as returned by
get_local_rank()
.{world_size}
The world size, as returned by
get_world_size()
.{local_world_size}
The local world size, as returned by
get_local_world_size()
.{node_rank}
The node rank, as returned by
get_node_rank()
.{epoch}
The total epoch count, as returned by
epoch()
.{batch}
The total batch count, as returned by
batch()
.{batch_in_epoch}
The batch count in the current epoch, as returned by
batch_in_epoch()
.{sample}
The total sample count, as returned by
sample()
.{sample_in_epoch}
The sample count in the current epoch, as returned by
sample_in_epoch()
.{token}
The total token count, as returned by
token()
.{token_in_epoch}
The token count in the current epoch, as returned by
token_in_epoch()
.{total_wct}
The total training duration in seconds, as returned by
total_wct()
.{epoch_wct}
The epoch duration in seconds, as returned by
epoch_wct()
.{batch_wct}
The batch duration in seconds, as returned by
batch_wct()
.For example, assume that the current epoch is
0
, batch is0
, and rank is0
. Then:>>> from composer.utils import format_name_with_dist_and_time >>> format_str = '{run_name}/ep{epoch}-ba{batch}-rank{rank}.{extension}' >>> format_name_with_dist_and_time( ... format_str, ... run_name='awesome_training_run', ... timestamp=state.timestamp, ... extension='json', ... ) 'awesome_training_run/ep0-ba0-rank0.json'