- class composer.metrics.LanguageCrossEntropy(vocab_size, dist_sync_on_step=False, ignore_index=- 100)#
Torchmetric that computes cross entropy on language modeling outputs.
- Adds metric state variables:
sum_loss (float): The sum of the per-example loss in the batch. total_items (float): The number of batches to average across.
Aggregate the state over all processes to compute the metric.
loss – The loss averaged across all batches as a
- update(output, target)#
Updates the internal state with results from a new batch.
output (Mapping) – The output from the model, which must contain either the Tensor or a Mapping type that contains the loss or model logits.
target (Tensor) – A Tensor of ground-truth values to compare against.