LanguageCrossEntropy#
- class composer.metrics.LanguageCrossEntropy(vocab_size, dist_sync_on_step=False, ignore_index=- 100)[source]#
Torchmetric that computes cross entropy on language modeling outputs.
- Adds metric state variables:
sum_loss (float): The sum of the per-example loss in the batch. total_items (float): The number of batches to average across.
- Parameters
- compute()[source]#
Aggregate the state over all processes to compute the metric.
- Returns
loss โ The loss averaged across all batches as a
Tensor
.
- update(output, target)[source]#
Updates the internal state with results from a new batch.
- Parameters
output (Mapping) โ The output from the model, which must contain either the Tensor or a Mapping type that contains the loss or model logits.
target (Tensor) โ A Tensor of ground-truth values to compare against.