LanguageCrossEntropyLoss

class composer.models.nlp_metrics.LanguageCrossEntropyLoss(dist_sync_on_step=False)[source]

Bases: torchmetrics.metric.Metric

Hugging Face compatible cross entropy loss.

Parameters

dist_sync_on_step (bool) – Synchronize metric state across processes at each forward() before returning the value at the step.

State:

sum_loss (float): the sum of the per-example loss in the batch. total_batches (float): the number of batches to average across.

compute() Tensor[source]

Aggregate the state over all processes to compute the metric.

Returns

loss (Tensor) – The loss averaged across all batches.

update(output: Union[Mapping, Tensor], target: Tensor) None[source]

Updates the internal state with results from a new batch.

Parameters
  • output (Mapping) – The output from the model, which must contain either the Tensor or a Mapping type that contains the loss or model logits.

  • target (Tensor) – A Tensor of ground-truth values to compare against.