Perplexity

class composer.models.nlp_metrics.Perplexity(dist_sync_on_step=False)[source]

Bases: composer.models.nlp_metrics.LanguageCrossEntropyLoss

Subclasses LanguageCrossEntropyLoss to implement perplexity.

If an algorithm modifies the loss function and it is no longer directly provided in the output, then this could be expensive because it’ll compute the loss twice.

compute() Tensor[source]

Returns torch.exp() of the LanguageCrossEntropyLoss.