# Perplexity#

class composer.metrics.Perplexity(dist_sync_on_step=False)[source]#

Subclasses HFLanguageCrossEntropyLoss to implement perplexity.

If an algorithm modifies the loss function and it is no longer directly provided in the output, then this could be expensive because it’ll compute the loss twice.

compute()[source]#

Returns torch.exp() of the LanguageCrossEntropyLoss.