composer.models.huggingface#

A wrapper class that converts ๐Ÿค— Transformers models to composer models

Classes

HuggingFaceModel

A wrapper class that converts ๐Ÿค— Transformers models to composer models.

class composer.models.huggingface.HuggingFaceModel(model, tokenizer=None, use_logits=False, metrics=None)[source]#

Bases: composer.models.base.ComposerModel

A wrapper class that converts ๐Ÿค— Transformers models to composer models.

Parameters
  • model (PreTrainedModel) โ€“ A ๐Ÿค— Transformers model.

  • tokenizer (PreTrainedTokenizer) โ€“ Tokenizer used to prepare the dataset and validate model inputs during training. Default None.

  • use_logits (bool, optional) โ€“ If True, the modelโ€™s output logits will be used to calculate validation metrics. Else, metrics will be inferred from the HuggingFaceModel directly. Default: False

  • metrics (list[Metric], optional) โ€“ list of torchmetrics to apply to the output of validate. Default: None.

Warning

This wrapper is designed to work with ๐Ÿค— datasets that define a labels column.

Example:

import transformers
from composer.models import HuggingFaceModel

hf_model = transformers.AutoModelForSequenceClassification.from_pretrained('bert-base-uncased', num_labels=2)
model = HuggingFaceModel(hf_model)
get_model_inputs()[source]#

Returns a set of inputs that the model expects in the forward pass. If an algorithm wants to interact with the model inputs (for instance, popping the labels for a custom loss fn, or adding attention head masks for head pruning, it must access self.set_model_inputs(). :Returns: model_inputs โ€“ The set of keys that are expected in the Mapping used to compute the forward pass.