AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

All architectures
All architectures
distilbert bert xlm-roberta roberta

MultiNLI

Multi-Genre Natural Language Inference is a large-scale, crowdsourced entailment classification task. Given a pair of sentences, the goal is to predict whether the second sentence is an entailment, contradiction, or neutral with respect to the first one.
nli/multinli@ukp roberta-large
1 version Architecture: pfeiffer

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 10 epochs).

nli/multinli@ukp roberta-base
1 version Architecture: pfeiffer

Pfeiffer Adapter trained on Multi-NLI.

nli/multinli@ukp distilbert-base-uncased
1 version Architecture: pfeiffer

Adapter for distilbert-base-uncased in Pfeiffer architecture trained on the MultiNLI dataset for 15 epochs with early stopping and a learning rate of 1e-4.

nli/multinli@ukp bert-base-uncased
1 version Architecture: pfeiffer

Adapter in Pfeiffer architecture trained on the MultiMLI task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

nli/multinli@ukp distilbert-base-uncased
1 version Architecture: houlsby

Adapter for distilbert-base-uncased in Houlsby architecture trained on the MultiNLI dataset for 15 epochs with early stopping and a learning rate of 1e-4.

nli/multinli@ukp bert-base-uncased
1 version Architecture: houlsby

Adapter in Houlsby architecture trained on the MultiNLI task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: