AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

All architectures
All architectures
bert xlm-roberta distilbert roberta

RTE

Recognizing Textual Entailment is a binary entailment task similar to MNLI, but with much less training data.
nli/rte@ukp distilbert-base-uncased
1 version Architecture: pfeiffer Head: 

Adapter for distilbert-base-uncased in Pfeiffer architecture trained on the RTE dataset for 15 epochs with early stopping and a learning rate of 1e-4.

nli/rte@ukp bert-base-uncased
1 version Architecture: pfeiffer Head: 

Adapter in Pfeiffer architecture trained on the RTE task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

nli/rte@ukp distilbert-base-uncased
1 version Architecture: houlsby Head: 

Adapter for distilbert-base-uncased in Houlsby architecture trained on the RTE dataset for 15 epochs with early stopping and a learning rate of 1e-4.

nli/rte@ukp bert-base-uncased
1 version Architecture: houlsby Head: 

Adapter in Houlsby architecture trained on the RTE task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

nli/rte@ukp roberta-base
1 version Architecture: pfeiffer Head: 

Pfeiffer Adapter trained on RTE.

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: