AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bert
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
aubmindlab/bert-base-arabert bert-base-multilingual-cased bert-base-uncased bert-base-multilingual-uncased malteos/scincl

RTE

Recognizing Textual Entailment is a binary entailment task similar to MNLI, but with much less training data.
  Website
nli/rte@ukp bert-base-uncased
1 version Architecture: pfeiffer Head: 

Adapter in Pfeiffer architecture trained on the RTE task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

nli/rte@ukp bert-base-uncased
1 version Architecture: houlsby Head: 

Adapter in Houlsby architecture trained on the RTE task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

AdapterHub/bert-base-uncased-pf-rte bert-base-uncased
huggingface.co Head: 

# Adapter `AdapterHub/bert-base-uncased-pf-rte` for bert-base-uncased An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the...

Paper

Brought to you with ❤️ by the AdapterHub Team