AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bert
All architectures
distilbert bert xlm-roberta roberta bart gpt2 mbart
All shortcut names
All shortcut names
bert-base-multilingual-cased bert-base-uncased bert-base-multilingual-uncased aubmindlab/bert-base-arabert

MRPC

Microsoft Research Paraphrase Corpus consists of sentence pairs automatically extracted from online news sources, with human annotations for whether the sentences in the pair are semantically equivalent.
  Website
sts/mrpc@ukp bert-base-uncased
1 version Architecture: houlsby Head: 

Adapter in Houlsby architecture trained on the MRPC task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

sts/mrpc@ukp bert-base-uncased
1 version Architecture: pfeiffer Head: 

Adapter in Pfeiffer architecture trained on the MRPC task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

AdapterHub/bert-base-uncased-pf-mrpc bert-base-uncased
huggingface.co Head: 

# Adapter `AdapterHub/bert-base-uncased-pf-mrpc` for bert-base-uncased An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the...

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: