AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bart
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
facebook/bart-base facebook/bart-large

MRPC

Microsoft Research Paraphrase Corpus consists of sentence pairs automatically extracted from online news sources, with human annotations for whether the sentences in the pair are semantically equivalent.
  Website
sts/mrpc@ukp facebook/bart-base
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 16 Head: 

Adapter for bart-base in Pfeiffer architecture trained on the MRPC dataset for 15 epochs with early stopping and a learning rate of 1e-4.

sts/mrpc@ukp facebook/bart-base
1 version Architecture: houlsby non-linearity: swish reduction factor: 16 Head: 

Adapter for bart-base in Houlsby architecture trained on the MRPC dataset for 15 epochs with early stopping and a learning rate of 1e-4.

Paper

Brought to you with ❤️ by the AdapterHub Team