AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

gpt2
All architectures
distilbert bert xlm-roberta roberta bart gpt2 mbart
All shortcut names
All shortcut names
gpt2

MRPC

Microsoft Research Paraphrase Corpus consists of sentence pairs automatically extracted from online news sources, with human annotations for whether the sentences in the pair are semantically equivalent.
  Website
sts/mrpc@ukp gpt2
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 16 Head: 

Adapter for gpt2 in Pfeiffer architecture trained on the MRPC dataset for 10 epochs with a learning rate of 1e-4.

sts/mrpc@ukp gpt2
1 version Architecture: houlsby non-linearity: swish reduction factor: 16 Head: 

Adapter for gpt2 in Houlsby architecture trained on the MRPC dataset for 10 epochs with a learning rate of 1e-4.

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: