AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

roberta
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
roberta-base roberta-large

MRPC

Microsoft Research Paraphrase Corpus consists of sentence pairs automatically extracted from online news sources, with human annotations for whether the sentences in the pair are semantically equivalent.
  Website
sts/mrpc@ukp roberta-base
1 version Architecture: pfeiffer Head: 

Pfeiffer Adapter trained on the MRPC dataset.

sts/mrpc@ukp roberta-base
1 version Architecture: houlsby Head: 

MRPC adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

sts/mrpc@ukp roberta-large
1 version Architecture: houlsby Head: 

MRPC adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

sts/mrpc@ukp roberta-large
1 version Architecture: pfeiffer Head: 

MRPC adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

AdapterHub/roberta-base-pf-mrpc roberta-base
huggingface.co Head: 

# Adapter `AdapterHub/roberta-base-pf-mrpc` for roberta-base An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the...

Paper

Brought to you with ❤️ by the AdapterHub Team