AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

All architectures
All architectures
bert bart xlm-roberta distilbert gpt2 roberta mbart

RACE

Race is a large-scale reading comprehension dataset with more than 28,000 passages and nearly 100,000 questions. The dataset is collected from English examinations in China, which are designed for middle school and high school students. The dataset can be served as the training and test sets for machine comprehension.
  Website 🤗  huggingface.co
rc/race@ukp distilbert-base-uncased
1 version Architecture: pfeiffer Head: 

Adapter for distilbert-base-uncased in Pfeiffer architecture trained on the RACE dataset for 15 epochs with early stopping and a learning rate of 1e-4.

AdapterHub/bert-base-uncased-pf-race bert-base-uncased
huggingface.co Head: 

# Adapter `AdapterHub/bert-base-uncased-pf-race` for bert-base-uncased An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the...

AdapterHub/roberta-base-pf-race roberta-base
huggingface.co Head: 

# Adapter `AdapterHub/roberta-base-pf-race` for roberta-base An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the...

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: