AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

All architectures
All architectures
distilbert bert xlm-roberta roberta

BoolQ

We build a reading comprehension dataset, BoolQ, of such questions, and show that they are unexpectedly challenging.
qa/boolq@ukp distilbert-base-uncased
1 version Architecture: pfeiffer

Adapter for distilbert-base-uncased in Pfeiffer architecture trained on the BoolQ dataset for 15 epochs with early stopping and a learning rate of 1e-4.

qa/boolq@ukp roberta-base
1 version Architecture: pfeiffer

Pfeiffer Adapter trained on the BoolQ task.

qa/boolq@ukp bert-base-uncased
1 version Architecture: pfeiffer

Pfeiffer Adapter trained on the BoolQ task.

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: