AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bert
All architectures
distilbert bert xlm-roberta roberta bart gpt2 mbart
All shortcut names
All shortcut names
bert-base-multilingual-cased bert-base-uncased bert-base-multilingual-uncased aubmindlab/bert-base-arabert

BoolQ

We build a reading comprehension dataset, BoolQ, of such questions, and show that they are unexpectedly challenging.
  Website 🤗  huggingface.co
qa/boolq@ukp bert-base-uncased
1 version Architecture: pfeiffer Head: 

Pfeiffer Adapter trained on the BoolQ task.

AdapterHub/bert-base-uncased-pf-boolq bert-base-uncased
huggingface.co Head: 

# Adapter `AdapterHub/bert-base-uncased-pf-boolq` for bert-base-uncased An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the...

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: