AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bert
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
aubmindlab/bert-base-arabert bert-base-multilingual-cased bert-base-uncased bert-base-multilingual-uncased malteos/scincl

SQuAD 2.0

SQuAD2.0 combines the 100,000 questions in SQuAD1.1 with over 50,000 unanswerable questions written adversarially by crowdworkers to look similar to answerable ones. To do well on SQuAD2.0, systems must not only answer questions when possible, but also determine when no answer is supported by the paragraph and abstain from answering.
  Website 🤗  huggingface.co
qa/squad2@ukp bert-base-uncased
1 version Architecture: pfeiffer Head: 

Adapter for bert-base-uncased in Pfeiffer architecture trained on the SQuAD 2.0 dataset for 15 epochs with early stopping and a learning rate of 1e-4.

qa/squad2@ukp bert-base-uncased
1 version Architecture: houlsby Head: 

Adapter for bert-base-uncased in Pfeiffer architecture trained on the SQuAD 2.0 dataset for 15 epochs with early stopping and a learning rate of 1e-4.

AdapterHub/bert-base-uncased-pf-squad_v2 bert-base-uncased
huggingface.co Head: 

# Adapter `AdapterHub/bert-base-uncased-pf-squad_v2` for bert-base-uncased An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the...

Paper

Brought to you with ❤️ by the AdapterHub Team