AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bart
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
facebook/bart-base facebook/bart-large

SQuAD 2.0

SQuAD2.0 combines the 100,000 questions in SQuAD1.1 with over 50,000 unanswerable questions written adversarially by crowdworkers to look similar to answerable ones. To do well on SQuAD2.0, systems must not only answer questions when possible, but also determine when no answer is supported by the paragraph and abstain from answering.
  Website 🤗  huggingface.co
qa/squad2@lohfink-rossi facebook/bart-large
1 version Architecture: lohfink-rossi-leaveout non-linearity: relu reduction factor: 16 Head: 

Adapter for bart-large using a custom architecture (Lohfink-Rossi-Leaveout) trained on the SQuAD 2.0 dataset for 15 epochs with a Cosine with Restarts learning rate scheduler ans learning rate 0.001.

Paper

Brought to you with ❤️ by the AdapterHub Team