AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

roberta
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
roberta-base roberta-large

SQuAD 2.0

SQuAD2.0 combines the 100,000 questions in SQuAD1.1 with over 50,000 unanswerable questions written adversarially by crowdworkers to look similar to answerable ones. To do well on SQuAD2.0, systems must not only answer questions when possible, but also determine when no answer is supported by the paragraph and abstain from answering.
  Website 🤗  huggingface.co
qa/squad2@ukp roberta-base
1 version Architecture: pfeiffer Head: 

Adapter for roberta-base in Pfeiffer architecture trained on the SQuAD 2.0 dataset for 15 epochs with early stopping and a learning rate of 1e-4.

qa/squad2@ukp roberta-base
1 version Architecture: houlsby Head: 

Adapter for roberta-base in Houlsby architecture trained on the SQuAD 2.0 dataset for 15 epochs with early stopping and a learning rate of 1e-4.

AdapterHub/roberta-base-pf-squad_v2 roberta-base
huggingface.co Head: 

# Adapter `AdapterHub/roberta-base-pf-squad_v2` for roberta-base An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the...

Paper

Brought to you with ❤️ by the AdapterHub Team