AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

All architectures
All architectures
distilbert bert xlm-roberta roberta bart gpt2 mbart

SQuAD 1.1

Stanford Question Answering Dataset (SQuAD) is a reading comprehension dataset, consisting of questions posed by crowdworkers on a set of Wikipedia articles, where the answer to every question is a segment of text, or span, from the corresponding reading passage, or the question might be unanswerable.
  Website 🤗  huggingface.co
qa/squad1@ukp bert-base-uncased
1 version Architecture: pfeiffer Head: 

Adapter for bert-base in Pfeiffer architecture trained on the SQuAD 1.1 dataset for 15 epochs with early stopping and a learning rate of 3e-4.

qa/squad1@ukp distilbert-base-uncased
1 version Architecture: houlsby Head: 

Adapter for distilbert-base-uncased in Houlsby architecture trained on the SQuAD 1.1 dataset for 15 epochs with early stopping and a learning rate of 1e-4.

qa/squad1@ukp roberta-base
1 version Architecture: pfeiffer Head: 

Adapter for roberta-base in Pfeiffer architecture trained on the SQuAD 1.1 dataset for 15 epochs with early stopping and a learning rate of 1e-4.

qa/squad1@ukp bert-base-uncased
1 version Architecture: houlsby Head: 

Adapter for bert-base in Houlsby architecture trained on the SQuAD 1.1 dataset for 15 epochs with early stopping and a learning rate of 3e-4.

qa/squad1@ukp distilbert-base-uncased
1 version Architecture: pfeiffer Head: 

Adapter for distilbert-base-uncased in Pfeiffer architecture trained on the SQuAD 1.1 dataset for 15 epochs with early stopping and a learning rate of 1e-4.

qa/squad1@ukp roberta-base
1 version Architecture: houlsby Head: 

Adapter for roberta-base in Houlsby architecture trained on the SQuAD 1.1 dataset for 15 epochs with early stopping and a learning rate of 1e-4.

AdapterHub/roberta-base-pf-squad roberta-base
huggingface.co Head: 

# Adapter `AdapterHub/roberta-base-pf-squad` for roberta-base An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the...

AdapterHub/bert-base-uncased-pf-squad bert-base-uncased
huggingface.co Head: 

# Adapter `AdapterHub/bert-base-uncased-pf-squad` for bert-base-uncased An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the...

calpt/adapter-bert-base-squad1 bert-base-uncased
huggingface.co Head: 

# BERT-base Adapter for SQuAD 1.1 Imported from https://adapterhub.ml/adapters/ukp/bert-base-uncased_qa_squad1_houlsby/.

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: