AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

All architectures
All architectures
bert bart xlm-roberta distilbert gpt2 roberta mbart

Commonsense QA

To investigate question answering with prior knowledge, we present CommonsenseQA: a challenging new dataset for commonsense question answering. To capture common sense beyond associations, we extract from ConceptNet (Speer et al., 2017) multiple target concepts that have the same semantic relation to a single source concept.
  Website 🤗  huggingface.co
comsense/csqa@ukp bert-base-uncased
1 version Architecture: pfeiffer Head: 

Pfeiffer Adapter trained on Commonsense QA.

comsense/csqa@ukp roberta-base
1 version Architecture: pfeiffer Head: 

Pfeiffer Adapter trained on Commonsense QA.

AdapterHub/bert-base-uncased-pf-commonsense_qa bert-base-uncased
huggingface.co Head: 

# Adapter `AdapterHub/bert-base-uncased-pf-commonsense_qa` for bert-base-uncased An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the...

AdapterHub/roberta-base-pf-commonsense_qa roberta-base
huggingface.co Head: 

# Adapter `AdapterHub/roberta-base-pf-commonsense_qa` for roberta-base An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the...

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: