AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

All architectures
All architectures
distilbert bert xlm-roberta roberta

Commonsense QA

To investigate question answering with prior knowledge, we present CommonsenseQA: a challenging new dataset for commonsense question answering. To capture common sense beyond associations, we extract from ConceptNet (Speer et al., 2017) multiple target concepts that have the same semantic relation to a single source concept.
comsense/csqa@ukp roberta-base
1 version Architecture: pfeiffer

Pfeiffer Adapter trained on Commonsense QA.

comsense/csqa@ukp bert-base-uncased
1 version Architecture: pfeiffer

Pfeiffer Adapter trained on Commonsense QA.

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: