AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bert
All architectures
gpt2 bert xlm-roberta roberta distilbert bart mbart
All shortcut names
All shortcut names
bert-base-uncased bert-base-multilingual-cased aubmindlab/bert-base-arabert bert-base-multilingual-uncased bert-base-cased malteos/scincl

MultiRC

The Multi-Sentence Reading Comprehension dataset (MultiRC, Khashabi et al., 2018) is a true/false question-answering task. Each example consists of a context paragraph, a question about that paragraph, and a list of possible answers to that question which must be labeled as true or false. The paragraphs are drawn from seven domains including news, fiction, and historical text.
  Website
AdapterHub/bert-base-uncased-pf-multirc bert-base-uncased
huggingface.co Head: 

# Adapter `AdapterHub/bert-base-uncased-pf-multirc` for bert-base-uncased An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the...

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: