AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

roberta
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
roberta-base
All shortcut names
roberta-base roberta-large

QNLI

Question Natural Language Inference is a version of SQuAD which has been converted to a binary classification task. The positive examples are (question, sentence) pairs which do contain the correct answer, and the negative examples are (question,sentence) from the same paragraph which do not contain the answer.
nli/qnli@ukp roberta-base
1 version Architecture: pfeiffer Head: 

QNLI adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 15 epochs).

nli/qnli@ukp roberta-base
1 version Architecture: houlsby Head: 

QNLI adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 15 epochs).

AdapterHub/roberta-base-pf-qnli roberta-base
huggingface.co Head: 

# Adapter `AdapterHub/roberta-base-pf-qnli` for roberta-base An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the...

Paper

Brought to you with ❤️ by the AdapterHub Team