AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bart
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
facebook/bart-base facebook/bart-large

MultiNLI

Multi-Genre Natural Language Inference is a large-scale, crowdsourced entailment classification task. Given a pair of sentences, the goal is to predict whether the second sentence is an entailment, contradiction, or neutral with respect to the first one.
  Website
nli/multinli@ukp facebook/bart-base
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 16 Head: 

Adapter for bart-base in Pfeiffer architecture trained on the MultiNLI dataset for 15 epochs with early stopping and a learning rate of 1e-4.

nli/multinli@ukp facebook/bart-base
1 version Architecture: houlsby non-linearity: swish reduction factor: 16 Head: 

Adapter for bart-base in Houlsby architecture trained on the MultiNLI dataset for 15 epochs with early stopping and a learning rate of 1e-4.

Paper

Brought to you with ❤️ by the AdapterHub Team