AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bert
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
bert-base-multilingual-cased
All shortcut names
aubmindlab/bert-base-arabert bert-base-multilingual-cased bert-base-uncased bert-base-multilingual-uncased malteos/scincl

MultiNLI

Multi-Genre Natural Language Inference is a large-scale, crowdsourced entailment classification task. Given a pair of sentences, the goal is to predict whether the second sentence is an entailment, contradiction, or neutral with respect to the first one.
  Website
nli/multinli@kabirahuja2431 bert-base-multilingual-cased
2 versions Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer adapter stacked on top of language adapter for the NLI task. Trained on the English MultiNLI data for 5 epochs and a batch size of 64. Version 2 performs better for cross lingual transfer

Paper

Brought to you with ❤️ by the AdapterHub Team