AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

gpt2
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
gpt2

RTE

Recognizing Textual Entailment is a binary entailment task similar to MNLI, but with much less training data.
  Website
nli/rte@ukp gpt2
1 version Architecture: houlsby non-linearity: swish reduction factor: 16 Head: 

Adapter for gpt2 in Houlsby architecture trained on the RTE dataset for 10 epochs with a learning rate of 1e-4.

nli/rte@ukp gpt2
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 16 Head: 

Adapter for gpt2 in Pfeiffer architecture trained on the RTE dataset for 10 epochs with a learning rate of 1e-4.

Paper

Brought to you with ❤️ by the AdapterHub Team