AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

roberta
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
roberta-base
All shortcut names
roberta-base roberta-large

RTE

Recognizing Textual Entailment is a binary entailment task similar to MNLI, but with much less training data.
  Website
nli/rte@ukp roberta-base
1 version Architecture: pfeiffer Head: 

Pfeiffer Adapter trained on RTE.

AdapterHub/roberta-base-pf-rte roberta-base
huggingface.co Head: 

# Adapter `AdapterHub/roberta-base-pf-rte` for roberta-base An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the...

Paper

Brought to you with ❤️ by the AdapterHub Team