AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

xlm-roberta
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
xlm-roberta-base xlm-roberta-large

WMT21

The WMT21 shared task on quality estimation. Training language pairs: high-resource English--German (En-De) and English--Chinese (En-Zh), medium-resource Russian-English (Ru-En), Romanian--English (Ro-En) and Estonian--English (Et-En), and low-resource Sinhalese--English (Si-En) and Nepalese--English (Ne-En).
  Website
Gregor/xlm-roberta-base-wmt21-qe xlm-roberta-base
huggingface.co Head: 

# Adapter `Gregor/xlm-roberta-base-wmt21-qe` for xlm-roberta-base An [adapter](https://adapterhub.ml) for the xlm-roberta-base model that was trained on the...

Gregor/xlm-roberta-large-wmt21-qe xlm-roberta-large
huggingface.co Head: 

# Adapter `Gregor/xlm-roberta-large-wmt21-qe` for xlm-roberta-large An [adapter](https://adapterhub.ml) for the xlm-roberta-large model that was trained on the...

Paper

Brought to you with ❤️ by the AdapterHub Team