AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

All architectures
All architectures
bert bart xlm-roberta distilbert gpt2 roberta mbart

WMT21

The WMT21 shared task on quality estimation. Training language pairs: high-resource English--German (En-De) and English--Chinese (En-Zh), medium-resource Russian-English (Ru-En), Romanian--English (Ro-En) and Estonian--English (Et-En), and low-resource Sinhalese--English (Si-En) and Nepalese--English (Ne-En).
  Website
Gregor/bert-base-multilingual-cased-wmt21-qe bert-base-multilingual-cased
huggingface.co Head: 

# Adapter `Gregor/bert-base-multilingual-cased-wmt21-qe` for bert-base-multilingual-cased An [adapter](https://adapterhub.ml) for the bert-base-multilingual-cased model that was trained on the...

Gregor/xlm-roberta-base-wmt21-qe xlm-roberta-base
huggingface.co Head: 

# Adapter `Gregor/xlm-roberta-base-wmt21-qe` for xlm-roberta-base An [adapter](https://adapterhub.ml) for the xlm-roberta-base model that was trained on the...

Gregor/xlm-roberta-large-wmt21-qe xlm-roberta-large
huggingface.co Head: 

# Adapter `Gregor/xlm-roberta-large-wmt21-qe` for xlm-roberta-large An [adapter](https://adapterhub.ml) for the xlm-roberta-large model that was trained on the...

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: