AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bert
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
aubmindlab/bert-base-arabert bert-base-multilingual-cased bert-base-uncased bert-base-multilingual-uncased malteos/scincl

WMT21

The WMT21 shared task on quality estimation. Training language pairs: high-resource English--German (En-De) and English--Chinese (En-Zh), medium-resource Russian-English (Ru-En), Romanian--English (Ro-En) and Estonian--English (Et-En), and low-resource Sinhalese--English (Si-En) and Nepalese--English (Ne-En).
  Website
Gregor/bert-base-multilingual-cased-wmt21-qe bert-base-multilingual-cased
huggingface.co Head: 

# Adapter `Gregor/bert-base-multilingual-cased-wmt21-qe` for bert-base-multilingual-cased An [adapter](https://adapterhub.ml) for the bert-base-multilingual-cased model that was trained on the...

Paper

Brought to you with ❤️ by the AdapterHub Team