AdapterHub
Explore
Docs
Blog
Explore
Task
Task Adapters
Pre-trained model:
bert
All architectures
bert
distilbert
roberta
xlm-roberta
bart
gpt2
mbart
xmod
bert-base-multilingual-uncased
All shortcut names
bert-base-multilingual-cased
bert-base-uncased
aubmindlab/bert-base-arabert
bert-base-multilingual-uncased
malteos/scincl
allenai/scibert_scivocab_uncased
CAMeL-Lab/bert-base-arabic-camelbert-msa
WMT16 English-Romanian
The English-Romanian translation dataset from the shared task of the First Conference on Machine Translation (WMT16).
Website
No task adapters available for
mt/wmt16_en_ro
bert-base-multilingual-uncased
Add your adapter to AdapterHub,
it's super awesome!
Get started