AdapterHub
Explore
Docs
Blog
Explore
Language
Language Adapters
Pre-trained model:
bert
All architectures
bert
distilbert
roberta
xlm-roberta
bart
gpt2
mbart
xmod
All shortcut names
All shortcut names
bert-base-multilingual-cased
bert-base-uncased
aubmindlab/bert-base-arabert
bert-base-multilingual-uncased
malteos/scincl
allenai/scibert_scivocab_uncased
CAMeL-Lab/bert-base-arabic-camelbert-msa
myv/wiki
Language modeling for the Erzya language on Wikipedia.
myv/wiki@ukp
bert-base-multilingual-cased
1 version
Architecture: pfeiffer
non-linearity: gelu
reduction factor: 2
Head: