AdapterHub
Explore
Docs
Blog
Explore
Language
Language Adapters
Pre-trained model:
All architectures
All architectures
xlm-roberta
bert
roberta
gpt2
bart
distilbert
mbart
t5
xmod
kv/wiki
Language modeling for the Komi language on Wikipedia.
kv/wiki@ukp
bert-base-multilingual-cased
1 version
Architecture: pfeiffer
non-linearity: gelu
reduction factor: 2
Head: