AdapterHub
Explore
Docs
Blog
Explore
Language
Language Adapters
Pre-trained model:
All architectures
All architectures
bert
distilbert
roberta
xlm-roberta
bart
gpt2
mbart
xmod
am/wiki
Language modeling for the Amharic language on Wikipedia.
am/wiki@ukp
bert-base-multilingual-cased
1 version
Architecture: pfeiffer
non-linearity: relu
reduction factor: 16
Head: