AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Language

Language Adapters

Pre-trained model:

bert
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
aubmindlab/bert-base-arabert bert-base-multilingual-cased bert-base-uncased bert-base-multilingual-uncased malteos/scincl

myv/wiki

Language modeling for the Erzya language on Wikipedia.
myv/wiki@ukp bert-base-multilingual-cased
1 version Architecture: pfeiffer non-linearity: gelu reduction factor: 2 Head: 

Paper

Brought to you with ❤️ by the AdapterHub Team