AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Language

Language Adapters

Pre-trained model:

bert
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
aubmindlab/bert-base-arabert bert-base-multilingual-cased bert-base-uncased bert-base-multilingual-uncased malteos/scincl

ru/wiki

Language modeling for the Russian language on Wikipedia. Russian is an East Slavic language, which is an official language in Russia, Belarus, Kazakhstan, Kyrgyzstan, as well as being widely used throughout Eastern Europe, the Baltic states, the Caucasus and Central Asia.
ru/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: pfeiffer non-linearity: gelu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Russian Wikipedia Articles for 250k steps and a batch size of 64.

ru/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: houlsby non-linearity: gelu reduction factor: 2 Head: 

Houlsby Adapter trained with Masked Language Modelling on Russian Wikipedia Articles for 250k steps and a batch size of 64.

Paper

Brought to you with ❤️ by the AdapterHub Team