AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Language

Language Adapters

Pre-trained model:

bert
All architectures
bert distilbert roberta xlm-roberta bart gpt2 mbart xmod
All shortcut names
All shortcut names
bert-base-multilingual-cased bert-base-uncased aubmindlab/bert-base-arabert bert-base-multilingual-uncased malteos/scincl allenai/scibert_scivocab_uncased CAMeL-Lab/bert-base-arabic-camelbert-msa

zh/wiki

Language modeling for the Chinese language on Wikipedia. Chinese is a family of East Asian analytic languages that form the Sinitic branch of the Sino-Tibetan languages.
zh/wiki@ukp bert-base-multilingual-cased
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Chinese Wikipedia Articles for 250k steps and a batch size of 64.

zh/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: pfeiffer non-linearity: gelu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Chinese Wikipedia Articles for 250k steps and a batch size of 64.

zh/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: houlsby non-linearity: gelu reduction factor: 2 Head: 

Houlsby Adapter trained with Masked Language Modelling on Chinese Wikipedia Articles for 250k steps and a batch size of 64.

Paper

Brought to you with ❤️ by the AdapterHub Team