AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Language

Language Adapters

Pre-trained model:

xlm-roberta
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
xlm-roberta-base
All shortcut names
xlm-roberta-base xlm-roberta-large

jv/wiki

Language modeling for the Javanese language on Wikipedia. Javanese is the language of the Javanese people from the central and eastern parts of the island of Java, in Indonesia.
jv/wiki@ukp xlm-roberta-base
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Javanese Wikipedia Articles for 100k steps and a batch size of 64.

Paper

Brought to you with ❤️ by the AdapterHub Team