AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Language

Language Adapters

Pre-trained model:

xlm-roberta
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
xlm-roberta-base xlm-roberta-large

en/wiki

Language modeling for the English language on Wikipedia. English is a West Germanic language that was first spoken in early medieval England and eventually became a global lingua franca.
en/wiki@ukp xlm-roberta-large
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on English Wikipedia Articles for 250k steps and a batch size of 64.

en/wiki@ukp xlm-roberta-base
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on English Wikipedia Articles for 250k steps and a batch size of 64.

Paper

Brought to you with ❤️ by the AdapterHub Team