AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Language

Language Adapters

Pre-trained model:

All architectures
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart

qu/wiki

Language modeling for the Quechuan language on Wikipedia. Quechua, usually called Runasimi ("people's language") in Quechuan languages, is an indigenous language family spoken by the Quechua peoples, primarily living in the Peruvian Andes.
qu/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: houlsby non-linearity: gelu reduction factor: 2 Head: 

Houlsby Adapter trained with Masked Language Modelling on Quechuan Wikipedia Articles for 50k steps and a batch size of 64.

qu/wiki@ukp bert-base-multilingual-cased
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Quechuan Wikipedia Articles for 50k steps and a batch size of 64.

qu/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: pfeiffer non-linearity: gelu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Quechuan Wikipedia Articles for 50k steps and a batch size of 64.

qu/wiki@ukp xlm-roberta-base
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Quechuan Wikipedia Articles for 50k steps and a batch size of 64.

qu/wiki@ukp xlm-roberta-large
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Quechuan Wikipedia Articles for 50k steps and a batch size of 64.

Paper

Brought to you with ❤️ by the AdapterHub Team