AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Language

Language Adapters

Pre-trained model:

All architectures
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart

sw/wiki

Language modeling for the Swahili language on Wikipedia. Swahili, also known by its native name Kiswahili, is a Bantu language and the first language of the Swahili people.
sw/wiki@ukp bert-base-multilingual-cased
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Swahili Wikipedia Articles for 100k steps and a batch size of 64.

sw/wiki@ukp xlm-roberta-large
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Swahili Wikipedia Articles for 100k steps and a batch size of 64.

sw/wiki@ukp xlm-roberta-base
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Swahili Wikipedia Articles for 100k steps and a batch size of 64.

sw/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: houlsby non-linearity: gelu reduction factor: 2 Head: 

Houlsby Adapter trained with Masked Language Modelling on Swahili Wikipedia Articles for 100k steps and a batch size of 64.

sw/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: pfeiffer non-linearity: gelu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Swahili Wikipedia Articles for 100k steps and a batch size of 64.

Paper

Brought to you with ❤️ by the AdapterHub Team