AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Language

Language Adapters

Pre-trained model:

All architectures
All architectures
bert distilbert roberta xlm-roberta bart gpt2 mbart xmod

xmf/wiki

Language modeling for the Mingrelian language on Wikipedia. Megrelian is a Kartvelian language spoken in Western Georgia (regions of Samegrelo and Abkhazia), primarily by the Megrelians.
xmf/wiki@ukp xlm-roberta-large
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Mingrelian Wikipedia Articles for 50k steps and a batch size of 64.

xmf/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: pfeiffer non-linearity: gelu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Mingrelian Wikipedia Articles for 50k steps and a batch size of 64.

xmf/wiki@ukp xlm-roberta-base
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Mingrelian Wikipedia Articles for 50k steps and a batch size of 64.

xmf/wiki@ukp bert-base-multilingual-cased
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Mingrelian Wikipedia Articles for 50k steps and a batch size of 64.

xmf/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: houlsby non-linearity: gelu reduction factor: 2 Head: 

Houlsby Adapter trained with Masked Language Modelling on Mingrelian Wikipedia Articles for 50k steps and a batch size of 64.

Paper

Brought to you with ❤️ by the AdapterHub Team