AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Language

Language Adapters

Pre-trained model:

All architectures
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart

jv/wiki

Language modeling for the Javanese language on Wikipedia. Javanese is the language of the Javanese people from the central and eastern parts of the island of Java, in Indonesia.
jv/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: houlsby non-linearity: gelu reduction factor: 2 Head: 

Houlsby Adapter trained with Masked Language Modelling on Javanese Wikipedia Articles for 100k steps and a batch size of 64.

jv/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: pfeiffer non-linearity: gelu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Javanese Wikipedia Articles for 100k steps and a batch size of 64.

jv/wiki@ukp bert-base-multilingual-cased
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Javanese Wikipedia Articles for 100k steps and a batch size of 64.

jv/wiki@ukp xlm-roberta-base
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Javanese Wikipedia Articles for 100k steps and a batch size of 64.

jv/wiki@ukp xlm-roberta-large
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Javanese Wikipedia Articles for 100k steps and a batch size of 64.

Paper

Brought to you with ❤️ by the AdapterHub Team