AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Language

Language Adapters

Pre-trained model:

All architectures
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart

ko/wiki

Language modeling for the Korean language on Wikipedia. The Korean language is an East Asian language spoken by about 77 million people.
ko/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: pfeiffer non-linearity: gelu reduction factor: 2 Head: 

Pfeiffer Adapter trained with Masked Language Modelling on Korean Wikipedia Articles for 250k steps and a batch size of 64.

ko/wiki@ukp bert-base-multilingual-cased
2 versions Architecture: houlsby non-linearity: gelu reduction factor: 2 Head: 

Houlsby Adapter trained with Masked Language Modelling on Korean Wikipedia Articles for 250k steps and a batch size of 64.

Paper

Brought to you with ❤️ by the AdapterHub Team