AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bert
All architectures
gpt2 bert xlm-roberta roberta distilbert bart mbart
All shortcut names
All shortcut names
bert-base-uncased bert-base-multilingual-cased aubmindlab/bert-base-arabert bert-base-multilingual-uncased bert-base-cased malteos/scincl

Japanese

NER on Wikipedia Documents.
wikiann/ja@ukp bert-base-multilingual-cased
5 versions Architecture: pfeiffer non-linearity: gelu reduction factor: 16 Head: 

Stacked adapter on top of Language adapter. MAD-X 2.0 style. The language adapters in the last layer (layer 11) are deleted.

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: