AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

xlm-roberta
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
xlm-roberta-base
All shortcut names
xlm-roberta-base xlm-roberta-large

MLKI_EP

Enhancing the phrase-level cross-lingual entity alignment in language models, which is suitable for knowledge graph tasks
  Website
mlki/ep@mlki xlm-roberta-base
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 16 Head: 

Knowledge adapter set for multilingual knowledge graph integration. This adapter is for cross-lingual entity alignment enhancement (phrase-level). We trained it with alignments from Wikidata...

Paper

Brought to you with ❤️ by the AdapterHub Team