AdapterHub
Explore
Docs
Blog
Explore
Task
Task Adapters
Pre-trained model:
roberta
All architectures
bert
distilbert
roberta
xlm-roberta
bart
gpt2
mbart
xmod
roberta-large
All shortcut names
roberta-base
roberta-large
distilroberta-base
MLKI_TP
Enhancing the phrase-level factual triple knowledge in language models, which is suitable for knowledge graph tasks
Website
No task adapters available for
mlki/tp
roberta-large
Add your adapter to AdapterHub,
it's super awesome!
Get started