AdapterHub
Explore
Docs
Blog
Explore
Language
Language Adapters
Pre-trained model:
distilbert
All architectures
bert
distilbert
roberta
xlm-roberta
bart
gpt2
mbart
xmod
All shortcut names
All shortcut names
distilbert-base-uncased
ko/cc100
Language modeling for the Korean language on the CC-100 corpus.
Website
No language adapters available for
ko/cc100
None
Add your adapter to AdapterHub,
it's super awesome!
Get started