AdapterHub
Explore
Docs
Blog
Explore
Task
Task Adapters
Pre-trained model:
gpt2
All architectures
bert
distilbert
roberta
xlm-roberta
bart
gpt2
mbart
xmod
gpt2
All shortcut names
gpt2
lm/poem
The model is trained to learn the structure of poems in the english language.
lm/poem@ukp
gpt2
1 version
Architecture: pfeiffer
non-linearity: relu
reduction factor: 16
Head: