AdapterHub
Explore
Docs
Blog
Explore
Task
Task Adapters
Pre-trained model:
gpt2
All architectures
bert
xlm-roberta
gpt2
distilbert
roberta
bart
mbart
t5
xmod
All shortcut names
All shortcut names
gpt2
EmoContext
In this dataset, given a textual dialogue i.e. an utterance along with two previous turns of context, the goal was to infer the underlying emotion of the utterance by choosing from four emotion classes - Happy, Sad, Angry and Others.
🤗 huggingface.co
No task adapters available for
emotion/emo
None
Add your adapter to AdapterHub,
it's super awesome!
Get started