Pre-trained model:
Pfeiffer Adapter trained on HellaSwag.
Adapter for distilbert-base-uncased in Pfeiffer architecture trained on the Hellaswag dataset for 15 epochs with early stopping and a learning rate of 1e-4.
# Adapter `AdapterHub/bert-base-uncased-pf-hellaswag` for bert-base-uncased An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the...
# Adapter `AdapterHub/roberta-base-pf-hellaswag` for roberta-base An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the...