AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

All architectures
All architectures
distilbert bert xlm-roberta roberta

CoLA

The Corpus of Linguistic Acceptability (CoLA) in its full form consists of 10657 sentences from 23 linguistics publications, expertly annotated for acceptability (grammaticality) by their original authors. The public version provided here contains 9594 sentences belonging to training and development sets, and excludes 1063 sentences belonging to a held out test set.
lingaccept/cola@ukp roberta-base
1 version Architecture: houlsby

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

lingaccept/cola@ukp roberta-large
1 version Architecture: pfeiffer

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

lingaccept/cola@ukp distilbert-base-uncased
1 version Architecture: houlsby

Adapter for distilbert-base-uncased in Houlsby architecture trained on the CoLA dataset for 15 epochs with early stopping and a learning rate of 1e-4.

lingaccept/cola@ukp bert-base-uncased
1 version Architecture: houlsby

Adapter in Houlsby architecture trained on the CoLA task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

lingaccept/cola@ukp roberta-large
1 version Architecture: houlsby

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

lingaccept/cola@ukp bert-base-uncased
1 version Architecture: pfeiffer

Adapter in Pfeiffer architecture trained on the CoLA task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

lingaccept/cola@ukp distilbert-base-uncased
1 version Architecture: pfeiffer

Adapter for distilbert-base-uncased in Pfeiffer architecture trained on the CoLA dataset for 15 epochs with early stopping and a learning rate of 1e-4.

lingaccept/cola@ukp roberta-base
1 version Architecture: pfeiffer

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: