AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

All architectures
All architectures
bert bart xlm-roberta distilbert gpt2 roberta mbart

CoLA

The Corpus of Linguistic Acceptability (CoLA) in its full form consists of 10657 sentences from 23 linguistics publications, expertly annotated for acceptability (grammaticality) by their original authors. The public version provided here contains 9594 sentences belonging to training and development sets, and excludes 1063 sentences belonging to a held out test set.
  Website
lingaccept/cola@ukp gpt2
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 16 Head: 

Adapter for gpt2 in Pfeiffer architecture trained on the COLA dataset for 10 epochs with a learning rate of 1e-4.

lingaccept/cola@ukp distilbert-base-uncased
1 version Architecture: pfeiffer Head: 

Adapter for distilbert-base-uncased in Pfeiffer architecture trained on the CoLA dataset for 15 epochs with early stopping and a learning rate of 1e-4.

lingaccept/cola@ukp bert-base-uncased
1 version Architecture: houlsby Head: 

Adapter in Houlsby architecture trained on the CoLA task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

lingaccept/cola@ukp roberta-base
1 version Architecture: pfeiffer Head: 

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

lingaccept/cola@ukp roberta-base
1 version Architecture: houlsby Head: 

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

lingaccept/cola@ukp facebook/bart-base
1 version Architecture: houlsby non-linearity: swish reduction factor: 16 Head: 

Adapter for bart-base in Houlsby architecture trained on the CoLA dataset for 15 epochs with early stopping and a learning rate of 1e-4.

lingaccept/cola@ukp roberta-large
1 version Architecture: pfeiffer Head: 

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

lingaccept/cola@ukp bert-base-uncased
1 version Architecture: pfeiffer Head: 

Adapter in Pfeiffer architecture trained on the CoLA task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.

lingaccept/cola@ukp distilbert-base-uncased
1 version Architecture: houlsby Head: 

Adapter for distilbert-base-uncased in Houlsby architecture trained on the CoLA dataset for 15 epochs with early stopping and a learning rate of 1e-4.

lingaccept/cola@ukp facebook/bart-base
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 16 Head: 

Adapter for bart-base in Pfeiffer architecture trained on the CoLA dataset for 15 epochs with early stopping and a learning rate of 1e-4.

lingaccept/cola@ukp roberta-large
1 version Architecture: houlsby Head: 

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

AdapterHub/bert-base-uncased-pf-cola bert-base-uncased
huggingface.co Head: 

# Adapter `AdapterHub/bert-base-uncased-pf-cola` for bert-base-uncased An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the...

AdapterHub/roberta-base-pf-cola roberta-base
huggingface.co Head: 

# Adapter `AdapterHub/roberta-base-pf-cola` for roberta-base An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the...

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: