AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

roberta
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
roberta-base roberta-large

CoLA

The Corpus of Linguistic Acceptability (CoLA) in its full form consists of 10657 sentences from 23 linguistics publications, expertly annotated for acceptability (grammaticality) by their original authors. The public version provided here contains 9594 sentences belonging to training and development sets, and excludes 1063 sentences belonging to a held out test set.
  Website
lingaccept/cola@ukp roberta-base
1 version Architecture: pfeiffer Head: 

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

lingaccept/cola@ukp roberta-base
1 version Architecture: houlsby Head: 

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

lingaccept/cola@ukp roberta-large
1 version Architecture: houlsby Head: 

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

lingaccept/cola@ukp roberta-large
1 version Architecture: pfeiffer Head: 

Adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs).

AdapterHub/roberta-base-pf-cola roberta-base
huggingface.co Head: 

# Adapter `AdapterHub/roberta-base-pf-cola` for roberta-base An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the...

Paper

Brought to you with ❤️ by the AdapterHub Team