View on huggingface.co

model = AutoAdapterModel.from_pretrained("roberta-base")
model.load_adapter("AdapterHub/roberta-base-pf-wnut_17", source="hf")

Description

Adapter AdapterHub/roberta-base-pf-wnut_17 for roberta-base

An adapter for the roberta-base model that was trained on the wnut_17 dataset and includes a prediction head for tagging.

This adapter was created for usage with the adapter-transformers library.

Usage

First, install adapter-transformers:

pip install -U adapter-transformers

Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More

Now, the adapter can be loaded and activated like this:

from transformers import AutoModelWithHeads

model = AutoModelWithHeads.from_pretrained("roberta-base")
adapter_name = model.load_adapter("AdapterHub/roberta-base-pf-wnut_17", source="hf")
model.active_adapters = adapter_name

Architecture & Training

The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here.

Evaluation results

Refer to the paper for more information on results.

Citation

If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":

@inproceedings{poth-etal-2021-pre,
    title = "{W}hat to Pre-Train on? {E}fficient Intermediate Task Selection",
    author = {Poth, Clifton  and
      Pfeiffer, Jonas  and
      R{"u}ckl{'e}, Andreas  and
      Gurevych, Iryna},
    booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2021",
    address = "Online and Punta Cana, Dominican Republic",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2021.emnlp-main.827",
    pages = "10585--10605",
}

Properties

Pre-trained model
roberta-base
Adapter type
Prediction Head
  Yes
Task
Named Entity Recognition
Dataset

Architecture

{
  "adapter_residual_before_ln": false,
  "cross_adapter": false,
  "inv_adapter": null,
  "inv_adapter_reduction_factor": null,
  "leave_out": [],
  "ln_after": false,
  "ln_before": false,
  "mh_adapter": false,
  "non_linearity": "relu",
  "original_ln_after": true,
  "original_ln_before": true,
  "output_adapter": true,
  "reduction_factor": 16,
  "residual_before_ln": true
}

Citations

Task
@inproceedings{derczynski-etal-2017-results,
    title = "Results of the {WNUT}2017 Shared Task on Novel and Emerging Entity Recognition",
    author = "Derczynski, Leon  and
      Nichols, Eric  and
      van Erp, Marieke  and
      Limsopatham, Nut",
    booktitle = "Proceedings of the 3rd Workshop on Noisy User-generated Text",
    month = sep,
    year = "2017",
    address = "Copenhagen, Denmark",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/W17-4418",
    doi = "10.18653/v1/W17-4418",
    pages = "140--147",
    abstract = "This shared task focuses on identifying unusual, previously-unseen entities in the context of emerging discussions.
                Named entities form the basis of many modern approaches to other tasks (like event clustering and summarization),
                but recall on them is a real problem in noisy text - even among annotators.
                This drop tends to be due to novel entities and surface forms.
                Take for example the tweet {``}so.. kktny in 30 mins?!{''} {--} even human experts find the entity {`}kktny{'}
                hard to detect and resolve. The goal of this task is to provide a definition of emerging and of rare entities,
                and based on that, also datasets for detecting these entities. The task as described in this paper evaluated the
                ability of participating entries to detect and classify novel and emerging named entities in noisy text.",
}