View on huggingface.co

model = AutoAdapterModel.from_pretrained("bert-base-uncased")
model.load_adapter("AdapterHub/bert-base-uncased-pf-conll2003_pos", source="hf")

Description

Adapter AdapterHub/bert-base-uncased-pf-conll2003_pos for bert-base-uncased

An adapter for the bert-base-uncased model that was trained on the pos/conll2003 dataset and includes a prediction head for tagging.

This adapter was created for usage with the adapter-transformers library.

Usage

First, install adapter-transformers:

pip install -U adapter-transformers

Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More

Now, the adapter can be loaded and activated like this:

from transformers import AutoModelWithHeads

model = AutoModelWithHeads.from_pretrained("bert-base-uncased")
adapter_name = model.load_adapter("AdapterHub/bert-base-uncased-pf-conll2003_pos", source="hf")
model.active_adapters = adapter_name

Architecture & Training

The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here.

Evaluation results

Refer to the paper for more information on results.

Citation

If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":

@inproceedings{poth-etal-2021-pre,
    title = "{W}hat to Pre-Train on? {E}fficient Intermediate Task Selection",
    author = {Poth, Clifton  and
      Pfeiffer, Jonas  and
      R{"u}ckl{'e}, Andreas  and
      Gurevych, Iryna},
    booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2021",
    address = "Online and Punta Cana, Dominican Republic",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2021.emnlp-main.827",
    pages = "10585--10605",
}

Properties

Pre-trained model
bert-base-uncased
Adapter type
Prediction Head
  Yes
Task
Part-Of-Speech Tagging
Dataset

Architecture

{
  "adapter_residual_before_ln": false,
  "cross_adapter": false,
  "inv_adapter": null,
  "inv_adapter_reduction_factor": null,
  "leave_out": [],
  "ln_after": false,
  "ln_before": false,
  "mh_adapter": false,
  "non_linearity": "relu",
  "original_ln_after": true,
  "original_ln_before": true,
  "output_adapter": true,
  "reduction_factor": 16,
  "residual_before_ln": true
}

Citations

Task
@inproceedings{sangIntroductionCoNLL2003Shared2003,
  title = {Introduction to the {{CoNLL}}-2003 Shared Task: {{Language}}-Independent Named Entity Recognition},
  booktitle = {Proceedings of the Seventh Conference on Natural Language Learning, {{CoNLL}} 2003, Held in Cooperation with {{HLT}}-{{NAACL}} 2003, Edmonton, Canada, May 31 - June 1, 2003},
  author = {Sang, Erik F. Tjong Kim and Meulder, Fien De},
  editor = {Daelemans, Walter and Osborne, Miles},
  year = {2003},
  pages = {142--147},
  publisher = {{Association for Computational Linguistics}},
  bibsource = {dblp computer science bibliography, https://dblp.org},
  biburl = {https://dblp.org/rec/conf/conll/SangM03.bib},
  timestamp = {Mon, 16 Sep 2019 17:08:53 +0200}
}