Edit on GitHub

model = AutoAdapterModel.from_pretrained("facebook/mbart-large-cc25")
config = AdapterConfig.load("pfeiffer", non_linearity="relu", reduction_factor=2)
model.load_adapter("mt/wmt16_en_ro@ukp", config=config)

Description

Adapter for mbart-large-cc25 in Pfeiffer architecture with reduction factor 2 trained on the WMT16 Romanian-English translation task. Training for 10 epochs with early stopping and a learning rate of 1e-4. After post-processing following https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/romanian_postprocessing.md, it achieves a BLEU score of 36.3.

Properties

Pre-trained model
facebook/mbart-large-cc25
Adapter type
Prediction Head
  Yes
Task
Machine Translation

Architecture

Name
pfeiffer
Non-linearity
relu
Reduction factor
2
{
  "ln_after": false,
  "ln_before": false,
  "mh_adapter": false,
  "output_adapter": true,
  "adapter_residual_before_ln": false,
  "non_linearity": "relu",
  "original_ln_after": true,
  "original_ln_before": true,
  "reduction_factor": 2,
  "residual_before_ln": true
}

Author

  Name
Clifton Poth
  E-Mail
  GitHub
  Twitter

Versions

Identifier Comment Score Download
1 DEFAULT 36.3

Citations

Architecture
@misc{pfeiffer2020adapterfusion,
  title={AdapterFusion: Non-Destructive Task Composition for Transfer Learning},
  author={Jonas Pfeiffer and Aishwarya Kamath and Andreas Rücklé and Kyunghyun Cho and Iryna Gurevych},
  year={2020},
  eprint={2005.00247},
  archivePrefix={arXiv},
  primaryClass={cs.CL}
}