model = AutoAdapterModel.from_pretrained("xlm-roberta-base")
model.load_adapter("Gregor/xlm-roberta-base-wmt21-qe", source="hf")
Gregor/xlm-roberta-base-wmt21-qe for xlm-roberta-baseAn adapter for the xlm-roberta-base model that was trained on the quality_estimation/wmt21 dataset and includes a prediction head for classification.
This adapter was created for usage with the adapter-transformers library.
First, install adapter-transformers:
pip install -U adapter-transformers
Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More
Now, the adapter can be loaded and activated like this:
from transformers import AutoModelWithHeads
model = AutoModelWithHeads.from_pretrained("xlm-roberta-base")
adapter_name = model.load_adapter("Gregor/xlm-roberta-base-wmt21-qe")
model.active_adapters = adapter_name
{
"adapter_residual_before_ln": false,
"cross_adapter": false,
"inv_adapter": null,
"inv_adapter_reduction_factor": null,
"leave_out": [],
"ln_after": false,
"ln_before": false,
"mh_adapter": false,
"non_linearity": "gelu",
"original_ln_after": true,
"original_ln_before": true,
"output_adapter": true,
"reduction_factor": 8,
"residual_before_ln": true
}
@article{fomicheva2020mlqepe,
title={{MLQE-PE}: A Multilingual Quality Estimation and Post-Editing Dataset},
author={Marina Fomicheva and Shuo Sun and Erick Fonseca and Fr\'ed\'eric Blain and Vishrav Chaudhary and Francisco Guzm\'an and Nina Lopatina and Lucia Specia and Andr\'e F.~T.~Martins},
year={2020},
journal={arXiv preprint arXiv:2010.04480}
}