View on huggingface.co

model = AutoAdapterModel.from_pretrained("xlm-roberta-large")
model.load_adapter("Gregor/xlm-roberta-large-wmt21-qe", source="hf")

Description

Adapter Gregor/xlm-roberta-large-wmt21-qe for xlm-roberta-large

An adapter for the xlm-roberta-large model that was trained on the quality_estimation/wmt21 dataset and includes a prediction head for classification.

This adapter was created for usage with the adapter-transformers library.

Usage

First, install adapter-transformers:

pip install -U adapter-transformers

Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More

Now, the adapter can be loaded and activated like this:

from transformers import AutoModelWithHeads

model = AutoModelWithHeads.from_pretrained("xlm-roberta-large")
adapter_name = model.load_adapter("Gregor/xlm-roberta-large-wmt21-qe")
model.active_adapters = adapter_name

Architecture & Training

Evaluation results

Citation

Properties

Pre-trained model
xlm-roberta-large
Adapter type
Prediction Head
  Yes
Task
Quality Estimation
Dataset

Architecture

{
  "adapter_residual_before_ln": false,
  "cross_adapter": false,
  "inv_adapter": null,
  "inv_adapter_reduction_factor": null,
  "leave_out": [],
  "ln_after": false,
  "ln_before": false,
  "mh_adapter": false,
  "non_linearity": "gelu",
  "original_ln_after": true,
  "original_ln_before": true,
  "output_adapter": true,
  "reduction_factor": 8,
  "residual_before_ln": true
}

Citations

Task
@article{fomicheva2020mlqepe,
   title={{MLQE-PE}: A Multilingual Quality Estimation and Post-Editing Dataset},
   author={Marina Fomicheva and Shuo Sun and Erick Fonseca and Fr\'ed\'eric Blain and Vishrav Chaudhary and Francisco Guzm\'an and Nina Lopatina and Lucia Specia and Andr\'e F.~T.~Martins},
   year={2020},
   journal={arXiv preprint arXiv:2010.04480}
}