AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

mbart
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
facebook/mbart-large-cc25

WMT16 English-Romanian

The English-Romanian translation dataset from the shared task of the First Conference on Machine Translation (WMT16).
  Website
mt/wmt16_en_ro@ukp facebook/mbart-large-cc25
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Adapter for mbart-large-cc25 in Pfeiffer architecture with reduction factor 2 trained on the WMT16 Romanian-English translation task. Training for 10 epochs with early stopping and a learning rate...

Paper

Brought to you with ❤️ by the AdapterHub Team