AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

All architectures
All architectures
bert bart xlm-roberta distilbert gpt2 roberta mbart

WMT16 English-Romanian

The English-Romanian translation dataset from the shared task of the First Conference on Machine Translation (WMT16).
  Website
mt/wmt16_en_ro@ukp facebook/mbart-large-cc25
1 version Architecture: pfeiffer non-linearity: relu reduction factor: 2 Head: 

Adapter for mbart-large-cc25 in Pfeiffer architecture with reduction factor 2 trained on the WMT16 Romanian-English translation task. Training for 10 epochs with early stopping and a learning rate...

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: