Pre-trained model:
Adapter in Houlsby architecture trained on the QQP task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.
Adapter in Pfeiffer architecture trained on the QQP task for 20 epochs with early stopping and a learning rate of 1e-4. See https://arxiv.org/pdf/2007.07779.pdf.
# Adapter `AdapterHub/bert-base-uncased-pf-qqp` for bert-base-uncased An [adapter](https://adapterhub.ml) for the `bert-base-uncased` model that was trained on the...