Pre-trained model:
Adapter for gpt2 in Houlsby architecture trained on the SST-2 dataset for 10 epochs with a learning rate of 1e-4.
Adapter for gpt2 in Pfeiffer architecture trained on the SST-2 dataset for 10 epochs with a learning rate of 1e-4.