AdapterHub
  •   Explore
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

bert
All architectures
bert xlm-roberta distilbert gpt2 bart roberta mbart
All shortcut names
All shortcut names
aubmindlab/bert-base-arabert bert-base-multilingual-cased bert-base-uncased bert-base-multilingual-uncased malteos/scincl

StackExchange QA Similarity

StackExchange QA similarity determines whether two questions or a question-answer pair in StackExchange forums are related or not (e.g., to find duplicate questions or relevant answers).
  Website
sts/stackexchange@ukp bert-base-uncased
141 versions Architecture: pfeiffer reduction factor: 12 Head: 

Our adapters from the MultiCQA paper (https://arxiv.org/abs/2010.00980) trained on the different StackExchange forums (see "version") with self-supervised training signals of unlabeled questions.

Paper

Brought to you with ❤️ by the AdapterHub Team