AdapterHub
  •   Explore
  •   Upload
  •   Docs
  •   Blog
  •  
  •  
  1. Explore
  2. Task

Task Adapters

Pre-trained model:

distilbert
All architectures
gpt2 bert xlm-roberta roberta distilbert bart mbart
All shortcut names
All shortcut names
distilbert-base-uncased

IMDb

MDB dataset having 50K movie reviews for natural language processing or Text analytics. This is a dataset for binary sentiment classification containing substantially more data than previous benchmark datasets. We provide a set of 25,000 highly polar movie reviews for training and 25,000 for testing. So, predict the number of positive and negative reviews using either classification or deep learning algorithms.
  Website 🤗  huggingface.co
sentiment/imdb@ukp distilbert-base-uncased
1 version Architecture: pfeiffer Head: 

Adapter for distilbert-base-uncased in Pfeiffer architecture trained on the IMDB dataset for 15 epochs with early stopping and a learning rate of 1e-4.

Paper | Imprint & Privacy

Brought to you with ❤️  by authors from: