Pre-trained model:
Pfeiffer Adapter trained with Masked Language Modelling on Chinese Wikipedia Articles for 250k steps and a batch size of 64.