Pre-trained model:
Houlsby Adapter trained with Masked Language Modelling on Javanese Wikipedia Articles for 100k steps and a batch size of 64.
Pfeiffer Adapter trained with Masked Language Modelling on Javanese Wikipedia Articles for 100k steps and a batch size of 64.