GigaBERT Zero shot Transfer Learning from English to Arabic

Multilingual pre-trained Transformers, such as mBERT and XLM-RoBERTa, have been shown to enable the effective cross-lingual zero-shot transfer . However, their performance on Arabic information extraction (IE) tasks is not very well studied… In this paper, we pre-train a customized bilingual BERT, dubbed GigaBERT, that is designed specifically for Arabic NLP and English-to-Arabic zero-shots transfer learning . Our best model significantly outperforms mBERTs in both the supervised and zero shot transfer settings .

Links: PDF - Abstract

Code :

https://github.com/lanwuwei/GigaBERT

Keywords : arabic - transfer - shot - english - pre -

Leave a Reply

Your email address will not be published. Required fields are marked *