Transfer Learning and Distant Supervision for Multilingual Transformer Models A Study on African Languages

Multilingual transformer models like mBERT and XLM-RoBERTa have obtained great improvements for many NLP tasks on a variety of languages . However, recent works also showed that results from high-resource languages could not be easily transferred to realistic, low-resource scenarios… In this work, we study trends in performance for different amounts of available resources for the three African languages Hausa, isiXhosa and Yor\`ub\’a on both NER and topic classification . We show that in combination with transfer learning or distant supervision, these models can achieve with as little as 10 or 100 labeled sentences the same performance as baselines with much more supervised training data .

Links: PDF - Abstract

Code :

https://github.com/uds-lsv/transfer-distant-transformer-african

Keywords : languages - models - transfer - transformer - performance -

Leave a Reply

Your email address will not be published. Required fields are marked *