Cross lingual Transfer Learning for Pre trained Contextualized Language Models

The pre-trained contextualized language model (PrLM) has made a significant impact on NLP . Training PrLMs in languages other than English can be impractical for two reasons: other languages often lack corpora sufficient for training powerful PrLMMs . To handle the symbol order and sequence length differences between languages, we propose an intermediate “TRILayer” structure that learns from these differences and creates a better transfer in our primary translation direction . We showcase an embedding aligning that adversarially adapts a PrLM’s non-contextualized embedding space and the TRILayer structure to learn a text transformation network across languages, which addresses the vocabulary difference between languages . Experiments on both language understanding and structure parsing tasks show the proposed framework significantly outperforms language models trained from scratch with limited data in both performance and efficiency . Despite an insignificant performance loss compared to pre-training from scratch in resource-rich scenarios, our proposed framework is significantly more economical, our transfer learning framework is much more economical. The proposed framework

Links: PDF - Abstract

Code :

None

Keywords : languages - framework - language - proposed - training -

Leave a Reply

Your email address will not be published. Required fields are marked *