A universal cross-language representation leads tobetter multilingual translation performance . For English-centric directions, mCOLT achievescompetitive or even better performance than a strong pre-trained model mBART . For non-English directions, the model achieves animprovement of average 10+ BLEU compared with the multilingual baseline. For other languages, mColT achieves an average of 10-BLEU. For more information, please visit http://www.mCOLT.com/mColT.uk/report/mCT/mCLT . For more details, go to www.mCT.uk.com.uk .

Author(s) : Xiao Pan, Mingxuan Wang, Liwei Wu, Lei Li

Links : PDF - Abstract

Code :
Coursera

Keywords : mcolt - uk - multilingual - achieves - english -

Leave a Reply

Your email address will not be published. Required fields are marked *