Multilingual representations embed words from many languages into a single semantic space such that words with similar meanings are close to each other regardless of the language . These embeddings have been widely used in various settings, such as cross-lingual transfer, where a natural language processing (NLP) model trained on one language is deployed to another language . While the cross language transfer techniques are powerful, they carry gender bias from the source to target languages . We propose several ways for quantifying bias in multilingual representations from both the intrinsic and extrinsic perspectives . We further provide recommendations for using the multilingual word representations for downstream tasks . We also provide recommendations for using multilingual words for downstream . tasks. The authors of this paper also provide an overview of the research into gender bias in this paper. The authors also provide a

Links: PDF - Abstract

Code :

None

Keywords : multilingual - language - bias - provide - words -

Leave a Reply

Your email address will not be published. Required fields are marked *