Previous works on cross-lingual summarization mainly focus on using pipeline methods or training an end-to-end model using the translated parallel data… However, it is a big challenge for the model to directly learn cross language summarization as it requires learning to understand different languages and learning how to summarize at the same time . In this paper, we propose to ease training by jointly learning to align and summarize . Experimental results show that our model can outperform competitive models in most cases . In addition, we show that . our model even has the . ability to generate cross .lingual . summaries without access to any cross . linguistic corpus. We design relevant loss functions to train this framework and propose several methods to enhance the

Links: PDF - Abstract

Code :

None

Keywords : cross - learning - model - lingual - summarization -

Leave a Reply

Your email address will not be published. Required fields are marked *