How to generate summaries of different styles without requiring corpora inthe target styles, or training separate models? We present two novel methodsthat can be deployed during summary decoding on any pre-trainedTransformer-based summarization model . (1) Decoder state adjustment instantlymodifies decoder final states with externally trained style scorers, toiteratively refine the output against a target style . (2) Word unit predictionconstrains the word usage to impose strong lexical control during generation . (3) In experiments of summarizing with simplicity control, automatic evaluation and human judges both find our models producing outputs in simpler languages .

Author(s) : Shuyang Cao, Lu Wang

Links : PDF - Abstract

Code :

Keywords : control - style - styles - word - target -

Leave a Reply

Your email address will not be published. Required fields are marked *