SeqMix creates newsynthetic examples by softly combining input/output sequences from the trainingset . We connect this approach to existing techniques such as SwitchOut and worddropout, and show that these techniques are all approximating variants of asingle objective . Seq mix consistently yields approximately 1.0 BLEU improvement on five different translation datasets over strong Transformer baselines . Ontasks that require strong compositional generalization such as SCAN and SCAN also offer further improvements .

Author(s) : Demi Guo, Yoon Kim, Alexander M. Rush

Links : PDF - Abstract

Code :


Keywords : strong - scan - techniques - - ontasks -

Leave a Reply

Your email address will not be published. Required fields are marked *