This work revisits the task of training sequence tagging models with limited resources using transfer learning . We investigate several proposed approaches introduced in recent works and suggest a new loss that relies on sentence reconstruction from normalized embeddings . We show improved results on the CoNLL02 NER and UD 1.2 POS datasets and demonstrate the power of the method for transfer learning with low-resources achieving 0.6 F1 score in Dutch using only one sample from it .

Links: PDF - Abstract

Code :

None

Keywords : reconstruction - sequence - tagging - sentence - resources -

Leave a Reply

Your email address will not be published. Required fields are marked *