Variational Information Bottleneck for Effective Low Resource Fine Tuning

Large-scale pretrained language models often suffer from overfitting in low-resource scenarios . We propose to use Variational Information Bottleneck (VIB) to suppress irrelevant features when fine-tuning on target tasks . The VIB model finds sentence representations that are more robust to biases in natural language inference datasets, and thereby obtains better generalization to out-of-domain datasets . It improves generalization on 13 out of 15 natural language infraction benchmarks, and improves generalizations on 13 of 15 out- of-domain natural language inference benchmarks . The method significantly improves transfer learning in low resource scenarios, surpassing prior work .

Links: PDF - Abstract

Code :

None

Keywords : language - natural - improves - resource - inference -

Leave a Reply

Your email address will not be published. Required fields are marked *