Most approaches in few-shot learning rely on costly annotated data related to the goal task domain during (pre-training) In settings with realistic domain shift, common transfer learning has been shown to outperform supervised meta-learning . We propose a transfer learning approach which constructs a metric embedding that clusters unlabeled prototypical samples and their augmentations closely together . This pre-trained embedding is a starting point for classification by summarizing class clusters and fine-tuning . We demonstrate that our self-supervised prototypical transfer learning . approach ProtoTransfer outperforms state-of-the-art unsupervised meta-learned methods on few . tasks from the mini-ImageNet dataset .

Links: PDF - Abstract

Code :

https://github.com/indy-lab/ProtoTransfer

Keywords : learning - transfer - prototypical - supervised - embedding -

Leave a Reply

Your email address will not be published. Required fields are marked *