Contextual Transformation Networks for Online Continual Learning

Continual learning methods with fixed architecture rely on a single network to learn models that can perform well on all tasks . Dynamic architecture methods can have a separate network for each task, but they are too expensive to train and not scalable in practice, especially in online settings . We propose a novel online continual learning method named “Contextual Transformation Networks” (CTN) to efficiently model the task-specific features . CTN is competitive with a large scale dynamic architecture network and consistently outperforms other fixed architecture methods under the same standard backbone . We will release our implementation upon acceptance. We will use CTN to implement a novel dual memory design and an objective to train CTN that can address both catastrophic forgetting and knowledge transfer simultaneously . We hope to use the CTN as a tool that can be used in a new version of our software to improve our

Links: PDF - Abstract

Code :


Keywords : ctn - architecture - learning - methods - online -

Leave a Reply

Your email address will not be published. Required fields are marked *