Continual Learning, Fast and Slow

Page view(s)
Checked on May 17, 2024
Continual Learning, Fast and Slow
Continual Learning, Fast and Slow
Journal Title:
IEEE Transactions on Pattern Analysis and Machine Intelligence
Publication Date:
13 October 2023
Pham, Q., Liu, C., & Hoi, S. C. H. (2023). Continual Learning, Fast and Slow. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1–16.
According to the Complementary Learning Systems (CLS) theory [1] in neuroscience, humans do effective continual learning through two complementary systems: a fast learning system centered on the hippocampus for rapid learning of the specifics, individual experiences; and a slow learning system located in the neocortex for the gradual acquisition of structured knowledge about the environment. Motivated by this theory, we propose DualNets (for Dual Networks), a general continual learning framework comprising a fast learning system for supervised learning of pattern-separated representation from specific tasks and a slow learning system for representation learning of task-agnostic general representation via Self-Supervised Learning (SSL). DualNets can seamlessly incorporate both representation types into a holistic framework to facilitate better continual learning in deep neural networks. Via extensive experiments, we demonstrate the promising results of DualNets on a wide range of continual learning protocols, ranging from the standard offline, task-aware setting to the challenging online, task-free scenario. Notably, on the CTrL [2] benchmark that has unrelated tasks with vastly different visual images, DualNets can achieve competitive performance with existing state-of-the-art dynamic architecture strategies [3]. Furthermore, we conduct comprehensive ablation studies to validate DualNets efficacy, robustness, and scalability. Code will be made available at
License type:
Publisher Copyright
Funding Info:
There was no specific funding for the research done
© 2033 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Files uploaded:

File Size Format Action
pami-upload.pdf 1,022.91 KB PDF Request a copy