Tuned Compositional Feature Replays for Efficient Stream Learning

Page view(s)
14
Checked on May 14, 2025
Tuned Compositional Feature Replays for Efficient Stream Learning
Title:
Tuned Compositional Feature Replays for Efficient Stream Learning
Journal Title:
IEEE Transactions on Neural Networks and Learning Systems
Publication Date:
25 December 2023
Citation:
M. B. Talbot, R. Zawar, R. Badkundri, M. Zhang and G. Kreiman, "Tuned Compositional Feature Replays for Efficient Stream Learning," in IEEE Transactions on Neural Networks and Learning Systems, vol. 36, no. 2, pp. 3300-3314, Feb. 2025, doi: 10.1109/TNNLS.2023.3344085.
Abstract:
Our brains extract durable, generalizable knowledge from transient experiences of the world. Artificial neural networks come nowhere close to this ability. When tasked with learning to classify objects by training on non-repeating video frames in temporal order (online stream learning), models that learn well from shuffled datasets catastrophically forget old knowledge upon learning new stimuli. We propose a new continual learning algorithm, Compositional Replay Using Memory Blocks (CRUMB), which mitigates forgetting by replaying feature maps reconstructed by combining generic parts. CRUMB concatenates trainable and re-usable "memory block" vectors to compositionally reconstruct feature map tensors in convolutional neural networks. Storing the indices of memory blocks used to reconstruct new stimuli enables memories of the stimuli to be replayed during later tasks. This reconstruction mechanism also primes the neural network to minimize catastrophic forgetting by biasing it towards attending to information about object shapes more than information about image textures, and stabilizes the network during stream learning by providing a shared feature-level basis for all training examples. These properties allow CRUMB to outperform an otherwise identical algorithm that stores and replays raw images, while occupying only 3.6% as much memory. We stress-tested CRUMB alongside 13 competing methods on 7 challenging datasets. To address the limited number of existing online stream learning datasets, we introduce 2 new benchmarks by adapting existing datasets for stream learning. With only 3.7-4.1% as much memory and 15-43% as much runtime, CRUMB mitigates catastrophic forgetting more effectively than the state-of-the-art.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the National Research Foundation - AI Singapore
Grant Reference no. : AISG2-RP-2021-025

This research / project is supported by the National Research Foundation - National Research Foundation Fellowship Award
Grant Reference no. : NRF-NRFF15-2023-0001
Description:
© 2023 IEEE.  Personal use of this material is permitted.  Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
ISBN:

Files uploaded:

File Size Format Action
210402206v8.pdf 11.70 MB PDF Open