Class-Incremental Learning via Knowledge Amalgamation

Page view(s)
11
Checked on Aug 10, 2025
Class-Incremental Learning via Knowledge Amalgamation
Title:
Class-Incremental Learning via Knowledge Amalgamation
Journal Title:
Lecture Notes in Computer Science
Publication Date:
16 March 2023
Citation:
de Carvalho, M., Pratama, M., Zhang, J., & Sun, Y. (2023). Class-Incremental Learning via Knowledge Amalgamation. In Machine Learning and Knowledge Discovery in Databases (pp. 36–50). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-26409-2_3
Abstract:
Ctastrophic forgetting has been a significant problem hindering the deployment of deep learning algorithms in the continual learning setting. Numerous methods have been proposed to address the catastrophic forgetting problem where an agent loses its generalization power of old tasks while learning new tasks. We put forward an alternative strategy to handle the catastrophic forgetting with knowledge amalgamation (CFA), which learns a student network from multiple heterogeneous teacher models specializing in previous tasks and can be applied to current offline methods. The knowledge amalgamation process is carried out in a single-head manner with only a selected number of memorized samples and no annotations. The teachers and students do not need to share the same network structure, allowing heterogeneous tasks to be adapted to a compact or sparse data representation. We compare our method with competitive baselines from different strategies, demonstrating our approach’s advantages. Source-code: github.com/Ivsucram/CFA
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the National Research Foundation, Singapore - Advanced Manufacturing and Engineering (AME) Industry Alignment Fund - Pre-Positioning
Grant Reference no. : A19C1A0018
Description:
This is a post-peer-review, pre-copyedit version of an article published in Lecture Notes in Computer Science. The final authenticated version is available online at: http://dx.doi.org/10.1007/978-3-031-26409-2_3
ISSN:
9783031264092
ISBN:
9783031264085
Files uploaded: