Du, F., Yang, Y., Zhao, Z., & Zeng, Z. (2022). Efficient Perturbation Inference and Expandable Network for continual learning. Neural Networks. https://doi.org/10.1016/j.neunet.2022.10.030
Although humans are capable of learning new tasks without forgetting previous ones, most neural networks fail to do so because learning new tasks could override the knowledge acquired from previous data. In this work, we alleviate this issue by proposing a novel Efficient Perturbation Inference and Expandable Network (EPIE-Net), which dynamically expands lightweight task-specific decoders for new classes and utilizes a mixed-label uncertainty strategy to improve the robustness. Moreover, we calculate the average probability of perturbed samples at inference, which can generally improve the performance of the model. Experimental
results show that our method consistently outperforms other methods with fewer parameters in class incremental learning benchmarks. For example, on the CIFAR-100 10 steps setup, our method achieves an average accuracy of 76.33% and the last accuracy of 65.93% within only 3.46M average parameters.
Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
Financial support provided by the Postgraduate Research and Innovation Foundation of Yunnan University with No.2021Z113, Yunnan provincial major science and technology special plan projects: digitization research and application demonstration of Yunnan characteristic industry under Grant: No. 202002AD080001,
The Natural Science Foundation of China (NSFC) under Grant: No. 61876166, and Yunnan Basic Research Program for Distinguished Young Youths Project, under Grant: 202101AV070003.