Reinforced Knowledge Distillation for Time Series Regression

Page view(s)
91
Checked on Sep 30, 2024
Reinforced Knowledge Distillation for Time Series Regression
Title:
Reinforced Knowledge Distillation for Time Series Regression
Journal Title:
IEEE Transactions on Artificial Intelligence
Publication Date:
18 December 2023
Citation:
Xu, Q., Wu, K., Wu, M., Mao, K., Li, X., & Chen, Z. (2023). Reinforced Knowledge Distillation for Time Series Regression. IEEE Transactions on Artificial Intelligence, 1–11. https://doi.org/10.1109/tai.2023.3341854
Abstract:
As one of the most popular and effective methods in model compression, knowledge distillation (KD) attempts to transfer knowledge from single or multiple large-scale networks (i.e., Teachers ) to a compact network (i.e., Student ). For the multi-teacher scenario, existing methods either assign equal or fixed weights for different teacher models during distillation, which can be inefficient as teachers might perform variously or even oppositely on different training samples. To address this issue, we propose a novel reinforced knowledge distillation method with negatively correlated teachers which are generated via negative correlation learning. The negatively correlated teachers would encourage teachers to learn different aspects of data and thus the ensemble of them can be more comprehensive and suitable for multi-teacher KD. Subsequently, a reinforced KD algorithm is proposed to dynamically employ proper teachers for different training instances via dueling Double Deep Q-Network (DDQN). Our proposed method complements the existing KD procedure on teacher generation and selection. Extensive experimental results on two real-world time series regression tasks clearly demonstrate that the proposed approach could achieve superior performance over state-of-the-art methods.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the Agency for Science, Technology and Research (A*STAR) - AME Young Individual Research Grant
Grant Reference no. : A2084c0167
Description:
© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
ISSN:
2691-4581
Files uploaded:

File Size Format Action
camera-ready-pdf-for-reinforced-kd-for-time-series-regression.pdf 4.02 MB PDF Request a copy