Contrastive Distillation With Regularized Knowledge for Deep Model Compression on Sensor-Based Human Activity Recognition

Page view(s)
72
Checked on Jul 21, 2024
Contrastive Distillation With Regularized Knowledge for Deep Model Compression on Sensor-Based Human Activity Recognition
Title:
Contrastive Distillation With Regularized Knowledge for Deep Model Compression on Sensor-Based Human Activity Recognition
Journal Title:
IEEE Transactions on Industrial Cyber-Physical Systems
Keywords:
Publication Date:
28 September 2023
Citation:
Xu, Q., Wu, M., Li, X., Mao, K., & Chen, Z. (2023). Contrastive Distillation With Regularized Knowledge for Deep Model Compression on Sensor-Based Human Activity Recognition. IEEE Transactions on Industrial Cyber-Physical Systems, 1, 217–226. https://doi.org/10.1109/ticps.2023.3320630
Abstract:
Deep learning (DL) approaches have been widely applied to sensor-based human activity recognition (HAR). However, existing approaches neglect to distinguish human activities which have similar sensory reading patterns, resulting in poor recognition accuracy on those activities. Moreover, these deep models often come with complex network architectures for performance improvement. Expensive computational cost and memory requirement hinder their feasibility to be deployed on resource-limited environments such as smartphones. As one of the most popular model compress techniques, knowledge distillation (KD) can be leveraged to boost the performance of a compact student with the knowledge from a cumbersome teacher. However, most of the existing KD works ignore the bias introduced by teacher's logits during distillation, leading to sub-optimal issue for student training. To address the above issues, we propose a novel Con trastive D istillation framework with R egularized K nowledge (ConDRK) for sensor-based human activity recognition. Particularly, we first utilize multiple intra-class samples to formulate a novel unbiased soft target as regularized knowledge which can significantly eliminate the bias introduced by teacher. Then, we propose a contrastive distillation scheme which employs the proposed unbiased soft target as positive pair and samples from other different classes as negative pairs to transfer the knowledge. Our proposed approach can not only reduce the intra-class variance but also maximize the inter-class distances, thus further enhancing the recognition performance of the compact student on similar activities. Extensive experimental results on two sensor-based HAR datasets clearly show that our proposed method can achieve superior and consistent performance over other state-of-the-art methods.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the Agency for Science, Technology and Research (A*STAR) - NRF AME Young Individual Research Grant
Grant Reference no. : A2084c0167
Description:
© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
ISSN:
2832-7004
Files uploaded:

File Size Format Action
contrastive-distillation-with-regularized-knowledge-for-deep-model-compression-on-sensor-based-human-actitivty-recognition.pdf 3.05 MB PDF Request a copy