Zhu, C., Chen, Z., Zhao, R., Wang, J., & Yan, R. (2021). Decoupled Feature-Temporal CNN: Explaining Deep Learning-Based Machine Health Monitoring. IEEE Transactions on Instrumentation and Measurement, 70, 1–13. doi:10.1109/tim.2021.3084310
Abstract:
Machine learning, especially deep learning, has been extensively applied and studied in the area of machine health monitoring. For machine health monitoring systems (MHMS), the major efforts have been put in designing and deploying more and more complex machine learning models. Those black-box models are non-transparent towards their working mechanism. However, this research trend brings huge potential risk in real life. Since machine health monitoring itself belongs to high stake decision applications, the outputs of the autonomous monitoring systems should be trustworthy and reliable, which refers to obtain explainability. Then, it comes to the following key question: \textit{Why the deployed MHMS predicts what they predict}. In this paper, we shed some lights on this meaningful research direction: explainable machine health monitoring systems (EMHMS). In EMHMS, the machine doctor could act like a real doctor who can not only make diagnosis but also describe the patient's symptoms. First, we propose a specific convolutional neural network (CNN) structure, named as \textbf{D}ecoupl\textbf{E}d \textbf{F}eature-\textbf{T}emporal CNN (DEFT-CNN), to balance precision-explainability trade-off. Specifically, feature and temporal information have been encoded in different stages of our model. The spatial attention module is added to boost the performance of the model. Then, to explain the decision of the model, we adopt the gradient-based methods to generate features and temporal saliency maps highlighting which kind of features and time steps are key for the model's predictions. At last, we conduct the experimental studies on two real datasets to verify the effectiveness of our proposed framework.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the Agency for Science, Technology and Research - AME Programmatic Funds - Learning with Less Data
Grant Reference no. : A20H6b0151
This research / project is supported by the Agency for Science, Technology and Research - Career Development Award - Contrastive Learning for Time Series Domain Adaptation
Grant Reference no. : C210112046