LiteFormer: A Lightweight and Efficient Transformer for Rotating Machine Fault Diagnosis

Page view(s)
46
Checked on Nov 26, 2023
LiteFormer: A Lightweight and Efficient Transformer for Rotating Machine Fault Diagnosis
Title:
LiteFormer: A Lightweight and Efficient Transformer for Rotating Machine Fault Diagnosis
Journal Title:
IEEE Transactions on Reliability
Publication Date:
24 October 2023
Citation:
Sun, W., Yan, R., Jin, R., Xu, J., Yang, Y., & Chen, Z. (2023). LiteFormer: A Lightweight and Efficient Transformer for Rotating Machine Fault Diagnosis. IEEE Transactions on Reliability, 1–12. https://doi.org/10.1109/tr.2023.3322860
Abstract:
Transformer has shown impressive performance on global feature modeling in many applications. However, two drawbacks induced by its intrinsic architecture limit its application, especially in fault diagnosis. Firstly, the quadratic complexity of its self-attention scheme extremely increases the computation cost, which poses a challenge to apply Transformer to a computationally limited platform like an industry system. Additionally, the sequence-based modeling in the Transformer increases the training difficulty and requires a large-scale training dataset. This drawback becomes serious when Transformer is applied in fault diagnosis where only limited data is available. To mitigate these issues, we rethink this common approach and propose a new transformer, which is more suitable for fault diagnosis. In this paper, we first show that the attention module can be actually replaced with or even surpassed by a convolution layer under some conditions in mathematics and experiments. Then, we adopt the convolutions into the transformer, where the computation burden issue is alleviated and the fault classification accuracy is significantly improved. Furthermore, to increase the computation efficiency, a lightweight transformer called LiteFormer, is developed by utilizing the depth-wise convolutional layer. Extensive experiments are carried out on four datasets: CWRU, PU, and two Gearbox datasets of DDS. Through our experiments, our LiteFormer not only reduces the computation cost in model training, but also sets new state-of-the-art results, surpassing other counterparts in both fault classification accuracy and model robustness.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the Agency for Science, Technology and Research (A*STAR) - AME Programmatic Funds
Grant Reference no. : A20H6b0151

This work was supported in part by the National Natural Science Foundation of China (Grant No. 51835009).
Description:
© 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
ISSN:
0018-9529
1558-1721
Files uploaded:

File Size Format Action
final-version.pdf 4.35 MB PDF Request a copy