TSLANet: Rethinking Transformers for Time Series Representation Learning

Page view(s)
10
Checked on Nov 14, 2024
TSLANet: Rethinking Transformers for Time Series Representation Learning
Title:
TSLANet: Rethinking Transformers for Time Series Representation Learning
Journal Title:
41st International Conference on Machine Learning
DOI:
Keywords:
Publication Date:
02 May 2024
Citation:
Eldele, E., Ragab, M., Chen, Z., Wu, M., & Li, X. (2024). Tslanet: Rethinking transformers for time series representation learning. ICML 2024.
Abstract:
Time series data, characterized by its intrinsic long and short-range dependencies, poses a unique challenge across analytical applications. While Transformer-based models excel at capturing long-range dependencies, they face limitations in noise sensitivity, computational efficiency, and overfitting with smaller datasets. In response, we introduce a novel Time Series Lightweight Adaptive Network (TSLANet), as a universal convolutional model for diverse time series tasks. Specifically, we propose an Adaptive Spectral Block, harnessing Fourier analysis to enhance feature representation and to capture both long-term and short-term interactions while mitigating noise via adaptive thresholding. Additionally, we introduce an Interactive Convolution Block and leverage self-supervised learning to refine the capacity of TSLANet for decoding complex temporal patterns and improve its robustness on different datasets. Our comprehensive experiments demonstrate that TSLANet outperforms state-of-the-art models in various tasks spanning classification, forecasting, and anomaly detection, showcasing its resilience and adaptability across a spectrum of noise levels and data sizes. The code is available at https://github.com/emadeldeen24/TSLANet.
License type:
Attribution 4.0 International (CC BY 4.0)
Funding Info:
This research / project is supported by the NRF - AI.SG
Grant Reference no. : AISG2-RP-2021-027
Description:
ISSN:
N.A.