Self-Supervised Learning for Label- Efficient Sleep Stage Classification: A Comprehensive Evaluation

Page view(s)
124
Checked on Apr 18, 2025
Self-Supervised Learning for Label- Efficient Sleep Stage Classification: A Comprehensive Evaluation
Title:
Self-Supervised Learning for Label- Efficient Sleep Stage Classification: A Comprehensive Evaluation
Journal Title:
IEEE Transactions on Neural Systems and Rehabilitation Engineering
Publication Date:
14 February 2023
Citation:
Eldele, E., Ragab, M., Chen, Z., Wu, M., Kwoh, C.-K., & Li, X. (2023). Self-Supervised Learning for Label- Efficient Sleep Stage Classification: A Comprehensive Evaluation. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 31, 1333–1342. https://doi.org/10.1109/tnsre.2023.3245285
Abstract:
The past few years have witnessed a remarkable advance in deep learning for EEG-based sleep stage classification (SSC). However, the success of these models is attributed to possessing a massive amount of labeled data for training, limiting their applicability in real-world scenarios. In such scenarios, sleep labs can generate a massive amount of data, but labeling can be expensive and time-consuming. Recently, the self-supervised learning (SSL) paradigm has emerged as one of the most successful techniques to overcome labels' scarcity. In this paper, we evaluate the efficacy of SSL to boost the performance of existing SSC models in the few-labels regime. We conduct a thorough study on three SSC datasets, and we find that fine-tuning the pretrained SSC models with only 5% of labeled data can achieve competitive performance to the supervised training with full labels. Moreover, self-supervised pretraining helps SSC models to be more robust to data imbalance and domain shift problems.
License type:
Attribution 4.0 International (CC BY 4.0)
Funding Info:
This research / project is supported by the Ministry of Education - Academic Research Fund Tier 2
Grant Reference no. : MOE2019-T2-2-175

This research / project is supported by the A*STAR - AME Programmatic Funds
Grant Reference no. : A20H6b0151

This research / project is supported by the A*STAR - Career Development Award
Grant Reference no. : C210112046

A∗STAR Singapore International Graduate Award (SINGA) Scholarship
Description:
ISSN:
1534-4320
1558-0210
Files uploaded:

File Size Format Action
self-s1.pdf 1.70 MB PDF Open