Deep learning-based plastic classification using spectroscopic data

Page view(s)
0
Checked on
Deep learning-based plastic classification using spectroscopic data
Title:
Deep learning-based plastic classification using spectroscopic data
Journal Title:
Journal of Cleaner Production
Publication Date:
17 October 2025
Citation:
Singh, A. R., Neo, E. R. K., Lai, C. M., Hazra, S., Coles, S., Peijs, T., & Debattista, K. (2025). Deep learning-based plastic classification using spectroscopic data. Journal of Cleaner Production, 530, 146793. https://doi.org/10.1016/j.jclepro.2025.146793
Abstract:
Plastic recycling represents a significant global challenge. Effective sorting is critical to achieving high-quality recyclate. While infrared (IR) spectroscopy-based classification has become a standard approach, current methods rely on traditional chemometric models and shallow deep learning (DL) architectures, which often struggle with noisy spectral data and limited feature extraction capabilities. To address these challenges, this study proposes a trainable preprocessing module incorporating average pooling layers, layer normalisation, or a combination of both to reduce noise and improve model stability. Additionally, we introduce a Transformer-based deep learning model for efficient plastic classification, alongside improved Convolutional Neural Network (CNN) and Artificial Neural Network (ANN) architectures. Our approach was evaluated on three spectroscopic datasets, including a newly introduced Fourier Transform Mid-Infrared (FTIR) dataset and two open-source near-infrared (NIR) datasets with varying spectral resolutions. The results demonstrate that our proposed methods outperform state-of-the-art machine learning and deep learning models, achieving a 5.4% improvement in F1-score on the FTIR dataset, along with 1.8% and 1.3% improvements in F1-score on NIR datasets, reaching near-perfect classification performance.
License type:
Attribution 4.0 International (CC BY 4.0)
Funding Info:
There was no specific funding for the research done
Description:
ISSN:
0959-6526