Ng, H. P., Chan, Q., Tan, Z. J., Ssebaggala, R., & Lifton, J. J. (2025). Segmenting spatter particles on additively manufactured surfaces using deep learning. Surface Topography: Metrology and Properties, 13(1), 015006. https://doi.org/10.1088/2051-672x/ada6e1
Abstract:
Metal additively manufactured (AM) surfaces do not exhibit the same surface features as machined surfaces. Rather than cutting marks, the additive surface may display surface features such as spatter particles, weld tracks, cracks, and surface breaking pores. These features are not well described by surface height parameters that were developed for machined surfaces. Therefore, an AM specific surface characterisation approach is required; feature based surface characterisation is a promising approach, but it requires surface features to be manually segmented which is a subjective process. In this work, a U-Net spatter particle segmentation algorithm is developed that removes the subjectivity of manual surface feature segmentation. A U-Net model is trained to segment spatter particles from optical measurements of 20 different metal AM samples. The performance of the U-Net segmentation algorithm is compared to segmenting the spatter particles using manual thresholding. The results show that the U-Net segmentation approach outperforms manual segmentations for 2 of 3 test samples considered. It is found that for 2 of 3 samples, the U-Net segmentation algorithm detects spatter particles that are missed by the manual segmentation approach. It is concluded that further training of the U-Net approach is required before it can fully supersede manual segmentation. In the future, it may be possible to replace human operators that subjectively segment surface features with robust machine learning-based surface feature segmentation algorithms. This novel application of U-Net for AM surface feature segmentation has the potential to automate surface characterisation for metal AM process optimisation, and for quality control in production environments.
License type:
Attribution 4.0 International (CC BY 4.0)
Funding Info:
This research is supported by the Agency for Science, Technology and Research (A*STAR), Singapore, under the Central Research Fund.