Liu, T., Wei, Q., Chen, J., Liu, W., Huang, W., Srivastava, R., Cheng, Z., Zeng, Z., Veeravalli, B., & Yang, X. (2024). Unsupervised 3D Lung Segmentation by Leveraging 2D Segment Anything Model. 2024 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 1–4. https://doi.org/10.1109/embc53108.2024.10782129
Abstract:
Lung segmentation is the first important step
for lung nodule detection and lung cancer analysis. Deep
neural networks have achieved state-of-the-art for most tasks
in medical image analysis, including lung segmentation. However,
training a deep learning model requires a large amount of
annotated samples, which is not practical in medical imaging.
In this study, we make efforts to perform unsupervised lung
segmentation on 3D lung CT data by leveraging foundational
2D segment anything model (SAM). The approach utilizes
SAM to segment 2D slides and generate 2D masks, then
reconstruct multiple 2D masks from the same subject into
one 3D mask. In such a way, we can train a 3D lung
segmentation model by using the reconstructed 3D masks
without the requirement of any ground truth annotations,
namely, in an unsupervised manner. The evaluation on
LUNA16 dataset shows our proposed unsupervised 3D model
achieves comparable results with enhanced stability compared
to the supervised one trained with ground truth annotations.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the Agency for Science, Technology and Research - Manufacturing, Trade, and Connectivity Programmatic Funds
Grant Reference no. : M23L7b0021