Pahwa, R. S., Chang, R., Jie, W., Xun, X., Zaw Min, O., Sheng, F. C., Ser Choong, C., & Rao, V. S. (2022). Automated Detection and Segmentation of HBMs in 3D X-ray Images using Semi-Supervised Deep Learning. 2022 IEEE 72nd Electronic Components and Technology Conference (ECTC). https://doi.org/10.1109/ectc51906.2022.00297
Abstract:
Deep Learning is being widely used to identify and segment various 2D and 3D structures in voxelized data in fields such as robotics and medical imaging. Automated object detection and segmentation has had a rich history in semicon inspection and defect detection technologies for past few decades. Deep learning-based object detection and image segmentation has the potential to further improve defect detection accuracy and reduce manpower required for the quality inspection process. We develop a novel framework that utilizes the advancements in deep learning-based object detection and image segmentation techniques to leverage on partial labeled data and remaining unlabeled data to significantly improve the performance of locating microscopic bumps and defects such as voids for the defect detection process. We apply our Semi-Supervised Learning approach on various buried structures such as memory bumps and logic bumps. We briefly describe our fabrication and scanning process and thereafter, explain our approach in locating these different structures in 3D scans in detail. We extract the virtual 2D slices from 3D scans, perform Semi-Supervised object detection and image segmentation to classify each pixel of these individual slices into solders, voids, Cu-Pillars, and Cu-Pads. We compare our approach with state-of-the-art fully supervised techniques and perform a thorough analysis to discuss the advantages and disadvantages of our approach in both object detection and image segmentation steps.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the A*STAR - Career Development Fund
Grant Reference no. : C210812046
This research / project is supported by the A*STAR - AME Programmatic Funds
Grant Reference no. : A20H6b0151